What is the meaning of the word "dentistry"?

Definitions:

  1. the branch of medicine dealing with the anatomy, development and diseases of the teeth copy to clipboard noun
    • I studied dentistry in college. copy to clipboard

Synonyms: