What is the meaning of the word "medicine"?

Definitions:

  1. the learned profession that is mastered by graduate training in a medical school and that is devoted to preventing or alleviating or curing diseases and injuries copy to clipboard noun
    • They studied medicine in hopes of helping people. copy to clipboard

Synonyms: