What is the meaning of the word "medicine"?
Definitions:
-
the learned profession that is mastered by graduate training in a medical school and that is devoted to preventing or alleviating or curing diseases and injuries noun
- They studied medicine in hopes of helping people.