What is the meaning of the word "tropics"?
Definitions:
-
the part of the Earth's surface between the Tropic of Cancer and the Tropic of Capricorn noun
- We like to vacation in the tropics during winter to avoid the cold.