What is the meaning of the word "tropics"?

Definitions:

  1. the part of the Earth's surface between the Tropic of Cancer and the Tropic of Capricorn copy to clipboard noun
    • We like to vacation in the tropics during winter to avoid the cold. copy to clipboard