British West Indies
Share:
Tweet A historic term for the British island colonies in the West Indies. Most of the islands gained their independence in the late 20th century; a few, such as the Cayman Islands and the Turks and Caicos Islands, remain British territories. |
The American Heritage® Dictionary of the English Language, Fifth Edition copyright ©2022 by HarperCollins Publishers. All rights reserved.