Definitions of Haiti:

  • noun:   a republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the Western Hemisphere
  • noun:   an island in the West Indies