Hey you! Yes, you! Thank you for adding beauty to the world!
Definitions of Haiti:
noun: a republic in the West Indies on the western part of the island of Hispaniola; achieved independence from France in 1804; the poorest and most illiterate nation in the Western Hemisphere