Since the end of World War II, the United States has invaded virtually every country in Latin America.

 
Since the end of World War II, the United States has invaded virtually every country in Latin America. - Cards, Informative, Interesting, Facts, Politics, USA

v20240901