Define American Imperalism Meaning

American Imperalism
Where the united states forces it's culture on other contries.Most other contries have a backward culture.For example in the middle east in some contries they will chop off your hand for even stealing a loaf of bread and in africa where they cut up women's pussies and call it female circumcision.

By Cherise