American Imperialism By: Tan Ly

Essay by viethelldragonCollege, UndergraduateA+, February 2004

download word file, 11 pages 4.6 2 reviews

Downloaded 195 times

Since the foundation of this country, the United States has always been an imperialist nation. Imperialism is the policy of extending the rule or authority of an empire or nation over foreign countries. The United States, based on this definition, is an imperialistic nation. During the 1800's and 1900's the United States pursued an aggressive policy of extending its political and economical influence. From the moment Christopher Columbus first set foot on the east coast, the United States has been controlling and inevitably destroying the ures of groups different than their own. A country of immense pride, the nation has forever believed that the American way of living is the best and only way anyone should live. Through the use of pro da, and exploits by the media-primary newspapers-, the U.S. government has dictated to its citizens, the "conquerization" of other peoples is not only acceptable but ultimately in the best interest of all parties.

At the 1904 Worlds Fair, various groups were used in an attempt to convince American citizens that the expansion of the United States was a beneficial, if not noble humanitarian act. The largest, most popular and persuasive exhibition of the Fair was that of the Philippines. The exhibition convinced some doubters of the expansionist theory, and cemented the belief of others that already supported an overseas empire. During the 1800's and 1900's the United States pursued an aggressive policy of extending its political and economical influence. Newspapers were the major source of news in America and often times sensationalized events going on around the world. Yellow journalism evolved as a way to increase circulation of news and acted as a way of increasing public concerns about events outside the United States. Although the nation previously had not involved itself with foreign affairs a shift...