American imperialism can refer to many things including but not limited to cultural imperialism, economic imperialism and military or territorial imperialism. America has a history of spreading influence around the world and creating a society that conforms to certain ways of life and imposes those beliefs on others. Whether it’s spreading democracy to other countries, claiming American soil, or allowing previously disenfranchised groups to affect change, this country has had quite a bit of experience with imperialism in the relatively short time it has existed as a nation.