gomezmaritza0812 gomezmaritza0812 31-10-2022 History Answered The term imperialism means the policy of extending a nation to authority over other nations. it often has the negative and racist connotation of a white countries dominating other races. in what way did Americas image suffer from these acquisitions