The term imperialism means the policy of extending a nation to authority over other nations. it often has the negative and racist connotation of a white countries dominating other races. in what way did Americas image suffer from these acquisitions



Answer :