I’m not an US citizen, never been in the US. But as far as I know, your country never was a real nice place. Slaves, first nations, Mexico, Hawaii, Tuskegee (and Guatemala), Condor Operation, Camboja, Palestine… The list goes on, and on, and on…
Of course I don’t blame the everyday citizen, but your rulers are terrible for the rest of the world, since ever, and the people are blinded to it. And about democracy, the electoral college system is pretty much the opposite of democracy IMHO, and the most powerful force driving the country is not what people needs or want, is money (and capitalism).
(Again, IMHO) The US image outside its borders is not “inspiring”, it is terrifying.
Brazil, does it make any difference?