what are some common misconceptions about the US that europeans tend to have?

The US has gotten alot of attention in the news and social media lately. I have noticed many comments regarding the US being very negative and most of it is just plain wrong. as a european i feel like there are many things we fail to understand about the US. what are some common misconceptions?