America likes to think that their culture is superior and nothing that they do is weird at all. They expect that everybody looks up to them from other countries and that they are basically the best in every way shape and form. As an American, I don't think I'm very qualified to be the one to burst this bubble, so I'll let Redditors do it for me.
People on Reddit are sharing the weird things that Americans do that they initially thought were just silly stereotypes from films and such, but are actually very true. Some of these cultural norms are less flattering than others.