Jul 25 2011
For this question I’m posing this morning, I’m not speaking of shows you think are inherently bad and should be kicked off the air just because they suck. No, I mean shows that are literally harming the fabric of our society by existing. I don’t think many programs fall into that category, but as you can see above, I have one that comes to mind immediately.
I’ve caught a few episodes of Man vs. Food in the past, and watched a few this weekend and it’s hard not to be absolutely disgusted by it. Not just by thinking about how awful it would be to have that much food inside me, but to think about how many starving people there are in the world and we have a SHOW dedicated solely to the deadly sin of gluttony. It’s insane, in horrible taste, and reinforces the view around the world that Americans are a bunch of lazy, overweight food vacuums.
Other examples of shows making America look bad? I would argue any of the Real Housewives installments, and a show like Jersey Shore. Somehow that show has spread like a virus, and now people think all Americans are tan, musclebound and idiotic, when that’s only a tiny fraction of us.
What other shows would you add to this category?
More Unreal Posts