So The Oscars happened a few weeks ago, and once again, I didn't watch. Not only that, I didn't care one way or the other.
Hollywood lost me years ago. And they've done nothing to repair that relationship. In fact, they've made it much, much worse.
Not many people know this, but Hollywood used to be filled with patriotic guys who actually fought in real wars. For example, James Stewart, who flew flew bombing missions over Nazi-occupied Europe, and earned the Distinguished Flying Cross and the Croix de Guerre, among other medals. Imagine that, an actor and a military hero.
Brig. Gen. Stewart risked his life, quite willingly, to fight against Nazis and for freedom, while today a useful idiot like Sean Penn risks nothing while he kisses up to a corrupt dictator like Hugo Chavez. Do we have to wonder what James Stewart might have thought about that? No we do not.
No Hollywood studio boss during WWII or Korea would have dared to make any kind of anti-military movie that painted soldiers as bloodthirsty baby-murderers. They didn't believe that, and they knew the market would reject it completely. The country was engaged, along with much of the free world, in a fight for survival, and made great sacrifices for that fight. It was not a pose to impress others, like much of what passes for anti-war sentiment among the Hollywood left today.
Today's Hollywood wouldn't even know what to do with a real man like James Stewart, because today’s Hollywood doesn't write that role, doesn’t make that movie, and doesn't cater to that demographic.
In fact, today's Hollywood despises all of that, and actively works against it.
Today's Hollywood is quite happy to make movies with feminized male characters who wouldn’t know how to fire a weapon or ride a horse if their lives depended on it. They don't even talk like men any more. But they know how to use their cool new iPhone to find the nearest Starbucks.
I don’t think most people realize it, but we have had something stolen from us over the last 40-50 years: the connection between movies and heroism, duty, and honor.
That connection, in large part, was purposefully destroyed by Hollywood starting in the mid-1960s with a constant stream of depressing, narcissistic, counter-cultural movies. Which was, amazingly enough, right around the time of the breakup of the old studio system. Suddenly, celebrating heroism, duty, and honor became oh-so-very-passe.
You can see it for yourself: just read through the plot descriptions of the movies on Turner Classic Movies every week, and then compare that with the dreary crap that Hollywood has put out over the last 15 years, or the depressing, narcissistic movies put out before that. It’s stunning, really, to see the complete about-face in movies from just the early 60s to the early 70s: from, say, “The Longest Day” to “Dog Day Afternoon”, featuring a criminal as protagonist.
This is a comment about the focus of these movies, not the quality of them. Lots of very good movies were made after the mid-60s, particularly in the 70s. But the focus of these movies, the lessons they reinforced, were completely different from what came before. It was not subtle, it was striking.
Today, Hollywood complains about losing customers, yet continues to churn out more of this same derivative, boring crap filled with anti-military, anti-family, anti-Christian, anti-America messages.
Newsflash: out here in flyover country supporting the troops is not just a pose, and it's plainly obvious that the movie-going audience, rather than leaving Hollywood, feels instead that Hollywood has left them. And with good reason.
The ugly truth for Hollywood is that we would come back if you would stop putting out crappy movies that hector us into feeling bad about ourselves, our society, and our country. I like movies, and I would go see some of them, but I absolutely will not subject myself or my family to propaganda explicitly intended to flip every traditional cultural norm on its head.
Obviously, somebody within the Hollywood power structure has thought about this too, somebody with the power to make the movies they want and tell the stories they want. It's not some revolutionary new idea that only I've had. And they’re in the business to make money, right? To make movies that put asses in the seats, and sell tickets? You would think.
But maybe not. Maybe Hollywood is really about shaping attitudes, and is willing to lose money on movie after movie in order to re-shape American attitudes about our military and about what Hollywood really thinks about the country that allows it to say whatever it wants. A freedom they enjoy, in part, because of two things about America they hate: our military and our Constitution.
Feel the irony. Hollywood likes irony. It makes them feel clever and superior.
(An updated version of something I wrote in July 2010)