HOLLYWOOD – The Necessary Evil.



I’ve always been a fan of movies. I wouldn’t call myself a movie addict though, because that is too farfetched. Growing up in the early 90s, my friends and I used to peep through windows just to get a glimpse of a movie on show from a neighbour’s room. If you are spotted, well, they’ll just close the window. If you were fortunate or Dbee enough to own a TV and a VIDEO CASSETTE PLAYER, then good for YOU. I will always treasure those moments when a bathe was our ticket to watch a movie in an uncle’s room. I miss days when we gathered and narrated movies we had seen among ourselves. Hmm Ghana.Days when we won’t miss Shifu (Journey to the West), Sinbad or Hercules. These were the golden days.

Even though there are local movies, I found myself liking the exotic ones, the American movies, where there will be a killer (Villain) and Jack/ blowman (Hero). Don’t ask me when and how killing and shooting became a form of entertainment!I just loved the action.

Looking back, there are some observations that baffles my mind; the stereotypes and racism coupled with gender bias and moral bankruptcy are my major concerns.


Hollywood has a very peculiar way of tarnishing the image of women in their movies. Am not saying every movie from Hollywood has that but most of them. When it comes to nudity, almost always, the women will go nude but the man wouldn’t. when a couple is having sex in the movie, the woman will expose her boobs and ass, but the man will not show his penis. Scenes from bars are filled with topless women. Nudity is so unnecessary at some scenes but Hollywood cares less. In the movie titled ‘Trance’, made in 2013, Rosario Dawson did a full frontal exposure of her vajayjay in a very unnecessary manner. The scene was so not necessary. I lost a huge amount of respect for her after that movie. Talk about Game of Thrones, Power, Basic Instincts, 50 Shades of Grey so many of them treating women as sex objects.

When it comes to witchcraft, women are the Villains all the time. They are always the victims. A wizard, however, is a hero. Harry porter, Gandalf, the wizard of Oz, etc all portray wizards as heroes, whereas Hansel and Gretel, the Witch, Season of the Witch, the last Witch Hunter, portray women as the wicked witch villains. Am not saying am a fan of witchcraft but come on. I think it’s about time women stood for their rights and took their right place in the movie industry.


Well I live on a different continent. A country where morals are held in high esteem not because of any rigid rules or regulations but because we as a people have a way of filtering what resides in our minds. This might not be the case in Hollywood but they are infiltrating our societies with their cultures through the movies they make. In every movie, sex is the definition of a relationship. Even if for 10 seconds, they will manage to bring a sex scene in the movie. They don’t care about making children have sex in the movie. ‘Paper Towns’ for instance had two teenagers have sex. ‘Love’ actually had a 9-year-old infant seriously searching for a girlfriend. An English guy getting to have sex with 3 girls just because of his British accent. Now this is what Hollywood is telling us, sex is a normal act, you can have sex with anyone you want, anywhere anytime. This is not our culture and if we continue bringing these kind of movies in our homes, sooner or later we will lose the values and morals we strongly uphold.


Well I guess that heading was a bit stern, but it’s there anyways. While growing up in the 90’s, movies had the main hero, usually a White man and ‘assistant’ hero who most of the time was a black guy. In their quest to conquer or kill the villain the black guy will do the dirty job and die leaving the white guy to take the glory. It was so predictable. We rejoiced and hailed the white man as hero – little did we know that they were conditioning our minds to see the white man as the hero all the time. That aside, Hollywood has a way of looking down at other races. Arabs are always the terrorists, Mexicans are always drug dealers, Africans are always slaves and servants. And Africa to them is one country. I wasn’t surprised when a white woman was asked, “How many countries form the African continent?” on ‘Who wants to be a millionaire’, and she answered one. That’s what they have been told – Africa is a slum, deserted, hungry, disease infested continent.

We need to admit that the USA is a great country but it doesn’t mean other continents, countries, races are inferior.

Hollywood should spare us the stereotypes!