When we can't be bothered to read reviews by an old grouch like Roger Ebert, or any of those tightasses they have at RottenTomatoes, we often turn to movie trailers to have a taste of what movies are like. Unfortunately, even then it's a hit-and-miss type of thing because let's face it, movie trailers is a form of promotion; editors paid by the same studio that made the films. Needless to say, they're not always honest.
|But if it's a Michael Bay movie, then what you see is what you get...|
The thing is, I don't really mind all that much if they were just holding out on us about something pivotal to the plot. After all, you wouldn't want the trailers to spoil the movie watching experience. The biggest offense here is when they sell a movie as something they are not. In other words, they just give a distorted impression of what the movies are really like.
Why? Well, it's simple really; the same reason the industry is called "Show Business". Here are just three recent examples I have on top of my head;