Another favorite measure of audience opinion comes from websites that collect and tabulate the opinions of self-selecting volunteers. Rotten Tomatoes and Metacritic are primarily interested in tabulating critics’ opinions, but they also collect grades from audiences and display them on the site. And IMDB doesn’t collect critics’ grades at all — just audience scores. (In some iterations of IMDB, you may also see the Metacritic score, which IMDB draws from Metacritic’s data set of aggregate critics’ opinions.)
Often, critics and audiences roughly track with one another. But sometimes critics’ scores and audience scores diverge greatly, a fact that people with critically derided films sometimes proclaim as if it says something positive about their film:
The narrative this argument is resting on is the assertion that critics, with their high-falutin’ ways and their snobby tendencies, are disconnected from the real, authentic folk, and therefore shouldn’t be trusted. Critically panned movies like Death of a Nation, Gotti, Baywatch — they’re for the fans, not the critics.
....the randomness of the sampling is less rigorous than the information collected by CinemaScore, which surveys everyone in the same theater. It seems reasonable to assume that the people motivated to spend the time entering an audience score on a website feel very strongly about the film, either in a positive or negative direction.
Additionally, the data would likely skew to favor the opinions of people who use those sites — which may, for instance, favor those with more leisure time, more access to the internet, and more technologically savvy than others.
Then consider that subset of people against critics, a group of people that skews male and white, but that also doesn’t choose to review a film because of their feeling about the film, but because it’s their job to review the film. That can lead to more shoulder-shrugging reviews — the 3/5 star score — but it makes for a more moderate score.