Rotten Tomatoes and Metacritic have become our first stop in determining how good a movie is.

Until recently, I had no idea how each site arrived at their review scores.

Once I found out, I realized Id been reading them all wrong.

Youve probably seen the rating next to a movie title.

Experienced users might even know that each site actually hastwoscores: one for critics and one for regular viewers.

What you may not realize is that each site calculates those numbersverydifferently.

The score you see is the percentage of the total reviews that are considered Fresh.

90 is 28% of 327, so that becomes the movies score.

Metacritic, on the other hand, uses a bit more nuance in their system.

The company collects reviews from around the web and assigns them a score ranging from 0 to 100.

The site then takes aweighted averageof all the reviews.

Its worth pointing out that Rotten Tomatoes and Metacriticas well as IMDbalso have separate user scores.

These work more or less consistently across all three sites.

This dramatically sways review scores in polarizing directions.

This scale averages reviewer scores after theyve been assigned a value on a ten-point scale.

If welook at that Batman v Superman example again, we see that its average rating is actually4.9.

Thats evenhigherthan Metacritic rated the movie.

This effect isnt just negative, though.

Appropriately, this effect makes the Tomatometer a bit like Captain Americas super soldier serum:Good becomes great.

The same effect applies to Rotten Tomatoes user scores, though its a bit less pronounced.

Any score of 3.5 stars (or 7 out of 10) is considered positive, or Fresh.

Less than that is considered negative or Rotten.

The user score represents the percentage of positive ratings.

While Metacritic embraces nuance, its also sometimes criticized for getting it wrong.

As we established earlier, Metacritic assigns a numeric value to reviews before averaging them.

However, picking those numbers can be a subjective ordeal.

Paradoxically, Metacritic gives reviewers both moreandless control over their scores.

A reviewers rankings and opinions are represented more faithfully with a numerical score than a boolean good/bad value.

This can be a huge problem if an industry startsrelying on review scores.

Thats a bit like trying to turn love into a fossil fuel.

The conversion doesnt make sense on its face.

However, review scores are still useful.

Reviewers help us determine which films are worth spending our time on.

Handy review scores make it even easier, turning the decision into a simple, two-digit number.

In my experience (also an opinion!)

Rotten Tomatoes probably answers it pretty well.

Just keep in mind that it tends to drag films to the extremes.

The flip side is that the site may also inadvertently inject opinions of its own.

Just keep in mind, its exactly that.

The average movie-going audience.

If your tastes differ from the mainstream, you might not agree with user ratings.

Most importantly, remember that your opinions are still your own.

Moviegoers like to follow review scores like theyre a competitive sport.