Slide show

[TV][slideshow]

How bad is the Tomatometer? Here's the top 10 movies of all time according to Rotten Tomatoes, Metacritic, and IMDB. Which list do you prefer?

https://preview.redd.it/0vyeptbge1nb1.png?width=1366&format=png&auto=webp&s=86df102a7197fb8abef428364f2890b62eff0fb0

A couple things to note, Rotten Tomatoes and Metacritic are based on critics reviews, IMDb is based on users reviews. Rotten Tomatoes adjusted score comes from this article. The adjusted score is explained by RT as:

Each critic from our discrete list gets one vote, weighted equally. A movie must have 40 or more rated reviews to be considered. The Adjusted Score comes from a weighted formula (Bayesian) that we use that accounts for variation in the number of reviews per movie.

For me personally, IMDb seems to be the most accurate (not just the top 10, but in general as well). What do you think? What's missing, who has the best list?

Edit: I just want to clarify something. I am aware of how the tomatometer works, and that it is a measure of the percentage of positive reviews. The point of this post was to say that maybe that's not a good metric for evaluating movies and that an average score is actually a better metric.

Edit 2: People seem to think the list from the article is boosting newer movies because they have more ratings. This is not what Bayesian weighting is, here is a basic explanation

The Bayesian average adjusts the average rating of products whose rating counts fall below a threshold. Suppose the threshold amount is calculated to be 100. That means average ratings with less than 100 ratings get adjusted, while average ratings with more than 100 ratings change only very slightly. This threshold amount of 100 is called a confidence number, because it gives you confidence that averages with 100 or more ratings are more reliable than averages with less than 100 ratings.



Submitted September 08, 2023 at 08:56PM by splityoassintwo https://ift.tt/AxhOBFu

Không có nhận xét nào:

vehicles

business

health