Any Decent Meta-Meta-Music Critics?
Please note: while this post will work with mobile, it should look much better in browser window on a computer.
I find that the review aggregators Metacritic Music and AnyDecentMusic? (they’re like Rotten Tomatoes, but for music) are really useful for finding well-vetted, well-crafted music albums across various genres. From time to time, I open up both sites on my browser and scroll though recent, well-reviewed albums looking for new music worth listening to. Knowing that these two sites use slightly different algorithms for selecting, interpreting, and weighting their reviews, I started wondering how well their final scores compared for a given album. I also wondered whether I could achieve an even better “meta-meta-” score by averaging their scores together into a new, singular number. Thus, I scraped their respective album review lists (current data captured on 12/17/2015) and analyzed them to try to figure this out.
Album ratings on AnyDecentMusic.com (x-axis: “adm”) and Metacritic.com (y-axis: “metacritic”). Only albums from 2010 on are included. Correlation between the two was ~ 0.86 (Spearman) /0.87 (Pearson). Thus, they did tend to score a given album similarly, though not always exactly the same.
Here’s the same plot I’ve made on Plot.ly for better default functionality (You can zoom in and out, and can unselect years you’re not interested in to better cut through the clutter).
Some useful numbers:
AnyDecentMusic // Average Score: 6.79, Standard Dev: 0.76
Metacritic // Average Score: 72.3, Standard Dev: 7.6
Snippets of d3js code taken from these sources: