Opinion | Why “Game of the Year” and review scores are inherently flawed
We are into October and Game of the Year is starting to dominate online gaming discourse. Every new game release leads to an “It’s my GotY” or “It’s in my top 5 games for the year” post on social media. In theory, it should be a harmless bit of fun as a thought exercise. “I liked these games and here’s why”. In practice, it controls parts of the industry. Why does it mean so much, when most of the people who create these lists or vote for these shows could never play the majority of a year’s releases? I think GotY is inherently flawed, along with the thing that powers it – review scores.

The Game Awards
Geoff Keighley’s Game Awards is the industry’s attempt to have a major of the year awards season. Dozens of gaming-focused outlets gather to vote on dozens of categories. The most prestigious of those is “Game of the Year”.
A massive panel of various websites, podcasts, news organizations, and individuals vote on categories like “Best Ongoing Game” & “Best Indie”. The show gets massive viewership, with reports that ad spots can cost in the hundreds of thousands each. It aims to be the video game’s answer to shows like the Oscars. There’s just one major issue that sets it apart in my mind.
Video games take a long time to play. You could watch every Oscar-nominated movie in a couple of weeks if you wanted (and had access). Comparing RPGs like Metaphor: ReFantazio and Dragon Age: The Veilguard could take you upwards of 200 hours. Add in the dozens of other likely contenders in that one category and I think you see where I’m going with this.
No one can play everything
I doubt that anyone in gaming coverage has played all of the games that will be nominated. I would never expect nor want someone to ever try, it would be unhealthy. Even in a single category is something the majority of a voting panel will not have started, let alone completed, the games in it.
Why then is online discourse and even the industry itself so enamored with the idea of a game of the year? How can one ever compare Balatro to Astro bot? Helldivers 2 is nothing like Satisfactory. Even within individual genre breakdowns like “best ongoing game” you have single-player titles with paid DLC (CP2077) vs. f2p titles like Apex Legends or Fortnite.
Best RPG last year had Sea of Stars, Lies of P, Baldur’s Gate 3, Final Fantasy XVI, and Starfield in a strange mishmash of role-playing games that all offer thoroughly disparate experiences.




Why do people care so much?
I’m not advocating for not having Game of the Year lists or awards, even if I think Game of the Year is inherently flawed. They should be relatively harmless in the grand scheme of life. I felt compelled to write this piece because of how toxic the discourse around them has become.
A constant battle cry in the pointless list wars is “x publisher hasn’t had a GotY winner!” or “x game is better than y because it won more GotY’s online”. To the point of insults, never-ending arguments, and general shittiness.
Discourse like that is not limited to fans. As an Xbox-focused website, the claim that the publisher’s first party has been a failure due to a lack of wins at various Game of the Year lists is a constant among the press.
The weight given to what game does or doesn’t win among a group of at most a thousand or so people in gaming coverage does not make sense to me. I also have major issues with how important scores are on reviews. Game of the Year discourse is intrinsically tied to them.

People only read the scores
Review scores are given a worth incommensurate to what is warranted. As a site listed on Metacritic and Opencritic, I’d like to break down how both work, as scores tie heavily into GotY (except when they seemingly don’t). Metacritic takes every website listed on it into consideration for its aggregate score. These sites are assigned an undisclosed amount of weighting.
A site like IGN in general appears to influence the aggregate far more than any site our size. Those out there stating “x sites are propping up the score of the 1st party game” seemingly ignore or don’t know that one score from us means nothing compared to the one or more scores a site like IGN will have listed. IGN is listed for both Xbox and PC on Starfield, something not afforded to a site like ours either, despite both reviewers playing dozens of hours on each platform.
Opencritic lets anyone on the site, and unlike Metacritic, you must input your information manually for it to show up. Unlike MC the only scores that matter are the Top Critics reviews. How one becomes a top critic is not public information.
For example, we have nearly 700 reviews over 4 years on the site and are not a top critic. There is a site with under 40 reviews in a decade that is listed as a top critic. I’ve emailed Opencritic multiple times asking what the criteria are but have only received silence.

Down with review scores!
Why does all of this matter? Because for the most part, any game in the game of the year discussions needs to have a higher Metacritic score to have a chance to be nominated, let alone win. Sometimes a game like Death Stranding or Ghost of Tsushima can get nominated despite a low 80s MC/OC average, but it’s not the norm. I wouldn’t be shocked if Black Myth: Wukong is nominated this year as an 82, but a title like Senua’s Saga: Hellblade II is considered too mixed to be at 81.
People, almost entirely in bad faith, have been calling for objective reviews. This would be a list of facts that give no inference into what a game was actually like to play. As is, review scores are determined by a single person after grinding a long game in a short time, desperately trying to hit an embargo.
There is no right way to score something
The weight that review scores carry over their words is one of my main issues with the game of the year discourse. Excellent titles get overlooked consistently. They didn’t have a big PR firm pushing them, so they got a handful of reviews where only a few matter. Their fate and their chances of being on these lists are tied to a dozen or so people who are mostly overworked and underpaid.
The biggest problem with the weight review scores are given is that there can never be a scientific method for determining them. Every review score given in gaming history has been a subjective one done at the moment you were forced to put it down. Whether a group or individual there is no right way to score a game. Is a 7 for you good? Cool! Is an 8 for you also only good? Cool, I guess.
There is no way to say someone is wrong for something that has no inherent rule set. We try to use the full 100 points here. Sites like Windows Central use the 5-star system, others use a strict 10-point setup. Still, sites like OC and MC smash them all into the same 100-point value, and it does not work.

What can change?
On a grand scale, nothing really. I hope that putting this out there opens a few eyes to how little game of the year should ever matter. It should be a fun engagement point for this hobby. When studios win awards or are part of website lists they should feel great that people enjoyed the product of their hard work. What it shouldn’t be is something that controls salaries, or if sequels are greenlit (I’m looking at you, Sony).
There is no easier nor more toxic form of engagement than list warring. It seeps its way into every part of life, from religion to sports, politics, and entertainment. When the year is up and you’re thinking of what you liked and see that someone else didn’t, maybe try a polite back and forth-instead of calling them a shithead or putting up a string of clown emojis.
For the industry, the game of the year is a big money maker. Know that some sites and shows out there will rely on your outrage or love to push their numbers up and don’t give in.




Unfortunately, game scores are impossible to use to find out if a game is good or not. This has been an issue even going back to the 8-bit days where one magazine would gush about how good a game was, giving it an 87% and another would say how the game was slow, buggy and unplayable and hit it with a 27%. I still think the reviews can make perhaps 20% of the decision on if a game is worth a proper look, then, critically, there should be a playable demo to try it for ourselves. There have been many titles I simply looked past because I was expected to throw $80-$100 into it to find out. Refund promises are a crappy way to do a ‘demo’ as they often rely on a convoluted return scheme that often ends up with keeping a game because it is ‘too much hassle’ to go through a return process.