Last weekend I wrote an email to Kyle Orland of Ars Technica in response to a question about an independent game's certification troubles. That letter was quoted in this article, which was then summarized by gameindustry.biz, and is probably floating elsewhere around the internet too.
The original letter went through a few examples of specific issues of certification and how they could be handled better, to help substantiate why I think these processes are so broken. These are not the kinds of details you would expect to make it into a short article for a general audience, but having typed them up, I think some people out there might be interested (especially developers who have not done console games before but are thinking about it). So, below is my original letter (edited a little bit for clarity):
-----
Hi Kyle,
I have gone on record a number of times, saying that the XBLA certification process makes games worse rather than better. But that is a pretty old experience for me now (4 years ago), and they are unlikely to change any of that stuff until the new console comes out anyway. Keep in mind that the Xbox people as a whole do not think of XBLA as a high priority (you can get a glimpse of this from the E3 press conferences, etc).
But, I think the more important issue applies globally, to all of Microsoft, Sony and Nintendo (and not just to smaller downloadable games, but to every game regardless of size). The certification processes of all these platform holders were based on the idea that all these steps they test are absolutely necessary for software to run robustly, and that software robustness is super-important for the health of their platform and its perception by customers.
But, look at iOS. There is almost no certification process for iOS, so by the Microsoft/Sony/Nintendo theory, the apps should be crashing all the time, everyone should think of iOS as sucky, etc. But in fact this is not what is happening. There is no public outcry for more testing and robustness of iOS software.
Part of this may be that iOS software is so easily patched; so maybe a heavy cert process made sense back in the disc-only days, but as we go into the next console generation it becomes unnecessary. But something tells me that Microsoft/Sony/Nintendo are not really going to let up on cert, even though they say they will. It's just "not in their DNA" as they say. The proof of this is the bizarre over-complexification that already happens with console games today, and is baked into the current certifications, that any of them could easily fix, but none of them do, because they don't care.
For example: every single game is REQUIRED to say on startup, "sometimes this game saves, when you see this animated icon in the corner, DO NOT TURN OFF YOUR CONSOLE, etc". This is something that developers have to implement and that has to be tested, which costs significant time and money, but worse than all that, it impacts the user experience, because the startup of the game becomes just a little more bureaucratized, and also -- this is supposed to be a fun experience, so why are you issuing warnings and strict instructions? (Just one thing like this may not seem like too much, but combined with everything else, it is a lot; I am using it as just one specific example).
Why is this DO NOT TURN OFF YOUR CONSOLE warning there? It's because if the game is saving a save file, overwriting an old one, and you turn off in the middle, you might corrupt the saved game. Well, guess what... the solution to this is to IMPLEMENT A MORE ROBUST SAVE SYSTEM. You save the new file next to the old file, flush the I/O, and only delete the old file once the integrity of the new one has been verified (or just don't remove it, and keep two copies of the save at all times). On load you just make sure you load the intact save file if one is corrupted. This is not hard to implement; I did it in Braid. But if consoles cared about this kind of thing, it would be built into their basic save-file API, so that it would always work perfectly and no developers would ever have to think about it.
If they did this, there would be fewer things to certify, certification would cost a little less and take a bit less time. Let's say you save 3 days of development and testing per game (this is conservative; the real amount can be substantially higher when you factor in the discussions and coordination about how the save notice should look, etc). Now add up how many games have been released just on the Xbox 360, multiply that number by 3 days, and what you get is probably OVER A DECADE OF DEVELOPER TIME that was wasted. Just on this one little requirement. For something that should just be built into the system by one person in a couple of weeks. Okay, that was only the Xbox, so now add in all the PS3 and Wii games, and see what a huge waste of time that represents.
Like I said, just this one thing by itself would not be too bad, but it is only an example. Another example that all three certification processes have is the "title-safe region" restriction. This basically means that you are not allowed to draw text or gameplay-critical information near the edge of the screen (about 10% of screen width in a margin around the edge) because you don't know if the user's TV is displaying the whole picture (even modern HDTVs often cover the edge of the display with a bezel or else just crop it entirely, which is completely stupid).
Okay, so at first blush this seems to make sense, so to make sure players can see everything, we have to go through each screen and each mode in every game we ever release and certify that important things don't get drawn too far toward the edge. Well, as you can extrapolate from my previous example, this also has wasted DECADES OF DEVELOPER TIME (probably many decades, because dealing with title-safe probably takes a lot more time than save/load). And it is also something that these consoles could handle at the system level if they actually cared about the problem. Just put a screen calibration menu in the dashboard, and then initialize it properly so that players would almost never need to use it (when they are using HDMI output, initialize the settings using the display's EDID; if they are outputting a different way, just default to shrinking the display a little bit).
10% at the edges of the screen may not sound like much, but it is actually huge. Suppose a game is rendering at 1280x720... you are only really able to use the interior 1024x576, with everything outside that just being decoration. 1280x720 is 921600 pixels; 1024x576 is 589824 pixels, in other words, you only get to use 64% of the screen area for real! Everything outside that is only allowed to be decorative. You know how the 360's new Metro interface seems to use an absurdly small amount of real estate in the middle of the screen? That is due in large part to title-safe region. You know how games always seem to have their HUD elements placed too far inward when they could be nicer, further off to the sides? Same.
This would also have the benefit of saving work for most people who implement 2D games. In Braid we had to program a screen calibration menu. I notice there's one at the beginning of Dyad, and one at the beginning of PixelJunk Shooter. For God's sake why??
This makes all games worse all the time, and makes developers have to do a lot more work, but ultimately could be fixed very easily by any of the platform holders. They just don't care enough to do it. I think they will have a hard time getting rid of these rules because bureaucracy is in their DNA.
These are just two examples of many... I am going to stop ranting here, but this is just the tip of the iceberg. Most certification requirements are like this.
The edge that both Apple and Valve have going into the future is that they both genuinely care about the end-user experience and want to make it as good as possible. (Their end-user experience is already way better than any of the consoles and they are always working to improve it). Which coincidentally seems to be the place that these consoles are handicapped due to their corporate culture. Can anyone look at the current 360 or PS3 dashboards and legitimately say that those are products of an entity that deeply cares about user experience?
A question a lot of developers have is: When the 720 / PS4 get launched, how many people are really going to care and buy games for those systems? Obviously some people will, but if it is less than bought games this generation -- which a lot of developers think is very possible -- then look for a "peak oil" kind of crash where a lot of too-big publishers and developers fight over a shrinking market. If the actual way the next-gen consoles work is much like now, they will be a functionally archaic in the marketplace (keep in mind that they have to compete with the iPad 4, 5, 6, 7, 8, and 9. Have you got any idea what the iPad 5 or 6 are going to look like, how powerful they are going to be, what other user experience benefits they are going to have? I sure don't.)
-----
A few days after having written this letter, I just have one more thought to add: I know that Microsoft thinks they are fixing these issues with their next console, and creating an environment more suitable to free-to-play games and downloadable things in general; I assume Sony is thinking similarly. I will be very surprised if either of them succeeds in fixing these problems. Large institutions always undercorrect; they always think they are being radical and risky when in fact they are doing something tame that is just a modified version of the status quo. When large institutions need to change course by 95% in order to do well, even if they know it's an emergency and are totally panicked, they can probably only manage about 35%. For recent illustrations of this, look at Nokia and RIM (or, uhh, look at the large governments of Earth in dealing with global warming).
So I would bet that Microsoft and Sony believe they are streamlining their online stuff, because they are implementing an expedited, simplified cert process for patches and content updates, or something, and that because of this they will be in a great position to compete with Facebook and Apple and Steam and whoever. Whatever they do is very likely not to be enough. Their competitors are not stopping either. (Steam, which was already pretty painless in terms of updating games, recently revamped their system; the new thing is way better than the old thing, which was already way better than what the consoles do.)