Bugs are a fact of programming life, although their consequences have changed a lot. When a console crashes, you reset it. When an arcade game crashes, it resets itself. When your PC crashes, you might have lost an important document. If you're going to have a bug, best to experience it in a game and not Microsoft Word.
Everyone knows video games have bugs these days. Not as many people know they've always had bugs, going right back to the beginning. Pac-Man, Galaga, Defender, they had more bugs than you'd suspect, and sometimes they laid right under your nose, unnoticed for decades, until you find out how to trigger them.
That's what this book is, a spotter's guide to classic game bugs: where they are, how to find them, and how to push them out into the open, so we can pin them to cardboard and put them on display. What use is a bug to anyone if it's crawling in the walls? Let's bring out the magnifying glass and have a good look. The wing pattern on this Pac-Man kill screen is really quite exquisite.
"*STORYBUNDLE EXCLUSIVE* - this is the first of a series of books written exclusively for the Game Storybundles, and John has delved deep into classic games - from Pac-Man to Galaga and beyond - to point out some amazing glitches, how you trigger them, and the underlying logic that makes them possible." – Simon Carless
"Video games are pretty cool when they're working, but WAY cooler when they ALMOST work. If you've ever felt the pang of existential terror when you clipped through a wall, John Harris will make you feel better. And if you haven't, he'll tell you how to get started!"– Zachary Spector
There is an idea in programming circles. It is that all programs have bugs.
Speaking strictly, it is trivially untrue. One can code an assembly language program that puts the characters "HELLO WORLD" on the screen and return, and do nothing else, and be assured there are no bugs. But ah, practically speaking, there is value in the idea.
For example. In computer science circles, one of the base problems of the field is how to sort a list of items in order from least (by some criteria) to most. This is a fundamental-enough problem that most people who take intro courses end up faced with it at some time, and even some people outside the field have heard about it. (Such as, potentially, yourself, now.) The "bubble sort" is known as a particularly easy-to-implement, and yet inefficient, solution to the problem.
One of the most lauded answers is a thing called mergesort, which involves splitting the list into half-sized smaller lists. For over twenty years it's been regarded as one of the better solutions to the problem. And then, in 2006, it was discovered that one of the core assumptions of the algorithm failed for very very large (billions of items!) use cases. It doesn't matter mostly, but it does matter for companies like Google who deal with datasets on the size of the number of sites on the internet.
You could claim that this wasn't really a bug, that it was a consequence of changing expectations and assumptions that were once universal becoming less so. And I would counter, well, so was the Y2K problem, so, in fact, are all bugs, on some level, and the only difference is the scale of the assumption that is being violated.
This is a book of bugs. They are bugs in a type of field that many people are relatively familiar with, that being classic video games. Video games have a unique position, being a kind of computer program that many who are not technically-minded, who can't tell an invalid pointer from a Doberman Pinscher, still have incentive to understand on some level, and thus are a useful tool in communicating to folk about the nature of bugs, their causes and results.
But be assured: all of the kinds of errors that turn up here turn up in "more important" software, operating systems and office applications, web browsers and cellphone apps alike. And in this recognition, maybe you might gain some sympathy for the plight of the programmer, that derided creature that can only make mistakes, with scant attention paid to the fact that those mistakes come in a framework that often that programmer invented, that the act of creation is of a higher order than a mere flaw in its making.
Unless that programmer doesn't sanitize his or her inputs. In that case, heaven has no mercy.