How Many Bits Will Make You Happy?
Before you battle the crypto-miners for that fancy new graphics card, think about how much information it takes to entertain you.
When Minecraft came out ten years ago, I was immediately attracted to it because of the procedural generation and open-ended nature of the game, and I also liked the approach of using a really simple block-based 3D world with 16x16 textures. It all had a very retro feel that appealed to my (pretty well-developed) nostalgic side, and I was an early Alpha player of the game.
Even though there were games out at the time with far-superior graphics, I found Minecraft to be immersive and engaging. One day while mining deep, I unexpectedly fell through a hole in the floor, into an unexplored cavern. I was hurt, my torch was lost, and as I took a new one out, the spark of light shone on a looming green figure in the dark — a Creeper, ready to explode, right next to me.
And I screamed, in pure terror.
This chunky little sprite that looked like something from one of my 8-bit Commodore 64 games had provoked a strong emotional reaction, in a way that many graphically superior games I had played lately had not.
Were there even scarier hi-res games out there I had not played? Probably. I’m not really well-versed as a modern 3D gamer and tend toward the retro and casual side of things a lot. But the mere fact that something of such low resolution could create that kind of experience for me got me thinking.
I wondered, what was the minimum resolution needed to scare me? Clearly a low-poly 3D object with 16x16 textures could do it. What about a 2D sprite that was 16x16? Maybe, I decided. But it would have to be a really good set-up, such that the sprite represented something alarming.
I was thinking about an old VAX computer game I used to play in the 1980’s called EMPIRE, a wargame where the enemy units were represented as single characters on a terminal, “A” for army, “F” for fighter, etc. I remember being upset about finding a bunch of “A” characters suddenly emerge from the fog-of-war near my undefended cities.
One Pixel Can Bring Joy
That invading army would be maybe an 8x16 pixel monochrome character, upsetting me. What about lower resolutions? And what about other emotions? Could a 4x4 pixel sprite make me laugh? Could a 2x2 sprite make me sad? I started to doubt, that even with proper backing context, something carrying so little information could invoke any true emotional reaction.
But context is everything. When I thought more of it, I remembered that a 1-pixel display had once made me joyous. In the mid 1990’s when my wife and I were trying to have our first child, pregnancy tests of the day would just show a solid line for a positive test — maybe wider than a traditional pixel, but really just a single bit of information. And seeing that one pixel resulted in one of the most powerful emotional experiences I’ll probably ever have.
This is the value proposition that many “simple” things have to offer, from pregnancy tests, to retro-games, to pen-and-paper RPGs, to minimalist art, and so on — The idea that it is not the quantity of information provided that is important to people, but the quality of it. For these things, you only get the essential information from what you are consuming, and the context that you bring into the experience does the rest.
Big Data is Still a Big Deal
But don’t throw away your RTX 3090 card just yet. I’m not suggesting somehow that “big data” experiences, like hi-res 3D gaming, Virtual reality, augmented reality, The Metaverse, and so on can be replaced with some 8-bit equivalent. The immersive and realistic experiences offered by state-of-the-art hardware, if used to good effect, can create something highly entertaining that has value justifying the costs (for some, anyway).
The most compelling example of the value of “big data” of course is reality itself. If you think about why you would attend a live performance of a musician versus say listening to a studio recording, it is because the “resolution” offered to your senses for a real-life experience is incredible.
The studio recording may have higher quality — a better mix maybe, and more complex and sophisticated arrangements and backing tracks. But it still does not completely replace the full-res experience of seeing the band live. Music being created live and making it to your ear directly, where you can even feel the bass. The sights, sounds, and smells even of the experience cannot (yet anyway) be duplicated at this level, through any virtual means.
The problem though is, high-res graphics or even the full-sense experience of a real-life event are all second priority. First priority is that “if used to good effect” part. I’m sure you can think of cases where you have watched a movie or played a game that was visually stunning, but something else like writing, directing, quality came up short and just ruined the experience.
Reality, although very compelling, is also not immune to this prioritization. Maybe you went to a live performance that was well mixed, but the band sucked. Or had front-row seats to a bad theater production.
In these cases, it doesn’t really matter that the resolution of your monitor was incredible, or that you had great seats. If the context or quality is missing, then resolution, even super-high-reality resolution, doesn’t save the experience.
So it seems that context is king, and although you can have a good experience with less data if the context is compelling, it is almost impossible to enjoy a ‘big data’ experience if the context is bad or missing. Nintendo is a great example of a company that figured this out, long ago. Their hardware platforms are frequently underpowered compared to competing game consoles of the same age, and the ensuing quality of graphics on Nintendo platforms is often derided by the ‘hardcore’ gaming community.
But Nintendo has consistently prioritized creating compelling context and producing high-quality software over things like graphic resolution, and has remained highly popular through the years, despite the naysayers.
Beginners in a Big Data World
If you are a beginner to computer hardware or software, or perhaps a solo developer of some sort, you may find yourself getting discouraged when thinking about ways to be creative in our big-data world. I wrote an article back in April called The Limitations Paradox that talked about Choice Paralysis, and how strangely, sometimes when we have less to work with it is better for our creativity.
I think about this often. Not just about how simpler things can inspire creativity, but also about how the complexity and “high bar” of creating new things with today’s computer technology makes it hard for beginners and solo creators to get started on something. Even “low-res” Minecraft is a 120MB download these days, which hardly qualifies it as a “small data” project you could manage yourself.
Being the proverbial old guy in tech, I remember a time when the bar for creating something interesting was pretty low. In the 1970s, assuming you even had access to a computer, you could write a text-based tic-tac-toe program or guess-the-number game, and everyone would be quite amazed.
That sort of thing would only require a few hours of work, and would be a great beginner project in programming. But when you were done, it would have perceived value, and the fact people enjoyed what you did would encourage you to take on something more advanced, maybe.
The landscape today for the starting programmer is a vastly different one. Although there are incredibly powerful development environments that can accelerate and automate many aspects of programming, the expectations for any kind of consumer are so high, even good efforts can be considered ho-hum.
I am encouraged though by the success of many one-person or a-few-people studios that manage to put out hit game titles. Games like Stardew Valley, Undertale, Flappy Bird, and (early in its life) Minecraft all managed to become wildly popular, in spite of having a fairly small codebase (and accordingly, a limited context) and very modest graphical resolution.
The context these games did have was novel enough, and the quality was high though, and they became hits. There was also a generous amount of luck involved too. But it at least demonstrates the idea that a ‘small data’ experience can still engage people.
Why Blinking Lights Make Me Happy
Let’s face it though, the chances of you or I coming up with the next Flappy Bird are low. We also are similarly not going to have the resources needed to create a ‘big data’ experience that is notable. Although we now have access to huge libraries of 3D models, engines, and accompanying software, it is very hard to stand apart from the big-name companies that have armies of people working on a game, and if we have a novel context of some sort, we compete against thousands of other indie developers, all hoping their novel context idea will catch on.
It can seem pretty hopeless for someone who wants to create, and I have seen a lot of new and ambitious game developers give up after failing to finish their project, or after doing so but failing to get any attention.
What can keep one motivated then? I have your answer, and it comes in the form of a single, blinking LED.
Above is a picture of the first computer I ever owned, an ELF II system featuring 256 bytes of static RAM, and a hex keypad for program input, and two 7-segment LEDs along with a “Q” register discrete LED for output. There was also a primitive monochrome video output, but the very first program I ever wrote for this system after soldering it together from a kit was one to turn on that discrete “Q” LED, whenever the “I” button was pressed.
The whole program was maybe 6 bytes. But they had to be entered as hex digit pairs, and then the “run” toggle switch flipped to execute it. When I had done this and flipped the switch for the first-ever time, and the LED light came on when I pressed the button as the program dictated it should, I was ecstatic. Another single bit, bringing me joy.
This was a primitive machine, even by 1978 standards, and its $150 bargain-basement price was all I (or more accurately, my family) could afford. So it might seem surprising that my extremely simplistic feat of programming was all that exciting, even then.
Context as always is everything though, and in this case, it includes the fact that at this point in time, no one I knew actually had a computer in their house. The joy of learning about this new technology, and having it and its seemingly endless possibilities exclusively at my fingertips made the simple experience of turning on an LED extraordinary.
That was a long time ago. For a while, I had assumed that this particular joy was something that had become as obsolete as the ELF II. But then at some point in the not too distant past, I started playing around with programmable microcontrollers like Arduino, and building projects with them. The first project? Connecting an LED, and programming it to turn on. And when I got it to work, it was strangely as satisfying as it was 30 years earlier.
The Producer / Consumer Disparity
Why? The context makes it so. The validation that a single bit, turning on an LED provides when you are trying to learn something brand new is priceless, and not just a phenomenon of the 1970s. Software developers know it by a different name:
“Hello, World.”
I have written “Hello, World” countless times, in different computer languages and in different contexts. Each time I am in the midst of trying to learn something brand new, and each time, it is very rewarding to get that simple program to work.
I sometimes would consider whether this joy of simple achievements was just a quirk of mine, owing to my history and love of retro things. But I have spent time teaching new programmers to write programs, and I see the same delight from them when they get their first program to run. I remember being worried that a programming student of mine would get discouraged when I first showed her how to create a simple sprite and move it on the screen using the keyboard.
I figured being a gamer she would find the primitive aspect of the graphics and gameplay disappointing as compared to what she was used to playing. But quite the contrary, she was thrilled, and went on to improve her first game attempt, almost immediately. This particular student was a Freshman in college, in a pre-med program. She had an interest in computers, but had never attempted programming before creating this simple game.
Not too long after this impromptu lesson though, she switched her major to computer science, and now works as a software engineer for Amazon. (Not taking credit or blame for that, especially, but I do think the power of achieving simple results when trying to learn is evident here.)
So if you tire of this article and want the quick takeaway, that is it. The way you can keep motivated and growing as a programmer (or hardware hacker) is to avoid temptations to compare yourself to some behemoth product that cost millions to create, or to some lucky person who happened to catch lightning in a bottle. Stick to the joy of doing it for yourself, the deep satisfaction that comes with learning and creating.
I know this can be difficult at times, because when you attempt to show others your hard-fought-for work, they may not be impressed. My family and friends were not especially enthralled with my blinking LED back in 1978, and yours may similarly not find your starting programs or solo creative efforts to be compelling. The reason is context.
You are a producer, and come into the experience of getting some version of “Hello, World” to work with all the struggles and memories of what it took to get you there. Who you show it to are the consumers, and they come into the same experience with expectations that involve their memories of other things they have consumed, many of which are quite sophisticated. Producer and consumer contexts are never aligned, and although if you keep at it you will someday likely produce something that excites the consumer as much as you, it turns out not to be that often.
But do keep at it, even if what you are doing is low-res, small-data, involving just blinking lights, or hopelessly retro or limited in some other way. Try to put your own satisfaction first, but if you really do crave feedback from others who can appreciate what you are up to, seek other producers, who can experience what you have done with a similar context. It turns out it doesn’t take a lot of bits to make a person happy, especially if the person is a producer. Start producing a few bits of your very own, and see for yourself!
Explore Further
Next Time: Subscribers, stay tuned for a weekend throwback to Mad Ned Memos past, as we roll out another Memo Rewind! We’ll look back at some of the great stories and discussions we’ve had over the summer. After that, it’s time to screw around. At work, even! Some confessions from yours truly here in the aptly-named article: The Value of Screwing Around at Work
You can keep up to date with nerdy computer things, one-bit and larger if you subscribe to The Mad Ned Memo! Get a weekly memo like this one delivered to your inbox, and never miss an issue. (Check the link for past articles.) The Mad Ned Memo is cost-free and ad-free, and you can unsubscribe at any time.
The Mad Ned Memo takes subscriber privacy seriously, and does not share email addresses or other information with third parties. For more details,
click here
.