Electronic Computing is Janky AF

Posted on Dec 27, 2023

Now there’s this funny type of project, right, where somebody does computation with something that’s not “for” computing. Using marbles in some kind of pachinko rig to add binary numbers. Pinging slow servers to temporarily store data.1 I guess redstone in Minecraft kind of is for making circuits, but people have built like … Turing machines and that’s funny. It’s all a bit janky and there’s an absurdist humour to it: these are high-effort shitposts. And here’s a thing that I’ve recently realised more viscerally than before: using electricity for computation … is kind of like that. It’s janky AF.

I mean clearly I didn’t think electricity is, teleologically speaking, for making computers. What does that even mean? But it kind of feels normal, right? Only the other types of computing seem funny, but that’s just because we’ve gotten super good at electricity.

Ben Eater has a great series on YouTube where he builds a simple computer on a breadboard, basically starting from individual transistors, and it taught me a lot. Not really anything that I need to know, I guess, but it’s fun at the very least and I think it’s good to know about how stuff works.

There’s a classic article by David Goldberg called “What Every Computer Scientist Should Know About Floating-Point Arithmetic.” That title has become kind of a meme: there’s also “What Every Programmer Should Know About Memory” by Ulrich Drepper, for example, which clocks in at 114 pages and on page 5 it says “We will keep the level of detail as low as possible.” right before going on to describe how a implement an S-RAM cell using 6 transistors. My dude, this is all super interesting, but I feel misled by the title. I suppose that’s part of the meme. And here’s my homework for you: at whatever level of computing technology you’re comfortable with, try having a look at how things work one level lower. Because all abstractions are leaky and you never know when it’ll become relevant in some unexpected way.