how awesome that this exists. I was learning how CPU works and designing my own CPU w emulator like 20 years ago as a teenager by googling into obscure forums, blog posts and homemade cpu webring. I made an experiment not long ago "would I be able to find in google by myself all learning materials to do that again". The outcome of that experiment deeply unsettled me. Google just gives you shit and total garbage. Half of the results are AI generated, other half is sloppily written half assed abstract pseudo tutorial like nonsense on medium or other paid-for-engagement platform. My children would not be able to reproduce such self learning without watching some youtubers doing it or by accessing some curated paid course or by accidental stumbling upon "gems" like this i.e. from HN. We desperately need back old google and old internet and somehow save and preserve humanitys knowledge.
For those who haven't seen it before, throwing out nand2tetris and nandgame as another interesting journey from logic gates -> computer programs running on a CPU.
nandgame, in particular, is really easy to get started with and has been updated quite a lot over the year. If you looked at it a while ago check for new updates!
nand2tetris goes into a bit more detail around some things, I like it too but harder to get started with.
For anyone interested in a "not-so-simple" CPU design, I have a couple old college assignments from a CPU design class that may be fun to peruse:
This one is a 4-stage pipelined CPU: https://github.com/wyager/Lambda16
This one is a superscalar out-of-order CPU: https://github.com/wyager/Lambda17
Both are written in Clash, which is a subset of Haskell that compiles to target FPGAs. It's an incredibly OP HDL.
I don't think I ever ran the second one on an actual FPGA, because at the time values of type `Char` wouldn't synthesize, but I think the Clash compiler fixed that at some point.
I just have to say. Such a beautiful and accessible website. No fluff, no ads, no distractions. I love it!
As much as I really benefited from being able to internalize system architectures like these many times over… I do wish now, as someone who ended up in software, that there were similar hand-holdy third guides to implementing the “core” of out-of-order superscalar execution engines, too. They’re crucial to understanding how modern processors _kinda actually work to a zeroth order approximation_, even though it’s impossible to convey the engineering scope of modern CPUs to those who need hand-holding, but I
I considered trying to do a simple CPU design from logic gates too. But I ended up wondering about some of the performance characteristics. Maybe some people who are knowledgeable are reading this. What I am wondering about is the switching speed of logic gates as compared to the signal speed in the electric connections for a realistic CPU. I.e., how many logic gate lengths (assume logic gates to be square) does an electric signal travel in an electric connection in the time that is needed for a logic gate to invert its output. Another one that seems relevant is how much spacing electric connections need compared to the size of a logic gate.
A nice non-Turing-tarpit minimalistic MCPU: https://github.com/cpldcpu/MCPU .
There's a nice little web forum (remember those?) for people interested in toy / experimental CPUs at anycpu.org
I'm not active there any more, but I used to be when I was developing my own toy CPU: https://github.com/robinsonb5/EightThirtyTwo
I'm pondering about building CPU from logic gate chips. The thing is most projects like that use chip count efficient (usually microcoded) designs which aren't fast enough to run "real" familiar software. I want 32-bit instruction set with virtual memory, capable of running Linux and fast enough to run games like Doom. Fastest reasonably available logic gate family is 74AUC (possibly with exception of exotic ECL gates). Combined with copious use of fast asynchronous SRAM chips, I think performance in the ballpark of 20 MIPS should be attainable. I'm half-expecting that I'm making some huge error in my assumptions and it will turn out these numbers are impossible, but that didn't happen yet.
Related - breadboarding and computer engineering tutorials: https://www.youtube.com/@BenEater/videos
I started this journey a while back using Tanenbaum's MIC-1 during my Uni days with a another colleague. Still have it online if anyone is interested: https://github.com/elvircrn/mic-1.
nice - reminds me of the excellent "Computer Organization & Design" by Patterson and Hennesy https://a.co/d/9U9Adl9
[dead]
I teach the introduction to computing class at MSU and agree entirely: most students need to start with the absolutely most simple introduction to computing possible.
My favorite two models are:
The Scott CPU
https://www.youtube.com/watch?v=cNN_tTXABUA (great book, website is now offline unfortunately: https://web.archive.org/web/20240430093449/https://www.butho...)
An extremely simple non-pipelined 8 bit CPU. The emulator lets you step through tick by tick and see how the machine code is driving an operation. I spend one lecture showing each tick of a bitwise AND and following the data around from the instruction into the instruction register, how the instruction selects the general purpose registers, runs it through the ALU and then moves the data back from the accumulator into a register. It's one of my favorite lectures of the year.
The Little Man Computer - https://www.101computing.net/LMC/
A higher level Von Neumann style computer that helps introduce students gently to assembly where they can fully understand the "machine code" since it's just decimal. We then build an emulator, assembler and compiler for an extension to LMC that introduces the notion of a stack to support function calls.
It's a fun one semester class, not as intense as NAND-to-Tetris but still an overview of how computing works.