Bookcover - But How Do It Know?

But How Do It Know?

The Basic Principles of Computers for Everyone

by J. Clark Scott

Rating: 7/10

Buy it on Amazon

Summary

This book aims to explain and demistify how a computer works under the hood. It tries to remove the layers and layers of abstraction and jargon that computer science has come up with over the years and focus instead on the core ideas of what a computer really does.

Those ideas are:

  • The Binary System => how Power On/Off can be used to represent binary states (0, 1) which then can be used to encode everything else: numbers, text, logic, programs etc.
  • Logic Gates => combinations of switches that can do logical operations on binary states, like AND, OR, NOT, with a lot of these gates one can build everything a computer can do
  • the CPU (central processing unit) => ALU (arithmetic logic unit) and the fetch, decode, execute, store cycle => answering the question: what is a program and how does a computer execute it? Retrieve instruction from memory, decode what that instruction means with the instruction set, execute the instruction with the ALU (say adding two numbers), and store the result somewhere else
  • volatile and non-volatile memory => volatile needs power to work, if power is lost, data is lost. RAM is volatile memory, a hard disk is non-volatile memory because it stores bits as little physical bumps of metal on a disk instead of electrical charge in a capacitor
  • registers and how there is a layering of data storage locations closer and further away from the CPU with different caches in between to improve performance
  • the idea of a bus system => a bunch of wires cleverly arranged so that everything can connect to and read/write from it without colliding => there are two important ones: the data bus (to fetch and write data between CPU, HDD and Peripherals, like mouse and keyboard) and the address bus (to specify RAM memory locations where program data should be stored)
  • instead of binary, there is machine code => the instructions in the instruction set that a given CPU can execute. Those are the lowest level humans have to deal with (assembly) because they can be usually directly translated into bytecode, higher level languages are abstracted step by step from that, but in the end they get translated back all the way into bytecode, the programs that do this sort of task are called compilers
  • data from input/output devices like screens and mouses is transferred over the databus to the CPU and then handled there
  • the clock => pulsing with a very high frequency to keep everything synchronized => modern processors have clock speeds of a few giga-hertz, which means the clock pulses on and off a few BILLION times every second. That's how a bunch of "load memory from here, add this memory to that, store result over there", can give the effect of a bunch of text appearing on a screen or even the 3D visuals and sounds of a computer game. In the end it's all simple binary operations but literally billions of them every second and therefore together, the results can be quite remarkably complex.
  • the decode, fetch, execute cycle can be used to load programs into memory and execute them, essentially, a program is also just data, however data where each byte corresponds to an instruction of the CPU from the instruction set. The CPU fetches the first instruction, executes it, then moves to the next. Common instruction sets have interesting operations such as conditional jumps which enable the processor to start reading an instruction from a different place in memory if a certain condition is met, this enables loops, functions and many more things.

Finally, modern computers have come a long way from all of this and they have a lot of bells and whistles that he doesn't describe. A few come to mind:

Branch Predictions, Error Correcting Codes, Operating Systems, GPUs, or the inner workings of SSD or DRAM.

To me, especially that he doesn't cover Operating Systems in a lot of detail is a bit of a pity. Nowadays they are an integral part to how computers work because they sit in between the hardware (CPU, RAM, peripherals) and the programs running on the computer. Not covering operating systems is understandable because it is a complicated topic in its own right but still. CPU interrupts and multi-tasking operating systems on a single core are very interesting and necessary components to understand if we want to understand all of how a computer works.

The same is true for the underlying ideas of how hardware like DRAM or SRAM is structured and how the wiring diagrams look like at least in principle and all the fancy shenanigans that people are building into modern instructions sets and CPUs to squeeze out more performance. There is just a lot of black magic going on there, that makes everything work super fast.

Buy it on Amazon