Jonathan
Jonathan Author of Robopenguins

How do computers work?

Awhile back my great aunt asked me for some resources on learning about computers. I’d thought I’d upload the conversation to save it for posterity.

While I first understood the question to be about programming, after a few emails back and forth she clarified her interest:

I realize that what I am looking for is to understand how computers work – the underlying mechanics/engineering – whatever — not to learn how to program or code.  I mean how all this information gets embedded – or whatever the term is – in a minuscule chip  — how it was possible to move from the room-filling contraptions that were the first computers into the hand-held devices that now can tie together the entire world – and beyond. How machines are able to respond to digitized instructions. What am I missing here?

I decided to spend an evening trying to figure out how to answer that question. It’s a pretty huge nugget to break down, but this was my take on it:

 That’s an interesting angle to be approaching things. Computers have progressed to the point where very few people, even programmers could really describe how they work. A huge portion of engineering is about building mental abstractions to allow one to work with things without needing to know all (any of) the details. At the most fundamental level I’d break how a computer works into math and physics.

The physics is mostly about the design of semiconductors especially the transistor https://en.wikipedia.org/wiki/Transistor . While there are many other technologies that also come into play, the computation in a computer basically just comes down to these digital switches. Getting into the physics of the transistor is an area I learned in school, but hardly remember at this point. The main thing that I would try to understand, is how you go from the idea of a transistor to it’s mathematical representation of a logic gate https://en.wikipedia.org/wiki/Logic_gate . The evolution of room sized machines, to modern phones, has fundamentally been an evolution of these gates to be smaller, cheaper and more efficient. This evolution is commonly described as Moore’s Law https://en.wikipedia.org/wiki/Moore’s law . The picture:
 shows how we’ve gone between different underlying technologies over the years to get to modern computers. In that picture, integrated circuits are the chips which are made of billions of individual transistors printed onto a silicon wafer. Advances in how this is done is what has contributed to most of the miniaturization and increased power of computers. The basics of the most ubiquitous technique are shown here
To leave the physics of what’s being built, the next level of abstraction are the logic gates as mentioned before. These gates are reasoned with using boolean algebra https://en.wikipedia.org/wiki/Boolean_algebra . The basic idea is that since we have these electrical switches, everything the computer “computes” is operating on this idea of on and off or 0 and 1. To extend this to numbers you can represent values and computation in binary http://www.math.grin.edu/~rebelsky/Courses/152/97F/Readings/student-binary . This might be a bit complex, but it shows how you work through the logic and build more complicated computations from simpler pieces
Computers have a clock that triggers the circuits to feed data through step by step. They also have memory that can store the results for future calculations. You could dive into the physics of the oscillators that make up the clocks, or the various memory technologies, but at their core it’s the process of translating an electrical phenomenon into 1’s and 0’s. The logic in a computer is arranged so that it can take in a series of instructions https://en.wikipedia.org/wiki/Instruction_set and store the results. These instructions are incredibly basic arithmetic, logic, and controlling the selection of the next instruction. When you run a program on a computer it’s just an extremely complicated set of millions of these instructions.
Humans almost never interact with these instructions directly. We write “code” which is an “easy” way to represent what we want a computer to do which gets mapped to these basic instructions. Different programming languages are different attempts to make this mapping as efficient as possible. From there it’s mostly layer after layer of abstraction to make telling the computer what you want as easy as possible.
The input (mouse, keyboard) and output (screen, speakers) are electrically connected and have their interfaces mapped to 1’s and 0’s. These affect and are effected by the execution of instructions in the computer and translate the data from the digital world inside the chip, to something a human can interact with.
I found this video series decent in covering this information:
This is all tremendously complicated to understand in the abstract, and any of these ideas can be a whole university course. I most confidently understand the pieces of the computer I interact with on a regular basis, and most of the time I don’t need to think about anything lower level then the instructions that my code might be mapped to.
I find the fact that the human race has managed to create things of such complexity that work with such consistency amazing. They have been able to develop so rapidly because they support cooperation among engineers that allow millions of people to build on each others work. The fact that this all boils down to pretty basic logic allows people to use each others work unambiguously.