November 2020
I like to answer this question in two parts: bits, and instructions.
Let's start with the bits. Suppose that you are on a cliff overlooking the ocean. It is currently night time and you have two flashlights on you. Your job is to use these flashlights to come up with a scheme to indicate to all incoming ships on whether it is safe for them to dock.
To achieve this, you (and the ship captains) agree to follow these rules: This is a very simple scheme, and it works. Try to understand what is going on here. The actual turning on/off of the flashlights is physically unrelated to whether it is safe or unsafe for the ships to dock. But since all parties have agreed beforehand on what the meaning of each action is, these actions now carry a different meaning that is present only in this specific context. In all other contexts, the turning on/off of the flashlights is meaningless, or worse, it could carry a different meaning, which may lead to misunderstandings and possibly harm.
This process of using one thing to represent another thing is called symbolism. Symbolism is a very fundamental idea in computer science. We know that computers read bits (ones and zeros). One of the reasons computers are so ubiquitous today is because of the fact that these lowly bits are capable of representing not just anything, but a very large number of anything.
At the software level, we use one piece of code (variables) to represent other pieces of code. At the hardware level, voltages on electrical circuits are used to represent bits (ones and zeros), which are in turn used to represent data and instructions. The pictures that you upload onto the Internet, the articles that you read, the videos that you watch, all of these things exist as nothing but mere ones and zeros.
We just saw how bits can be used to represent information. Let's now take a very shallow look at how a computer reads and executes computer instructions.
Essentially, a computer does one thing, and one thing only -- execute instructions. A typical sequence of computer instructions might consist of the following:
These elementary operations (reading from memory, writing into memory, performing an operation) form the basis of a large subset of the set of all instructions of a modern processor. The four instructions above may be written, in computer instruction form, as the following:
The first instruction says to move the bits in memory location 00101010 into memory location 00000000. The third instruction says to add the bits in memory location 00000000 to the bits in memory location 00000001 (in place).
But wait, computers can only read zeros and ones, how does it read the textual part of the instructions, i.e. MOVE and ADD? The answer is, they don't. The instruction form above is not the final form that gets read by the computer. The actual final form looks something like the following:
If you look closely, you might notice some common characteristics between these four instructions and the four instructions before them. Instructions 1, 2 and 4 all start with the same four bits: 0010. You might guess that these four bits 0010 represent the MOVE instruction, and you would be right. The first four bits specify the type of operation to perform (MOVE or ADD), the next sixteen bits specify the source and destination memory locations.