How Code Becomes Action
You type instructions. Something happens on screen. But what actually happens in between? This path reveals the invisible machinery between your code and the electrical signals that make things work — so you understand the substrate before you build on top of it.
Who this is for
- You use software every day but have never thought about what happens inside the machine when you press “run”
- You’re learning to code — or working with AI that writes code — and want to understand the ground beneath your feet
- You want to know why programming languages exist, why there are so many of them, and what each one actually does differently
What this article is NOT
This is not an electronics course or a computer science degree. It’s the conceptual foundation — enough to understand why programming exists and what each layer of the stack does. The circuit diagrams stay in the engineering department.
Part 1 — A very fast follower
The most important thing to understand about a computer: it cannot think. It cannot improvise. It cannot interpret your intentions. A computer is a machine that follows instructions, one at a time, extremely quickly.
How quickly? A modern processor executes billions of instructions per second.1 But each instruction is absurdly simple — move this number here, add these two numbers, compare this value to zero. The power comes from speed and volume, not from intelligence.
The processor — the CPU — runs a loop that never stops:
graph LR F[Fetch] -->|read the next instruction| D[Decode] D -->|figure out what it means| E[Execute] E -->|do it| F style F fill:#4a9ede,color:#fff style D fill:#e8b84b,color:#fff style E fill:#5cb85c,color:#fff
This is the fetch-decode-execute cycle.2 Every program you have ever used — every website, every game, every spreadsheet — is ultimately this loop running billions of times per second.
Fetch: the CPU reads the next instruction from memory. Decode: the control unit figures out what the instruction means — what operation, what data. Execute: the CPU does it — adds numbers, moves data, compares values.
Then it moves to the next instruction and repeats.
Why this matters for you
When someone says “the computer runs your code,” this is what they mean. Your code is broken down into these tiny operations and fed through this loop. Everything you will learn from here — languages, compilers, operating systems — exists to make it easier for humans to create these instruction sequences without going insane.
Part 2 — The language of electricity
A CPU is built from transistors — tiny electronic switches that are either on or off. There is no “maybe.” There is no “a little bit.” On or off. 1 or 0.
This is binary — a number system with only two digits. Every piece of data a computer processes — numbers, text, images, sound — is ultimately represented as patterns of 1s and 0s.3
Why? Because electricity is good at two states. A wire either carries a signal or it doesn’t. A transistor is either open or closed. Building a system on this binary foundation is straightforward to engineer and extremely reliable.
graph TD T[Transistors] -->|on or off| B[Binary digits] B -->|combined into| G[Logic gates] G -->|combined into| C[Circuits] C -->|combined into| CPU[Processor] style T fill:#94a3b8,color:#fff style B fill:#e8b84b,color:#fff style G fill:#4a9ede,color:#fff style CPU fill:#5cb85c,color:#fff
Logic gates are the building blocks. An AND gate outputs 1 only when both inputs are 1. An OR gate outputs 1 when either input is 1. A NOT gate flips the input. These three operations — combined in billions of configurations — produce everything a computer can do.4
You do not need to understand gate design. What matters is the insight: every computation, no matter how complex, is ultimately reducible to combinations of AND, OR, and NOT applied to 1s and 0s. A spreadsheet formula. A video call. A language model generating text. All of it.
The key insight
Computers do not understand words, images, or meaning. They manipulate patterns of electrical signals. Everything above this layer — every programming language, every application — is a human invention designed to make it easier to create those patterns.
Go deeper
Open binary for the full explanation of how numbers, text, and images are encoded in binary, including ASCII, Unicode, and colour representation.
Part 3 — The first languages
In the earliest computers, programmers wrote instructions
directly in binary — patterns like 10110000 01100001. This is
machine-code: the exact sequence of 1s and 0s the CPU
reads.5
It worked. It was also brutal. Writing a simple addition required knowing the exact binary code for “add,” the exact memory addresses of the values, and the exact format the processor expected. One wrong bit and the program crashed — or worse, silently did the wrong thing.
The first major improvement: assembly-language. Instead of
writing 10110000 01100001, you could write mov al, 61h — a
human-readable shorthand for the same instruction. A small
program called an assembler translated these mnemonics back to
binary.6
graph LR MC[Machine code<br/>10110000 01100001] --- AS[Assembly<br/>mov al, 61h] AS -->|assembler translates| MC style MC fill:#e74c3c,color:#fff style AS fill:#e8b84b,color:#fff
Assembly was a dramatic improvement. But it was still tied to a specific processor. Code written for one CPU would not run on another. And even simple tasks — like printing text to a screen — required dozens of instructions.
John Backus at IBM described the state of programming before high-level languages as “hand-to-hand combat with the machine.”7 The industry needed something better.
What assembly looks like
Here is a simple operation — storing the number 97 in a register — at three levels:
Level What you write What it means Machine code 10110000 01100001Move the value 97 into register AL Assembly mov al, 61hSame thing, in human-readable form High-level (C) char a = 97;Same thing, in one readable line As you move up, the intent becomes clearer and the machine details disappear. That is the point.
Part 4 — Translation
The problem with assembly: it is close to the machine, far from
the human. What if you could write something closer to your
intent — a = b * 2 + 5 — and have a program translate it
into the hundreds of machine instructions the CPU needs?
That is exactly what happened. And there are two approaches to this translation.
The compiler: the book translator
A compiler reads your entire source code, translates it into machine code, and produces a standalone executable file. You run the compiler once; then you can run the resulting program as many times as you want without the compiler.8
Think of translating a novel. You hire a translator, they work through the entire book, and hand you a finished translation. The reader never needs to see the translator again.
Languages that use compilers: C, C++, Rust, Go.
The interpreter: the live translator
An interpreter reads your code line by line and executes each line immediately. There is no separate executable file — the interpreter must be present every time the program runs.9
Think of a live interpreter at a conference. They translate each sentence as the speaker says it. Slower than reading a pre-translated book, but you can react and adjust in real time.
Languages that use interpreters: Python, Ruby, JavaScript (historically).
graph TD SC[Source code] --> C[Compiler] SC --> I[Interpreter] C -->|translates everything| EX[Executable file] EX -->|runs directly on| CPU1[CPU] I -->|translates line by line| CPU2[CPU] style SC fill:#4a9ede,color:#fff style C fill:#e8b84b,color:#fff style I fill:#9b59b6,color:#fff style EX fill:#5cb85c,color:#fff
The hybrid approach
Some languages — Java, C# — use both. Your code is compiled into an intermediate format (bytecode), then a virtual machine interprets or just-in-time compiles that bytecode when you run it. This gives you some of the portability of interpretation with some of the speed of compilation.10
Why this matters
When someone says “Python is slow,” they mean the interpreter adds overhead compared to pre-compiled machine code. When someone says “C is fast but dangerous,” they mean the compiler produces efficient code but gives you less safety. Every language makes a trade-off between developer convenience and machine performance. Understanding that trade-off is understanding programming languages.
Part 5 — The building manager
You have a program. It has been compiled or interpreted into machine instructions the CPU can execute. But your computer runs dozens of programs simultaneously — a browser, a text editor, a music player, background services. Who decides which program gets the CPU? Who prevents one program from overwriting another’s data? Who manages the disk, the network, the screen?
The operating-system.11
The OS is the building manager in an office tower. Tenants (programs) do not deal with plumbing, electricity, or elevators directly. They make requests to the building manager, who coordinates with the actual infrastructure.
graph TD P1[Program A] --> OS[Operating System] P2[Program B] --> OS P3[Program C] --> OS OS --> CPU[CPU] OS --> MEM[Memory] OS --> DISK[Storage] OS --> NET[Network] style OS fill:#4a9ede,color:#fff
The OS does four critical jobs:
Process management. It runs multiple programs by rapidly switching the CPU between them — so fast it looks simultaneous. Each program thinks it has the CPU to itself.12
Memory management. It assigns memory to each program and prevents them from accessing each other’s space. Without this, one buggy program could corrupt every other program’s data.
Device abstraction. Programs do not talk to hardware directly. They ask the OS to “write to disk” or “send over the network.” The OS translates these generic requests into the specific signals each piece of hardware needs, using specialised software called device drivers.
File system. It provides the logical structure — folders, files, permissions — over raw storage. Without it, a disk is just a sequence of bytes with no organisation.
Why this matters for you
When you write code — or when AI writes code for you — you almost never interact with hardware directly. You ask the operating system.
open("file.txt")in Python does not physically spin a disk. It sends a request to the OS, which manages the details. Understanding this layer helps you understand what your code can and cannot do.
Part 6 — Where code lives when it runs
When you launch a program, the operating system loads it into memory — the computer’s fast, temporary workspace. But memory is not a single undifferentiated pool. Your program uses two distinct regions.13
The stack: sticky notes on your desk
The stack stores temporary, short-lived data — local variables inside functions, the trail of which function called which. It works like a pile of sticky notes: you add a note on top when you start a task, remove it when the task is done. Last in, first out.
The stack is fast, automatic, and small. When a function finishes, its data is instantly reclaimed. You never have to clean up.
The heap: the warehouse
The heap stores data that needs to persist beyond a single function call — objects, large collections, anything whose size is not known in advance. Think of a warehouse: you request space when you need it, use it for as long as necessary, and eventually return it.
The heap is larger and more flexible, but slower to manage. Someone has to track which space is in use and which is free.
graph TD MEM[Program memory] --> S[Stack] MEM --> H[Heap] S --> S1[Fast and automatic] S --> S2[Short-lived data] H --> H1[Flexible and large] H --> H2[Long-lived data] style MEM fill:#4a9ede,color:#fff style S fill:#5cb85c,color:#fff style H fill:#9b59b6,color:#fff
Who cleans up the warehouse?
In languages like C, the programmer manages heap memory manually — allocating and freeing space by hand. Forget to free it and you leak memory. Free it too early and the program crashes.
Most modern languages — Python, Java, JavaScript, Go — use a runtime that includes a garbage collector: an automatic system that periodically identifies unused memory and reclaims it.14 This is one of the major reasons high-level languages are easier to work with. The memory-management is handled for you.
Go deeper
Open memory-management for the full explanation of how stack and heap allocation works, or runtime for what happens between your code and the operating system.
Part 7 — The abstraction ladder
Step back and see the full picture. Everything you have read in this path describes a series of layers, each one hiding the complexity of the layer below:
graph TD DSL[Domain-specific languages<br/>SQL, HTML, CSS] --> HL HL[High-level languages<br/>Python, JavaScript, Java] --> LL LL[Low-level languages<br/>C, C++, Rust] --> ASM ASM[Assembly language] --> MC MC[Machine code] --> HW HW[Hardware<br/>transistors, logic gates, circuits] style DSL fill:#9b59b6,color:#fff style HL fill:#4a9ede,color:#fff style LL fill:#5cb85c,color:#fff style ASM fill:#e8b84b,color:#fff style MC fill:#e74c3c,color:#fff style HW fill:#94a3b8,color:#fff
This is the abstraction ladder.15 Each layer was invented because the layer below was too hard, too error-prone, or too slow for humans to work with productively.
| Layer | What you write | What it costs you |
|---|---|---|
| Machine code | 10110000 01100001 | Maximum control, maximum pain |
| Assembly | mov al, 61h | Tied to one processor family |
| C / Rust | a = 97; | You manage memory, you handle errors |
| Python / JS | a = 97 | Slower execution, less hardware control |
| SQL / HTML | SELECT * FROM users | Only works in its specific domain |
The trade-off is consistent: each step up trades machine efficiency for human productivity. John Backus demonstrated this in 1957 when FORTRAN reduced roughly 1,000 machine instructions to 47 lines of code.7 Guido van Rossum made the same argument when he created Python in 1991: “A change of mindset about cost of programmer’s time versus cost of computer’s time was overdue.”16
This trade-off explains why there are so many languages. It is not chaos — it is specialisation. Assembly exists because firmware engineers need direct hardware control. Python exists because data scientists need to express ideas quickly. SQL exists because database queries should describe what you want, not how to retrieve it.
The key insight
You do not need to work at every layer. But understanding that the layers exist — and that each one is a human invention built to make the layer below more accessible — gives you the mental model to understand any technology conversation. When someone says “Python is interpreted” or “C compiles to machine code,” you know exactly what they mean and why it matters.
What you now understand
Concepts you've gained
- Fetch-decode-execute — the loop at the heart of every CPU, running billions of times per second
- Binary — the language of electricity, where everything reduces to 1s and 0s
- Machine code and assembly — the first programming languages, direct conversations with the processor
- Compilers and interpreters — two strategies for translating human-readable code into machine instructions
- The operating system — the building manager that coordinates hardware access for all running programs
- Stack and heap — where your code’s data lives in memory, and who cleans it up
- The abstraction ladder — the reason programming languages exist and why each layer trades control for productivity
Check your understanding
Test yourself before moving on (click to expand)
- Explain the fetch-decode-execute cycle to someone who has never used a computer. What does each step do?
- Describe the difference between a compiler and an interpreter. What analogy would you use to explain each?
- Distinguish between stack memory and heap memory. When does each get used?
- Interpret this statement: “Python is slower than C.” Using what you have learned about compilers, interpreters, and abstraction layers, explain why this is true and why Python is still widely used.
- Design a one-paragraph explanation of why the operating system exists, using the building manager analogy. Include at least three jobs the OS performs.
Where to go next
Path A: Learn the universal grammar of code
You now understand what happens beneath the code. Next, learn the patterns that exist in every programming language — variables, control flow, functions, data structures, and paradigms. The syntax changes; the logic stays the same.
Continue to grammar-of-code.
Best for: People who want to understand code conceptually before choosing a language or building a project.
Path B: Start thinking about software architecture
You understand the substrate. Now learn how software is structured — frontend, backend, APIs, databases — and how to direct AI to build it for you.
Continue to from-zero-to-building.
Best for: People who want to build something and need the mental model for how software systems fit together.
Path C: Go deeper into the concepts
Explore the concept cards at your own pace. Start with the ones that intrigued you most:
- binary — how data is encoded in 1s and 0s
- compiler and interpreter — translation strategies
- operating-system — the full picture of resource management
- abstraction-layers — the complete ladder with examples
- memory-management — stack, heap, and garbage collection
Best for: People who prefer depth-first exploration over guided sequences.
Sources
Further reading
Resources
- Code: The Hidden Language of Computer Hardware and Software — Charles Petzold’s classic that takes you from telegraph relays to modern CPUs, assuming zero background
- Introduction to Computer Science — OpenStax free textbook covering abstraction layers, OS concepts, and programming fundamentals
- Crash Course Computer Science — 40-episode video series covering everything from transistors to AI, 10 minutes each
- Ben Eater’s YouTube Channel — If you want to see someone build a CPU from scratch on a breadboard, start here
- From 1+1 in Assembly to LLMs: The Evolution of Computing Abstraction — Taewoon Kim’s walkthrough of the full abstraction ladder with code examples at every level
Footnotes
-
Modern processors operate at clock speeds of 3-5 GHz, with multiple cores executing instructions in parallel. Hennessy, J. & Patterson, D. (2017). Computer Architecture: A Quantitative Approach. 6th ed. Morgan Kaufmann. ↩
-
Baeldung. (2024). Introduction to the Fetch-Execute Cycle. Baeldung CS. ↩
-
Petzold, C. (2000). Code: The Hidden Language of Computer Hardware and Software. Microsoft Press. The definitive beginner-friendly explanation of binary and how computers work. ↩
-
Boolean algebra, formalized by George Boole in 1854, provides the mathematical foundation for digital logic. Boole, G. (1854). An Investigation of the Laws of Thought. Walton and Maberly. ↩
-
LearnCpp.com. (2024). Introduction to Programming Languages. LearnCpp. ↩
-
Wikipedia. (2026). Assembly language. Wikipedia. ↩
-
IBM. (2026). John Backus. IBM History. ↩ ↩2
-
GeeksforGeeks. (2024). Difference Between Compiler and Interpreter. GeeksforGeeks. ↩
-
Built In. (2024). Compiler vs. Interpreter in Programming. Built In. ↩
-
Oracle. (2026). The Java Virtual Machine. Oracle. ↩
-
GeeksforGeeks. (2024). Introduction to Operating System. GeeksforGeeks. ↩
-
OpenStax. (2024). Fundamental OS Concepts. OpenStax, Introduction to Computer Science. ↩
-
GeeksforGeeks. (2024). Stack vs Heap Memory Allocation. GeeksforGeeks. ↩
-
Jones, R. & Lins, R. (1996). Garbage Collection: Algorithms for Automatic Dynamic Memory Management. Wiley. The foundational reference on automatic memory management. ↩
-
OpenStax. (2024). Computer Levels of Abstraction. OpenStax, Introduction to Computer Science. ↩
-
Van Rossum, G. (2009). Python’s Design Philosophy. Python History Blog. ↩
