A non-technical explanation of how computers and programmers speak to each other

Photo by Hitesh Choudhary on Unsplash

If you work in technology at some point you will have had to try to explain what you do to a non-technical person. It might be your inquisitive partner wanting to know more about what you do in that “matrix looking thing” (the terminal) all day. Or perhaps your friendly neighbor learned that you work as a programmer and asked you to fix their printer. These kinds of conversations can be hard.

Emerging from my home office after a long day of writing code, my partner hit me with a profound question:

“Can’t you just tell the computer NO?”

I mumbled out an answer about how the computer doesn’t understand “NO”. It can only speak in binary. He works in a completely non-technical field, but he knows what binary code is. He quips back “How many zeroes and ones do you need to spell out N — O?” I sighed, trying to find the most straightforward way to answer his question, and reflected on how many tech people have probably had a conversation like this at one point or another.

I’m here, writing this guide to hopefully demystify a few questions:

  • Does the computer really speak in 0s and 1s? How does that work?

Maybe you can send this article to your curious spouse. Maybe you just want to know more about how binary, programming, and computers all fit together. Hopefully this guide is helpful, whatever your goal is. Just don’t try to use it to fix your neighbor’s printer.

Does the computer really speak in 0s and 1s? How does that work?

The short answer to this question is yes. They only speak in 0s and 1s and that’s called Binary. If you break what a computer does down to its most basic function, it is only an electrical current flowing through a circuit. If the circuit is connected, then electricity flows through the circuit and that correlates to 1. If the circuit is not connected, electricity does not flow and that correlates to 0. There are transistors that funnel the flow of electricity too, but all you need to know now is that the computer only really knows on and off.

The fact is that it could be 0/1, on/off, yes/no, true/false or cat/dog. It matters not what we call it, all that matters is that it represents difference. One thing that is distinctly different from another. For computers, it happens to be electricity and 0/1. So everything on your computer is actually a number. Even if it’s a letter like “N” or “O”, it’s really just a collection of 1s and 0s.

You’ve probably thought to yourself at some point that computers are very smart. They do all sorts of crazy powerful calculations, right? They must be much smarter than humans.

Computers are NOT smart. They are in fact very dumb. But they’re fast.

If I presented you with a human and told you they could only understand two things: 0 and 1, you probably wouldn’t think they were very smart. More on this later.

Okay, computers speak in 0s and 1s. So how do humans communicate with them?

Well, let’s start with how humans communicate with each other. They use languages. Languages allow us to take abstract information from our brains, break it down into smaller conceptual chunks and then reassemble those chunks to be broadcast in a more digestible format to the world using mostly sound from our mouths and gestures from our bodies. There’s a whole branch of scientific inquiry dedicated to studying the exchange of data called Information Theory.

Information is everywhere. It’s in our minds, in our smartphones, and even in our DNA. And it has a unit of measurement that is universal no matter what type of information (data) we’re talking about. This unit of measurement is called a bit. Bit is short for (you guessed it) binary digit. Bits have a binary value of either 0 or 1. Is this starting to sound familiar? There are 8 bits in a byte and most of us know all about bytes — you probably know exactly how many gigabytes your smartphone or laptop can hold.

The human brain is estimated to be able to retain the equivalent of 2.5 million gigabytes of data. So in an abstract kind of way, I could say that if I’m talking to someone on the street I’m transmitting bits (information) from my brain to theirs using language. If a computer is moving files onto a hard drive, it is also transferring bits.

Well, programmers communicate with computers using languages too. Just not spoken languages like English or French. You might have heard about programming languages like Javascript, C, Ruby, Java (different from JavaSCRIPT), Python, Go, R, PHP…the list goes on. If you’ve ever seen Javascript written out, it might have looked something like this:

const min = function(numbers) {   let lowestNum = numbers[0];   for (let num of numbers) {      if (num < lowestNum) {         lowestNum = num;      }   }   return lowestNum;};

Wait, that isn’t 0s and 1s though…where are you going with this??

Human languages such as English or French moderate between the abstract thoughts in our brain and the less abstract representation of that information that eventually lands with whoever is listening to us talk or write. Similarly, programming languages stand in between the abstract HUMAN thoughts of the programmer and the more precise and less abstract 0s and 1s of the MACHINE.

Normal humans struggle to remember a 10 digit phone number, let alone millions of 0s and 1s, so we need a way to abstract all those 0s and 1s away and talk to the computer in a way that both humans and machines can understand. Similarly, the computer is only a series of on or off electrical circuits, so it can’t comprehend our complex language instruments. What we end up with is something in between 0s and 1s and human language, like the above Javascript.

Ok, so we have programming languages to talk to computers. How do THEY become 0s and 1s?

Well, the short answer is that programs written in programming languages get translated into 0s and 1s by…other programs. That’s right, smart programmers taught computers to translate languages they don’t understand into ones they do. Pretty cool, huh? This grunt work is done by something called a compiler. The compiler is a program that takes the code written by the programmer and converts it into a set of instructions written in code the computer can understand. What it ends up with is a big long list of 0s and 1s, that cause the electricity to go off and on in the exact right order so that the computer executes the instructions the programmer specified. The reality of what goes on with the compiler is much more complicated than that but if I tried to explain it we’d be here all day. There’s also something similar to a compiler called an interpreter, but again we won’t go into that for the sake of simplicity.

The fact is that computers need a serious amount of hand-holding from programmers in order to perform the complex tasks we know them to be capable of. As mentioned earlier, we love computers because they can do things lightning-fast but they still need to be told what to do and how to do it in fairly minute detail. This is why we have no reason to worry they will go sentient and take over the Earth. They won’t think unless we tell them what to think and how to think it.

So who wrote the compiler? That sounds like hard work!

Yes, dear reader. At some stage, some very smart humans sat down and mapped all of the obscure character combinations that make up the syntax of a programming language (like the things you see in the Javascript above) to a combination of 0s and 1s for the computer. And most of that logic gets executed by the compiler.

This feels like the right time to mention that not all programming languages are the same. We have high-level and low-level programming languages and they sit on a scale between human-friendly (high-level languages) and computer-friendly (low-level languages).

If you wanted to write code for a website you could safely, easily, and most importantly quickly use a high-level language like JavaScript, Python, or Ruby. They are easier to learn and less complicated. Low-level languages like assembly language are so close to 0s and 1s that they don’t even use a compiler. In fact, what the compiler actually does under the hood is translate a high-level language into assembly language. Low-level languages are much more complex and difficult to learn. Somewhere in the middle is a language like C, which is commonly used for complex programming tasks such as creating an operating system or making a compiler, but isn’t as hardcore as something like assembly language.

Let’s get back on track here — how do you tell the computer “NO” and how do I fix my printer?

Photo by Sven Brandsma on Unsplash

I mentioned earlier that even letters to a computer are a collection of 1s and 0s. They go from being letters to being a collection of 1s and 0s through a process called encoding. Well, there’s this thing used for encoding called ASCII and it uses 7 digits to depict the first 126 characters and symbols of the English alphabet. It’s conveniently sequential and the letters N and O are right next to each other.

With ASCII, You tell a computer no with “1001110 1001111”. Notice these two collections of 7 0s and 1s are differentiated only because the first ends with a 0 and the second with a 1. Like I said, right next to each other. Capitalization matters but that’s not important to understand the concept. But what about all the other characters you ask? There are a lot more out there than just 126!

ASCII was never going to be enough on its own. In order to standardize the world’s many characters, we have this body called the Unicode Consortium and they oversee a list of 100,000+ characters, each represented by a collection of binary digits. They had to go all the way up to 32 0s and 1s in order to represent that many characters. These days, the type of encoding we use for characters is called UTF-8 and it’s pretty universally used across the modern web. Luckily, it is backward compatible with ASCII, so “1001110 1001111” is still “NO” even though technically now we have 32 0s and 1s to say NO, rather than just 7. This is hard to understand but there is more info for you in the references if you want to explore it more.

As for printers…they are so 2008. Read about why you should rethink your printer ownership decision.

To conclude…

Now, you know how to respond if a computer tries to sing you the song of its people! Sort of… This is by no means an exhaustive or detailed explanation of everything there is to know about any of the topics mentioned (binary, information theory, programming languages, character encoding, etc). It barely scratches the surface on what is a fascinating area of inquiry and the concepts are simplified to help you unlock your first level of understanding.

My hope is that it raises more questions for you and leads you to learn more about computer science. I’ve listed the resources I used to put the article together below for you to check out if you want to go further down that rabbit hole.

My references are your resources!

Richard Feynman on how computers are DUMB and FAST

More on how a compiler works

More on information theory and the science of data transfer

A great book on systems of communication

Crash course computer science Youtube series

Tom Scott on ASCII and the miracle of Unicode

More about the Unicode Consortium

List of all Unicode characters

The memory capacity of a human brain

Full stack web developer based in Vancouver

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store