Binary made really easy to understand for anyone
Importance of Binary
We have all probably heard by now how computer science is a really up-and-coming industry. Even The New York Times mentioned that the College Board had recently stated that coding is one of the 2 most important skills for young people to have today. Of course, it is debatable whether the College Board is right about it being one of the most crucial skills to have. But one thing is for certain, computer science is majorly increasing in demand. Hence, on that note, we need to start exposing more young people to computer science concepts.
Although we keep hearing people call computers smart, computers are actually limited by the fact that they can only differentiate between 2 different inputs, they understand true vs false, which is a Boolean logic. We will talk more about this in future guides. They also understand on vs off or one vs zero. Hence, the ones and zeros of the binary are actually what the computer uses to decipher information. Considering the dilemma of only being able to input 2 things into a computer, the binary system is a truly amazing creation.
The Birth of Binary
Every great creation has a story behind it, and even binary has quite an interesting history. I've created a short summary below.
Timeline Made by Potato Pirates
These are only a few of the historical events that brought binary to where it is today. Binary has been there for a very long time, but it wasn't until the 20th century that it finally started being used for computers. The Z1, for example, was the first programmable computer made during 1936-1938 by Konrad Zuse. The best part of this invention is that it was built in his parents' living room. I can't imagine what they were thinking about the gigantic computer in their house at that point in time, but many others like him made history with their different ideas for the first computers.
The Z1 by Konrad Zuse
The Electronic Numerical Integrator And Computer, ENIAC for short, was invented by J. Presper Eckert and John Mauchly during 1943-1946 at the University of Pennsylvania. It was a huge machine, weighing about 50 tons. Luckily, thanks to the advancements in technology, our computers are getting lighter and lighter. Other than the size, the ENIAC was an excellent first computer because it was able to function fully.
How Does Binary Work?
Now, let's get to the good part. What exactly do these ones and zeros mean, and how do we read them? This short and sweet video will explain to you all the basics of binary, getting you started on the road to becoming a binary genius.
So now that we know how to convert decimal to binary and how a computer gets these ones and zeros in the first place, the question on our minds is probably how do they form alphabets then? No matter what you do, it's impossible to get an alphabet from summation of numbers. And, you're not wrong.
This is why each alphabet is actually allocated a specific number. So in American Standard Code for Information Interchange (ASCII), for example, an uppercase A is 65, while a lowercase a is 97. Based on the context of the situation, the computer will either display letters or numbers.
Table for conversion from binary to ASCII
For example, if I was trying to call someone on my phone, the numbers will not suddenly change to alphabets because the phone knows that in terms of the context, I only want numbers to show up. This is important since some of the binary numbers are the same for numbers and alphabets or other symbols.
In part 2 of this article, will explore more on what binary is used for, how to convert it to different systems like the hexadecimal system and it's different applications. Did you know that your images and videos are all made by binary as well? There are a lot of functions for these ones and zeros, so you can look forward to a lot more great content coming your way!