I need to know everything about modern technology so im starting with the bit
I'm starting to realize that if you want to be a domain expert in anything technology or IT-related, you also have to have some general grasp of electrical engineering and math.
So, in my quest to learn about this field, I will start writing about my newly acquired knowledge. (and hopefully retain the knowledge)
the smallest unit of data is a bit.
Bit stands for binary digit. Binary digits are 0 and 1.
I'm gonna use CPUs as an example because it's the most important part of a computer. From an electrical engineering perspective, a bit works like this. (Very dumbed down I know)
- Low voltage is a logical 0 (close to 0V)
- A higher voltage is a logical 1 (0.8V to 1.5V)
A CPU is made up of millions of transistors. A transistor controls the flow of electricity.
On the most basic level, they are connected to a power source (think battery), a load (think light bulb), and a control signal (this is the gate that stays shut or opens to let electricity flow, runs on a lower current)
All of these transistors are either "on" or "off".
A transistor in its initial state is off. Like a light bulb. No light whatsoever. Representing a "0"
When you apply a higher voltage to the transistor - opening the control signal - it turns on. We have light. Now the transistor represents a "1"
This "1" is constantly charged with the higher voltage to be refreshed. Until it is not needed anymore.
Note that this is very very very very basic. I will get back to this blog post once I know more. And maybe (hopefully) i'll realise how dumb I was back then.
Update 11-11-2024: Currently taking the CS50 course. Binary is 0 & 1 because electricity is either on or off. (Think of charging your phone it's either plugged in or its not)