Learn the essential computer science fundamentals that power all modern software — including how code runs, what memory and CPU do, and how programming languages interact with machines. No prior experience needed. This course builds the mindset and foundation for programming, DSA, and interviews.
Computers don’t understand human language. At their core, they only understand numbers — more specifically, the Binary Number System. But humans use Decimal, and programmers also use Hexadecimal and Octal. Understanding these systems is fundamental to computer science, programming, and working with hardware.
This is the number system you’ve used since childhood. It has 10 digits: 0 to 9.
Used by all digital electronics including computers. Consists of only two digits: 0 and 1. Each digit is called a bit (binary digit).
Programmers often use Hexadecimal because it's shorter and easier to read than binary.
#FF5733)Less common, but used in some programming scenarios like Unix file permissions.
Divide the number by 2 repeatedly and record the remainders.
Decimal 25 → Binary: 25 ÷ 2 = 12 remainder 1 12 ÷ 2 = 6 remainder 0 6 ÷ 2 = 3 remainder 0 3 ÷ 2 = 1 remainder 1 1 ÷ 2 = 0 remainder 1 Binary = 11001
Binary 1010 = 1×2³ + 0×2² + 1×2¹ + 0×2⁰ = 8 + 0 + 2 + 0 = 10
Binary: 11011111 → Group: 1101 1111 → Hex = DF
Hex: 3C → 0011 1100 (Binary)
| System | Base | Digits Used | Best Use |
|---|---|---|---|
| Decimal | 10 | 0-9 | Used by humans |
| Binary | 2 | 0, 1 | Used by machines |
| Octal | 8 | 0–7 | Unix permissions |
| Hexadecimal | 16 | 0–9, A–F | Programming, debugging |
In the next chapter, we’ll explore Software vs Hardware — understanding the difference between physical components and digital instructions.