When you think of binary in terms of numbers and pictures on a screen, the numbers get staggering.
Binary is base 2 math (0-1). Hexadecimal is base 16 math (0-F).
Take this simple color -
View attachment 20015
The RGB (Red Green Blue) levels for that are 51, 255, 51 - what you may see "1 level down" like in a Paint program.
The hexadecimal code for that color is 33ff33. In other words, if you were creating a Web page, the number you would type in the line of code for the color you wanted in that area would be 33ff33.
The binary for that is 00110011 11111111 00110011 - what the computer sees that color as.
Not counting size, shape, position, which all have their own 1's and 0's associated with them, a simple green color on a picture requires the computer to process 24 digits. Each 8 digits is one bit. 3 sets of 8 bits means you have 3 bytes. 1,000 bytes equals one kilobyte (KB). 1,000 KB equal 1 megabyte (MB).
So, a picture that takes up 10 MB of space, equals 10,000 KB. That equals 80,000 bytes. That equals 640,000 bits. That's 640,000 1's and 0's a computer has to process to display a relatively simple image!
Want to do that math for a high-def, 1GB (1,000 MB) image?!?!
1 GB = 1,000 MB = 1,000,000 KB = 8,000,000 bytes = 640,000,000 bits
Now, in terms of your original post, think of how many bits that means are floating around the Internet...!!!!