Block F

Two things describe computers. They are

Computers are dumb piles of switches.

Bits and Bytes

A bit is a 0 or 1, a binary digit. It is also the unit of information revealed by a fair coin flip.

By convention:

A byte is 8 bits, this is the basic unit of memory in a computer.

Information is bits + context. Everything is stored as bits. The context is HOW they are read.

information = bits + context

Numbers

Problem Given collections of objects A and B, tell which has more times in it without counting.

Solution Pair them off.

This "same-sizeness" is the real, abstract notion of number.

The symbol 256 is a representation of a number. Our numbering system is a base system with a base of 10. This is because we have 10 fingers. We parse it as follows.

256 -> 2 units of 100, 5 units of 10, 6 units of 1.

parse(v.t.) to extract meaning from symbols

256 = 2 units of 100, 5 units of 10, 6 units

Why 10? Because we have ten fingers! Computers don't have fingers. They are big dumb piles of switches that understand "on" an "off", or 0 and 1. Computers use a place value of 2 instead of ten.

How do we go between decimal and base 2 (binary) representations of numbers?

Greedy (bigendian) algoritm To count out an integer number of dollars in US currency with as few bills as possible you use this scheme: Use as many of the largest bill as you can. Then move down a denomination and repeat until it's all counted.

Let's use the greedy algorithm to count out $374.

picture of US bills

We can use this scheme to convert a decimal representation of a number to binary. The "denominations" are powers of 2.

Lets run the greedy algorithm on the decimal number 73.

1    1     0    use a 1 and we are done.
2    0          can't use a 2.
4    0          can't use a 4.
8    1     1    use an 8, 1 is left
16   0          can't use a 16.
32   0          can't use a 32.
64   1     9    use a 64, 9 is what's left.
128  0    73    cant use a 128 so we enter a 0 in the center.

Read UP, and the binary representation of 73 is 1001001. By convention we use this notation: 0b1001001.

This is the bigendian algorithm, because we go from the big end to the little.

LittleEndian: next time