loading

computers








understanding literacy...

...the conundrum:

...when it comes to understanding the preliterate past, we modern folk are hopelessly automobolized. The written word is the mechanism by which we know what we know. It organizes our thought. We may wish to understand the rise of literacy both historically and logically, but history and logic are themselves the products of literate thought.

James Gleick 1
but we'll try:

“...when the word is instantiated in paper or stone, it takes on a separate existence as artifice. It is a product of tools, and it is a tool. And like many technologies that followed, it thereby inspired immediate detractors. One unlikely Luddite2 was also one of the first long-term beneficiaries. Plato (channeling the nonwriter Socrates) warned that this technology meant impoverishment. (...) The power of the first artificial memory was incalculable: to restructure thought, to engender history. It is still incalculable, though one statistic gives a hint: whereas the total vocabulary of any oral language measures a few thousand words, the single language that has been written most widely, English, has a documented vocabulary of well over a million words, a corpus that grows by thousands of words a year.” it also allowed for statements of law and economic agreements and, “gave rise to discourse about discourse. Written texts become objects of a new sort of interest.” these new disciplines required a reconfiguring of existing vocabulary into new vocabulary (topics once meant place, structure once only referred to buildings, categories once meant accusations or predictions). “The persistence of writing made it possible to impose structure on what was known about the world and, then, on what was known about knowing. As soon as one could set words down, examine them, look at them anew the next day, and consider their meaning, one became a philosopher” The preliterate could not conceive of, “the idea of beauty in itself rather than many beautiful things, nor anything conceived in its essence instead of the many specific things.” once u can organize experience in terms of categories rather than events; once u can embrace the discipline of abstraction; one u’re literate u can start thinking. “The written word—the persistent word—was a prerequisite for conscious thought as we understand it.”1

logic

Taking the next step on the road of abstraction, Aristotle deployed categories and relationships in a regimented order to develop a symbolism of reasoning: logic—from λόγος (logos), the not-quite-translatable word from which so much flows, meaning “speech” or “reason” or “discourse” or, ultimately, just “word.”

Logic might be imaged to exist independent of writing—syllogism can be spoken as well as written—but it did not. Speech is too fleeting to allow for analysis. Logic descended from the written word, in Greece as well as India and China, where it developed independently.

Logic turns the act of abstraction into a tool for determining what is true and what is false: truth can be discovered in words alone, apart from concrete experience.

James Gleick 1


Gleick, referencing Ong’s research explains the difference between how literate && illiterate people think, “Logic implicates symbolism directly: things are members of classes; they possess qualities, which are abstracted and generalized. Oral people lacked the categories that become second nature even to illiterate individuals in literate cultures: for example, for geometrical shapes. Shown drawings of circles and squares, they named them as “plate, sieve, bucket, watch, or moon” and “mirror, door, house, apricot drying board.” They could not, or would not, accept logical syllogism. A typical question:

In the far North, where there is snow, all the bears are white.
Novaya Zembla is in the Far North and here is always snow there.
What color are the bears?

Typical response: ‘I don’t know. I’ve seen a black bear. I’ve never seen any others… Each locality has its own animals.’ By contrast, a man who has just learned to read and write responds, ‘To go by your words, they should all be white.’ To go by your words—in that phrase, a level is crossed. The information has been detached from any person, detached from the speaker’s experience. Now it lives in the words, little life-support modules. Spoken words also transport information, but not with the self-consciousness that writing brings. Literate people take for granted their own awareness of words, along with the array of word-related machinery: classification, reference, definition. Before literacy, there is nothing obvious about such techniques.”1



Mathematical Analysis of Logic





George Boole


In 1847 George Boole published the pamphlet Mathematical Analysis of Logic. He later regarded it as a flawed exposition of his logical system, and wanted An Investigation of the Laws of Thought (1854), on Which are Founded the Mathematical Theories of Logic and Probabilities to be seen as the mature statement of his views. Contrary to widespread belief, Boole never intended to criticise or disagree with the main principles of Aristotle's logic. Rather he intended to systematise it, to provide it with a foundation, and to extend its range of applicability 3













symbolic (logic) Analysis of Circuits






a circuit is just a bunch of wires and components ( resistors, transistors, diodes, etc. ) connected together w/ electricity running through it, arranged in such a way that allows for simple or complex operations ( from dimming a light bulbs to running robots ).

"The real savior of Boolean logic was born a century after Boole: Claude Shannon4, A Bell Labs engineer working with telephone switches. In his 1938 thesis, A Symbolic Analysis of Relay Switching Circuits, he laid out Boole's AND, OR, and NOT functions as electrical circuits—the first Logic Gates" 2





In his 1937 master's thesis, Shannon laid the groundwork for electronic digital computing and practical digital circuit design by demonstrating the electrical application of Boolean algebra to resovle logical numerical relationships. Shannon built his maze-solving mouse, Theseus, in 1950.







numbers


Cuneiform numbers, one of the earliest known systems of writing 1

symbols




“Numbers are certainly the most abstract codes we deal with on a regular basis. When we see the number 3 we don’t immediately need to relate it to anything. We might visualize 3 cats or 3 of something else, but we’d be just as comfortable learning from context that the numbers refers to a child’s birthday, a television channel, a hockey score, or the number of cups of flour in a cake recipe. Because our numbers are so abstract to begin with, it’s more difficult for us to understand that this number of cats: 🐱 🐱 🐱 doesn’t necessarily have to be denoted by the symbol 3” 5







♞ + ★ = ☢
☎ - ☯ = ?
☢ / ☀ = ?
♎ * ☀ = ?
☢ + ♞ = ?
♞ * ♞ = ?



Ten is Arbitrary






“Ten is an exceptionally important number to us humans. Ten is the number of fingers and toes most of us have. Because our fingers are convenient for counting, we humans have adapted an entire number system that’s based on the number ten. The number system we use is called Decimal which is base ten. The number system seems so natural to us that it’s difficult at first to conceive of alternatives.”5




0





base 8 ( or Octal )


0





base 2 ( or Binary )


0






a thinking machine



relay




from relays to gates



useful combos



adding machine





a real 4bit adder


( made of transistor gates not relay gates )




( a single full adder close up )

Interlude: Moore's Law

Moore's Law is the observation by Intel founder Gordon Moore that "the number of transistors that can be fitted onto a circuit doubles approximately every two years. This is why the computer you bought last year is now pathetically huge and lumbering compared to the sleek whizzy new one coming out next month."6





1930’s early digital computing machines using relay’s
1940’s first computers with tubes
1950’s the transistor hits the scene
1960’s integrated circuit’s can hold few transistors
1970’s integrated circuit’s can hold thousands of transistors
1980’s integrated circuit’s can hold tens of thousands of transistors
1990’s integrated circuit’s can hold hundreds of thousands of transistors
2000’s integrated circuit’s can hold millions of transistors

today a microprocessor ( integrated circuit ) that fits in the palm of your hand has got around 5 billion logic gates

variable memory





automation





putting it all together







i said before the 4 bit adder, while being a digital arithmetic circuit, isn’t “technically” a computer. this is because it can’t do what a “Turing Machine” can do. A Turing Machine is a hypothetical machine that works as a criteria for what counts as a computer. It was conceptualized by one of the most brilliant mathematical minds in history, Alan Turing in his 1936 paper On Computable Numbers. Big thanks to Alan Turing for guiding us into a world full of computers and free of Nazis.








the first* full blown computer (1948)



*( the idea of “firsts” is always a myth, every development in any field is a single node in the middle of a long conversation,
for a more detailed description on the evolution of early computers see wikipedia )



Interlude:

Computers ≠ Electricity

there’s nothing inherent to the concept of computing ( as described by Turing ) that requires electricity ( it just so happens that electric current flows pretty fast, which makes it useful for doing things really fast ). but in theory we can recreate our 4 bit adding machine out of dominos.





the analytical engine





you could also ( in theory ) create a computer using gears and levers powered by steam, which is exactly what Charles Babbage designed in Victorian Era England, 100 years before the first computers were made. In a sense Babbage was the first computer engineer and his Analytical Engine ( had it been completed ) the first computer, capable of doing all the things Turing Machine’s require, as proven/articulated by Ada Lovelace who, in her Notes of the Analytical Engine, described how the Engine could be used to calculate all kinds of things. She detailed the specific instructions necessary for “programming” the Engine, in a sense making her the first computer programmer.






( if u're really curious, here's Sydney Pauda's breakdown of how it would have worked )



“Again, it [the Analytical Engine] might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine . . . Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.”

Ada Lovelace 7




Numerical Representation

In the interest of cross-platform compatibility ( i.e. so you can share files from one computer to another regardless of what kind of computer it is ) various organizations exist that create standards, an incomplete list of of computer standards is available on wikpedia. One of the oldest standards is ASCII ( American Standard Code for Information Interchange ), formalized in 1967, they standardized which numerical values would correspond to which characters.


3 byte pixel




red 0 00000000

green 0 00000000

blue 0 00000000



binary to txt/image editor

homework



of all the incredibly interesting people, inventions and ideas we covered today pick one, do a little research, dig a little deeper into it, share with us one interesting fact about it/she/him not covered in class. then convert that sentence into binary code ( 8 bits per [1byte] per character ). submit it below.






then, complete the online hello processing tutorial



endnotes

  1. Gleick, James. The Information: A History, A Theory, A Flood. Pantheon Books. 2011.
  2. Padua, Sydney. The Thrilling Adventures of Lovelace and Babbage. Pantheon Books. 2015.
  3. from Wikipedia. George Boole. https://en.wikipedia.org/wiki/George_Boole#Symbolic_logic
  4. Claude Shannon demonstrates machine learning
  5. Petzold, Charles. CODE, The Hidden Language of Computer Hardware and Software. Microsoft Press. 2000.
  6. Padua, Sydney. The Thrilling Adventures of Lovelace and Babbage. Pantheon Books. 2015.
  7. Lovelace, Ada. Sketch of the Analytical Engine. Bibliotheque Universelle de Geneve. 1842.