Tuesday, 25 November 2014

Ontario's Fair Accommodation and Practices Act

In response to what we would probably call lobbying today, the Government of Ontario passed the Fair Accommodation and Practices Act on April 6, 1954. It stated that "No one can deny to any person or class of persons the accommodation, services or facilities usually available to members of the public."

On October 29, 1954 Bromley Armstrong and Ruth Lor went with a journalist to Morley McKay's restaurant in Dresden, Ontario where they were refused service. The restaurant and its owners were eventually convicted of violating the Fair Accommodation and Practices Act and fined (the restaurant owners were forced to agree to serve anyone). For what was probably the first time in Canada, a law was enforced which supported the victims of racial discrimination.

(Canadian) Prairie Towns photos

Here's a site that displays historical photos of various towns and a few cities in Alberta and Saskatchewan. The photos of the quite tall railway bridge at Outlook, SK are probably my favourites.

http://prairie-towns.com/

Friday, 14 November 2014

History is written by historians

Attribution

I wrote this way back in 2002. The original is located at http://www.everything2.com/user/dabcanboulet/writeups/History+is+written+by+historians



With due respect to whoever wrote history is written by the victors, I'm going to try to explore the question of "who writes history?" from a different perspective.
In today's world of instant history as served up by the likes of CNN, we might be fooled into believing that the process of writing history after all the events in question are over has become, somehow, obsolete. Nothing could be further from the truth. In very real terms, it isn't possible to write a history of anything until well after the dust settles. There are a number of good reasons why this is true including:
  • the significance of the event can only be measured once the impact of the event has been felt.
  • the reasons behind the event are rarely as clearly defined as we believe them to be while the event is underway.
  • how the event fits into what else is going on is unlikely to be apparent until we are able to step back and view the events of the time from a distance.
  • the demands of military secrecy, the fluidity of events, the impossibility of having a neutral observer in all the right places, and more mundane factors like avoidance of embarrassment make it impossible to get any sort of accurate picture of "what happened" while the event is underway or is still part of "recent history".
  • we are unlikely to be able to view the event with sufficient objectivity until well after the event.
  • we are unlikely to be able to find witnesses who can describe the events objectively until well after the event (witness testimony is useful but rarely definitive as it represents recollections; in contrast, documents don't (usually) lie or forget).
Let's take a look at some recent and not-so-recent events to see when their history could (or will) have been written:

September 11, 2001

There is no doubt that the events of 2001/09/11 were major events. There is also no doubt that much has been written about 9-11 and a lot more is yet to come. What should also be clear is that little if any of what has been written to date about 9-11 could be fairly called history (it'scurrent events). It will be a few years before anyone is able to write something that could truly be called a history of 9-11. In addition to other reasons (e.g. still too emotional an event to write about dispassionately), there's too much that we don't know about exactly what happened:
  • how the perpetrators planned and executed the plot
  • who helped them
  • who "looked the other way" (i.e. deliberately kept quiet while knowing actual details about the plot)
  • how the plot was missed by the intelligence and law enforcement services (I'm not suggesting that they screwed up or that they did everything correctly; I'm only saying that we just don't know yet (no matter how much we wish or believe that we do))
and the list goes on! 
(keep in mind that this was written in 2002; I don't intend to update this section or even the article as a whole as the events that I discuss are really just vehicles that I am using to make my case)

The Gulf War

The Gulf War happened over ten years ago yet we still aren't able to write any sort of complete history of the war. The reasons for this go beyond not knowing what happened (even though there are bound to be significant aspects of the war which are still shrouded in military secrecy or national security and I'm not just referring to U.S. miltary secrecy or national security here).
Other reasons include:
  • the war, as an event in our time, isn't over yet:
    • although Iraq has been defeated in military terms in the 2003 Iraq War, nobody is likely to argue that the 2003 Iraq War is truly over yet.
    • the U.S. motivation for attacking Iraq in 2003 was clearly at least partially connected to the fact that the elimination of Iraq as a threat to the region was not accomplished during the Gulf War
  • it is still impossible to get any sort of detailed view of events from the Iraqi perspective, the Kuwaiti perspective, the American perspective or the U.N. perspective (to name just a few of the important perspectives).
Need I say more?
I guess that I'll say more: the fact that I've had to almost totally re-write this section on the Gulf War since this w/u was first written in the fall of 2002 strikes me as pretty solid evidence that it isn't time to write a proper history of the Gulf War yet!

The Vietnam War

Some well written histories of the Vietnam War are starting to appear but it's still a little early to be able to say that anything resembling a definitive history has been written or can even be written yet. For example:
  • many of the key human participants are still alive or recently deceased. The risk of being accused of libel or slander or just plain speaking ill of the dead is still too great to expect all of the relevant personal-perspective information to become available.
  • although this is changing, there are still major events surrounding the war, e.g. events of the Cold War, which are still at least partially if not entirely clouded in secrecy.
  • The impact of the war on America is still too great (and emotional) for a history to be viewed as being dispassionate (even if it is!).
  • Vietnam and the West are just starting to develop the sort of communication pathways and exchanges necessary for any real understanding of the context of the war from the Vietnamese perspective (either North or South) to develop.
It's still too early for the Vietnam War.

World War II

Surely enough time has gone by for comprehensive histories of the Second World War to be written? Well, although I'll admit that we may seem to be getting close to that point in time, I'm going to argue that we aren't there yet. For example:
  • The British Government has a policy of keeping certain categories of information secret until 50 or even 75 years after the individuals involved have died. Many of the key military and political leaders during the war havn't been dead for 50 years yet.
  • Some have argued that the fall of the Berlin Wall marked the end of the Second World War. That's not very long ago!
  • To take a possibly extreme position, there are still governments-in-exile dating back to the war. For example, check out http://www.net2000.com.au/customers/danzig/index.html to see the web site for the Free State of Danzig / Government-in-Exile. Some might argue that it is time for these folks to give up the fight. Feel free to suggest to them that they give up the fight . . . (and before you do, consider the governments-in-exile of many former Warsaw Pact countries that survived to see their countries freed after 40 years).
  • There's an astounding amount of documentary material about the war. Much of it has never even been looked at since the war and some of it is still secret. Who knows what we might learn once it has all been explored. For example, it would be interesting and possibly even important (from the perspective of understanding "war" and "warfare" as concepts) to really know why Germany hesitated long enough for the miracle of Dunkirk to happen and Canadians are still wondering and learning about why the Dieppe raid happened. Many theories exist on both topics. Some of them are backed up by various forms of documentary evidence. As far as I know, nobody really knows. There are other events, many of which are more important than Dunkirk or Dieppe, which could conceivably be explained by documents that have yet to be studied.
Just to be clear - many good, solid and reasonably comprehensive histories of the Second World War have been written. It just seems a little early yet to claim that the task is done. Lest we appear arrogant, maybe we should give it a few more decades.

World War I

Quite comprehensive histories of the First World War have been written. Putting aside the argument that WWI and WWII were really just one war with a twenty year resting period in the middle, the events of the First World War have been well documented by historians around the world. Still, there are grounds to suggest that a little more time is needed:
Britain recently pardoned some of the men shot for cowardice who were really suffering from battle shock. If we've just come to understand the difference between the two then are we really ready to try to explain why troops did what they did during various events of the war? (I'm pushing things a bit here but could a history of the First World War which described these victims of battle shock to be deserters have been truly comprehensive or even accurate?)
On the other hand:
Britain mourns today (2009/07/25) the death of Harry Patch who was considered by some to have been the last surviving British veteran of WWI (this point of view would seem to ignore another British veteran of the Great War, Charles Choules (108) who lives in Australia. The last surviving American veteran of the war is Frank Woodruff Buckles (108). There are not believed to be any French or German veterans left alive.
Then again . . . what sort of interesting documents will appear as the various 25, 50 and 75 year post-event and post-death-of-participant secrecy rules allow even more documents to come to light? Hmmmm . . .
So when DOES it become possible to write a good comprehensive history of an event? That's a pretty hard question to answer (i.e. sorry - no answer today) but one thing should be clear - real history isn't written while an event is unfolding!
Sidebar: I've cheated a bit here. If I had selected less important mid to even late 20th century events then we'd find that many are ready to have comprehensive histories written about them. The existence of relatively recent events that are ready doesn't affect the essential point - if the reporters are still "on the story" then the "story" isn't ready to become the target of a comprehensive history.
Let's consider the (relatively recent) events of September 11, 2001 again - although some of what is being written today about 9-11 is at least a valiant attempt at writing history, much if not all of what is being written today about 9-11 is still news or current events.
The distinction between reporting news, reporting current events and writing history is important:
  • reporting news is describing what is happening now or in the very recent past in order to inform the reader/viewer about what's going on in their world.
  • reporting current events is the process of trying to explain what is going on today or in the recent past by placing it into a context (think of this as the full page articles that you read in the Sunday section of the paper).
  • writing history is the process of trying to describe events in their entirety including what happened, why it happened and what impact it had.
If one is to write history then one must be in a position to at least attempt to answer the big questions:
  • what happened?
  • where did it happen?
  • when did it happen?
  • who was involved?
  • why did it happen?
  • how did it happen? (e.g. external factors and the influence of random events)
  • what impact did it have?
A news reporter's job is to focus on "what", "where" and "who". They don't get paid to figure out "why" and "when" is usually pretty obvious. A current events reporter's job is (arguably) to focus on "why" and/or "how" with a certain amount of effort being expended on the rest of the questions. It is the historian who must step up and deal with them all.
As we struggle our way through the deluge of news and current events reporting, it is important that we keep in mind that history is written by historians.

A side story

Winston S. Churchill had a substantial impact on the 20th century and he found himself making history fairly often. Churchill was also a historian. He wrote many history books including a biography of (his ancestor) the first Duke of Marlborough called Life of Marlborough, a four volume A History of the English-speaking Peoples, a six volume history of the First World War called The World Crisis and a six volume History of the Second World War.
The last two, The World Crisis and History of the Second World War are particularily interesting as Churchill played a major role in both wars. Let's focus on the History of the Second World War books as his role in this war is, presumably, reasonably familiar to all of us (click here for details). When Churchill set out to write his history of WWII, he knew exactly what he was doing (read the Preface to his History of the Second World War if you're in any doubt):
  • he was documenting the war from a truly unique perspective (i.e. providing a service to history that he was uniquely able to provide).
  • he was writing such a comprehensive work that he was essentially setting the context within which the Second World War would be viewed.
  • he was getting his story out before anyone else got around to writing it.
It was precisely because he was a historian that he wrote what he wrote when he wrote it. After all, his earlier works certainly established his credentials as a historian and his place in history was certainly secure! (clarification: his A History of the English-speaking Peoples was published after his History of the Second World War although he'd already published quite a few other histories prior to the war)
One question that comes to mind is "should he have written his History of the Second World War?". The answer is, obviously, YES - if for no other reason than that he had a unique perspective on the war.
Maybe a better question is - "did he write it too early?". Unfortunately, time was running out - he became Prime Minister of Great Britain at the age of 65. If he didn't write it soon after the war ended then it wasn't going to get written by him and he was determined that he was going to write it.
On the other hand, if one asks "was it written too early?" then the answer is quite different. Churchill, of course, knew many MANY secrets about WWII which he wasn't at liberty to reveal when he wrote his history of the war. Consequently, his history couldn't possibly be comprehensive as he had to leave holes in the story or even tell lies to protect the secrets. His histories of the Second World War were written too early in the sense that they aren't as comprehensive or as authoritative as they appear to be or as one might like them to be. Let's be VERY clear: I'm not criticizing Sir Winston S. Churchill here. I'm just making the point that even someone with Churchill's perspective on WWII wasn't able to write a comprehensive history so soon after the end of the war.
One final note: if you've never read any of Churchill's work then you've been missing a real treat. He's an excellent writer who really tells a great story. If you don't know where to start, here are a few suggestions:
  • His A History of the English-speaking Peoples is a delight to read. Even though they're best when read in sequence, they can be read separately. The books in the series are:
    • The Birth of Britain
    • The New World
    • The Age of Revolution
    • The Great Democracies

    Just pick one and read it - you won't be disappointed.
  • His History of the Second World War is truly amazing. His personal perspective on the war allowed him to be both authoritative and comprehensive in a way that nobody else could be (even given the constraints that he was writing under) and his dry wit and humour show through at the most unexpected moments. For example, consider that you'll often forget that you're reading his work and almost start to imagine that Winston is sitting in your living room telling you about the war and something else might happen (true story):
    I was reading about one of the battles involving Tobruk and got to an excerpt from a cable which talks about making water dispotable. He then explains (in a footnote) that this means 'to make water unfit to drink' and then he apologizes for the use of the word.
    I stopped and thought to myself - "Sir Winston S. Churchill has just apologized to ME! Wow!"
    The books in this series are:
    • The Gathering Storm
    • Their Finest Hour
    • The Grand Alliance
    • The Hinge of Fate
    • Closing the Ring
    • Triumph and Tragedy

    These books are definitely best read in sequence unless you're already quite familiar with the war. Even if you don't manage to finish them (about 5,000 pages), you'll still enjoy what you read and you'll learn things that you never knew before.
  • Here's one that you will be able to finish - Savrola is his only novel. It's only a few hundred pages long and anyone over the age of about eleven years old who enjoys an adventure story will find it quite enjoyable.
Bias alert: Winston S. Churchill is, without a doubt, my favourite author.

An amusing on-point quote: “History is written by historians, when politicians get involved, it's always bad news” by Manuel Fraga, a minister under Franco and later a senator from the Popular Party (found in a New York Times article titled "Spain’s Dilemma: To Toast Franco or Banish His Ghost?" by Renwick McLean; published October 8, 2006).

P.S. I'm a computer geek who just happens to be interested in history and spends a fair bit of time reading history books and thinking about history.

Inefficient "Hello world" programs

Attribution

I wrote this way back in 2003. The original is located at http://www.everything2.com/user/dabcanboulet/writeups/Inefficient+hello+world+programs



It seems to me that any supposedly inefficient "Hello World" program which actually contains the string "Hello world\n" in its source code (excluding comments) is bogus since such a program could just print out the string directly.
Here's a program which doesn't contain the string "Hello world" anywhere except in the comments. It generates a string of twelve characters randomly selected from the set of lower and upper case letters, the space character and the newline character. It then checks if the string's MD5 checksum is equal to the hard-coded MD5 checksum of "Hello world\n". If the checksums match then it prints the string and terminates. Otherwise, it begins again with another randomly generated twelve character string.
There are 5412 or 614,787,626,176,508,399,616 twelve character strings containing the characters used by the program. If the program generates and MD5 checksums 100,000 random strings per second (roughly what a 2GHz Athlon should be capable of) then the correct string should appear in about one hundred million years.

In fairness, it should be noted that this program could produce the wrong output as the MD5 checksum algorithm is known to not necessarily produce different hash values for different input values (even if the input values are as short as 16 bytes). See arielsmd5 hash function writeup for details.
The input value to the MD5 algorithm in this program is the 12 character string "Hello world\n". Since this string is four bytes shorter than the sixteen byte value computed by the MD5 function, it isn't clear if there is another different 12 character string (using the same character set as used by this program) that results in the same MD5 hash value as "Hello world\n" generates. This may not seem very important but a program which is going to run for one hundred million years really should use an algorithm which is known to be correct!
P.S. Don't try this at home . . .

Here's a few questions for the cryptology folks or those with a few hundred million years of CPU cycles to burn:
  • An MD5 checksum is 16 bytes long. Each of the 256 different byte sequences of length one generate different MD5 checksums. On the other hand, there must be many pairs of byte sequences of length seventeen which both generate the same MD5 checksum.
    Question: are there any pairs of byte sequences shorter than 16 bytes which both generate the same MD5 checksum?
    if the answer is no then the program below will, eventually, produce the correct answer. If the answer is yes then more work is required before we can tell if the program necessarily eventually produces the correct answer.
  • Is there a second twelve character string containing characters from the set a-z, A-Z, space and newline which generates an MD5 checksum equal to the one generated for the twelve character string "Hello world\n"?
    If the answer is yes then the program below might produce the wrong answer. If the answer is no then the program below will, eventually, produce the correct answer.
One last question:
Assume:
  • that computers will continue to get faster and faster over time and that humanity survives long enough for the algorithm illustrated by this program to run to completion.
  • that whatever computer the program is started on will continue to operate without interruption until the program terminates successfully.
  • that computers continue to increase in speed at a long term average rate of a doubling in performance every two years.
  • that if you start the program on a current vintage (i.e. 2003) computer then it will take exactly 100 million years to get the intended result.
In which year's January should the program be started (on a then current computer) in order to get the intended result during that same year using a single core/thread on a single computer?
Hint: if all of the assumptions are true then many of the folks who read this writeup in 2003 will almost certainly still be alive when it comes time to start this program. See my homenode for the answer.

/*
 * Print's "Hello world\n" to stdout.
 *
 * How it works:  it generates strings of randomly selected letters,
 * spaces and newline characters and computes the MD5 checksum of
 * each string.  Once it get's a string whose MD5 checksum matches
 * the MD5 checksum of "Hello world\n", it prints the string and
 * terminates.
 *
 * N.B. This program uses the standard srandom and random functions to
 * generate the random strings.  Make sure that your implementation of
 * these functions has sufficient entropy to give you a decent chance
 * of generating the "Hello world\n" string!
 */

/*
 * md5data is a routine which computes the MD5 checksum of
 * the specified data bytes and returns the 16-byte checksum
 * into the 4 element array csum.
 * Many MD5 implementations are available on the 'net.  Pick your
 * favourite one (or use the one in the md5 hash function node) and
 * write an interface routine for it called md5data which takes
 * the parameters described below.
 */

extern void md5data( void *data, int length, int csum[] );

/*
 * pre-computed MD5 checksum of "Hello world\n"
 */

int md5[4] = { 0xf0ef7081, 0xe1539ac0, 0x0ef5b761, 0xb4fb01b3 };

main() {

    srandom(0);

    while (1) {

        char msg[13];
        int tmp[4];
        int i;
        for ( i = 0; i < 12; i += 1 ) {

            msg[i] =
            "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ \n"[
            random() % (2 * 26 + 2)
            ];
        }

        msg[12] = '\0';
        md5data(msg,strlen(msg),tmp);
        if ( md5[0] == tmp[0]
        &&   md5[1] == tmp[1]
        &&   md5[2] == tmp[2]
        &&   md5[3] == tmp[3] ) {

            write(1,msg,strlen(msg));
            exit(0);
        }
    }
}

A Brief History of Computing

Attribution

I wrote this article way way back in 2003. The original is located at http://www.everything2.com/title/A+Brief+History+of+Computing


Really early computational devices

The earliest computational devices were actually memory aids in the form of wet clay tablets or even pebbles organized into piles which were used to perform and record calculations. The most famous such device is undoubtedly the abacus which was developed independently by the Chinese and the Romans.


The concept of an algorithm

The 4th century B.C. Greek mathematician Euclid's work The Elements contains the earliest recorded description of specific algorithms. This work also contains the earliest known attempt to formally define what an algorithm is.


The word algorithm

Abu Ja'far Muhammad ibn Musa al-Khwarizmi of Baghdad was the early 9th century author of a variety of quite important mathematical texts. His name, al-Khwarizmi, (say it quickly) is generally considered to be the source of the word algorithm.


The invention of binary encoding

The Inca civilization (founded in about 1200AD, destroyed in 1532 by the Spanish) almost certainly used a form of binaryencoding in their khipus. A khipu has a primary cord from which secondary cords are attached. Tertiary cords are sometimes attached to the secondary cords. The cords are knotted either individually or together (i.e. forming a weave-like pattern).
Although khipu were and are decorative objects, they also stored information. There is a seven layer decision making process which the maker of the khipu follows when creating it (whether to use cotton or wool, whether to use "spin" or a "ply" cords, whether to hang the secondary cord from the front or the back of the primary cord, etc.). The first six decision points each have two possible values or results. The seventh decision point, colour, had twenty four possible values. The result is a system capable of hierarchically encoding 1536 different values (26 * 24).

Pascal's calculator

Blaise Pascal invented a mechanical calculator capable of adding and subtracting numbers in 1642. Pascal was awarded a royal monopoly by the King of France in 1645 to produce his pascaline calculator. The extreme difficulty of producing the many small components of each calculator resulted in it not being a commercial success. Only eight of the roughly fifty 'pascaline' calculators built by Pascal still exist today.
One interesting point is that Pascal's method of performing subtraction by adding the complement of the subtrahend to theminuend is (almost?) identical to how subtraction is performed by modern computers although Pascal's device operated in base 10 (decimal) whereas modern computers operate in base 2 (binary).

The Leibniz calculating machine

Gottfried Wilhelm Leibniz, a lawyer by trade, invented an improved calculating machine in 1673. Leibniz's machine was capable of mechanically calculating the four primitive arithmetic operations of addition, subtraction, multiplication and division.


Vaucanson's automated creations

Jacques de Vaucanson was a French inventor of automated devices. His most famous creation was an automated duck which could flap its wings, "eat" fish, "digest" them and even "defecate" the result!
Vaucanson was appointed the inspector of silk manufacturing in 1741. In addition to introducing many significant process improvements into the industry, he developed an automated loom which was at least partially controlled by punched cards. Apparently, fear of the impact that the loom might have on the industry lead to it being rejected by weavers.
Although not strictly speaking "computational devices", his creations represented a key stage in the development of machines which could be controlled "programmatically".

The Jacquard loom

Joseph-Marie Jacquard invented an automated loom controlled by a series of punched cards (i.e. a stored program) in Francein 1804. The invention was at least partially based on Vaucanson's loom. Jacquard's loom was NOT very popular with textile workers as they believed that it threatened their livelihood. It was a technical success in that it allowed the creation of woven cloth with essentially arbitrarily complex patterns.


The Difference Engine

By the early 1800s, a variety of books of mathematical tables had been published. These books of logarithmic, trigonometric and other mathematical tables were used by navigators, engineers, artillerymen, insurers, bankers and other professionals. The problem was that the tables contained mistakes introduced by the process of manually calculating the hundreds or even thousands of individual values (i.e. numbers) in each table.
In 1822, Charles Babbage proposed building a calculating machine which could be used to mechanically produce "correct" tables. The machine was designed to be capable of producing tables accurate to 18 digits (with the last digit properly rounded) and could be configured to perform certain calculations to 30 decimal digits. Funding was provided by the British government and construction of Babbage's Difference Engine began in ernest. The project was immediately challenged by the inability of then current manufacturing processes to produce the mechanical components with a sufficient degree of precision. By about 1827, the project was completely stalled due to a lack of funding and Babbage's inability to accept anything less than perfection.
Babbage produced detailed plans for a second Difference Engine (Difference Engine 2) although only a handful of components were ever built.

The Analytical Engine and the invention of programming

Babbage effectively abandoned the Difference Engine project in the early 1830s when he conceived of what can only be described as a true computer. The Analytical Engine would have memory, a processing unit (called a "mill") and an input/output mechanism. Probably inspired in part by the Jacquard loom, the Analytical Engine would be controlled by a series of punched cards.
Ada Byron, the daughter of Lord Byron, attended a dinner party in 1834 at which Charles Babbage described his Analytical Engine. Babbage's suggestion that 'a calculating machine should be able to not only forsee but act on that foresight' caught Ada's attention.
Babbage gave a seminar in TurinItaly in 1841 where he described his Analytical Engine idea. His description was written up inItalian by Luigi F. Menabrea. Ada, now Lady Lovelace, translated the article into English and presented a copy of the translation to Babbage. His suggestion that she write notes to accompany the article lead to a series of correspondence between the two. They conceived of the idea of creating a formal description of what an Analytical Engine should compute and Ada Lovelace became the world's first computer programmer when she wrote programs for the as-yet unbuilt Analytical Engine.
Ada Lovelace's death by cancer in 1852 at the age of 36 effectively ended the Analytical Engine project although Babbage continued to work on the idea sporadically until his death in 1871.

Hollerith's tabulating machine

Processing the data from the 1890 U.S. census was going to be a problem. In fact, it was generally recognized that there wasn't any way to process and tabulate the data from the 1890 census fast enough to have the results ready before the 1900 census (i.e. the results would be worthless before they even existed).
Fortunately, Herman Hollerith had a solution. In 1888, Hollerith had invented a tabulating machine which used punched cardsquite similar to those used in Jacquard's loom. A key difference, from the perspective of the history of computing, was that Hollerith's machine treated the punched cards as data whereas Jacquard's loom had treated them as instructions. By encoding the information on each person onto a punched card, Hollerith's tabulating machine could and was used to produced tabulated 1890 Census data in just six months.
Hollerith was to lend his name to a form of character constants called Hollerith literals in early versions of FORTRAN. His most lasting contribution is almost certainly the company that he formed, the Tabulating Machine Company, which through a series of mergers became IBM.

De Forest's triode vacuum tubes

Earlier work by Thomas Edison and John Fleming led to Lee de Forest's invention of the triode vacuum tube in 1906. The presence or absence of power on one input could be used to determine if a signal on a second input was passed through to the output. This function, absolutely critical to digital logic, is typically performed by a transistor in modern computers.
Strictly speaking, the level of power on one input determined the amount of the power on the second input that was passed through to the output. This was the contemporary reason for the success of the triode vacuum tube as it made it possible to construct much more efficient analog amplifiers.

The first digital computers

There's still a certain amount of controversy and confusion surrounding who built the first fully functional electronic digital computer. Two clear contenders are:
  • Konrad Zuse of Germany built what was almost certainly the world's first digital computer, the Z1, between 1936 and 1938 (i.e. started in about 1936, completed in 1938). The Z1 was a completely mechanical device (i.e. no electronics). Zuse had been performing research on and developing the notion of what a computer was for some time. He certainly understood by 1934 that a computer would require memory, an arithmetic unit and a control unit. He filed a patent in Germany in 1936 describing such a machine. The patent also describes how instructions in the form of combinations of bits could be stored in the memory of the machine.

    The Z1 is also significant in that it was almost certainly the first device to perform floating point arithmetic.

    Zuse completed the hybrid mechanical/electronic Z2 in 1940. In 1942 he built the Z3 which was an electronic version of the Z1. This was arguably the world's first electronic digital computer.
  • John V. Atanasoff and Clifford Berry of Iowa State University built the Atanasoff-Berry Computer between 1937 and 1942. This computer is also arguably the world's first electronic digital computer (i.e. it depended primarily on electronics as opposed to mechanical devices).

Turing's notion of computability

Alan Turing published On computable numbers, with an application to the Entscheidungsproblem in 1936. This landmark paper described an imaginary computational machine which Turing called an LCM (Logical Computational Machine). Turing used the LCM to define what it meant for something to be "computable". In short, if a problem can be solved using an LCM then it is computable and if it can't be solved by an LCM then it isn't computable. He went on to prove that there were some problems which were not "computable" (see Halting Problem).
LCMs are today called Turing Machines in honour of their inventor.

Computing in World War II

The Second World War triggered an explosion in the development of computing. The contributions to the field during the war were, quite literally, too many to even hope to list. Here are a few of the more significant ones:
  • Alan Turing made major strides in the development of computing while cracking the German codes (see Enigma). This included practical developments like the Bombes used to mechanically attack the Enigma codes.
  • Konrad Zuse continued to develop his mechanical computers for Germany, building the (hybrid electronic/mechanical) Z2 (1940), the (electronic) Z3 (1938-41) and the (electronic) Z4 (1941-45).
  • Vannevar Bush developed a machine called the "Differential Analyzer". Weighing in at one hundred tons (90,000 kilograms), this analog computer operated by implementing "analogs" to actual physical processes.
  • Howard Aitken and Grace Hopper were the designers of the Harvard Mark I which became operational in 1944. The discovery of a dead moth which had caused the Mark I to fail one day led to Grace Hopper coining the term bug.
  • Although not completed until shortly after the war, the development of ENIAC by John Presper Eckert and John William Mauchly was a direct result of the wartime need for computational power primarily although not entirely in the area ofballistics.
  • Computational devices like the Norden Bombsight played a major role in the war although, with the advent of digital computers, I'm going to have to ignore these purely mechanical devices or this w/u will never get finished.

Sidebar: The post-war explosion of computing

With the end of the Second World War, computing technology had reached a level of maturity that was to provide the foundation for a veritable explosion of technology. The next twenty years would see the development of practically all of the major concepts and technologies which make up a modern computer including:
  • the transistor
  • operating systems
  • compilers
  • virtual memory
  • microcode
The remainder of this w/u will focus on a handful of the most major milestones. Feel free to suggest which of the many omitted milestones should also be covered.

The invention of hypertext

Vannevar Bush makes another appearance in this w/u with his 1945 Atlantic Monthly article titled As We May Think. This article discusses how computers might be used to assist humans in the processing of information. It describes a device that Bush calls a Memex which organizes information using the hypertext mechanisms which are familiar to anyone who uses the World Wide Web today.
Hypertext's invention in 1945 is arguably not a "major milestone" as hypertext would have to wait for the creation of the World Wide Web almost fifty years later to become in any sense relevant. It's included in this writeup more as an example of how old some of our "new" ideas really are. It's also a rather striking example of how a truly fundamental idea is pretty much irrelevant until the technology required to implement it actually exists.

The transistor

One of the most important inventions of the 20th century was made in 1947 by William ShockleyWalter Brattain and John Bardeen of Bell Labs. Their transistor, actually two different kinds of transistors, was the result of an effort launched shortly after the end of the war to replace the vacuum tube triode and similar devices with some sort of solid device. Like the triode, a transistor can be used as a digital switch or as an analog amplifier.


The UNIVAC computer

With the end of the war, the U.S. Census Bureau (remember them?) had another incarnation of the same problem of sixty years earlier - there was no way to process the data which would be collected in the upcoming 1950 census in a reasonable timeframe. The Census Bureau turned to the inventors of ENIACJohn Presper Eckert and John William Mauchly, with a contract to build a suitable computer for not more than $400,000US. Eventually rescued financially by Remington Rand Inc. in 1950, the first UNIVAC was delivered to the U.S. Census Bureau in 1951. Forty six UNIVACs were eventually built.
UNIVAC was the first even remotely general purpose commercial computer. The original UNIVAC is in the Smithsonian Institution.

The IBM System/360

After a massive investment of five billion dollars (i.e. 5,000,000,000 1964 dollars) on a literally "bet the company" venture,IBM held a news conference on April 7, 1964 to introduce their new System/360 product line. Suddenly, computers were no longer massive "personal computers" which could run only one program at a time but were now true "business machines" which could be used to run multiple programs or jobs simultaneously. The System/360 was also a real "system" in the sense that it had a well defined system architecture along with a family of products built around the architecture.
The era of the mainframe had arrived!
After premature warnings of the "death of the mainframe" in the early 1990s, IBM's mainframe business has recovered and today continues to generate significant revenue ($US4.2 billion in 2003, an increase of 6% over 2002).
Factoid: As of 2004, roughly 70% of the world's data is still stored on mainframes.

The development of the early ARPANET

It was 1966 and Bob Taylor had a problem and an opportunity. The problem was that he and his people were using a number of different computer systems and it wasn't possible to easily share information between the systems. The opportunity was that Bob Taylor was the newly appointed director of ARPA's IPTO (Information Processing Techniques Office) and was in a position to do something about the problem by putting his influence behind an idea that he'd kicked around with the previous IPTO director Joseph Licklider. The idea was to connect the key computers together using some sort of a digital link or "network".
As the idea developed, it became clear that probably the key piece of as yet non-existent technology was a communications device which was soon called an Interface Message Processor (IMP). The RFPs (request for proposals) went out in mid-1968. By the deadline date, over a dozen replies had been received including one from IBM and another from CDC (Control Data Corporation). Although negotiations with Raytheon began in early December, the winning bidder was ultimately BBN (Bolt Beranek and Newman).
The process of actually developing the first IMPs and getting them operational was far from smooth (see Where Wizards Stay Up Late / The Origins of the INTERNET in the "Sources" below for details). The first IMP was installed at UCLA in September, 1969 and the second was installed at SRI in October. By the end of the year, UCSB had IMP number three and the fourth IMP was in Utah. It took another year to architect and implement a communications protocol called the NCP (Network Control Protocol).
The ARPANET, precursor of the INTERNET, was alive.

The dawn of the personal computing era

The MITS Altair 8800 appeared on the January, 1975 cover of Popular Electronics. The computer was available as a kit for $395 USD and assembled for $495. It had 256 bytes of memory and a 2MHz Intel 8080. The only input device was a set of switches on the front panel. A set of lights on the same panel were the only output device.
The IMSAI 8080 was announced in mid-1975. With a price tag of about $250 USD, it had 4K of memory, an Intel 8080A and a twenty two (22) slot S-100 bus (along with lights and switches on the front panel).
Together with other hobbyist kits and the like, these computers launched the era of personal computing.

Birth of the World Wide Web

Building on the work of Vannevar Bush (hypertext) and others, Tim Berners-Lee launched the World Wide Web in 1990. The first web browser ran on the NeXT system and the first web site was targeted at the High Energy Physics community. It would take a couple of years but by the mid-1990s, the World Wide Web had brought the INTERNET to the masses.
The rest, as they say, is history.


What about ___________?

While it's true that no "history of computing" would be complete without a discussion of AppleMicrosoftSunCrayXerox,UnixMS WindowsMacOSOS/VS1, etc., it's also true that this is a BRIEF history of computing. Decisions had to be made. Feel free to suggest changes or, probably better yet, write your own w/u under this node.
P.S. Personally, I suspect that there are far more important holes in my coverage of early computing history than in my coverage of modern computing history. Maybe I should require that each suggestion for an addition to the modern computing portion be accompanied by a suggestion for an addition to the early computing portion! (grin)


Sources

  • The web page titled located at http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Al-Khwarizmi.html(last accessed 2003/06/03)
  • The June 23, 2003 article in The Independent newspaper titled Inca may have used knot computer code to bind empireby Steve Connor (Science Editor). Located on the 'net athttp://news.independent.co.uk/world/science_medical/story.jsp?story=418049 (last accessed 2003/09/23)
  • The web page titled on Jacques de Vaucanson and his Duck located athttp://www.swarthmore.edu/Humanities/pschmid1/essays/pynchon/vaucanson.html (last accessed 2003/06/03)
  • The web page titled Vaucanson's Duck located athttp://music.calarts.edu/~sroberts/articles/DeVaucanson.duck.html (last accessed 2003/06/03)
  • The web page titled A short Biography of Leibniz located at http://www.helsinki.fi/~mroinila/lbio.htm (last accessed 2003/06/03)
  • The web paged titled The pattern loom located at http://www.deutsches-museum.de/ausstell/meister/e_web.htm(last accessed 2003/06/03)
  • The web pages titled Mechanical Aids to Computation and the Development of Algorithms, the first of which is located athttp://www.csc.liv.ac.uk/~ped/teachadmin/histsci/htmlform/lect1.html (last accessed 2003/06/03)
  • The web page titled Ada Byron, Lady Lovelace (1815-1852) located athttp://www.cs.yale.edu/homes/tap/Files/ada-bio.html (last accessed 2003/06/03)
  • The web page titled Introduction to Ada Lovelace's Translation of, and Notes to, Luigi F. Menabrea's "Sketch of the analytical engine invented by Charles Babbage, Esq." (1842/1843) by Christopher D. Green located athttp://psychclassics.yorku.ca/Lovelace/intro.htm (last accessed 2003/06/03)
  • The web page titled Babbage's Difference Engine located athttp://hoc.co.umist.ac.uk/storylines/compdev/earlymechanical/diffengine.html (last accessed 2003/06/03)
  • The web page titled Herman Hollerith's Tabulating Machine located at http://www.maxmon.com/1890ad.htm (last accessed 2003/06/03)
  • The web page titled Inventor Herman Hollerith located athttp://www.ideafinder.com/history/inventors/hollerith.htm (last accessed 2003/06/03)
  • The web page titled The Invention of the Vacuum Tube located at http://www.maxmon.com/1883ad.htm (last accessed 2003/06/03)
  • The web page titled History in the Computing Curriculum located athttp://www.hofstra.edu/pdf/CompHist_9812tla2.PDF (last accessed 2003/06/03)
  • The web page titled Konrad Zuse located at http://ei.cs.vt.edu/~history/Zuse.html (last accessed 2003/06/03)
  • The web page titled The Life and Work of Konrad Zuse by Horst Zuse, located athttp://www.epemag.com/zuse/part3c.htm (last accessed 2003/06/03)
  • The PDF file titled Z1, Z2, Z3 and Z4 located athttp://www.zib.de/zuse/English_Version/Inhalt/Kommentare/Pdf/0680.pdf (last accessed 2003/06/09)
  • The web page titled Reconstruction of the Atanasoff-Berry Computer located at http://www.scl.ameslab.gov/ABC/(last accessed 2003/06/03)
  • The web page titled John Vincent Atanasoff and the Birth of the Digital Computer located athttp://www.cs.iastate.edu/jva/jva-archive.shtml (last accessed 2003/06/05)
  • The PDF file titled Subject-Matter Imperialism? Biodiversity, Foreign Prior Art and the Neem Patent Controversy located at http://www.idea.piercelaw.edu/articles/37/37_2/9.Kadidal.pdf (last ccessed 2003/06/05)
  • The web page titled Computable Numbers, 1936 and the Turing Machine located athttp://www.turing.org.uk/turing/scrapbook/machine.html (last accessed 2003/06/03)
  • The web page titled Vannevar Bush's Differential Analyzer located at http://www.uh.edu/engines/epi27.htm (last accessed 2003/06/03)
  • The web page titled The ENIAC Story located at http://ftp.arl.mil/~mike/comphist/eniac-story.html (last accessed 2003/06/03)
  • The 1945 Atlantic Monthly article titled As We May Think by Vannevar Bush. This article is quite easy to find on the 'net. One copy is located at http://www.theatlantic.com/unbound/flashbks/computer/bushf.htm (last accessed 2003/06/03)
  • The web page titled Transistorized! located at http://www.pbs.org/transistor/album1/ (last accessed 2003/06/03)
  • A series of web pages titled The History of Computers by Mary Bellis, located athttp://inventors.about.com/library/blcoindex.htm?PM=ss12_inventors (last accessed 2003/06/03)
  • The web page titled IBM's 'dinosaur' turns 40 PCs were supposed to kill off the mainframe, but Big Blue's big boxes are still crunching numbers located at http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2004/04/05/BUGF75VUUQ1.DTL&type=tech (last accessed 2004/04/05)
  • The book Where Wizards Stay Up Late / The Origins of the INTERNET by Katie Hafner and Matthew Lyon; published by Simon and Schuster; Copyright © 1996 by Katie Hafner and Matthew Lyon; ISBN 0-684-81201-0.
  • The web page titled The MITS Altair 8800 located athttp://wwwcsif.cs.ucdavis.edu/~csclub/museum/items/mits_altair_8800.html (last accessed 2003/06/09)
  • The web page titled The IMSAI 8080 located athttp://wwwcsif.cs.ucdavis.edu/~csclub/museum/items/imsai_8080.html (last accessed 2003/06/09)
  • The web page titled The World Wide Web: a very short personal history by Tim Berners-Lee, located athttp://www.w3.org/People/Berners-Lee/ShortHistory.html (last accessed 2003/06/09)
  • Personal knowledge.