Brief History of Coding

A Brief and Fun History of Coding: the Beginnings

Wrens operating the 'Colossus' computer, 1943.Have you ever looked at pictures of the first computers and wondered how we got from that to the super-sweet, high-speed device on which you are now reading this article?

No? Well, you should—because it’s pretty cool and important.

When writing about the development of anything from a historical perspective, it’s tough to pick a starting point without getting overly-philosophical pretty quickly (how did we get here??). While it’s tempting to dig deep into a brain-twisting inquiry of information theory, for now, we’ll just stick with the basics, which of course begins with…

Hollywood.

Remember when Doctor Strange played the part of Alan Turing in the movie The Imitation Game? If you haven’t already seen it, I highly recommend it, as it is a fascinating story, albeit a tragic one, given the treatment such a brilliant mind received. But relevant to this narrative is the part he played in the development of coding.

During World War II, the British were intent on breaking coded messages being sent by the Germans. Assembling most of their efforts at a place called Bletchley Park, an English country estate, they set to work on determining the best way to crack the case, so to speak.

Having your city bombed repeatedly is apparently a tremendous motivator, as the team there made rapid advancements in the area of automated assistance, which led to the creation of a machine named “Colossus”—arguably the world’s first programmable, electronic, digital computer (see picture above). Turing was a key member of the team there and his work was directly responsible for breaking several German codes which, according to some historians, shortened the war by at least a couple years.

From there, Turing went on to develop what was known as the ACE (Automatic Computing Engine), which stood apart from its predecessors as the first machine to employ “abbreviated computer instructions”—a programming language.

Coded messages inspired code-breaking, which needed speed and efficiency; these qualities required machines, which required more speed and efficiency, which required they run on an abbreviated language that would operate a program, which meant they needed…code.

Modern computing was born.

Coming full-circle in developmental needs and advancements in a very short time, computer technology took off from there. No small part of this was due to one of the smartest individuals who ever lived—mathematician, physicist, and general polymath John von Neumann, whose work on the Manhattan Project prompted several ideas that he carried forward into algorithmic development, problem-solving with pseudorandom number generators, and a designing of computer architecture that is still used today, and that heavily influenced the development of the famed ENIAC machine and the IBM 704.

I could spend the next 50,000 words writing about von Neumann and still barely scratch the surface of his genius. Suffice to say that he was wicked-smart, highly important, and is worth reading more about on a number of levels, but especially the issue of computational development.

NNeumanneumann’s theories and practical applications spurred tremendous growth in the area of computer programming, specifically in the area of how it works within the architecture of a machine. How memory is both stored and accessed is directly attributed to von Neumann, and has enabled numerous directions of development to be followed since his explanation in 1945.

From there we can make some interesting connections in a six-degrees-of-separation type method. Neumann also consulted on the EDVAC project, the chief designers of which were J. Presper Eckert and John Mauchly. The Eckert-Mauchly Computer Corporation in 1949 hired a mathematician named Grace Hopper as a senior developer on the UNIVAC I project, the woman most responsible for the programming language known as COBOL—Common Business-Oriented Language.

Grace HopperCode Platoon offers a Women in Technology Scholarship to a female veteran in honor of Hopper.  The scholarship overs the full $15,500 tuition for one female Veteran during each cohort.

Her belief was that programming should be mostly English language-based, as that was much easier for most people to understand and work with. Although it took her a full three years of being rejected at Eckert-Mauchly, she eventually won everyone over and launched what would be one of the most influential programming languages in software development.

What is perhaps most fascinating about Grace, however, is that she did all of this while serving in the Naval Reserves, which she joined during WWII (she wanted to be an active duty but was too small by Navy standards) and retired from as a Rear Admiral, thus enabling her to implement many of her ideas into Defense Department standards of practice. Her insistence on testing of computer systems led to a convergence of programming languages such as COBOL and FORTRAN, developed by John Backus, and the methods for implementing these tests eventually formed the foundation of the National Bureau of Standards, which eventually was renamed the National Institute of Standards and Technology (NIST).

Some programmers argue that FORTRAN is the foundation of nearly every programming language used today and that everything else is descended from it. Although it was more mathematical than the English-based method Hopper had advocated for with COBOL, the intent Backus had with FORTRAN was to craft something decidedly more human in its input methodology than previous languages, thus enabling users to develop their own with more ease (a crucial component to code development).

Backus, a designer at IBM, is the author of the BNF—the Backus Normal Form—which was implemented to define coding language syntax and how they are expressed. So when you see a textbook or a manual explaining what type of programming language is being used and how to differentiate them, you can thank John Backus (or be mad at him, depending on how frustrated you get with that style of notation).

What is interesting to note here is that most of what you’ve just read (aside from the creation of the NIST) had happened by 1959.

In other words, the foundations for modern programming had been firmly established before color TV was a common thing.

Next time we’ll look at developments since then, what they mean, and how those Bill Gates and Steve Jobs fellas work into this mix.

Greg Drobny is a former Airborne Infantryman, PSYOP Team Chief, political consultant, professional mil blogger, and is Code Platoon’s Student Outreach Coordinator. He holds a BA in history, a Masters of Science in organizational psychology, and is currently pursuing an MA in history. He is married with four children who keep him more than slightly busy and is passionate about helping veterans find their paths in life and develop the skills needed to pursue their goals.

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our team.

Thank you for subscribing to our email list!