History of Computer Science
Chances are if you clicked this blog, you either are already in the field, are looking to get into it, or are stumbling down the Internet rabbit hole. With how pervasive computer technology is in our everyday lives, we can already establish that it has a wide scope in terms of future employability because of the constant demand. Now a billion-dollar, multi-disciplinary industry, it makes you wonder how it all really began. Well, let’s flip the pages and go back to the year 1837 when Charles Babbage first designed his Analytical Machine.
Although there have been attempts to design a calculating device before Charles Babbage and mathematicians have been hacking away at the logic behind all computing a couple of centuries before 1837, the Analytical Machine By Charles Babbage was still the only acceptable first design for a modern computer. With expandable memory, an arithmetic unit, and proper logic processing capabilities, the analytical engine model was heavily studied even though it was never built.
After the success of his initial models, Babbage continued working on further developments using punched cards to perform arithmetical operations. Ada Lovelace gets all the credit for being a pioneer for computer programming due to her contributions to the algorithmic bits of the Analytical Machine.
In 1886, Charles Sanders Pierce wrote about the different logic gates and proposed electrical switching circuits to make logical decisions.
Akira Nakashima, Claude Shannon, and Viktor Shetakov published papers, during the time period from 1934 to 1936, talking about boolean algebra and utilizing the properties of the electrical switches for logical operations.
Claude Shannon founded the field of Information Theory which used probability theory and talked about the best way to encode information being transmitted. This laid the foundations for modern-day data compression and cryptography.
The mathematical foundations can be credited to Kurt Gödel with his incompleteness theorem which talked about limits existing with what can be proved and disproved within the formal system.
In 1936 Alan Turing and Alonzo Church worked towards formalizing an algorithm while Alan Turing also published his work about abstract digital computing machines which are credited for the concept of the stored-program which is now used in all modern-day computers.
Norbert Wiener coined the term cybernetics and his paper on the same largely influenced modern-day Artificial Intelligence. He also compared computing and other cognitive functions to brain waves.
The ENIAC, one of the first computers, was programmed mainly by the human computers that were predominantly women. Kathleen Booth developed the Assembly language in 1950 to program the computers she was working on more easily.
Grace Hopper, one of the first programmers of Harvard Mark I, developed the first compiler and the programming language FLOW-MATIC to program the UNIVAC. Frances E. Holberton, who was also working on UNIVAC at the time, developed the code C-10 to let programmers use keyboard inputs.
A team at IBM developed FORTRAN in the 1950s and the first compiler was released in 1957. Further work went into creating more programming languages but the usage eventually became more widespread than specific and well, it has only gone uphill from there.
That brings us to the end of History of Computer Science but there’s so much more to be discussed about it, perhaps some other day on LetsUpgrade.
Stay safe everyone!
LetsUpgrade your LIFE!