About Me: I am currently a Computer Science major at San Jose State University.

Sunday, November 17, 2013

History of Computer Science: From Humble Beginnings

        If you ask ten different computer scientists about what the history of computer science is, I can guarantee that you will get ten different answers, and all of them will be right. There are so many different facets of computer science, like the hardware, algorithms, data structures, video games, security, and so on. It all depends on what part of computer science the person being asked finds most interesting. For me, I believe the two primary facets of computer science have to be both the hardware that makes up the physical machines and the software that tells the machines what to do.

http://www.computus.org/journal/wp-content/uploads/2008/08/antikythera_mechanism_fragmenta.jpg
The Antikythera Mechanism
         While many of us associate the word computer with the desktops, laptops, IPhones, and tablets, not too long ago computers were simply mechanical devices that performed mathematical calculations. One of the earliest computers, The Antikythera Mechanism, was believed to have been made somewhere in Greece nearly 2000 years ago (1). This collection of gears is thought to have been designed to calculate the movement of stars, and while it may not run on electricity, the Antikythera Mechanism is still a computer.
We still haven't solved the wire problem

       Fast forwards a couple of millennium and we start to see the beginning of the modern computers that we know and love come into existence. The year 1946 saw the creation of a new machine called ENIAC. In a joint venture with the U.S. military, John Mauchly and John Presper Eckert designed a room sized computer that, “In one second, the ENIAC (one thousand times faster than any other calculating machine to date) could perform 5,000 additions, 357 multiplications or 38 divisions.”(2) While this sounds impressive, one must take into consideration that this machine would accept input from punch cards and would take weeks for technicians to reprogram. This computer is roughly comparable to a cheap pocket calculator today.
 

There's my baby, 5 years old and going strong.
        As time progresses, computers became more and more powerful, and took up less and less space. In 1977, Steve Jobs and Steve Wozniak released the Apple 2, one of the first commercially viable personal computers for $2600 dollars. This relatively cheap piece of hardware enabled many people to have their very own computers to use; and as time went on, those PCs became cheaper and more powerful. I myself eschewed the idea of buying a computer out of a box, and instead I make my own. While buying a computer is easy and quick do, it just can't compare to the feeling of accomplishment that you get from ordering the parts and putting the computer together.

Seems a bit basic don't you think?
         Now this shiny hardware is all well and good, but there isn't much point to it unless there is some kind of program that tells the computer what to do. In order to solve this problem many different methods and languages were created to do just that. BASIC, one of the earliest languages, was created by John Kemeny and Thomas Kurtz at Dartmouth in 1964 as a way for average students who were not math or science majors to use computers. At the time it became quite the popular language, and both Microsoft and Apple used derivatives of BASIC as the core of their early operating systems. Even today there are a number of languages that have their roots in BASIC, such as Microsoft's Visual Basic, which is still being used today in their .net framework.

         Another important language would have to be C. It was released in the early 1970s, and it enabled programmers to have much more control over how memory was allocated during the run-time of a program. It is a very structured language is known for being very efficient when the program makes the transition to actual machine language. The language is so versatile that it spawned a couple of newer versions; namely the object oriented languages C# and C++, which are still in use today. In fact, C is such a widely used language that many computer science degrees require a class in the C language as a prerequisite. 

         While I have talked about a select few hardware milestones and languages, there are still many important contributions, like the IBM PCs (1981) and Cray Supercomputers (1960s), along with important languages like Java (1995) and FORTRAN (1957). Every one of these has left their mark upon computer science as we know it.



No comments:

Post a Comment