About Me: I am currently a Computer Science major at San Jose State University.

Tuesday, November 26, 2013

Artificial Intelligence: Human or not?



When you think about Artificial Intelligence, chances are that what comes to mind first is robots rising up and destroying modern civilization. Well, I can guarantee you that is not going to happen, at least not for a good long while. In a nut shell, A.I. (Artificial Intelligence) is the ability for a computer or a machine to make decisions based on some past experience. There are some A.I. that can perform an operation based only on a specific list of actions that they can’t deviate from, while there are those that take in many different variables, and adjust their actions accordingly. Some examples of the former are programs that control an automobile assembly robot and automated customer service lines, while the latter include experimental self-driving cars and the routines that control an enemy soldier in a video game. Right now these A.I. are very rudimentary, but as time marches on and processor power increases, they will become smarter and more complex. We needed a method for figuring out just how smart a computer can be, and that is where Alan Turing comes in.

In 1950, Alan Turing proposed just such a method. The Turing Test, as it is known today, is used to test computers for the presence of active intelligence. What Turing proposed is to have three participants in the test: a person, a computer, and a human questioner. The questioner is to have a number of conversations with both the computer and the person, and at the end of the test, decide who is human and who the computer is. Turing believed that if the computer is able to fool the questioner at least 70% of the time, then the computer is said to have passed the Turing Test. Turing proposed this over 60 years ago, and unfortunately, that 70% success rate has not been attained yet. There is one program that has made some progress on the test through, and it is known as Cleverbot. Cleverbot is a website that lets users have conversations with a computer. Cleverbot is actually a pretty interesting in that more conversations it has with users, the more it “learns”. Repeated conversations with users enabled the algorithm behind Cleverbot to associate words and sentences with each other, and then it chooses the best response for any given question or statement made by a human. As impressive as it sounds, however, the best that the program has been able to achieve is a 60% success rate. This just goes to show that even with all the advancements that we have made in the field of computer science, we still have a way to go before we are able to see complex A.I. be able to pass themselves off as humans.

For information about the Turing Test visit this website.

Sunday, November 17, 2013

History of Computer Science: From Humble Beginnings

        If you ask ten different computer scientists about what the history of computer science is, I can guarantee that you will get ten different answers, and all of them will be right. There are so many different facets of computer science, like the hardware, algorithms, data structures, video games, security, and so on. It all depends on what part of computer science the person being asked finds most interesting. For me, I believe the two primary facets of computer science have to be both the hardware that makes up the physical machines and the software that tells the machines what to do.

http://www.computus.org/journal/wp-content/uploads/2008/08/antikythera_mechanism_fragmenta.jpg
The Antikythera Mechanism
         While many of us associate the word computer with the desktops, laptops, IPhones, and tablets, not too long ago computers were simply mechanical devices that performed mathematical calculations. One of the earliest computers, The Antikythera Mechanism, was believed to have been made somewhere in Greece nearly 2000 years ago (1). This collection of gears is thought to have been designed to calculate the movement of stars, and while it may not run on electricity, the Antikythera Mechanism is still a computer.
We still haven't solved the wire problem

       Fast forwards a couple of millennium and we start to see the beginning of the modern computers that we know and love come into existence. The year 1946 saw the creation of a new machine called ENIAC. In a joint venture with the U.S. military, John Mauchly and John Presper Eckert designed a room sized computer that, “In one second, the ENIAC (one thousand times faster than any other calculating machine to date) could perform 5,000 additions, 357 multiplications or 38 divisions.”(2) While this sounds impressive, one must take into consideration that this machine would accept input from punch cards and would take weeks for technicians to reprogram. This computer is roughly comparable to a cheap pocket calculator today.
 

There's my baby, 5 years old and going strong.
        As time progresses, computers became more and more powerful, and took up less and less space. In 1977, Steve Jobs and Steve Wozniak released the Apple 2, one of the first commercially viable personal computers for $2600 dollars. This relatively cheap piece of hardware enabled many people to have their very own computers to use; and as time went on, those PCs became cheaper and more powerful. I myself eschewed the idea of buying a computer out of a box, and instead I make my own. While buying a computer is easy and quick do, it just can't compare to the feeling of accomplishment that you get from ordering the parts and putting the computer together.

Seems a bit basic don't you think?
         Now this shiny hardware is all well and good, but there isn't much point to it unless there is some kind of program that tells the computer what to do. In order to solve this problem many different methods and languages were created to do just that. BASIC, one of the earliest languages, was created by John Kemeny and Thomas Kurtz at Dartmouth in 1964 as a way for average students who were not math or science majors to use computers. At the time it became quite the popular language, and both Microsoft and Apple used derivatives of BASIC as the core of their early operating systems. Even today there are a number of languages that have their roots in BASIC, such as Microsoft's Visual Basic, which is still being used today in their .net framework.

         Another important language would have to be C. It was released in the early 1970s, and it enabled programmers to have much more control over how memory was allocated during the run-time of a program. It is a very structured language is known for being very efficient when the program makes the transition to actual machine language. The language is so versatile that it spawned a couple of newer versions; namely the object oriented languages C# and C++, which are still in use today. In fact, C is such a widely used language that many computer science degrees require a class in the C language as a prerequisite. 

         While I have talked about a select few hardware milestones and languages, there are still many important contributions, like the IBM PCs (1981) and Cray Supercomputers (1960s), along with important languages like Java (1995) and FORTRAN (1957). Every one of these has left their mark upon computer science as we know it.



Sunday, November 10, 2013

File Sharing





Source
     In the tech industry, you need to have a fast and efficient way to share your work with both your co-workers and your managers. This is because your work can only improve when you have your peers examine your programs and your managers need to know how far along you are in your work. There are a number of ways to share your files, whether it is through email, thumb drives, or through an online file storage service. But all of that can be very cumbersome, namely because you have to manually keep track of all the versions flying around. Luckily, there are a number of services on the web that can both keep track of the history of your files and let other easily access it.

     One of those services is called GitHub. So many people just use it solely as a place to store files, but there is so much more that you can do with that service. For example, for any code that you upload to the site, you can view the history of that file. With that you can view all the changes you made during a particular time period, and who made those changes. And on top of that, GitHub has their own editor that allows you to make changes to a file without even having to open up another IDE. There are so many services out there that are devoted to making your lives as computer programmers so much easier, all you have to do is look for them.

Monday, November 4, 2013

Data Structures: Backbone of The Information Age



Source
If you are even slightly familiar with the field of computer science, then you should be familiar with data structures. Data structure is a catch-all term for any sort of system that is designed to store multiple versions of the same data type in a specific way. These structures are extremely important to how our mode
rn world. Without them, we would have a much harder time sorting and accessing any information on the Internet in any meaningful manner. These structures can range from the simplistic, like an array or a linked list, to more complicated fare, like stacks, queue, maps, and trees. Each one of those structures has a different purpose, and some are better suited to one task than others. For example, I wouldn’t use a binary tree to act as the back end of a system that sends print jobs from multiple computers to a single printer. While it could theoretically be done, it would be a waste of time to code and be very inefficient; instead I would use a priority queue, which gives each print job a value that determines the order of printing. 

This semester, I am working with a group that is working on a real world application of a data structure. Our client already has a database running that stores over 3000 items, and she wants us to put a version of it online that has some sensitive information redacted. That online version is supposed to be updated whenever there is a change made in the primary database. Like all things online security is a very important concern. We have to make sure that only information about the items is available for viewing online, and that we try our hardest to ensure that there is no way for any unscrupulous fellows to access the primary database.  It will be very interesting to hear my fellow team members discuss how to implement this database.