Has personal computing hit a progress wall?

The promise of cheap powerful personal computers has for the most part been filled. For those that are not familiar with Moore’s Law it was an observation made in 1965 by Gordon Moore, co-founder of Intel, that the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented. Moore predicted that this trend would continue for the foreseeable future. In subsequent years, the pace slowed down a bit, but data density has doubled approximately every 18 months, and this is the current definition of Moore’s Law. Most experts, including Moore himself, expect Moore’s Law to hold for at least another two decades. That being said we have computers that are extremely powerful when compared to only a decade ago, and those were much more powerful than they were a decade before that. With that much power have advances in use of that technology when compared to progress of that more powerful hardware keep pace?

 

Let’s digress for a moment. In the movie 2001 a space odyssey, and even the follow on 2010, we had computers that were capable of what appeared to be thought. They could hold a conversation with us, make decisions, take action, etc. Other predictions had android type robots in our homes doing daily tasks, watching over us, helping us plan our lives etc. While the hardware has continued to double in power our use of that and the advancement of the application of that technology does not seem to have kept pace.

 

Yes the graphics are better, the computers are faster but when you get to the heart of it the progress in application or use of personal computers has really stagnated for the last decade. Look at where we were in 2000 with popular use of computers. At work we had office applications like word processors, presentation programs, scheduling software etc. In the factories we had robots taking over repetitive tasks, automating much of the factory assembly lines…in the last ten years this has not really changed all that much, even thought the computers are much faster and more powerful.

 

We all seem to think it is great that our video games look so good today but in reality the play itself has not changed that much. They look slicker but the core of the games themselves have not changed that much. So does Moore’s law apply to the use of the technology keeping pace? It appears not. Where an application used to fit on a 1.5 meg floppy it now needs gigs on a DVD. Instead of writing efficient code we now use megs, or gigs of executables to essentially do the same thing. There is so much memory and computing power it seems we by and large have forgotten how to use what we have efficiently.

 

For now it appears that one prediction of the future is holding true and that is of ubiquitous computing. It was predicted that cheap integrated chips would lead to mini computers being all around us, so much so that we would not even be aware of them. That indeed has happened. You car has many computers, home alarms, TV, cable box, even your toaster may have a computer in it. These do make our daily lives better in invisible ways. When you drive your car the perfect mix of fuel gets to the engine thanks to computers, you also get better fuel mileage, and of course…your toast is perfect. But is this meeting the promise of a great future on computing…hmmmm..

 

We are getting to the point where the code for programs is so large, so complex that it takes teams of people to pull it all together, gone are the days of the code writer working on his own, and so to at some point may be gone the days of the teams of people. They will be replaced by the program that creates the programs. To some extent that has already happened but there will come a time when we will need to create programs so complex that only a program will be able to create it. When that happens we will be one step closer to a true revolution in computing. The computers will create the programs for themselves and at some point this could lead to what appears to be, or may actually be, self awareness. Once that happens we will then have a real issue to deal with. Will we have the moral right to turn something off that is self aware and does not want to be turned off? But that is another article.

 

When will we start to see real changes in the progress of applications? That’s hard to tell. It could happen over night with the advent of cloud computing, or it could take decades, or tens of decades. Sooner or later it will happen though. In our lifetimes I do not think we will have a real conversation with our computers. Technology will in the not to distant future be able to recognize our face, ask us questions, understand our answers, but this will be not much more than programmed response and not thought.

 

As excited as we all are by the power of our iphones and PCs these are really not much smarter than a toaster, and a long way from a Moore’s law like exponential jump in application progress. Today we can enjoy the technology we have and the way it helps our lives but on some level it seems that we have not made the leaps in the use of that technology that was the promise in the early days of computing.