Kevin Drum and Matt Yglesias discuss the extremely rapid potential future growth of computing power, and how this affects the way we should think about the march towards more and more awesome computing machines.
An important caveat here is that while computing power may continue to grow exponentially for some time, our ability to do something useful with that increased power may not grow at the same pace. Graphics cards today are capable of rendering much more complicated content at much higher resolutions than they were twenty years ago. But that hasn't resulted in a massive reduction in the amount of time artists spend producing content. Likewise network bandwidth has not enjoyed growth rates as fast as CPU power, because someone has to actually set up a large amount physical network infrastructure, and the costs associated with that infrastructure don't reap all the goodness of whatever bastardization of Moore's Law you're using to describe network speeds. Japan is at the forefront of using robots to replace humans for certain service jobs, particularly in the health care field, but those robots are and will continue to be quite expensive, since there are lots of physical processes that go into robotics that don't benefit from Moore's Law. So while computing will become cheaper and even more ubiquitous, computers will interact with more and more processes that won't be able to take full advantage of extremely fast growth rates of computing power.
Another important consideration when thinking about our ultra-fast, ultra-cheap CPU future is that disk-IO does not grow in a Moore's Law-esque fashion, though Solid State Drives may change that equation.
No comments:
Post a Comment