Hitting the ceiling...

Qion

Uber Nothing
It seems to me that the processor speeds of our personal computers are approaching a kind of singularity. Given our current GUIs and tasks that we tax our computers with, our processors simply have become fast enough to handle it all without much necessary improvement. For instance, I do a lot of freelance 3D modeling, and even I feel very comfortable on my midrange pro machine from around two years ago. I feel absolutely no pressing need to upgrade, and I can't foresee a job in the next three years that would require something faster. Outside of a computer farm, what's the point? When I can have Ps, Illy, Id, Modo, and correspondence programs open and busy at the same time, exactly at what point will I need to change my computing habits?

The futurist in me says, "When the interface changes." Steve Jobs mentioned, albeit sporadically, at D5 a computer which "wraps around your desk". Now, that would be something! :)
 
How long does your 3D scene take to render? If it rendered in half the time, would your productivity double?

When I was doing my PhD, I found that the neural networks I used took ages to train. Each experimental run took about a day, due in part to the complexity and the size of our data sets. Down the hallway, there were guys engaged in computer vision and their models took days to run too. You might say that computers have hit a ceiling, but the next generation of computer applications which make use of AI (I really hate that term!) are a long way off, mainly because computers are no where near fast enough.

Computers will never be "fast" enough.
 
How long does your 3D scene take to render? If it rendered in half the time, would your productivity double?

When I was doing my PhD, I found that the neural networks I used took ages to train. Each experimental run took about a day, due in part to the complexity and the size of our data sets. Down the hallway, there were guys engaged in computer vision and their models took days to run too. You might say that computers have hit a ceiling, but the next generation of computer applications which make use of AI (I really hate that term!) are a long way off, mainly because computers are no where near fast enough.

Computers will never be "fast" enough.

Yes, if I were in a rush situation, my productivity would increase. My point is that it's unnecessary in the majority of things we currently do with our personal computers. The physical amount of work I can do with my hands and mind has really become the bottleneck in most design situations. I can spend an hour simply thinking about a particular element, while my machine can render the same picture in under a minute. It cannot create as I do, but it is the means by which I create. I'm not sure I want that to change.

I'm writing a novel centering around the idea of strong AI, and the clear and present dangers surrounding its adaptation. If computers will never be fast enough, at precisely what point will they stop being computers and start being sentient? When will we converse with computers instead of commanding them?

...unless you're speaking of weak AI, of course, which could be hugely beneficial to our species while running a much lower catastrophic risk. I'm not saying we should throw in the towel and stop making faster computers, I'm merely remarking that our current means of work do not necessitate supercomputers.
 
...strong AI...
....weak AI....

No one apart from philosophers and the loony fringe in academia refer to "strong" AI when talking about AI. When you hear about AI from scientists, it is almost always "weak" AI. Personally, I prefer to see the distinction as pipedream AI and practical AI. ;)

Even outside the realm of AI, there is a very big demand for more computing power. For example, DSLRs are becoming more and more common and amateur photography is now in the realm of 10+ mega pixels and requires almost professional quality software to work with. Then you have the hobbyist videographers who need even more computer power...

If all you're interested in is web browsing and light word processing, then the multicore desktop revolution may not be for you. But for many others, it's sweet sweet manna from heaven.
 
What ever happened to Moore's Law?

Oh I have no doubt we'll continue with at least the spirit of his law. Quantum computing, as my PhD-having poster probably knows more about, could very well exceed his law year-over-year.

By "ceiling" I'm referring to the point at which more power becomes unnecessary or unusable.
 
Quantum computing, as my PhD-having poster probably knows more about...

I admit I know very little about quantum computing. I'm already busy enough as it is trying to keep up to date with what's going on in my subject area (neural networks and general intelligent systems).
 
Back
Top