Future of computing?

Status
Not open for further replies.

H-kon

Registered Senior Member
Just wondered if any of you have any thoughts about how computers are going to be in the next couple of years?

It's not been long since i had a 486 with 8MB of memory, and a tiny 245 MB harddisk..

Any clues?

Will we only see faster computers, or will we see a totally new way of doing things in the next 10-15 years?

Hmm. wonder where mine is going to be then...
 
My guess is that there won't be a radical change in computing paradigms, only evolutionary incremental improvements. I see a transition toward all-optical (think TB/s) interconnects, possible emergence of holographic or molecular memories, and massively parallel "CPU"s that actually consist of many dozen or hundreds, or maybe even thousands of full-fledged smaller processors working simultaneosly (for a preview, check out the Multi-Threaded Architecture on www.tera.com website...)

With optical computing, wave division multiplexing could become popular for increased bandwidth, essentially leading into acceptance of numerical bases higher than binary. This could lead into trinary, quad, etc. computers that operate in base other than 2...

My general feeling is that Moore's law will be alive and well for a long time to come... Even though silicon technology is rapidly approaching a limit of miniaturization, we have barely began to fully explore architectural variants of computer components.

------------------
I am; therefore I think.
 
Actually, computers will change dramatically in the next few decades as I see it.. Here are two of the things that tell me this.. Not trinary or etc, but Quantom Computers.. The electron is limited to a state of on or off, but it has been shown that a computer built on quantom technology could exist in a combined state of on or off at the same time.. this allows very diificult calculations to occur rapidly.. Also, the latest thing I have heard as far as HD size goes is bubble-clustering technology, which of i understand little, but know that the theory is that in a small space they could bubble-cluster memory in infinatly small layers allowing infinite hard drive space.. nice : )..

------------------
--telekinetic--
Contact:
telekinetic@mail.com
UIN: 382423
Check it out:
Quake3.nu admin insomniac
<A HREF="http://home.cdsnet.net/~sexton

Who" TARGET=_blank>http://home.cdsnet.net/~sexton

Who</A> says 13 year olds are immature ; )
 
I'm highly skeptical of quantum computers. I mean, they may be possible on football field-sized scales and available only to the top government agencies for cracking codes -- but I just don't feel that they could be miniaturized enough to be practical at home in the foreseeable future. They are extremely touchy and fragile by their very nature (even a tiniest disturbance can result in decoherence and loss of all information), they are extremely hard to maintain, they can only maintain memory for extremely small time periods, etc, etc, etc.

But even more importantly, they are not suited for the typical essentially deterministic tasks we expect computers to perform. For example, quantum computers are absolutely useless when it comes to word processing or web browsing, or rendering 3D virtual reality scenes. Computers as we know them today are universal, easily programmable machines. Quantum computers are not even close to achieving such a status. Even though there is a chance they might make it, or at least eventually supplement classical computers with quantum-parallel "co-processors", I wouldn't hold my breath over the next few decades.

The hard disks are indeed advancing by leaps and bounds, but they are rapidly approaching physical limits. They are also power-intensive, mechanical, and two-dimensional. Personally, I think the future lies in 3D holographic memory "crystals".

Also, a potentially revolutionary technology with lots of promise is high-temperature superconductors. Just imagine circuitry that dissipates no heat -- the maximum attainable clock frequencies could probably jump a few orders of magnitude right there. Imagine handhelds and laptops a million times faster than today's fastest desktop, and lasting for weeks and months on their batteries instead of just hours.

------------------
I am; therefore I think.

[This message has been edited by Boris (edited October 28, 1999).]
 
I clearly see where you are coming from.. And I agree with you, but look back at when electron computers took an entire room to operate and didn't even have the computing power of a low-end calculator.. I think that there is always the possibility that it is not really useful or possible for that matter, but there is also the possibility that eventually quantom computers will be tools in the home and office.. perhaps.. for now we can only speculate. In my mind computers have not stopped advancing and probably never will.. we can always make things smaller, generate less heat, make things cost less.. I don't care how impossible it seems now, because it will most likely be possible in the future.. once again look at the first days of the computer. Those who created them and worked with them could never ever forsee a 72.8 gig hard drive, or even a 1 megabyte of memory for that matter.. So be careful about making assumptions, they will most likely turn out to be wrong.

------------------
--telekinetic--
Contact:
telekinetic@mail.com
UIN: 382423
Check it out:
Quake3.nu admin insomniac
<A HREF="http://home.cdsnet.net/~sexton

Who" TARGET=_blank>http://home.cdsnet.net/~sexton

Who</A> says 13 year olds are immature ; )
 
In a few years the current processing technology available will reach its developmental limit, and we will either have to develop a new, more efficient, processor design, or use massively parallel computing. Companies such as SGI and Tera are already working on huge parallel computing machines, but these will never be available to the home user unless that user happens to be a multi-millionaire. One solution which may become prominent is clustering. There is a large amount of research in the field of Beowulf clustering, which utilizes the Linux operating system. It allows tasks to be distributed among a number of different nodes (workstations) on a network. This can achieve comparable results to a Cray at a small fraction of the cost. Soon, we may see these systems performing in the teraFLOPS range. For more info on Beowulf clusters you can visit beowulf.org. But, as I said, current technologies are reaching their limits. AMD and Intel are moving to .18 micron technology, but how much smaller can they go? It is incredibly difficult to produce processors at this level; I can't imagine what it would be like to produce at say .02 microns.

As for storage, our current methods will also be going down the tubes. Researchers in England recently developed a method of storage which would allow many terabytes to be stored on a dime sized piece of equipment. I'm unsure of the details of this technology, but I understand it is also fairly inexpensive to produce. Hopefully this will enter production soon.

Well, there are my thoughts on this. Let me know what you think.
 
I still think you are approaching this to narrow-minded.. Like i said, back in the day, they didn't think that computers could ever reach the computing power they have today.. they thought that they would reach limits as well.. Oh well, my opinion is that there will be no limit to where we can take the power of the electron, and eventually somthing new and better.. We can only wait and see : )


------------------
--telekinetic--
Contact:
telekinetic@mail.com
UIN: 382423
Check it out:
Quake3.nu admin insomniac
<A HREF="http://home.cdsnet.net/~sexton

Who" TARGET=_blank>http://home.cdsnet.net/~sexton

Who</A> says 13 year olds are immature ; )
 
I didn't say that computers would reach a developmental limit. I said the current technology would, and it will. This has happened before, as well. We aren't using vacuum tubes anymore are we?
 
future of computers

considering the advances that have been made in computing since the early twentieth century and also considering the advances made in technology overall in the last 300 years, i think any concrete statements about the future of computers, refuting all but one possible scenario are very narrow-minded. so many changes are going to occur within this century that are going to completely alter the way computers are operated and viewed by society. those operators will look back at the start of their century and wonder how our computers ever worked at all. i believe the next revolution will be a multiple advance. we might see three or four different approaches to a new age of computing: quantum, photonic, neural and biological hybrids, or something we never even considered plausible. existing principles and theories will always be tested and/or overthrown at some point in the future. time will tell. moore's law is real and it will come into effect within a decade or so. some new method of computing must be ready to take the throne at or before that time. it will need to be many, many millions of times better than anything existing at the moment, as all of the latest advancements in science, medicine and technology will require more power to cope with larger amounts of information, because that is what it is all about, information. we must wait and see what will come about but i have a feeling that computing will very quickly take on a life of its own. look at the internet: no-one specifically owns it or controls it and so i don't think we could stop it if we wanted to. its now too big. computers themselves will soon reach this zenith.


hal will come.
 
It'd be like getting a bike!

I was thinking, it would be really cool to have a little portable peice of electronic-paper to take around.

I mean with a MASSIVE storgae device.

Like one that could hold everything ever published, newspaper archives for hundreds of years, every movie ever made in every language, every peice of music ever made, and a speedy search ebngine to sift through it all.

That probably wouldnt be too hard with those little holographic memopry cubes, that hold like 30 TB per cubic centimeter.
 
CNet has a pair of interesting videos regarding this subject at: http://cnet.com/news/0-1575813-7-3240381.html#

The first (under Wednesday, March 14th) is a portion of a speech by Dean Kamen who argues that the pursuit of ever increasing computational power will at some point become meaningless as our ability to harness that power is overcome by either indifference or other human limitations.

The other (under Tuesday, March 13th) is a bit about wearable computers from an MIT researcher (the guy looks like a grad student). It's my personal opinion that this is will become a major trend in the next few years.
 
The other (under Tuesday, March 13th) is a bit about wearable computers from an MIT researcher (the guy looks like a grad student).

And we have that tv commercial that shows an actor portraying a networking guru. As he strolls the city, he wears a 1 lens glass. Upon the glass is a display in green. Not so far into the future it seems.

And what will optics and optical devices do for computing besides running at the speed of light?
 
Status
Not open for further replies.
Back
Top