Today's computers should be good for 20 years

Discussion in 'Computer Science & Culture' started by Syzygys, Jun 15, 2012.

Thread Status:
Not open for further replies.
  1. Syzygys As a mother, I am telling you Valued Senior Member

  2. Google AdSense Guest Advertisement

    to hide all adverts.
  3. aaqucnaona This sentence is a lie Valued Senior Member

    CRYSIS did NOT WORK well in 2007. In fact, the demands were so high that it became commonly accesible only around late 2010. The hardware wasnt good, the software was, which is again not revolutionary, since studios were making much better visuals for films long ago. In fact, the requirements were so high that the game suffered greatly due to it, which is why crysis 2 had to be toned down rather than improve as usual in games development.
  4. Google AdSense Guest Advertisement

    to hide all adverts.
  5. aaqucnaona This sentence is a lie Valued Senior Member

    Because dedicated processors have - just see the evolution of graphic cards, video cards, capture cards and sound cards.
  6. Google AdSense Guest Advertisement

    to hide all adverts.
  7. MacGyver1968 Fixin' Shit that Ain't Broke Valued Senior Member

    Simple...chip manufacturer figured out there are better ways to increase the performance of the chip other than cranking up the clocking speed. Adding additional cores to the processor increases performance and multitasking ability without adding a whole lot of extra heat.
  8. przyk squishy Valued Senior Member

    But there's never been any set demand on computer performance. It's generally worked the other way around: the availability of newer, more powerful machines has always ended up creating its own demand.

    Every generation of machine, when it first came out, has always looked like that 8 GB quad core does to you now. When I bought a 512 MB, Athlon XP 2600+ desktop in 2004, it was way more powerful than any computer I'd regularly used before, and it seemed like it always "should" be useable, and anything more would be added luxury. (My only complaint with it at the time was that it wasn't so good with recent games, which apparently doesn't count for the purposes of this thread anyway.) Now even the more popular desktop Linux distributions expect you to have at least a gigabyte of memory.

    And that was true 20 years ago too. What's changed is our expectation of the "average tasks" that computers are supposed to handle, and the way they're supposed to handle them.

    There's more ways an application developer can exploit computing resources than just making it run faster. The obvious one is adding more features. A less obvious one is restructuring an application in a way that makes it more flexible or its development easier. For example, a trend in programming (especially in open source) is that more applications are getting developed in higher level interpreted languages like Python. That takes up more computing resources. The tradeoff is that developers can create more applications more easily and more rapidly, and likely with less bugs in them. The same goes for making an application web browser-based instead of a standalone executable. Making an application scriptable also carries a performance penalty. The reason the average user can play a simple game in Facebook is that their computer is powerful enough not only to run the game, but also all the levels of abstraction that game is built on top of.

    There's also more to computers than just how fast the processors are and how much memory they have. Modern computers are defined by a host of evolving industry standards. One of the reasons you don't want to be stuck with a 20 year old PC is that it doesn't support USB, and nobody sells devices that use serial or parallel connectors anymore.

    I haven't used Office in years.

    Because the whole computer was slow, or just because dialup was slow?

    20 years is a long time.
  9. Syzygys As a mother, I am telling you Valued Senior Member

    Sure there is. It should be affordable and do the job you expect it to do. If you want the longer version: should be able to run the programs and applications you use. There should be enough storage space in it so you can keep your files and data. If laptop, it should be light with long battery life and strong wireless ability. It should handle the internet fast (streaming video without much preloading), and should have enough all kind of connection ports on it, so you can hook it up to this and that. It should be reliable and last 3-5 years at least...

    Nowadays, even the cheapest computers fullfill these requirements...

    You must be really young. My first computer was a Sinclair ZX Spectrum. Look it up. But even in the late 90s, computers sucked big time, they were slow, the internet was slow, connection ports were missing, HD space was expensive, monitors were bulky, batteries died in 1-2 hours, RAM was expensive,etc...

    I told you, you were young...

    Please Register or Log in to view the hidden image!

    Both... Also, cabel internet either didn't exist (in your area) or was prohibitively expensive.

    P.S.: Blast from 1982:
  10. Syzygys As a mother, I am telling you Valued Senior Member

    I know but what is the point here? That in clock speed we already reached a practical limit (around 3 GHz), where it is just not economical to go beyond. So they started to multiply the cores. How long can they do that (8 cores?16 cores?), and what is more important, is it really necessary beyond a point??

    If my original assumption is correct, and we already reached a point, where most costumers' demand can be statisfied with an average computer, than what is the point of making them faster, just for the sake of marketing??? If a more expensive or technically more advanced computer won't run the program any smoother than the old one, than the money/improvement is pretty much wasted on it....
  11. aaqucnaona This sentence is a lie Valued Senior Member

    Oh they will find new ways to use computing power, they will. I can imagine you saying this in 2004 when half life 2 came out and its source engine was quite beyound the time or saying it in 2007 when crysis came out or last year when bc3 came out. The same is true for high speed hd video streamers, 3d softwares, simulators, physics engines and other programs - 3d maps like advanced google earth [thats your 32 gb ram right there] or high speed bluetooth interconnectivity between cell phones, laptops, pcs, office pcs, fridge, home control systems, etc - thats 10 mbs internet speed. You could have 7k 120fps 16xAA ultra massive settings so that games look like avatar.

    We are nowhere near done with computing. Sure there will be those guys who just email or maintain spreadsheets or do nothing but call on their 600$ smartphones, but for most of us, the journey has just began - and multitaskers, serious users and gamers are on the rise. Just see how much the smartphone industry is blowing up - pcs will be like that once they become cheaper than consoles and good softwares for a wide range is easily available.
  12. przyk squishy Valued Senior Member

    But that's the point I was making: people's expectations aren't static. They change, and that's influenced by what's available.

    Now, you could say that you're happy with the applications and operating system you're currently using and don't see any reason to change, and you're happy to turtle shell yourself while the rest of the world moves on, but then you're no longer talking about the average user. You're ignoring that the average user cares about their computing experience being as simple and trouble-free as possible, and a large social aspect of that is interoperability with what everyone else is using.

    Let's say you buy a good modern machine now and decide to stick with it. Now, fast forward 20 years. Other users, whether they really needed to or not, bought the newest machines (because, why not, and because 20 year's worth of teenagers got a new computer when they started college), and along with them came new versions of software and new industry standards. All your applications run just as fast as they did 20 years before[sup]*[/sup], but support for your version of Windows was finally ended 5 years ago, and since then there haven't been any updates or security fixes. Whenever someone emails you a Word document, they have to make it "Office 2010 compatible". Half the time, they forget. USB has become obsolete - new computers don't use it, so you can't casually give someone your USB key. You have to give them the key and some quaint USB-to-whatever adapter you bought ten years ago. Let's say your hard disk or DVD drive breaks down. You can't easily get a replacement, because the bus standard has changed and manufacturers are all building for the new one. Then you walk into Walmart and see a new, modern system going for just $300. With the new system, the increasing number of workarounds you've been having to live with would all go away instantly. What do you think the average user would do in that situation?

    [sup]*[/sup]Windows has this nuisance that the registry becomes increasingly cluttered over time, which slows down the whole system. From what I hear, keeping a Windows machine running as fast as new often means periodically reinstalling the whole system or carefully limiting the number of applications you install and remove.

    But there's a problem here: the kind really matters, and that's a standard that changes with time. One of the things that makes a 20 year old computer near useless to the average user is that of its 2 or 3 ports, exactly zero of them are USB.

    For the record, I've used a Tatung Einstein. (Then again, I was about 7, and it was basically a toy.) I don't think the thing even had a hard disk.

    They suck in retrospect. They look pretty damn miraculous if you compare them with what was available in the 80s. And back in the 80s, simply being able to own a microcomputer and use it at your desk at home looks pretty miraculous compared with the minicomputers of the 70s.

    I was actually using a mid-90s computer (Windows 95, 100 MHz Pentium, 16 MB RAM, 500 or so MB hard drive) for a time. The main reasons I thought it sucked all had to do with that time being the early 2000s. In fact, my biggest annoyance was with the hard drive capacity, which I think was quite small even by mid-90s standards.
  13. Syzygys As a mother, I am telling you Valued Senior Member

    Sure, but is not the point. The point is if we really need that much of computing power. After all, streaming video doesn't take that much computing...

    See, we are back in gaming again, what is first a special need, second, they are pretty darn good nowadays. If you are not statisfied with it today, chances are you won't be 20 years from now...

    Netflix streams just fine for me.

    All special needs. You can find the nearest pizza joint with a 2D GSP, you don't need a 3D "holding your hand and driving your car" program for it.

    Nobody said we are done, but we are doing pretty good. By the way, I don't have a smart phone, because I like to talk to people, if I want to communicate with them...

    But for the sake of fun, what if we run into other limits and computers kind of stop improving much? Who is going to drive your car???
  14. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    Streaming of HD content is just about feasible for the majority of people, but soon there will be 4K tv. There will be 3D tv. There will be glass-less 3D (that will require updates to the screen you use). And who knows what else.

    Downloading movies in 5 minutes is only with respect highly compressed videos... 700Mb in size or so, compared to a full DVD quality that is nearer 8Gb, and a Blu-ray movie that is closer to 30Gb.

    But of course we don't need that much computing power.
    It is just that it is available, and people do make use of it, and develop things to utilise that much power, requiring people to buy that much power to use them.
    First, it is no more "special" than any of the consoles. And it is a multi-billion dollar industry.
    Second, they ARE pretty darn good... but that's what we thought 10 years ago, and even when PacMan first came out we thought it was the pinnacle of technology.
    But in 20 years time they will be that much better, with better AI (requiring greater resource), greater interactivity with the environment etc.

    As said, if all you ever intend to do is what you currently do, then yes, current computers are fine.
  15. Syzygys As a mother, I am telling you Valued Senior Member

    I know and that is just fine, but most of the needs are nicely covered right now.

    You might want to check out just what computer the average BUSINESS people use today and that XP is still on 25% of the computers. That is 11 years after it was born.

    Why would I be ignoring that? Today it is real simple and care-free...

    So, then why would I worry about my 20 years old computer?

    Now you forgot that I never said you can't update Windows. I was talking about that the hardware will last 20 years, not that the OS never needs update. Do you think Windows 12 needs more than quad core and 8 GB?

    Sure they buy a new one. That why I tell everyone when they ask what computers should they buy, just buy the cheapest new one...

    I agree, that could be a problem.

    You guys always think about gaming, but never think about business, although that drives or slows down innovation. Lost of companies are still using XP, because finally that was a safe and stabil enviroment. AND if it does/did the job, there was no real reason to upgrade a whole companies' infrastructure and messing with a well working system. My wife who is in logistic still has XP on her work laptop.
    Computer manufacturers are careful not to piss off business people with their huge orders. If their Office and power point presentation runs smooth on 1 GB RAM, there is no need to force them to try to buy the newest gadget for every worker....

    Sure Microsoft wants to sell you a new OS in every 3 years, but seriously, do we need it? What is such a huge difference between 7 and 8? And by the way did I mentioned I was sweating bullets when I tried to use Homegroup in 7? They should make it work first, then try to sell me the latest and newest...
  16. Syzygys As a mother, I am telling you Valued Senior Member

    That is your ISP, not your computing power. A brand new machine today could suck the data up, if the ISP could provide it.

    Would you please forget about consoles? In this thread I provided data that said about 20% of computers are for high end gaming. 20% by definition is special need and not average. 50-60% would be average...

    I agree, and that is my point. If they are darn good already, how much people will pay for slightly better?

    Get real...

    Again, the average computer is NOT used for high end gaming...
  17. quadraphonics Bloodthirsty Barbarian Valued Senior Member

    Actually much of the hardware in computers is designed to fail long before 20 years of use go by. The hard drives in particular are only expected to last one third that long (if that), the capacitors on the boards will go some time after that, and you'll even get into silicon failures on the ICs when you get up into the decade range.

    If you've ever built and maintained your own computer, you know perfectly well that you end up swapping out everything except the case itself within 5-6 years, especially if you go for the cheapo components.
  18. Syzygys As a mother, I am telling you Valued Senior Member

    Sure, but that is not the point of this thread. Why is it so hard to see what the point is???
  19. Gustav Banned Banned


    that sounds like some evil plan to bilk the consumer

    i would imagine that it is more a inherent feature of the materials used that cause a limited shelf life. i suppose some could demand more exotic metals and whatnot be used but are we willing to pay the price? it is a simple tradeoff b/w affordability and longevity

    from wiki....
    If marketers expect a product to become obsolete, they can design it to last for a specific lifetime. If a product will be technically or stylistically obsolete in five years, many marketers will design the product so it will only last for that time. This is done through a technical process called value engineering. An example is home entertainment electronics which tend to be designed and built with moving components like motors and gears that last until technical or stylistic innovations make them obsolete.​
    so what the fuck should they have used? flash drives in the era of tape, belt drives and servos? we had no goddamn choice

    One-hoss Shay

    That was built in such a logical way
    It ran a hundred years to a day,
    And then,
    went to pieces all at once, --
    All at once, and nothing first, --
    Just as bubbles do when they burst.
  20. aaqucnaona This sentence is a lie Valued Senior Member

    Depends on ones use, but for many things, we dont need it. I am saying that they will find new ways to need it.

    Thats not the point. The point is that there is room for improvement. Just see the two links I gave, for gaming itself, maybe you will be correct in 20 years.

    But it wont for those who have 7k hd full wall projections connected to their pcs.

    Yes, but that market is growing massively.

    But a smartphone has many more uses that just that, though you do prove my point.

    Can you please elaborate?
  21. Sarkus Hippomonstrosesquippedalo phobe Valued Senior Member

    20% is "special need"??? A minority, sure, but special need?
    And it seems you want to ignore the prime driver for the PC market?

    Furthermore, what is currently a spec for "high-end gaming" will be spec for "low-end gaming" quite soon.
    A "high-end gaming" computer from 2002 would seriously struggle to play any of todays lower-requirement games.

    But basically your argument boils down to: If you only ever want to do what your current PC is capable of, you'll be good for 20 years!
    This is a truism, but also fairly useless.

    Depends on their cost/benefit analysis.
    Yep - strip out all the drivers for improved tech, and you are left with the same computer for the next x years.

    So are you actually going to define what an "average PC" is used for? Or are you just going to discount anything raised that might actually result in the need for a new PC, such as high-end gaming, video-editing etc? :shrug:
  22. Syzygys As a mother, I am telling you Valued Senior Member

    Which would also be special needs....

    Just for the sake of a thought experience, let's say that after clock speed hitting a practical limit around 3 GHz (a fact), the doubling of cores will also hit a practical/economical limit at let's say 8 cores. There will be small tinkerings, but computers kind of stop at 3 GHz and 8 cores. Now what will happen then?

    First, softwares start to improve (like games for the same old consoles) and through the years using the same motherboard set up, they will still be better and better. Second, we will learn to live between our limits and make programs that fit this set up, which is most most programs....
  23. Syzygys As a mother, I am telling you Valued Senior Member

    Depends on the definition, doesn't it?

    My Jeopardy answer: What is profit?

    True, but that is my point. Today the low end computers (under $500) are
    pretty darn good for gaming... Does Crysis play well on quadcore 8 GB?

    The current cheap PCs can handle 80% of everyday tasks just fine... And since the average demand doesn't seem to grow that fast anymore, they should be good for a long time..

    I haven't?

    As it was shown, the average PC is not used for high end gaming nor for video-editing, so ......
Thread Status:
Not open for further replies.

Share This Page