Over the next 30 years, I think that humanity may be approaching an historic watershed comparable to the taming of fire or the conversion from a food-gathering economy to a food-producing economy. As computers reach the take-off point in the implementation of certain key technologies, we might expect them to play an ever-more-critical role in this historic transition. I've been forecasting computer technology for 24 years, and the results have been fairly accurate (owing more to providential good luck, I'm afraid, than to any cleverness on my part). (Please visit "http://www.geocities.com/rnseitz/Relativity_Made_Simple.html" to check out my previous rash forecasts.) And now, it's once again time for me to throw caution and good judgement to the winds, and once again, rush in where angels fear to tread. Are you ready? Here we go!
Table 1: What You Might
Expect on Your Desktop for $1,000-$1,200, 1997 through 2030:
|4/00||600/K7||1.2||128||12||30||4.3||1.5||640X480 Vidcam, Zip|
|4/02||1,500/Merced II||5||512||35||150||15?||1.5||640X480 Vidcam,RW|
|4/03||2,500/Merced II||7||1,024||60||200||25?||1.5||800X600 Vidcam,RW|
|4/05||4,500/Merced III||13||2,048||200||400||180.0||1.5||1280X1024, RW DVD|
|4/06||6,000/P8||16||4,096||400||500||180.0||1.5||1920X1024, RW DVD|
Notes for interpreting Table 1:
All entries after 2012 are highly speculative. They are included only to show what a simple extrapolation of Moore's Law would project (More about this below).
(Column 2)- Multiprocessing (use of multiple computers) is moving into the mainstream, with the future possibility of placing more than one microprocessor on a chip or of paying somewhat more for multiprocessor boards (which would sell more chips). This can provide an alternative method of achieving higher speeds for certain classes of partitionable, computationally demanding calculations. Also, the long-term role of Intels IA-64 (P7) architecture on the desktop is far from clear to me. Designations like Merced, P8, P9, and P10 are meant to be suggestive rather than prophetic.
(Column 3)- Computer Speeds have become much more difficult to quantify with the introduction of super-scalar, super-pipelined, out-of-sequence processing, as well as MMx. Add to that changes in benchmarking from Megops to MIPS to SPECInt92s toSPECInt95s and it becomes hard to know where you are. Integer applications that are conducive to MMx may run several times faster than shown in Column 3.
(Column 4) - RAM (Random Access Memory) has moved fromed EDO RAM in 97 to SDRAM in 98 and 99 to Rambus in 2000. Access to memory has become a serious bottleneck to faster computing, and new techniques are appearing to facilitate faster and faster memory access.
(Column 5)- Hard drive capacities are complicated by the fact that:
- (a) hard drive capacities have begun to double every year rather than every 2.25 years, and
- 3.5" disk capacities are expected to top out at a theoretical limit of 500-to-1,000 Gigabytes.
(Column 6) - Multimedia Accelerators continue to operate at speeds up to 30 times the speed of the desktop computers they support. Compaq, beginning in 1997 with its MediaGX processors, and now Intel with its are dispensing with graphics accelerators, hardware modems, and sound cards, using the fast central processor to perform these functions and to lower overall computer costs. "Systems-on-a-chip" are an attractive way to cut overall computer costs, and may play a role in bringing computers to virtually every household.
Column 7) - DVD Drives: 25-gigabyte DVD's are required for full-length HDTV movies. 160-GB DVD drives are allegedly in the works for 2005, with 1,000-GB DVD drives forecast for 2010. Rewritable DVD's may take over from RW-CD's as the preferred backup medium.
(Column 8) - Communications Bandwidths are a bit of a wild card, depending as they do upon information-utility politics and upon upgrading of Internet servers, lines, and hubs. However, a first-generation of lower wideband service, in the 400 kilobaud to 1.5 megabaud range, may spread rapidly during the 2000 to 2005 time period.
(Column 9) - Video cameras have dropped as low as $30..
How Far We've Already Come:
We are in the midst of an ongoing computer technology revolution that dwarfs anything else in human experience. Today, you can buy a megabyte of RAM (Random Access Memory) for $1.00. Thirty-three years ago, in 1967, my employers at NASA paid $3,000,000 to buy that same megabyte of RAM for their Univac 1108 computersa 3,000,000:1 price reduction! If this had happened in the automotive world, it would be as though you could buy a new Rolls Royce, costing $30,000 in 1967, for 1¢ today! Twenty-three years ago, in 1977, when Radio Shack and Commodore introduced the worlds first personal computers, Radio Shack had to charge $266 for 8 kilobytes of RAM. Today, 64 megabytes of RAM costs $64 (about a 30,000-to-1 ratio).
It may well be that the computer sitting in front of me here as I type is more powerful than all the worlds computers put together in 1967. Here again, there is a price/performance improvement of the order of 3,000,000:1. But if that doesnt astonish you, try this. In 1946, ENIAC (Electronic Numerical Integrator and Calculator), the first electronic digital computer, was able to perform something like 3 decimal calculations a second. Fifty years later, in 1997, Intel delivered a computer that performs one trillion decimal calculations per second300 billion times faster than ENIAC! A 10-trillion decimal-calculation computer is due for delivery this year, rising to 30 trillion in 2001 and 100 trillion in 2005.
The cheapest magnetic disks in 1967 stored one (1) megabyte of data and probably added $10,000 to the price of an IBM 1130 minicomputer. The current price is less than a penny a megabytea 1,400,000 to 1 price-performance improvement. (Until recently, magnetic disk prices declined somewhat slower than RAM prices.)
The Internet isn't feasible in its present and future states without this accompanying personal-computer power. (If you doubt it, try accessing the Internet with a 1990-vintage computer.) I believe that this computer revolution and the accompanying revolution in wideband communicationis at least as important as were the railroad and the telegraph (and later, the telephone) in the 19th century.
For the past 35 years, both speed and storage have improved by a factor of 10 every 5 yearsMoores Law.
This is always difficult to call. For at least the past 33 years, memory capacities have doubled precisely every 18 months, 100-folding every 10 years. For example, Fairchild Semiconductor introduced 256-bit RAM chips in 1967. One-thousand-bit chips followed in 1970. One-billion-bit SDRAM chips are being sampled this yearprecisely as we would expect if memory capacities 100-folded every decade. However, there have been dire warnings every inch of the way that we were about to hit a brick wall. The problem is that, sooner or later, that's going to happen. We are shrinking linear circuit dimensions by a factor of 10 every 10 years, and are shrinking areas by a factor of 100 every decade, which has allowed us to expand memory capacities by a factor of 100 every decade. Right now, our smallest circuit features are about 1,800 Angstroms or 500 atoms across. In 10 years, circuit features would drop to 50 atoms across, in 20 years, they would span 5 atoms, and in 30 years, they would only be 0.5 atoms wide. Obviously, at some point, circuit shrinkage is going to stop.
Right now, we're as good as guaranteed that progress will continue unrestrained through 2002 (see ), as circuit features shrink from their current 1,800 Angstrom level to 1,000 Angstroms. Many experts believe that current technology can be
There is reason to believe that this astounding rate of computer technology improvement will continue through at least the year 2012 and possibly, through the year 2030. The Semiconductor Industry Associations 15-year Technology Roadmap projects 13 GHz on-chip clock speeds in 2012 (compared to 1 GHz today), with 3 GHz cross-chip clock speeds, and 256-gigabit DRAM memory chips (compared to 256-megabit chips today) by the year 2012. (With 3 GHz on-chip clock speeds and 1.5 GHz cross-chip clock speeds coming toward the end of this year, I actually expect to see on-chip clock speeds in excess of 13 GHz by 2012). By 2015, computer speeds of 1 to 2 terops might be expected, together with 1 terabit memory chips. By 2020, we would anticipate 10 to 20 terops processors and 8 terabit memory chips. By 2025 we would be looking for 100 to 200 terops microprocessors and 128 terabit memory chips. And by 2030, we would expect to see 1 to 2 petops CPU's and 1 petabit memory chips.
It's certainly not a sure thing that we can reach these milestones this rapidly, if at all.
I haven't extended my forecast beyond 2030 because at our present rate of progress, we would reach atomic dimensions on our chips by approximately 2030, and my crystal ball becomes a little murky when we try to pass beyond that miniaturization threshold.
What will we do with all this speed and storage capacity?
There is enormous room for improvement in computer capacity over the next 15 or 30 years, just as there was 15 or 30 years ago. The most important applications for this swelling computer power will be invisible, taking the form of embedded or industrial processors, and the most important role for robotics will be, as it has been in the past, industrial and commercial.
Artificial Intelligence and Robotics:
At the top of the list of watershed events that I mentioned at the beginning of this article are the anticipated breakthoughs in artificial intelligence and robotics. The implementation of human-caliber-or-beyond artificial intelligence, if it can happen, would forever change our daily world. This has been a staple of science fiction for decades. In many of the stories, the computer intelligence takes over. In A. E. van Vogt's 1945 novel, "The World of Null-A", "The Machine" tests 25th-century citizens for rationality, and serves as a guardian for good government. In "The Humanoids", the robots arrive from outer space to "serve mankind and protect him from harm". In the end, they chemically lobotomize humans to protect them from themselves. In Fred Saberhagen's Berserker series, intelligent machines set about to sterilize the universe of organic life. In "Dinosaur Beach", Keith Laumer has a god-like machine from the "Tenth Era" create the universe, and then protect it from time-travelers. Robert Heinlein has computers assume human form so that they can enter into normal human lives. However, even if we don't emulate human intelligence, gains in computer power that may be expected over the next ten years should produce corresponding gains in such human capabilities as practical speech dictation, natural language understanding, and voice-operated language translators.
Intelligent agents that can carry on limited conversations, with rudimentary understanding of certain types of requests, may show up over the next ten or twenty years.
2000 to 2030 time period: Robotic devices will probably tend to be specialized machines that will perform specialized functions, rather than anthropomorphic machines. In part, they will simply be improvements upon robotic devices that are already in limited use.
2003: The first visually guided automatic platforms, such as automatic sweepers, serving carts, and lawn mowers might possibly appear in restricted locations. Speech recognition, natural language understanding, and voice-operated language translators are slowly improving, although they're still pretty crude.
2005 to 2010:Automatic sweepers, serving carts, and lawn mowers may begin showing up in commercial settings such as motels, conference centers, and lawn services.
Speech recognition, natural language understanding, and voice-operated language translators begin to move into the mainstream.
Personal digital assistants that respond verbally to verbal requests are becoming popular.
Voice-operated typing systems are becoming practical.
2010 - 2030: At the same time, there may be new robotics applications, including improved Animatronics at Disneyworld, and perhaps in our family rooms.
2015: You may be able to find household sweepers, floor scrubbers, and lawnmowers on sale at Walmart.
2020: General-purpose robots, with manipulators, that can make beds, load and unload washers and dryers, fold clothes, and work in the kitchen begin to appear in commercial environments.
Imagine seeing a truck tooling along the Interstate with no one at the wheel!
It's 5 p. m. in Toronto on a dark, chilly day in January, 2026. Your spouse picks you up from work in your self-contained, self-driving van. You change clothes and relax for a few minutes, and then the two of you eat supper, sitting in the van's dinette, talking and watching the news, as the van drives you southwest on the Queen Elizabeth Highway. Snow is everywhere, and there's more on the way. It's been a dark, gloomy day, but the Interstates are clear. Before the next blizzard hits, you should be far enough south that it won't matter. After supper, your spouse watches a movie-on-demand while you videoconference over the Internet, write an email to your sister, and settle down to read a good book. At 6 p. m., your van passes St, Catherine's, stopping momentarily at customs, and then picking up I-80 West to rendezvous with I-79. Traveling at 100 mph., you reach northern West Virginia by 9:30. Your van stops at an automatic "gas" pump to refill your fuel tank. By 11:00. when you go to bed, you're tolling down I-77 on your way to I-81. The next morning, when you awaken at 7, the van has reached Orlando and the campground at Disneyworld. You eat a bite of breakfast at McDonalds. The birds are singing, and it's shirtsleeves warm... wonderful after a long, dark winter! Thirty minutes later, you're parking in the campground near Disneyworld. On Monday afternoon, you'll start back home, arriving at the house at 7 a. m., in time to get ready for work.
U. S. Army:30% of fielded forces to be robotic by 2015.
(see "http://www.geocities.com/rnseitz/Robotic_Army.html"), and the Air Force has announced similar plans (see "http://www.geocities.com/rnseitz/UAV-AF.html").
U. S. Department of Transportation Schedule: Autonomous Vehicles in the 2020 Time Frame. Mercedes trucks are already being equipped with visually triggered lane-switching signals: "http://www.geocities.com/rnseitz/Mercedes_Computer_Guidance.html". To this will be added collision warning, and, possibly, automatic emergency steering, in case the driver is drunk or unconscious at the wheel (e. g., with an epileptic seizure). These capabilities will appear first in luxury automobiles, and will diffuse downward into lesser automobiles. And gradually over time, One of the problems that will accompany self-driving vehicles will be the matter of insuring that they are road-worthy and fail-safe. This may require special licenses and annual inspections. Built-in diagnostics may provide partial control, but questions like tire type and wear would be more difficult to automate, although given low-cost visual inspection systems, it may be that computerized inspections can be performed at Department of Transportation inspection centers.
The Department of Transportation will have to be very careful to avoid deadly failures and bad press. The media will be waiting to pounce.
Safer.. Such operations, once they are thoroughly debugged, might afford greater driving safety than is currently possible with human drivers, since the computer control systems may be given lightning reflexes, can communicate with computers in adjacent vehicles, and won't become drunk or reckless. Free home deliveries might become a practical fringe benefit of self-driving trucks.
Cheaper. Self-driven trucks could lower shipping costs.
How autonomous vehicles will fit our existing highways and Interstates, I don't know. There is a stretch of Interstate 5 near San Diego that is equipped for experimentation with self-driving vehicles, and several of the major carmakers are running their experimental autonomous vehicles on these roads, with a human driver on standby at the wheel.
One fact that I don't understand is why we don't have navigational coding along our present highways that would permit passing vehicles to determine their locations. This could take, for example, the form of barcodes, embedded magnets, or tuned circuits embedded in the pavement. (Of course, GPS, and particularly, differential GPS, affords another technique for precisely locating motor vehicles.) Given sufficient precision, it should be possible for vehicles to coordinate their driving in such a way as to greatly reduce the probabilities of accidents. And this could be done with traffic control systems embedded in intelligent highwaysputting the "smarts" in the roadrather than depending upon vision- or radar-guided vehiclesputting the "smarts" in the vehicle. However, it looks as though such a system would have been feasible decades ago, and for some reason, it hasn't been done.
NASA has a very far-sighted program underway to promote personal aircraft that will soon be self-piloted. This SATS (Small Aircraft Transportation System)
The sportscar-like cockpit of the Cirrus C-20 state-of-the art personal aircraft, designed for mass production. The heads-up display gives a synthesized view
of the terrain even in clouds and dense fog. This aircraft is designed to be as easy to fly as a car, and is the only aircraft that comes with a parachute.
Rapid spread of Internet appliances, 1st-generation wideband, wireless, , Bluetooth, free access, free wideband access. Natural
language web searches make their debut.
A huge commercial success, Ananova sets the stage for "the face of the web" 2002
Wideband is spreading rapidly. Wireless Internet access in cars,
commerce continues to grow. More people gain Internet access.
Hans Moravec delivers first mobile robotics platform. Prototype visually guided
automatic sweepers, autonomous carts, and lawn mowers in news.
Intelligent agents are acting as
Bluetooth used to "wire" older homes for wideband.
With larger displays and wideband, virtual reaiity gradually gaining ground.
Wireless cellular PDA's are HOT!
Global Internet marketplace. Internet gradually becoming ubiquitous, wireless. Speech
voice operated translators are marginally useful.
sweepers, serving carts, and lawn mowers might possibly appear in restricted locations.
IIntelligent agents are supporting wireless PDA's. Voice mail and video-
conferencing is finally gaining acceptance now that wideband is widespread.
warning, and automatic lane signaling options appear on trucks and perhaps, on
top of the line cars.
generation wideband (1.5 to 12 Mbits per second) Internet TV takes off. Fiber-to-the-curb appearing, as
Bell Systems respond to competition.
language understanding, and voice operated language
move into mainstream.
Automatic sweepers, automated carts, and lawn mowers might begin showing up in motels, conference centers, and lawn services. Intelligent agents appear in operating systems, as sales reps. Wireless wideband interconnectivity among devices, including Internet appliances, in the home. Personal digital assistants that respond verbally to verbal requests are becoming popular. Voice operated typing systems
Conversational dolls are becoming popular. Scare movies about robotic dolls that take over the world. Media - "Children will think such dolls are real." 2011
based receptionists are becoming an incompetent and hateful successor
to voicemail, displacing humans.
Automatic sweepers, automated carts, and lawn mowers enter the home. On sale at Walmart, they're driving down lawn service prices. Interactive female Intelligent agents are beginning to
offer help in making reservations, other services.
Voice interactive personal digital assistants are the cat's meow. Voice operated typing systems
Collision avoidance and emergency driving backup
is beginning to appear.
Conversational dolls are remarkable. Smart toys are everywhere. 2016
are becoming a little less hateful.
Automatic sweepers, lawn mowers, trimmers, and edgers are becoming privately owned, edging out lawn services.. AI Modeling, through ever-more sophisticated web-based services. Web-based intelligence?. Brother voice-
writers on sale at Walmart for $139.95.
trucks and autonomous vehicle
highways are beginning to appear.
based office "personnel" are becoming widespread, augmenting humans.
General-purpose robots,with manipulators, that can make beds, load and unload washers and dryers, fold clothes, and work in the kitchen begin to appear in commercial environments Some stretches
of Interstate are designated forautonomous vehicles- mostly trucks.
Conversational dolls.are beginning to play important educational and emotional roles in children's lives. 2026
General-purpose robots are becoming popular for home use. AI Modeling is leading to human-like robotic entities, though still far from human. More and more highways are
being certified for self-driving, including most Interstates.They're safer, and are permitted
higher speed limits.
Lifesize conversational dolls.are beginning to serve as personal assistants. 2031 ?
Data Compression and Display
Improved data compression, graphic/virtual reality/laser holographic displays, MPEG2 and MPEG4 encoding.
Computer Games and Virtual Reality
Computer games and simulations can profit mightily from orders-of-magnitude improvements in computer speeds. Right now, games like "Riven" have to jump from snapshot to snapshot rather than allowing virtual-reality type movement. Also, when "Riven" runs a "video clip", the resolution has to be reduced to blurry pixel blocks so that the computer can alter the image in real time. It could easily use a factor-of-ten improvement in computer speeds just to handle its video clips. A speed-improvement factor of 1,000 may not be enough to support photo-realistic virtual reality.
Computer games have begun to use multiple CD ROM disks. They could easily and very profitably utilize tens of gigabytes of DVD storage if it were available. By the year 2012, I predict that game developers will find it easy to fill up terabytes of DVD storage (giving us photo-realistic virtual worlds to explore!)
Nanotechnology and Genetics
Nanotechnology and genetic manipulation may be other watershed arenas that will begin to alter our lifestyles within 30 years. Telepresence and virtual reality will probably be important, although perhaps not in the same league with AI and robotics. Genetic computation and simulation may be another profitable application of computers, as we develop the ability to simulate genetic interactions, and perhaps, living cells.
In 1982, we
were playing Pac-Man on our IBM and Commodore 64 computers. We
might have wondered then how in the world we could use a 5,000-fold
improvement in computer speeds and storage capacities. Now we
know, and it has been wonderful. The next 15-to-30 years will
be equally wonderful."Telepresence" with high-definition,
3-D "window-wall" displays, free Internet-based international
videoconferencing, more-effective distance learning programs,
virtual reality with kinesthetic and tactile support, and a truly
Jesse Burst, the editor of the ZD Anchor-Desk has suggested that we may look back upon the year 2000 as the year in which the computer age began for the consumer.
Can We Go?
Granted that improving computer speeds and storage capacities will be a good thing: how much can we improve them?
This requires a little background. In the early 90's, we were warned that computer progress could continue until the year 2000. Then we were going to hit a brick wall, when circuit features got down to 0.2 microns (about half the wavelength of violet light). There were a number of "show-stoppers" at this point. If one problem didn't get us, another would. In 1997, Dr. Gordon Moore, the enunciator of "Moore's Law", reaffirmed this gloomy prognosis as he retired from Intel, the company that he had founded. In fact, said Dr. Moore, the cost of semiconductor "fabs" has risen so high that cost alone will keep future improvements from happening. "Moore's Law" should now be replaced, he said, by "Moore's Lament". But in the meantime, the Semiconductor Industry Association, with the aid of 300 experts from around the world, had drafted the Semiconductor Industry Association's 15-Year Technology Roadmap, and it called for relentless progress through at least the year 2012. And last year, semiconductor vendors began manufacturing chips that use 0.18-micron design featuresjust about the brick-wall size at which the music was supposed to stop. Who was right? (Just last fall, an Intel researcher published an article in Science stating that we were about to hit that brick wall.)
The answer has just arrived. Semiconductor manufacturers have burst through the 0.2-micron "brick wall" as though it were tissue paper. Next year (2001), Intel, IBM, Texas Instruments, and Samsung will begin manufacturing computer chips with 0.13-micron design rules, and in 2002, they will switch to 0.1-micron chips, staying right on the Moore's-Law curve. The talk now is that everything will be good down to at least 0,07 microns. A lot of work has taken place at smaller dimensions. Back in 1990, Caltech was working on Stark-Effect transistors that used 10-atom (0.001-micron) design rules.
Similar "brick walls" have been touted in the past. The first such show-stopper that I remember occurred with 16-kilobit RAM chips in 1976 when it was learned that natural background alpha-particle radiation was causing errors in information retrieval. Another such barrier loomed large as we approached 1.0-micron design features. Semiconductor chips are made using a photolithographic process, and 1.0 micron (10,000 Angstroms) was approaching the wavelength of light and the diffraction limit for optics.
Why is this continuing reduction in circuit sizes important? Because as transistor sizes shrink, we can
(a) get more of them on a chip, and
(b) we can run the chip faster.
As the area required by a transistor shrinks, the amount of energy required to switch it off or on shrinks more or less in proportion to its area. And the amount of power it takes to operate a circuit is roughly proportional to its area multiplied by the speed, in megahertz or gigahertz, at which we operate it. So if we cut its linear dimensions by 2 and its area by 4, we can increase its clock frequency by a factor of 4, while dissipating the same amount of power (as heat). For example, if we can run our 0.18-micron chip at 1 gigahertz while dissipating 25 watts of power, then we should be able to run an 0.09-micron chip at 4 gigahertz while dissipating the same 25 watts of power.
So why don't we build these circuits right now at the smallest possible size? The answer is called "the learning curve". We don't yet know how. It's been challenging to say the least to have come as far as we have as fast as we have.
I feel fairly confident that circuit design rules can be reduced to the 0.07-micron level, which we're scheduled to reach in 2004, and perhaps the 0.05-micron level, which we're expected to attain in 2005, without necessarily falling behind the Moore's Law projections. And Rome would not in Tiber melt and the wide arch of the rang'd empire fall if we were to slow down a bit as we transition from traditional photolithography to x-ray or electron-beam photolithography. We can probably reach 0.1 microns using extreme-ultraviolet excimer lasers for our photolithography, but to go to 0.07 or 0.05 microns, we may have to switch from the ordinary-light photolithography that has underpinned semiconductor fabrication for the past 35 years, to x-ray or electron-beam photolithography. And there could be a hiccup in this transition. However, Lucent Technologies is fabricating 0.035-micron transistors, which would be the next stage in this downward progression, anticipated in 2007. Beyond this would lie 0.025 features, targeted for 2008, and 0.018-micron chips, projected (by me) for 2010. Right now, we should be able to economically manufacture one-gigabit SDRAM chips, and by 2010, we should be able to extend that to 64-gigabits on a chip.
By 2015, we would expect to reach 0.007 microns, permitting 1-terabit memory chips, if we stay on the Moore's Law curve, arriving at 0.0018 microns amd 8 terabit memory chips by 2020, 0.0007 microns by 2025, and 0.00018 microns (2 Angstroms or 2 atomic diameters).
1997 - 2010 Computer
It is even more dangerous for me to try to forecast future applications of computer technology then the technology forecasts themselves, since they depend both upon knowledge I don't possess, and upon non-technical factors. However, I think that, over the next few years, personal computers are going to impact the man in the street to an unprecedented degree. The next ten or fifteen years--and in fact, the next few years--should be very exciting. The timeline depicted below should be considered to be a listing of some possible near-term technology applications rather than a reliable schedule for these applications to appear.
dolls begin to appear
Video mail.......Virtual reality becomes major entertainment medium?
1st-gen........................... 2nd-gen.................................... Virtual shopping? ........................Voice-writers
voice- ..............................voice- ......................................3-D displays? ..............................begin to replace
writer............................... writer...................Net video-casting.....................................................typists
Figure 4 - Computer Applications Timeline
These projections are discussed in greater detail in the following eight paragraphs.
(2) Natural Language Understanding
and Conversation with the Aid of Artificial Intelligence
Natural language queries first appeared in 1997 with Microsoft Office. Natural-language understanding means that you can ask the computer questions in everyday English, such as "How do I double-space documents?" The computer (usually) correctly interprets this question and tells you how to do what you want to do. At some point, one can probably expect to see it becoming widespread for "Help" functions in various software packages. The ability to frame sentences or to draw upon a large library of stored phrases may be expected as the next stage in this progression that will eventually lead to limited "understanding" in certain specialized fields. Computer-based telephone answering software that can process messages--e.g., giving a forwarding number to a select set of people--will probably come next, followed by software that will "understand" simple statements and queries, and will respond intelligently, permitting dialogs with computers. (Perhaps youll soon get an answer to the question, "Does my computer really love me?") However, before this can happen, computers will need to improve their voice recognition capabilities by becoming more accurate and speaker-independent. It will probably be several years before this will happen5 to 8 years from now.
"Personal Assistants" may answer the telephone for us, and keep track of appointments, notifying us verbally.
Much of our equipment may respond to the spoken word.
(3) Voice-Operated Language Translation
(4) Robotic Lawn Mowers, Floor Scrubbers and Vacuum Sweepers
(5) Better Communications
(6) Net Video-Casting
(8) Virtual Reality
(9) Three-D Displays
(10) Toys That Converse with Children
(11) Autonomous Vehicles
(12) Printers, Scanners, Color Fax Machines, and Color Copiers
(13) Removable Disk Drives
(14) Video/Digital Cameras
And Looking Still
As previously mentioned, the Semiconductor Industry Association has an unpublished technology roadmap through 2022. As currently envisioned, circuit design rules will diminish to 0.05 m by 2012.
There is evidence that materials still exhibit bulk properties down to 0.03 m. If that can be stretched to 0.02 ms, then conventional semiconductor progress might continue to follow Moores Law through 2022, with computer price/performance ratios halving every 18 months.
If so, then by 2022, your bargain-sale desktop computer will be equipped with 2 to 4 terabytes (2 to 4 million megabytes) of RAM and will run at a speed of 20 to 30 trillion operations per second or about 100,000 times that of a 166-MHz MMx Pentium. This hardware capacity should be sufficient to support human-level intelligence at a readily affordable cost. However, as mentioned in item 4 above, near-human intelligence may be reached with the aid of specialized digital signal processors well before 2022. The biggest obstacle here is probably that of software.
Even if circuit shrinkage were to stop abruptly in 2012, there should still be about six years worth of price/performance improvements for RAM, as 256-gigabit RAM chips go into volume production. This should reduce RAM prices to, perhaps, $200/terabyte by 2018. One more round of shrinkage, (or die-size expansion) to permit 1 terabit RAM chips, might reduce RAM prices to $50/terabyte by 2022. Of course, at some future time, in all likelihood, the rate of semiconductor shrinkage will probably slow from its current hectic pace but will probably not stop altogether. Or, while transitioning to new technological approaches, the rate of progress may slow for a year or two and then speed up again.
The use of multiple processors offers an alternative approach to higher computer speeds when uniprocessor speeds finally peak out. The ability to pack billions of transistors on a chip should make it feasible to mount several microprocessors on one die. And of course, several such CPU chips, each containing several microprocessors, are possible. Costs would be somewhat higher than for single chip systems but not distressingly so. Furthermore MMx instructions can run several times as fast as the main desktop processor for those types of calculations at which they excel.
How far can these ponies run? If by 2025, we reach the $20/terabyte-of-RAM, 40¢/terabyte-of-disk, hundred-terops speed range, then from a hardware standpoint, we ought to be able, easily and cheaply, to simulate human-class intelligence. This would probably be done using mass-produced specialized hardware and software. However, this may actually be achievable by the 2008-2010 time period, using special purpose accelerators. The real problem is going to be software.
Looking Ahead: What
You Might Expect to Buy for $1,000 to $1,200
Appendices A, B, and C will be transmitted later under separate cover.
Jesse Burst, Editor of
ZDNet's AnchorDesk, has suggested that the year 2000 may be the
real beginning of the computer age for consumers.