Weekly Editorials Page
5/24 to 5/30, 2001




5/30/2001:
   Tonight's lead article contains the fantastic news: God Diagnosed with Bipolar Disorder!God, whom we have always assumed was the very paragon of stability and consistency, is now alleged to be plagued with moon-like inconstancy! La donna é mobile? Sorry, God. But be that as it may, You have my steadfast faithfulness and devout respect.
    Hard on the heels of the first shocking revelation comes a second:You may not be who you think you are.! But if you're not you, who are you? And what other who is you?
     R2-D2, where are you? The robot's slow evolution  provides a robotics update, giving some information about Freindly Robotics' $700 Robomower. It might be worth noting that Radio Shack is now offering Robomower for sale. Hint: you might want to wait until it goes on sale at the end of the season for $599.95. And remember that Toro is offering the iMow for $900. To be really effective, robotic lawnmowers need better navigation systems. One such possibility is something that uses differential GPS. Three-meter GPS is now available, but locally, it can be accurate within a few inches by using differential GPS, by "zeroing" it on an exact location and then using correcting other numbers in accordance with their offsets from this reference location. Another ploy that might aid in lawnmowing would be to use the LED-photocell comb employed in the Lawn Ranger (Radio-Electronics, June/July/August, 1991). This comb counted had light-emitting diodes and photodiodes facing each other on the teeth of the comb. An onboard microprocessor then counted the number of grass blades between each of the many pairs of teeth to determine whether the grass was cut or uncut. This would be another way to detect the edge between the mowed and unmowed grass on a lawn. An electronic compass and an odometer could also provide approximate location, in conjunction with a boundary wire at the edges of the lawn. The odometer would be resent whenever the boundary wire were encountered.
    How about an electric scooter? Remember, in a few more years, they'll probably be powered by fuel cells and you'll be able to ride for hours without refueling. It's interesting to note that fuel cells offer about 7 times the energy per unit weight of lead-acid batteries, or more than 3 times the energy density of nickel hydride batteries..At 210 watt-hours per kilogram, they won't match gasoline, but they should render electric cars practical. Electric bikes are also coming on.
    I've begun reworking the Alzheimer's page to try to provide comments summarizing each of the research papers, and eventually, reviewing the research, including possible ways to avoid Alzheimer's Disease, and what can be done in the way of treatment.
5/27/2001:
    Table 1, below, shows a 15-year computer technology forecast that I made in 1997. Table 2 shows tonight's May 28, 2001, update to it. I've ignored the joint announcement by IBM, Sony, and Toshiba that they will be offering 1 teraflops computers for the desktop before 2005. The reason is that, at the moment, the information about this initiative seems to be rather vague. The last time IBM made a similar dramatic announcement (in 1992), it was made for their Reduced Instruction Set Computer (RISC) PowerPC microprocessors, pledging an increase in computer speeds from about 15 MIPS in 1991 to 500 MIPS in 1995. In practice, they didn't quite achieve this, although their rival, Intel Corporation, reached 328 MIPS with the Pentium Pro in December, 1995.
5/28/2001:
    Another interesting line of conjecture concerns how they might achieve a 1-teraflops microprocessor on a chip. IBM is planning to mount 32 one-gigaflops microcomputers on a chip to create its "Blue Gene Machine". Blue Gene will be based upon chips with 32 microprocessors mounted on each chip, each of which can perform about one billion floating point operations per second (GFLOPS). Its first chips are scheduled for delivery by the end of this year. IBM's P4 PowerPC can run at one GFLOPS at a clock speed of about 450 MHz.By 2005, Intel has promised clock speeds of 10 GHz. In addition, with chip features shrinking from 0.18 microns in 2000 to 0.10 or 0.07 microns in 2005, chip densities may run four times as great as they did for the PowerPC, and as they will for Blue Gene. IBM could presumably pump out 20 to 25 GFLOPS with a 10 GHz clock speed. IBM might also mount 64 or 128 processors on one chip, and some combination of these parameters could possibly lead to one-teraflops computer chips. However, one fact is clear: IBM's strategy will depend upon a much higher level of paralellism than is common today, and will probably be predicated upon the concepts embodied in their Blue Gene machne.
 . By contrast, Intel's Itanium chip could be operating above 0.1 teraflops by 2005. 


Computer Technology Forecast, 1997 – 2012 (Based on the Semiconductor Industry Association’s 15-Year Technology Roadmap)

Robert N. Seitz

September 30, 1997

Table 1: What You Might Expect on Your Desktop for $1,000-$1,200, 1997 through 2012:
Mo./
Year
CPU
MHz/Type
Speed
MIPS
RAM
MB
HD
GB
Accel
Gigs
CD
GB
Comm
MBaud
Video-camera,
Removable Disk?
4/97 133/Pentium  160 32 3 1-2 0.6 0.033  
4/98 266/Pentium II 320 64 5 2-3 0.6 0.056   
4/99 400/Deschutes 800 128 8 5 4.3 0.112  
4/00 600/Katmai 1,200 128 12 8 7.6 0.256 640X480 Vidcam, Zip 100
4/01 900/Willamette 2,700 256 20 18 15 0.256 800X600 Vidcam,RW CD
4/02 1,200/Merced II 5,000 512 35 32 15 0.4 1,024X758 Vidcam,RW CD
4/03 1,500/Merced II 7,000 1,024 60 45 15 0.4 1,280 X 960 Vidcam, RW
4/04 1,800/P8 10,000 1,024 100 65 60 1.0 1,920X1,080 Vidcam, RW
4/05 2,200/P8 12,500 2,048 200 80 180 1.5 System on a chip? DVD?
4/06 2,500/P8 II 16,000 4,096 400 100 180 1.5 System on a chip? DVD?
4/07 3,200/P9? 25,000 4,096 700 160 300 6 System on a chip? DVD?
4/08 4,000/P9 II? 40,000 8,192 1,000 250 500 6 System on a chip? DVD?
4/09 5,000/P9 II? 50,000 16,384 1,500 320 750 12 System on a chip? DVD?
4/10 6,500/P10? 80,000 16,384 2,000 450 1,000 12 System on a chip? DVD?
4/11 8,000/P10 100,000 32,768 2,500 650 1,000 12
4/12 10,000/P10 120,000 32,768 3,000 800 1,500 12  

Computer Technology Forecast, 1997 – 2012 (Based on the Semiconductor Industry Association’s 15-Year Technology Roadmap)

Robert N. Seitz

May 27, 2001
Table 2: What You Might Expect on Your Desktop for $1,000-$1,200, 1997 through 2012:
Mo./
Year
CPU
MHz/Type
Speed
MIPS
RAM
GB
HD
GB
Accel
Gigs
CD
GB
Comm
MBaud
Video-camera,
Removable Disk?
4/97 133/Pentium  160 0.032 3 1-2 0.6 0.033  
4/98 266/Pentium II 320 0.064 5 2-3 0.6 0.056   
4/99 400/Deschutes 800 0.128 8 5 4.3 0.112  
4/00 600/Katmai 1,200 0.128 12 8 4.3 0.256 Zip 100
4/01 1.2 Ghz K7 2,700 0.256 30 18 4 3 1.5  RW CD
4/02 2 GHz K7 5,000 0.512 40 32 4.3 1.5 RW CD
4/03 3 GHz P4 or K8 7,000 1 60 45 15 1.5 RW CD/DVD/Zip 10 GB
4/04 5 GHz  10,000 1 100 65 60 1.5 RW CD/DVD/Zip 20 GB
4/05 7 GHz P7 or K8 12,500* 2 200 80 180 1.5 ?
4/06 10 GHz 16,000* 4 400 100 180 1.5 ?
4/07 15 GHz 25,000* 4 700 160 300 6 ?
4/08 25 GHz 40,000* 8 1,000 250 500 6 ?
4/09 35 GHz 60,000* 16 1,500 320 750 12 ?
4/10 50 GHz 80,000* 16 2,000 450 1,000 12 ?
4/11 70 GHz 100,000* 32 2,500 650 1,000 12 ?
4/12 100 GHz 140,000* 32 3,000 800 1,500 12 ?

* - If IBM, Sony, and Toshiba deliver on their promise to market a 1-teraflops PC by 2005, all these numbers can be tossed into a cocked hat. (They're pretty fuzzy, anyway, since computer speeds depend upon the speeds of other components, such as bus and RAM speeds.)
5/25/2001:Where Our Ships Might Sail in the 21st Century
Computing Power, the Next 20 Years:
    In 1991, my records show that Fujitsu offered supercomputers ranging from 110 MFLOPS to 375 MFLOPS, and renting for $46,000 to $126,000 a month. That's about one-tenth to one-third as fast as the PC on which I'm typing these words ten years later.
    In 1991, the fastest gun in the West was the Cray Y-MP, with a speed of 16 GFLOPS, or about the speed of a desktop dual-processor Itanium server today.
. The first teraflops computer, Intel's Touchstone Delta, was delivered in 1997. It required 9,000 microprocessors to deliver its teraflops performance. IBM's, Sony's, and Toshiba's plan to deliver a teraflops to the desktop by 2004 or 2005 would mean that only 7 or 8 years would have elapsed between the $100,000,000 teraflops computer and the $2,000 desktop equivalent.
    IBM's planned Green-Gene machine, capable of delivering 1,000 teraflops, is scheduled to appear in 2003 or 2004, at about the time that IBM, Sony, and Toshiba will market their 1 teraflops chips for the desktop. The Green-Gene machine will provide ten times the 100 teraflops speed that Hans Moravec has estimated as the computational power required for human-level intelligence. Of course, the Green-Gene Machine will probably be deemed too expensive to use for artificial intelligence research. However, Dr. Moravec has estimated that a computing speed of about 0.1 teraflops should be sufficient to emulate the brain of a mouse. (The human brain typically occupies about 1,350 cc's,and  if Dr. Moravec is correct, delivers about 100 trillion calculations per second. Consequently, a brain with 1.35 cc's should deliver about 100 billion calculations per second, and should be just about the size of a mouse' brain.) Similarly, a brain capable of crunching numbers at the one teraflops level should fit into a volume of about 13.5 cc's (a little over an inch, or about 3 centimeters in diameter)...  about the size of a monkey's brain. If we should reach the 10 teraflops level by 2010 (only as far away as 1992), then we could field the computational capability of a chimpanzee's brain. Of course, I think that it might take much less computational power than living organisms possess to emulate their purely cognitive capabilities. Animal brains are responsible for exceptional sensory and motor capabilities to insure their survival in the wild, which shouldn't be needed in a robot. Also, multiple processor chips might profitably be used to handle specialized functions such as vision, audio, and speech, much as we use a graphics, a telecommunications, and an audio processor to augment the capabilities of a PC's main microprocessor. A certain amount of processing power is required to handle functions such as computer vision or voice recognition, but those requirements may soon be amply fulfilled, with processing power left over.
Clock Speeds versus Parallel Processing
    There are two basic ways to speed up computers: increase their clock speeds so that individual microprocessors can run faster, and run computations in parallel. Generally, serial computations can't run much  faster than the computer's clock speed. Right now, with clock speeds approaching 2 gigahertz, and local clock speeds pushing 4 GHz, serial processing speeds of the order of a few  gigaflops are typical. By 2005, serial clock speeds should be pushing 10 gigahertz. In the meantime, silicon-germanium chips are running at 40 GHz, and are pushing toward 140 GHz. The current speed limit is about 300 GHz, corresponding a far-infrared wavelength of 100 microns, or a little over 100 times the wanvelength of visible light. It's possible that higher speeds might be achievable as circuit dimensions shrink. Of course, circuits that don't use clocks would function somewhat differently, but local circuit speeds would still approximate the equivalent closk speed for the circuitry that's being employed.
    Beyond that, we have to turn to parallel processing, and that's presumably what IBM is planning for its 2004-2005 teraflops chip. As circuit densities increase, multiple Reduced Instruction Set Computers (RISC) on a chip should be feasible.
    Shrinking the circuit dimensions tends to reduce power requirements and circuit heating as the square of their linear shrinkage, but the fact that we can place more transistors on the chip as the inverse square of their linear dimensions tends to increase power levels, so that the total power output remains the same. However, as clock speeds increase, power requirements and circuit heating rise in direct proportion, with the net result that power problems become worse and worse in direct proportion to the clock speed. Various tricks are being devised to hold these power levels in check, since they're already at marginally acceptable levels. Meanwhile, the road gets steeper and steeper.
RAM Memory
    Conventional CMOS RAM memory will be using 1 gigabit memory chips within the next few years. The Semiconductor Industry Association's 15-year roadmap calls for 256-gigabit RAM chips by 2012. A foreseeable limit for conventional techniques might come at or near 1 terabit.This would require memory cells that are no more than 10 nanometers across. One ploy that might take us further might be to switch to magnetic or ferroelectric RAM... basically, flash memory. This has the advantage of being non-volatile. Unlike the dynamic RAM that we've used for the last 30 years, magnetic or ferroelectric RAM doesn't require any expenditure of power to maintain its contents. For this reason, magnetic memory chips or layers could be stacked to generate a three-dimensional structure. An upper limit on three-dimensional RAM memory might be 1 petabit. That would require 1,000 1-terabit layers. (Each layer would have to be no more than 10 microns thick.)  However, that would seem to be a manufacturing nightmare, and it may never come to pass before some other storage concept takes its place.
5/26/2001:
RAM Memory (Continued)
    I would suspect that one petabit would be an upper limit on chip-oriented storage capacities. A quadrillion bytes (a petabyte) of memory would be an upper bound on what I would expect the human brain to store in its one quadrillion synapses. By way of comparison, if the human brain stored sensory inputs at the rate of 5 gigabytes per second, then it would require 40 terabytes for each yeat of storage, or a petabyte every 25 years. However, I believe that an artificial intelligence would require enormously lower data storage capacities than that, with synoptic images and summary information. Although the Penfield exeriments indicate that the brain may be capable of recalling everything we ever experienced, it certainly isn't accessible to us in our daily lives. What do I remember from May 25th, 1977, during the interval from 9:10 p. m. and 9:11 p. m.? Zip, that's what! I think we could get good recall levels for an artiificial intelligence with  a few terabytes of memory by using still images and summary information..(The still images could even be computer-animated to reconstruct video sequences of important experiences.)
    Right now, RAM costs about $250 a gigabyte. In ten years, it should cost about $2.50 a gigabyte, or $2,500 a terabyte.
    OK. What other storage concepts could there be, and what could they do? The ultimate storage technique might involve storing a bit in every molecule in a three-dimensional crystal or an organic blob. It would be no mean feat to read the data from such a molecular array, especially at picosecond rates, but given that it could be done, there are about 6 X 1023 molecules in a gram molecular weight of matter. If the material were something like lithium niobate, a gram-molecular weight would total 100 grams. Guessing its density to be about that of rock, or about 3 grams per cubic centimeter, there would be about 2 X 1022 LiNb molecules in a cubic centimeter, which could store 2 X 1022 bits or about 2 X 1022 bytes,.That would be 1,000,000 petabytes... a tremendous amount of storage. Alternatively, the lithium niobiate could be coated onto the surfaces of disks, as it is with conventional disk drives. However, a solid-state memory might be designed that wouldn't require an electromechanical readout, rendering a substitute or complement for RAM. It could be a substitute if it could be accessed and read out as fast as RAM, or a complement if it were slower than RAM.   We're presently orders
Disk Storage
    I believe we'll see 1 terabyte disk drives by 2004 or 2005, but beyond that, from my keyhole perspective, the outlook for conventional disk drives becomes fuzzy. Right now, disk storage costs about $2,500 a terabyte. By 2005, it might cost $125-$250 a terabyte. By 2010, it ought to be available cheaply enough to support human-level artificial intelligence.
Virtual Reality
    Computer games are gradually maturing into virtual reality experiences. Rapidly-improving graphics boards are supporting ever-faster and more realistic rendering of scenes, with shadows, textures, and other realistic artifacts. Three-D renditions should greatly improve the sense of presence in these games. Some of the scenes and sounds in "Riven" are enjoyable in their own right. However, computer games are still aimed at children or at puzzle addicts. Adult entertainment or edutainment programs are few and far between. Professional-caliber walkthroughs and military training programs are coming along, and will eventually filter down to the consumer. It will probably be at leas a decade before true virtual reality, three-D, adult entertainment programs appear.
Artificial Intelligence and Robotics
    Artificial intelligence is finding its place in web search engines. Search Engines Ready to Learn and Search Engines: The Next Generation discuss progress in search engine technology. Search engine technology is extremely important! Search engines are gradually giving us "brain amplification".Another frontier in AI is the intelligent user interface. Apple Computer's "Talking Moose" was, in my opinion, an enjoyable example of a conversational computer interface. Unfortunately, it will no longer work on newer Macintoshes, and nothing seems to have been done to update it. "Microsoft Bob" was a failed attempt by Microsoft to introduce an avatar that would assist you in running your computer. But Microsoft is so awful at interacting with its customers that this may be more a function of Microsoft's ineptitude and incompetence than it is a failure of the technology. I have Microsoft's Office Assistant on my computer, but there doesn't seem to be any reason to use and office assistant.Insofar as I'm aware, it only picks out keywords in a query and then searches the Help file for those keywords. But I can just as easily enter those keywords into the search command and bypass the Office Assistant step. It would be great if it would work, but I keep it turned off.
    I have found Microsoft's trouble-shooting routines to be very helpful.
    I believe that true artificial intelligence isn't going to occur until others understand the role of emotions in "programming" animals though urges and appetites rather than through programmed actions. They'll also need to understand that self-awareness arises through self-observation, and through various competing "emotions" or indices that lead ultimately to action. I believe that nature programs mammals by giving them drives and urges that they seek to satisfy. Nature may program insects, but higher animals must respond more flexibly to conditions in their environments.
    Although Dr. Hans Moravec may deliver visually navigated platforms on schedule at the end of 2002 or the beginning of 2003, upon thinking about it, I believe that this will be an evolutionary process, with rapid improvement continuing over a period of years. Robotic lawnmowers and vacuum sweepers will probably continue to improve, with the first visually-navigated devices appearing around the middle of this decade. But I expect to see rapid improvements during the latter half of this decade, with gradual penetration of the markets for these devices. (It's hard to predict how popular items will be, and how fast markets will develop, since this depends upon non-technical factors.)
    We might do quite a bit with simple navigation techniques such as electronic compasses combined with odometers that are reset by calibration wires at the edges of the areas of operations.
Computing Power: The Next 50 to 100 Years
    All of this is concerned with what will happen over the next ten to twenty years. What will it be like in 50 or 100 years?
Computer Networking, and the Human/Computer Interface
    I think that a lot more research will take place in direct man-machine mind-melding. To really do it right might  require extensive nano-wiring within the brain  This might eventually be done in a microscopic, minimally invasive way before a baby's fontanels have closed. Shy of this, we'll probably see a lot of improvement in glasses and hearing aids, or even implants to access existing sensory inputs and neurological outputs. I also foresee a great deal of networking. If you think privacy is a problem today....
    The incentive for implants might come from a desire to more-intimately amplify our mental powers with computer assistance, and possibly, even a desire to keep up with artificial intelligence. ("If you can't lick 'em, join 'em.") On the other hand, it may be a long time before we know enough to, or are prepared to invasively embed wiring in our nervous systems.
    Speaking of networking, the most effective way to network computers is with the closest possible packing and connections. However, I expect that  we'll also see fiber-optic local-area networks and wide-area networks in spite of the propagation delays that these will incur. So what will it be like in 2050? I suspect that we'll be communicating with our computers and with each other using a form of electronic telepathy. It may take the form of muscularly controlled inputs, and auditorily mediated inputs. We may discover that telepathy isn't such a good idea after all, if people can reach us at any hour of the day or night. (We're already approaching that state with cellphones.) Also, there will have to be safeguards against torturing us or messing with our minds. Neural implants tying into sensory centers might bring  Holodeck-level virtual reality simulations to life. And some of the potential benefits and liabilities of such simulations have been addressed by science fiction authors... e. g., addiction to the pleasure palaces of dreamworlds. It could even include direct connections to the pleasure centers of the brain, (Fend forbear!)
   I can see it now. Remote controls would be replaced by direct mind interfaces to locally networked devices. It would look like magic, as TV channels changed and car doors unlocked with no visible intervention.
    In the meantime, there will probably be a lot of networking and load-leveling among computers, with, for example, word processing offloaded to computers with several different word-processing programs installed, graphics design carried out on other computers, and music composition software installed on still other computers.