Weekly Editorials Page
6/7 to 6/13, 2001

6/13/2001:
Intel Discusses Its Plans Its 1,000,000,000-Transistor, 20MHz PC chips in 2007
    Tonight's lead articles, Intel develops 0.07-micron transistor for 10-GHz processors by 2005, and, Intel develops 20-nm transistor for 20-GHz processors by 2007, may, perhaps, be Intel's response to IBM's announcement that IBM, Sony, and Toshiba plan to introduce a teraflops personal computer chip by 2004, (Sony, IBM, Toshiba chips to bring supercomputers home ), and IBM's announcement yesterday, IBM touts breakthough in faster chip speed, of a 35% increase in computer speed-power products by using "strained silicon". It seems as though IBM has been pulling ahead in the War of the Words lately. Now Intel has responded with a peek at its own product-introduction agenda, planning to deliver computers 10 times faster than the upcoming 2 GHz Pentium 4, with 1,000,000,000 transistors on the chip instead of the Pentium 4's 42,000,000.
What Are the Practical Consequences of This?
    Some of the practical consequences have to do with much better speech recognition, virtual reality, telepresence, and computer intelligence. If IBM's, Sony's, and Toshiba's teraflops computer really arrives on our desktops by, say, 2004, then by 2010, it might be showing up in some pretty smart household and industrial robots.
Reason for Multiple Articles
    I'm including several articles concerning both IBM's articles and Intel's articles because I think these announcements are important, and different articles give different slants on these subjects.
Implications of, and Questions About Intel's Announcement
    Intel is basically announcing what one would logically expect by 2007, but its annoucement acknowledges that such technological goals are feasible. Intel is saying that these goals can be met using more-or-less-conventional silicon designs and manufacturing techniques. To me, Intel's announcement is significant for two reasons.
    First, Intel is planning 0.065-micron (65-nanometer) circuit features by 2005, rather than the 70-nanometer features that have preciously been described for that time frame, and 45-nanometer design features in 2007 instead of 50-nanometer features.
    Second, Intel is quoted as planning to utilize 30-nanometer transistors in their 65-nanometer circuitry in 2005, and 20-nanometer transistors.
    How does one implement 30-nanometer-wide transistors in circuitry whose smallest features are 65-nanometers wide, and 20-nanometer-wide transistors in circuits with 45-nanometer minimum elements? A shrinkage from today's 180-nanometer circuitry to 65-nanometer circuits should permit, perhaps, a 7- or 8-folding of processor speeds, and to 45-nanometer circuits, should allow a 16-folding of speeds, if my notions about speeds versus sizes is correct. I'm wondering if something has been lost in the translation, with 20- and 30-nanometer transistors slated for later than 2007. The logical next steps beyond 45-nanometer circuits might be to 30-nanometer features in 2009, and then to 20-nanometer circuits in 2011.
Speed-Power Product, and Speculations about 2007, 2010
    A couple of other points that sound interesting are the facts that Intel feels it can mutiply the number of transistors on its chips by 24 while simultaneously multiplying their clock speed by a factor of 10... increasing the speed-power product by a factor of 240... and all this without increasing the total power output. Also, IBM's teraflops computer is predicated upon 100-nanometer circuit features in 2004-2005. Circuit features of 45-50 nanometers in 2007 might afford 4 to 5 teraflops performances on the basis of clock speed alone, and possibly 10 to 15 teraflops, given additional transistors on the chip supporting a higher degree of parallelism. (Of course, this is highly fanciful on my part.)
    In the realm of flights of fancy, I might predict 50 GHz computer clocks by 2010, 250 GHz clocks by 2015, and terahertz+ clocks by 2020. But time will tell.
Computer-to-Computer Software Translator
    The Software switches code on-the-fly describes a translator that will convert instructions for one computer to another during execution. This would permit software written for an IBM computer to run on an IBM or Motorola G4. (Of course, something that would rapidly convert software before it began to run would seem to be a very valuable tool.
Who Needs Faster Computers?
    The article, Who needs more than a gigahertz?, is a classic "who-needs-it" kind of discourse. Similar articles have dogged the footsteps of computer development since the computer's inception in the1940's. Thomas  Watson, the son of the founder of IBM, is reputed to have said in the 1950's that there wouldn't be a market for than a dozen 704 computers in the whole country. Paul Furmeister, the Director of the Computing Center at NASA's Langley Research Center, told me in 1967, as he was driving me to the airport, that in the 50's no one could foresee a market for more than a dozen 1000-bit (128-byte) Magnetic RAM memories in the country. And there were similar arguments in the early 90's about the frivolity of CD-ROM's and "multimedia".
Spaceships Made Of Concrete?
    For more than a decade now, UAH (the University of Alabama in Huntsville) has been competing in a contest to build a concrete canoe. The latest pictures in the Huntsville Times show this year's UAH entry looking like an ordinary "Cyclolac" canoe. Apparently, UAH has found ways to render concrete flexible and very strong.
Baldness pill 'passing early tests' Listen up, guys! Sounds as though this is for us! (Wonder if it beats minoxidil?)
6/9/2001:
    As a good-natured, laid-back fellow, I'm a teensy bit outraged over tonight's article, Do we really need glamorous geeks?
    Most, if not virtually all of the hyperbright... the Carol Burnetts, the Einsteins, the Bill Gates... are consigned to geekdom by other kids during their school years. Children who are reading and conversing at an adult level when they are 6 are cygnets who don't mesh well wih the other ducklings.
    I grew up near the end of the Great Transformation, when people transitioned from living much as they had in Roman times, with yesterday's coal-oil lamps, horse-drawn plows/buggies, unpaved roads, outdoor plumbing, wood or coal stoves, and unelectified lifestyles to today's fluorescent lights, automobiles, interstates, multiple indoor baths, thermostatically controlled heating and air conditioning, and such electrical servants and appliances as the telephone, the TV, the VCR, the microwave, power tools, and the computer/Internet. Many of us who lived during the past are grateful for the present and eager for the future, and we realize that science and technology made it possible, Even so, when I was growing up, movies stars, recording artists, best-selling authors, etc., were everyone's dream for the future. They were empowered by the new technologies, and were siphoning off some of the best and the brightest for enjoyable but unproductive occupations.
    That has gotten more pronounced over the past 50 years. How do I know? All I have to do is look at the names of corporate and research leaders in today's major companies. Citizens of third-world countries have the drive and the goals that we possessed 100 years ago. In the meantime, it has become overpoweringly fashionable to try to be a rock star, a TV anchor, or a movie actress... so much so that high-tech jobs are going begging. So what we're seeing is supply and demand. And one way to boost supply is to inspire people at the juvenile level where life decisions are made. (I also think it might help to teach our nerds how to avoid nerddom, perhaps by teaching them about underarm deodorant, social skills, and how to pick clothes that flatter them. I was certainly innocent of such carnal knowledge when I was in high school.) Part of nerddom lies in a set of values that prizes intense concentration upon difficult problems over "looking cool". Generally, the kids who are geeks in high school grow up like the Termites to become highly-accomplished and polished adults. After all, they're the ones who focus on developing their capabilities and educating themselves rather than looking cool. Many of them become multimillionaires. I've read that after a person has married two or three cool alcoholics/deadbeats/philanderers/spendthrifts, polished ex-geeks can look pretty good after they've grown up and become swans.
   Marilyn vos Savant has observed that as a society, we've become obsessed with entertainment. I couldn't agree more. Even though I write poetry and short stories, and enjoy good music, good books, and good movies as much as the next, by and large, I'd rather see our best and brightest tackling our most urgent problems, such as cures for cancer or a theory of quantum-gravity, than writing yet another TV script or rock tune.
    Getting back to geeks, by deriding our productive minority, we've reduced their ranks to the point where we're having to either import scientists, programmers, and engineers, or to ship our mindwork off to foreign shores. In the meantime, the global village is becoming a highly competitive reality, with work exported around the globe to the lowest bidder.
    In the light of these circumstances, I think it's urgently important that you get the best educational credentials you can get, because I think unskilled labor, and potentially, skilled labor, is dirt-cheap in China or Thailand. I think we're on thin ice with disdain for our potential producers, and admiration for our entertainers. II's a competitive world. I think this is a watershed era in which the third-world is catching up with the first-world. (I've been amazed and delighted that the globalization we've experienced so far hasn't reduced our standard of living.)
    I'm happy to say that this article has been written by a British, rather than an American woman. Talk about biting the hand that feeds you! I think the attitude it espouses ranks right up there with sneering at global warming and opposing genetic research because it's unnatural. Ms. Jane Wakefield says,
    "Interest in tech jobs is at an all time low--and the tech firms of Silicon Valley are losing billions a year due to lack of talent (poor dears). So, according to U.S. experts, it is time to glamorize geekiness. High tech needs a Hollywood makeover in order to rekindle interest among kids, the experts say. They have found that school children are, based on their viewing habits, far more likely to want to be Ally McBeal or George Clooney than a Linux programmer. Shocking news, especially given the fact that to truly emulate Ally McBeal children would have to forgo their diet of hamburgers in favor of a weekly carrot and the odd leaf of lettuce. Such perverse behavior confirms that American youth are all mad TV junkies who have completely lost the plot and forgotten that the best thing in life they could possibly be is a middle manager for a software company with a nice house in Silicon Valley and a personal organizer.
    "In fact, coming from the arts side of the fence at school and university--before I crossed to the dark side of technology--I confess that I used to jeer at the math geek for his (it was always a he) perfectly wall-paper covered exercise books, his unsullied pencil case with perfectly pointed pencils and his inability to make friends with anyone other than his teacher. While the researchers have the right idea--that TV is sadly the way to win the hearts and minds of teenagers--a greater sea-change needs to happen before people will view geekdom as the height of chic. I myself have in the past proliferated the cliche that geeks like Iron Maiden and have BO, but zero social skills."

    Hopefully, the U. S. will treat its potential researchers more sympathetically than this. For us, I would see it as a prescription for disaster. But if Britain wants to treat its technophiles as pariahs, that's certainly Britain's prerogative. Maybe Britain will want to export its geeks to the U. S. and to Canada.

"Send us your geeks, your nerds, your brilliant poor,
   Your wretched technics yearning for esprit....

 I lift my lamp beside the golden door."

6/8/2001:
   The Kearneys have sent a most excellent article that resolves an earlier paradox in the news about Alzheimer's disease. There was a British report last July, Alzheimer's vaccine 'safe to use', about several patients who had been recruited for early Phase I trials of a promising new Alzheimer's vaccine. Such vaccines have prevented Alzheimer's plaques from developing in the brains of mice. Then in December, there was an announcement, Researchers develop vaccine for Alzheimer's , about the revolutionary discovery of an Alzheimer's vaccine in Toronto. But the Toronto vaccine was in an earlier stage of development than the British vaccine, having only been tried with mice. It would be a year before human trials could begin. But what had happened to the British Phase I trials? How did the efforts relate? The above article seems to tell the tale. The British have been inundated with requests to admit Grandma and Uncle Bartholemew to their experimental program. Consequently, they're keeping a low profile. Let the Toronto researchers enjoy the limelight, and fend off Granny and Uncle Bart. Pretty shrewd! Now the Brits are in Phase II trials with 80 patients at 4 secret British hospitals.Sh-h-h! If you find out which hospitals they are, don't tell anybody. It's a secret.
    The article mentions that, unfortunately, it will be at least five years before such vaccines are approved for clinical use.
   Thanks, Kevin, Cassidy, Michael, and Maeghan.