What Is the Flynn Effect?

 

The Flynn Effect refers to the observation that:
(1) beginning with, or shortly after the onset of the Industrial Revolution in the 19th century, IQ scores started to increase in all the industrial nations in which IQ scores were recorded since the time they were first measured in the early 1900's; and
(2) these gains in IQ are very strange in that there has been little improvement in vocabulary, general information, and arithmetic, and in SAT and GRE scores, and improvement by something like 5 points per decade in pattern recognition and spatial visualization (fluid intelligence)--in short, in those very capabilities which are held to be the essence of intelligence. On IQ tests such as the Stanford Binet and the Wechsler tests, that contain a mixture of questions designed to draw upon both crystallized and fluid intelligence, the gain has been about half that on the fluid intelligence tests (such as the Raven Progressive Matrices and other culture-reduced tests).
    Some confusion arises when we try to extend these results over the entire 20th century. If the average fluid-g IQ rose by
5 points per decade, the total rise in average fluid-g IQ, integrated over ten decades, would have been 65 points. We have reason to think that the actual gain in these scores is less than this. In 1942, shortly after norming the Raven Progressive Matrices, Dr. Raven administered his test to cohorts ranging from 65-year-old Britishers born in 1877 to 30-year-old Britishers born in 1912. In 1992, Dr. Raven's son updated this study, adding data for individuals born between 1877 and 1967. Over the 90 years, IQs derived from the the Raven test rose 47 points, with the average raw score increasing from 23 out of 60, for those born in 1877, to 55 out of 60 for those born in 1967. (These latter scores could have suffered from a ceiling effect, so the actual improvement might have been greater.) However, extrapolating another 8 years to bring the study up to the year 2000, we might estimate a total integrated Raven-IQ increase of approximately 50 points over the 20th century.
  
The Flynn Effect presents us with profound dilemmas no matter how we interpret it. Here are three representative cases.

(i.) The Flynn Effect is 100% Real 
   
 This has astounding implications. It would argue that human intelligence has doubled in just a hundred years. Homo sapiens would now be homo superior. Furthermore, the Raven-based IQ of the average Britisher in 1900 would have been 50, falling on the borderline between moronic and imbecilic.The great minds of the 19th and early 20th century--Einstein, Poincare, Hilbert, Maxwell, George Bernard Shaw, A. E. Housman, Sir Francis Galton, John Stuart Mill, H. G. Wells, Edna St. Vincent Millay, and Pearl S. Buck--would have year-2000 IQs that are below 100. In other words, any Walmart or supermarket clerk today could outperform the leading minds of the 19th and early 20th century. And this brings us to the other horn of this dilemma. Although Raven-derived IQs have doubled since 1900, vocabulary, general information, and arithmetic skills have risen only 2 or 3 points since 1950, and presumably, since 1900. In other words, these people would have blown the doors off the SAT and the GRE while remaining imbeciles in terms of of true intelligence!!! How could this happen? The argument given is that hours of core-curriculum school instruction have dropped from 1,000 hours in 1950 to 300 hours today, and that children have switched from reading books to watching TV (although this would have to be in spite of the countervailing influences of programs like, "Sesame Street" and edutainment computer programs.) Consequently, today's population is severely undereducated in a rote-learning sense compared with the population in 1900 (although the average educational level has risen from, perhaps, 8 years of school to 14). At the same time, children today are taught how to think and how to infer, and this might be playing a role in their higher scores on tests of true intelligence. Today's children are the beneficiaries of Sesame Street, computerized books, and a lot more parental attention than the overworked parents of 7 children could provide in 1900.
   Since the Raven-derived IQ of today's average 20-year-old stands at 200 using the standards of 1900, it follows that our mentally challenged.down to an IQ of 50 can be taught to function as responsible, capable citizens, and those with IQs of 60 or above can probably become high school teachers or even physics professors. All that is needed is an educational system comparable to that in 1900. (It might be argued that today's world is more complex than that of 1900, but to someone who was born in 1929, that's patently incorrect.) This means that our Downs-syndrome patients will no longer have to be sequestered but can become full-fledged members of society. Anyone with an IQ above 90 should make a perfect or almost-perfect score on the SAT and the GRE. It also means that the average child should be ready for college by 9 or 10, and the somewhat-above-average child should be ready by 8. College should take no more than 2 years, thus greatly reducing its cost to the individual and to society. Alternatively, everyone should have at least one Ph. D. by 13 or 14. And today's young adult with an IQ of 125 is as smart as the smartest human on earth in 1900.
    Presumably, today's average child learns to read at 3, and is adept with arithmetic before he/she starts to school, in keeping with this model of mental acumen. Has that happened?
   These are the revolutionary consequences of the idea that the average IQ today is twice as high as it was in 1900.
    There are a couple of sticking points with this scenario. We're postulating the idea that the society of the late 1800's was able to train individuals with ratio IQs of 50, or deviation IQs of the order of 60, to nearly the level of today's average citizen in the areas of arithmetic, vocabulary, and general information. But we're unable to do that with our "mentally challenged" with IQs of 60. Also, the 19th century's brightest individuals, with ratio IQs at or below 200, would have been no brighter than our average individuals today. And a mental giant like William Sidis would seem to have been a remarkable intruder in such an era. (However, it might be worth noting that William Sidis was considered to be sui generis. Also, his father's coaching might have produced childhood results that would faded somewhat in adulthood.)
    The other sticking point is that a large vocabulary, a wide-ranging fund of general knowledge, and astute arithmetic reasoning, which for most of us are the most obvious insignia of intelligence, represent what has been dubbed crystallized intelligence, and crystallized intelligence is usually a consequence of a high level of fluid intelligence. Smart little kids with large vocabularies who seem to know it all are the classic representation of the precocious child. We don't need IQ-test results to decide that such a child is very smart. And now we're saying that this hasn't advanced at all over the past 100 years. Of course, we could argue that the culture itself hasn't upgraded to take advantage of the putative doubling of intelligence over the past century. With the advent of television and a decline in core curriculum hours, it might be supposed that we've reached the same level of education with much less teaching time, rather than a higher level of education in the same teaching time. Maybe our super-smart need to exert less effort to reach levels of erudition that are sufficient unto their needs. However, one fact is clear: the hyperbright don't make any greater conscious effort to learn their bounty than average folk require to learn their bare sufficiency.
    One solution to this would be to suppose that learning and memory are distinct from problem-solving and insight. In that case, we could assume that problem-solving and insight have improved mightily, with no concomitant enhancement of learning aptitude. The only problem with this hypothesis is that factor analysis reveals only one g factor for both manifestations of intelligence.
    Another possible way to get a handle on this subject would be to compare the precocities of children in the early years of the 20th century with children today. I have the impression that there are many children today who are prodigiously gifted than would have been found in 1900. Leta Hollingworth's "Children Above 180 IQ" gives us


(ii.) The Flynn Effect is 50% Real, 50% Artifact
    Our best all-around IQ tests, such as the Stanford Binet and the Wechsler, show a 3-point-per-decade rate of rise, or, conservatively, a 25 point increase since 1900. According to this intermediate scenario, the Flynn Effect partially represents an increase in our underlying intelligence, effectively raising it by 33% over the past century. Today, the average citizen in the industrialized nations has a 1900-equivalent mental age of about 21.33 years, a ratio IQ of 133 on their scale, and a deviation IQ of 130 on their scale. 
     This sounds more plausible, since it's a compromise. Now we're trying to train someone with a ratio IQ of 75 to reach a verbal, arithmetic, and informational level of IQ 88-90. Their brightest individuals, with ratio IQs above 200, would only have fallen above our (deviation) IQ-143 level.
    The only problem with this scenario is that it seems rather arbitrary, without some guiding rationale, to accept these results that draw upon mixed crystallized and fluid intelligence testing. For individual test scores, this might be fine, but for research purposes, we want to be able to separate crystallized and fluid intelligence, expecially since crystallized intelligence seems to have frozen at something like the 1900 level.

     I've finally finished the reviewing the book, "The Rising Curve", which discusses some of the paradoxes of the Flynn Effect. The book review should be available sometime within the next 24 hours, along with "News of the Ultranet". Since then, they've continued to march upward at this same inexorable rate of 3 points per decade, which adds up to at least 25 points of IQ by now. In other words, today's average IQ of 100 would have corresponded to an IQ of 133 in 1900, and would have been surpassed by less than 2% of the population in 1900! Curiously, there's been virtually no concurrent improvement in vocabularies, arithmetic skills, and general information--learned skills--with practically all the gains occurring in fluid "g", which has been shown through factor analysis to be the essence of general intelligence (the potential to learn and to solve problems). This means that our forebears 100 years ago could talk and write about as well as we can, knew about as much as we know today , and could handle their finances about as well as we can now, but they presumably wouldn't have had the "aha'" insights that characterize today's citizenry. But this raises profound questions. Today's 4- or 4½-year-olds--or perhaps, our average 3-year-olds, if it's the potential to learn and solve problems that has been enhanced over the past century--should be as ready for school as was the average 6-year-old in 1900. Children should develop intellectually at much earlier ages than they did in 1900.


(iii.) The Flynn Effect is 0% Real, 100% Artifact 
    This is more or less Dr. Flynn's position. Dr. Flynn feels that there must be some real-world benefits attributable to the enhanced problem-solving skills that underwrite these rising IQ scores, but he doesn't believe that underlying intelligence has increased significantly. One of the ideas is that tests like the Raven are actually highly culture-dependent. Of course, lesser gains have been registered on other tests such as the Wechsler and the Stanford Binet, but these could also have profited from our ever-more-stimulating environment. People in 1900 were just as bright as we are today, although they wouldn't have scored as well on our IQ tests. 
    One of the problems with this explanation is that we don't know how to train people to permanently raise their scores on arbitrary IQ tests by 25 points. The Abecdarian and Milwaukee Projects (and others) represented all-out attempts to boost children's IQs, beginning shortly after birth. Long term, it's doubtful that the permanent gains exceeded 5 points of IQ in adulthood. On the other hand, if IQ tests mean anything at all, the Flynn Effect has boosted people's IQs by 25 points by more. If so, this would suggest a startlingly-large environmental influence upon IQ. Note that this doesn't preclude biological mechanisms such as improved nutrition, improved perinatal care, etc.

 


    The Flynn Effect represents an increase in our underlying intelligence (fluid g), effectively doubling it over the past 100 years. Today (according to this scenario), the average citizen in the industrialized nations has a 1900-equivalent mental age of 32 and a ratio IQ of 200 compared to the average citizen in 1900. This intellectual gain hasn't been fully realized in the culturally-mediated areas of vocabulary, arithmetic, and general information because we've developed inferential capabilities rather than rote-mechanical knowledge. Also, reading and arithmetic have sagged a bit because of TV. However, we've managed to learn in 300 hours a year about as much as we learned in 1,000 hours a year of drill in 1900. 

    There are dilemmas with all three of these scenarios. As mentioned above, efforts to train people to score higher on IQ tests have never produced gains of more than a few points. Yet we're seeing 30% gains over a 100-year period--or 42% gains from the perspective of someone back in 1900. Are these gains environmentally-inspired... something that most psychometrists might well have considered impossible? Is environment far more important in the nurturing of intelligence than we've been led to suppose (assuming that the Flynn Effect is truly boosting intelligence)? 
    If we assume that the Flynn Effect isn't truly emblematic of rising intelligence, then we have to explain how people are able to consistently score better on IQ tests when we haven't been able to find any way to train them to do this. How can they score better in the testing room without being able to "score" better in the world outside the testing room? 
    If we assume that the Flynn Effect is truly indicative of rising intelligence, then we have to explain how the citizens of the nineteenth century and before, were able to perform as well as they did with present-day core IQs of no more than 50 or 75. 
Even a middle-ground assumption that the world of 1900 had an average IQ of 75 doesn't make it easy to fathom how they could have done what we know they did. Based upon this scenario, their brightest individuals--Maxwell, Dirac, Einstein, Heisenberg, Sir James Jeans, Sir Arthur Eddington--couldn't have had deviation IQs above a present-day level of, perhaps, 150. 
    The obvious possible causes, such as better nutrition, better education, test-taking sophistication, change in lifestyles, and environmental complexity have been investigated in depth, and have failed by factors of several. And even if they had succeeded, we would still be left with the question of how our grandparents/great-grandparents/great-great-grandparents, born in 1870, could have been as capable as they were, and could have coped as well as they did.
    There's some possibility that we're producing a lot more fantastically-smart children than we were in 1900. I don't remember hearing about, or reading about anyone in the literature other than William Sidis and the Yale Prodigy who began speaking in sentences by the age of 6 months, and who started to read at the age of one. Eric Temple Bell, in his discussion of Gauss, states that "In all the history of mathematics, there is nothing approaching the precocity of Gauss as a child. Although it seems incredible, he showed his caliber" [corrected his father's arithmetic error] "before he was three years old. Before this, the boy had teased the pronunciations of the letters out of his parents and their friends and had taught himself to read." Dr. Bell's astonishment over what today would be regarded as impressive but nothing record-breaking attests to our standards of precocity back in the 30's.
    Another interesting question is that of whether the agents that are inducing the Flynn Effect act upon those of us who are intellectually active throughout our lives or only in childhood. I find the IQ tests that I took 55 years ago as a teenager to be quite simple for me today, but they weren't quite simple for me at 16. This applies not only to their verbal and arithmetic problems, but most especially to pattern-recognition and number-series problems that should be most affected by the Flynn Effect. If this is true for everyone, then it would mean that the Flynn Effect is an ongoing part of our "Information Age" ambiance. As a society, we're certainly engaged in a lot more white-collar activities today than we were in 1900. But it raises the question: neglecting age-related changes, would I be 17 points of IQ smarter than I was in 1945? (In a 1940's study of identical twins separated shortly after birth and reared apart, the greatest disparity in IQ's among the 221 pairs of twins was 22 points, between a farm wife and a teacher in New York City.) The discovery that the brain might be like a muscle ("use it or lose it"), and that new neurons can be formed and will migrate to active regions of the brain must surely be considered a revolutionary accession to our knowledge of neurophysiology..
    James Flynn's question is that of whether underlying intelligence has also risen by anything like the factor of 1.0/0.75 = 1.33 that the Flynn Effect suggests.
    It's exciting to think that IQ scores have risen as much as they have. If the Flynn Effect really represents even a partial increase in average intelligence, it would seem to me to bode bode well for the future of humanity. 
    IQ gains of this caliber are stunning in the light of all the painstaking research that has been performed showing how little effect environmental factors have seemed to produce on IQ scores. This is genuine cognitive savvy that must be expressing itself in everything from jigsaw puzzles to computer programming.

    What do you think about this? 

The Flynn Effect is Also Important Because It May Elevate the Role of Environment in Mediating Intelligence
    The reason that the Flynn Effect is important is not only because it points toward rising intelligence--a monumental discovery in its own right--but also because of what it says about the roles of environment and heredity in determining (a) IQ, and (b) intelligence. No means of deliberately significantly boosting IQs has been found, and yet, the Flynn Effect is inarguably doing just that, irrespective of whether it's also boosting intelligence. One interesting observation concerns the fact that, in Africa, where the culture-free average IQ is observed to be 70, the Africans don't seem to have anything like the trouble coping with life that one expects in industrialized countries at that IQ level, nor do the Africans with IQs of 50 or 60 have to be institutionalized the way they would elsewhere. It's interesting to note that similar situations have been observed in practical real-life situations in which people who supposedly lack the required IQ are able to operate beyond their expected intellectual level.

    Given answers to these and kindred questions, it may be possible, using available evidence, to make a horseback estimate of whether, and how much, intelligence has risen since 1900.

(1) How Were Adult Mental Ages Above 16 Normed on the 1916 Stanford Binet IQ Test?
    The first major U. S. IQ test was Dr. Terman's 2nd revision of the Binet-Simon test, the 1916 Stanford Binet IQ test. As mentioned below,
   "You can define a mental age of 16 by simply defining it to be the average mental age of people 16 and over*. However, when you get beyond a mental age of 16, some other strategy has to be employed. If you're willing to wait a few years, until the children whose IQs you've measured grow up, then by selecting a sample with childhood ratio IQs between 145 and 155, you could use their average test results in early adulthood to define an adult ratio IQ of 150 or an equivalent mental age of 24. However, if it's 1916, and you're in a hurry to standardize your new IQ test, you might instead use percentile rankings for adults, assigning a mental age of 24 or above to the top 0.1% of your test-takers. In that case, you would be mixing apples and oranges, using ratio IQs for children with mental ages of 16 or below, and deviation IQs for adults with mental ages above 16. As mental ages go, it doesn't make a huge amount of difference at these IQ levels. If Dr. Terman standardized mental ages above 16 in this hasty manner for his 1916 Stanford Binet IQ test, someone assigned a mental age of 24 on a percentile basis would actually have had a mental age of 25.6 years, or a ratio IQ of 160, if compared with grown children who had taken the Stanford Binet when they no older than 10.0 years of age."

Question #1: How were adult mental ages (mental ages greater than 16) derived on the Stanford Binet Test?

(2) Given the Definition of Average Mental Age, How Could the Average Mental Age on the Army Alpha Test Have Been 13?
    The year after developing the Stanford Binet test, Dr. Terman and three others were asked to create IQ tests for the U. S. Army called "The Army Alpha" and "Army Beta" Tests. The Army Alpha Test was a printed test, while the Beta was nonverbal, and designed for the foreign-born and illiterate. These tests were administered to 1,750,000 young recruits during World War I. In 1921, one of the tests' co-designers, Dr. Robert Yerkes, estimated that the average mental age of the average recruit was about 13. (These tests would probably have to have been graded on a percentile basis, thereby yielding equivalent deviation IQs.)
Question #2: How could he estimate such a mental age? Presumably, the 1916 Stanford Binet test was standardized to give an adult mental age for the general population of 16 by definition. What could Yerkes have used as a standard of comparison to arrive at any average mental age of 13? (Note that if 13 were the average mental age for the general population, 3/4ths of the Flynn Effect could be explained by a rise in mental age to 16. Somehow, I don't think that's what's meant.)

(3) Why Haven't Scores Risen on the SAT as They Have on Other IQ Tests?
    The Scholastic Aptitude Test (SAT) was born in 1926, and administered to 8,000 high school students that year. The SAT was basically the Army Alpha Test reworked.
Question #3: Why have scores on the SAT failed to rise over the decades? Or have they failed to rise over the decades? Has there been any upgrading of SAT questions and norms since 1926 (other than the "dumbing down" of the SAT in 1995)? It may be argued that schooling is less rigorous today than it was in 1920, but would that make much of a difference in SAT scores? If the SAT is patterned after the Army Alpha Test and scores on the Army Alpha Test have climbed over the decades, why hasn't that happened with the SAT as well? One hint here may reside in the fact that scores on the Army Alpha Test were highly correlated with years of education, and before 1917, years of education may have been more dependent upon opportunity than with intelligence. The SAT is a test of educational achievement, even though it may be highly correlated with IQ.
    A counterargument might be that in recent decades, there has been a lot of coaching and of "teaching to the test" to help students pass their SATs.
   There have been many more students taking the SATs in recent years, dipping farther down into the talent pool, and this would tend to lower average SAT scores, but it's also the case that the number of very-high scores has declined. It's hard to reconcile this with 24-point increases in IQ.
(4) Flynn-Effect gains are quoted as 3 points of IQ (3%) per decade, or 24 points of IQ in 80 years. But 3% per year would lead to a 27%, or 27-point, gain in IQ over 80 years. Furthermore, if the change were 24 points of IQ looking back at 1920, then from the standpoint of 1920 looking forward, the gain would be 32 points measured on a 1920 IQ scale. IQs are ratios and can't simply be added and subtracted.
    Another way to measure this is to give the 1916 Stanford Binet, the Army Alpha and Beta, and perhaps, other 1920-vintage tests to a representative cohort today to see what the integrated gain has been, and then to compare this with the product of the decade-by-decade gains since 1920.


Question #4: What has the actual IQ gain been since 1920?

   It might be worth noting that Sir Francis Galton, with extensive coaching from his teen-age sister, learned to read by two-and-a-half... precocious, but given coaching by his sister, perhaps not terribly so.

Today, I began to consider the interrelationship between the results of the 1921-22 Terman-Cox screening of 250,000 California schoolchildren and the Flynn Effect.
    The basic idea is that, irrespective of the meaning, and the cause of the Flynn Effect, we know that it's defined by a rise in IQ scores of approximately 24 points of IQ between 1921, when the Terman screening was conducted, and the present year 2000. This means that if we gave the 1916 Stanford Binet IQ test to 250,000 present-day California schoolchildren, their average IQ on this test today would be (and presumably is) 132. Or to say it the way we usually say it, if we adjusted the 1916 IQ scale so that today's children would have an average IQ of 100, then the 1921 California Schoolchildren would have an average IQ of 100 - 24 = 76. For the children 16 or over (the children's ages ranged from 4 through 17), this would correspond to a mental age of 16 on their 1921 scale, or about 12 on our year-2000 scale.    It's worth noting that on the Raven Progressive Matrices, British scores rose 47 points of IQ over a 90-year period between 1900 and 1992... an even greater improvement in fluid intelligence. However, since the 1916 Terman revision of the Binet Test was the fiduciary instrument used in 1921, we probably need to restrict our attention to the 3-points-per-decade climb in Stanford Binet IQ scores.
    The highest score any of the "Termite" children made in 1921 was an IQ of 201 on an SD = 16 scale, or 196 on an SD = 15 scale . Consequently this child had a mental age that was about twice his or her chronological age. If this child were 8 when given the test, she/he would have had a mental age of 16 on their scale, or a mental age of about 12 on our year-2000 scale. So this child would have a contemporary ratio IQ of only about 150, or a present-day deviation IQ of only about 143! We may argue about the true meaning of such an IQ score and the comparability of scores in those days versus scores today, but given a time machine, that's what we would ostensibly be led to find.
    To carry this a little farther, the 25 children whose IQs fell in the 180-to-201 range in 1921 would putatively be equivalent to today's children with ratio IQs of 137-to-150, or deviation IQs ranging between 134 and 143, with an average present-day IQ of 138. The 51 children whose 1921 ratio IQs fell in the 170-to-180 range would have present-day ratio IQs in the 129-137 range, with deviation IQs between 127 and 134.
    Most of them wouldn't be eligible for Mensa!
Continuing downward,
160 in 1921 becomes 122 in 2000
150 in 1921 becomes 114 in 2000
140 in 1921 becomes 106 in 2000.

    Sitting in front of me here, beside my computer, are the "Henmon-Nelson Tests of Mental Ability, High School Examination - grades 7 to 12 - Forms A and B", published by the Houghlin Mifflin Company in 1942. I was given this test in 1943, as was a friend, who earned an IQ of 168 on the test. The present generation of year-2000 schoolchildren should average about 122 on this test. Conversely, if this test were renormed to present-day standards, resetting the average IQ to 100, we would have gotten, as children in 1942, an average IQ of about 82 on this test. By these standards, my high-school friend would have a present-day ratio IQ of 138, or a present-day deviation IQ of 134... barely over the threshold for Mensa.
    I don't believe this! Knowing my friend's mentality today, and the awe in which he is held by his wife and associates, I would have to place him at least above the 99.9th %-tile, and probably above the 99.97th %-tile. When my friend was three-and-a-half, his uncle wondered aloud what weather was predicted for the next day. My friend piped up and said, in a childish lisp, "The newspaper says there will be inclement weather tomorrow." This is how his family learned that he had taught himself to read. These are not the characteristics of someone with an IQ of 134.
    Two possibilites suggest themselves One concerns the way in which mental ages above the mental age of 16 were established in the 1916 Stanford Binet Test. You can define a mental age of 16 by simply defining it to be the average mental age of people 16 and over*. However, when you get beyond a mental age of 16, some other strategy has to be employed. If you're willing to wait a few years, until the children whose IQs you've measured grow up, then by selecting a sample with childhood ratio IQs between 145 and 155, you could use their average test results in early adulthood to define an adult ratio IQ of 150 or an equivalent mental age of 24. However, if it's 1916, and you're in a hurry to standardize your new IQ test, you might instead use percentile rankings for adults, assigning a mental age of 24 or above to the top 0.1% of your test-takers. In that
case, you would be mixing apples and oranges, using ratio iQs for children with mental ages of 16 or below, and deviation IQs for adults with mental ages above 16. As mental ages go, it doesn't make a huge amount of difference at these IQ levels. If Dr. Terman standardized mental ages above 16 in this hasty manner for his 1916 Stanford Binet IQ
test, someone assigned a mental age of 24 on a percentile basis would actually have had a mental age of 25.6 years, or a ratio IQ of 160, if compared with grown children who had taken the Stanford Binet when they no older than 10.0 years of age.
    I don't know what was done in practice, but I'm sure someone does. I'll have to find out.
    The other possibility, and the one that I favor based upon personal experience, is that perhaps, somehow, the IQs of those of us who have remained mentally active have risen over the decades in accordance with the Flynn Effect. I find that old IQ tests like these 1942 Henmon Nelson tests sitting here in front of me, and the 1963 edition of the Slossen Intelligence Test (with a mental age ceiling of 27) that's also sitting in front of me, seem pretty simple today, but I don't think they would have seemed that simple to me as a teenager. As a teenager, I can remember looking up the word "salient" in the dictionary, and I can remember encountering the word "vapid" on an aptitude test and not knowing what it meant. (I was too old when I took the Henmon Nelson test to fit below its ceiling.)
    There's more to discuss, but this is today's installment. The Terman Study gives us a solid profile of IQ scores and their distribution in 1921, providing a solid anchor-point in the past. Because they were expressed as ratios, it follows that someone with a childhood ratio IQ of 200 on a 1916 test would have a childhood ratio IQ of only about 150 on a test normalized in the year 2000. To say it another way, a 10-year-old with an IQ score of 180 in 1921 would have a present-day ratio IQ of about 137, as mentioned above. In other words, the higher the IQ, the greater the Flynn Effect should be

* - (We know today that mental abilities slowly increase somewhat, into the 20's, and it's even possible for vocabulary to increase even later in life, but that wasn't understood in 1916.)


Tuesday, September 5th:
Pat Flynn has forwarded an article from the Chicago Tribune that discusses the efforts of parents to stimulate their children's intelligence, and then advises them to concentrate on the child's emotional development, not only to produce a mentally healthier child but also to more-effectively stimulate cerebral development. To enlist this as an etiological factor in the Flynn Effect, we would have to explain why the rate of rise is approximately equal in developed countries, and why it approximates 3 points per decade. And above all, we have to document that true intelligence is rising and not just some facility to perform better on tests. It would seem as though whatever parents might be doing today would have been incorporated into programs like the Milwaukee Project. However, even there, the children were, perhaps, available only during the day. Perhaps home-based acculturation plays a more prominent role than school-based education. Also, if these new considerations are correct about the role of emotions in infant development, it might be that much more is accomplished when the "training" is family-based and taps emotions in a way that purely-intellectual training can't. But it will important to know whether or not average infant development has accelerated beyond what it was 100 years ago--not that this would come as a great surprise, but that this must be an important question.
    One fact is certain: something is boosting IQs in childhood. It would seem to be important to know whether this boost occurs in infancy or later in childhood before IQ tests are administered.

Sunday, September 3rd:
It just occurred to me that one way to assess the significance of the Flynn Effect would be to see whether child development schedules (like the Gesell development guides) have had to be upgraded over the last 80 or 100 years. In the 30's and 40's, babies began to mutter a few words by their first birthday. There are various other criteria that are employed to assess a child's growing state of mental competence. These may be found in IQ tests for children. Have these age thresholds dropped, in keeping with the Flynn Effect for IQ tests? Do most children begin to say a few words at six months these days? At nine months? Or is it still 12 months? Does anyone know?
Saturday, September 2nd:
The book review of "The Rising Curve" and the September "News of the Ultranet" are now on the e-zine newsstands. Unfortunately, the last round of corrections and additions didn't make it into Ubiquity Magazine. Consequently, I'm going to link to the corrected version here.
Friday, September 1st:  This morning, I reviewed a description of William Sidis early years, written by his mother, Sarah Sidis. Billy began to speak his first words at 6 months, and was taught using alphabet blocks after that. He was allegedly reading the New York Times at 18 months (though that doesn't mean that he understood it)..(I was memorizing and reciting poetry at 22 months, but I didn't really understand the words I was prattling.) Elsewhere, he's listed as reading by the age of two. Note that Ken Wolfe and Michael Kearney spoke their first words at 4 months, and their first sentences at 6 months. Adam Konantovich spoke his first sentence at 5 months. Both were reading by the age of one. Nor are they alone in those capabilities. Greg Smith allegedly began speaking at 2.5 months. Bottom line: I'm thinking that these stories are not inconsistent with the thesis that we're seeing more-precocious children, and more precocious children, than we were 100 years ago. William Sidis was a super-smart individual, but he doesn't seem to me to be unique. Other late-20th-century children sound even more precocious to me than Billy Sidis. There's also the fact that super-smart children may regress toward the mean as they mature because of uneven maturation, and because genetic predisposition becomes more influential in adulthood. William Sidis' childhood "IQ" might possibly have been somewhat inflated compared to his adult IQ because of the pressure on him to perform as a child. But "might possibly" is the probably the proper way to say it. Attempts to boost children's IQs through training and coaching have been remarkably ineffective.
. .  In thinking about tests like the Raven Progressive Matrices, it occurs to me that there may be techniques and experience factors that could expand someone's scores. Considering the case of number series, which lend themselves better to this typed discussion, if you see a series of one- and two-digit nembers ranging between 1 and 26 and including 1, 5, 9,.15, 21 , and/or 25 (for a, e, i, o, u, and y), you're probably looking at a word, with the numbers standing for the letters of the alphabet. Also, your first order of business is to take the differences between adjacent numbers to see if they represent a familiar pattern. A series of squares such as 100  81  64  49 __  is popular in testing circles, as are perfect right triangles: 3  4  5 and  5  11  12; cubes:1  8  27  64  125  ___, sums of integers: 1  3  6  10  15  21  __, and factorials:  1  2  6  24  120  720  ____. It may be that when a whole society grows up with these habits of thought, test proficiency is teachable in a way that can't easily be emulated in a classroom over a moderate period of time. Just how much these thinking skills will spill over into real-world activities is hard to say, but today's child who knows how to program the VCR, and who can make computers and video games stand up and dance is a staple of today's urban legends.



Thursday, August 31st: In 1900, it was newsworthy when a child began reading before the age of 3. Also, he apparently wasn't as precocious as some of our late-20th-century prodigies, first picking out alphabet blocks at 1, and having been nearly 2 before he started to read.) "He had all his meals with us from the time he was six months old. He couldn't creep, and he couldn't walk and he couldn't talk, but he could observe." "and by the time he was three he read well". --Sarah Mandelbaum Sidis, 1950.