Can the Flynn Effect Dwindle As the IQ Rises?

 

    All IQs in the following discussion are based upon a standard deviation of 16.
    In arguing that the Flynn Effect cannot be dependent upon IQ, I assumed that the Flynn Effect acts uniformly over all age groups. But supposing it isn't. Suppose that someone who today has a deviation IQ of 143 (present-day ratio IQ of 150) actually only has a present-day ratio IQ of 125 (present-day mental age of 20). And someone today with a deviation IQ of about 174 (equivalent ratio IQ of 200) really only has a present-day ratio IQ of 150 (present-day mental age of 24). In other words, suppose that the distribution curve for the upper half of the year-2002 distribution for intelligence is only half as wide as it was in 1916, with a standard deviation of 8 instead of 16. After all, if we measure adult IQ's, and since 1986, even children's IQs, only with deviation IQs (frequencies of occurrence,) we have no idea what their actual levels of capability are.
    The average present-day young adult (present-day IQ = 100, present-day mental age = 16)) will score a 1916 IQ of 133 (1916 mental age = 21.33) on the 1916 edition of the Stanford Binet IQ test, or on a comparable IQ test of that vintage.
    Today's citizen with a deviation IQ of 143 (which would have corresponded in 1916 to a ratio IQ of 150, but which would only correspond to a ratio IQ of 125 today, with a present-day mental age of 20) would have a 1916 mental age of 4/3 X 20 = 26.67, and a 1916 ratio IQ of 26.67/16 = 166.67.
    Today's citizen with a deviation IQ of 174, standard deviation = 16, (equivalent in 1916 to a ratio IQ of 200, but equivalent to a ratio IQ of only 150 today, with a present-day mental age of 24) would have a 1916 mental age of 4/3 X 24 = 32, and a 1916 ratio IQ of 200 (1916 deviation IQ of 174, standard deviation = 16).
    This would give us a situation in which past geniuses with ratio IQs of 200 would be as bright as today's phenomenally gifted adults with ratio IQ's of 200 (deviation IQs of 174, standard deviation = 16). At the same time, it would let us explain how the average IQ of today's average young adult is 133 on a 1916-era IQ test.
    But I don't think it holds water. Here's why.
    In 1988, Dr. Miraca Gross identified 15 children from South Australia, Victoria and the Australian Capital Territory with ratio IQs of 160 or higher on the Third Revision (1973) of the Stanford Binet Intelligence Test. Three of these children obtained an IQ score of 200 or above. One of these, Christopher Otway, hit the ceiling of the test, with a mental age of 22 at the chronological age of 11. A few months later, "At the age of 11 years, 4 months he achieved the phenomenal score of 710 on the Scholastic Aptitude Test-Mathematics (SAT-M) (SAT), and 580 on the Scholastic Aptitude Test-Verbal (SAT-V). To extend the testing further, the psychologist administered the Wechsler Adult Intelligence Scale (WAIS-R). Here Christopher performed at the absolute maximum on abstract reasoning and arithmetic, placing him in the 'very superior' range even compared to adults."
    The Third Revision of the Stanford Binet Intelligence Test would now be 29 years out of date, leading to a Flynn Effect of about 9 points. Since we're dealing with an IQ at the 200 level, and since I think that the magnitude of the  Flynn Effect is directly proportional to the IQ (for above-average IQs), I'm going to assume that Chris' 1988 IQ of 200 can be de-rated by 18 points because of the Flynn Effect. On the other hand, since he bumped his head against the ceiling of the test, I'm going to assume that his actual IQ was somewhat higher than 200. In fact, I'm going to assume that it was about 218, which will allow us to retain a mental age of 22 as measured by today's standards. His IQ in 1988 would then have been 218 relative to other 11-year-olds. His adult IQ on today's scale would  be 200 because of the Flynn Effect. But we have assumed above that a present-day IQ of 200 corresponds to a present-day mental age of only 24, so Chris' mental age at 16 and beyond would be only 24. Now let's see. When he was 11, he had a mental age of 22. At 16, his mental age was 24. Consequently, he would have added only 2 years to his mental age between the chronological ages of 11 and 16  In other words, his mental age was increasing by 2 years per year up to the age of 11, and then slowed down to an average of 0.4 years per year between 11 and 16.
    I don't believe that happened. Do you?

To Summarize:
    As recently as 1988, ratio IQ's were alive and well for children with IQs above 200 up to at least the age of 11. Presumably Dr. Linda Silverman's Gifted Development Center has measured many more of these phenomenally high IQs that are based upon mental ages, as provided by the Third (1973) Revision of the Stanford Binet Intelligence Test.

Tests That Could Test This Hypothesis:
    The Slossen Intelligence Test (S-FRIT) (1993) extends upward to a mental age of 27.
     Form S of the California Test of Mental Maturity ranges upward to a ceiling mental age of 32. (American Mensa has bought the rights to this test, and has assigned a deviation IQ ceiling of 168 to it . . . a number that seems generous considering the 45 years that have elapsed since Form S was published.) It's ceiling would have to be de-rated by about 4 years of mental age to compensate for the Flynn Effect, giving it a mental age of ceiling of about 28 today.
    If we can find a significant number of subjects who score higher than a mental age of 24 on either of these tests, then we will have succeeded in falsifying the hypothesis that the above-average adult intelligence distribution  has shrunk to half its 1916 width.
    We can begin with me. I had a mental age of 29 years on the CTMM when I took it in 1979, and of 30 years, 8 months when I took it in 1999 (before correcting for the Flynn effect).  The first time I took the test, in 1999, I made 3 errors each on two of the 7 subtests (Similarities and Verbal Concepts) of the test, with perfect scores on the other 5 subtests. (It would seem that I was limited by the ceiling of the test.) The second time I took the test, in 1999, I made perfect scores on 6 of the 7 subtests, with the same 3 errors on the Similarities subtest. Correcting these numbers for 45 years of Flynn Effect yields 25.55 years of mental age for the first try in 1979, and 27 years of mental age for the second attempt in 1999.  These mental ages, after correcting for 45 years of Flynn Effect,  correspond to present-day ratio IQ's of 159.7 and 167.5. If our hypothesis about the upper half of the IQ distribution having shrunk to half its width were correct, my present-day mental ages of 25.55 and 27 would translate to deviation IQ scores of 184 (1 in 13,000,000) and 192 (1 in 225,000,000), respectively. And this on a test on which I got perfect scores on first 5, and then 6 out of the 7 subtests! Sorry. I like it, but it doesn't fit..

    The idea that the Flynn Effect diminishes to the vanishing point as IQs rise would allow the great minds of the past to possess IQs as high as those of the present day. Unfortunately, I don't think that's the case. I think the Flynn Effect is proportional to both time and IQ. However, it should be very easy to test this hypothesis simply by giving present-day subjects with known present-day IQs old tests, and looking at their resulting IQ distributions on those  old tests.