Why I Find This Exciting!

 

 

    I mentioned that the question of whether or not IQ distributions are ln-normal is important. Let me tell why I think so, and what I hope the consequences might be.
    I should probably disclaim any expert knowledge of intelligence and intelligence testing. My remarks are exploratory, and undoubtedly will change as I learn more, and as some of us explore this with experts..  
    Right now, I've gotten the impression that there is some confusion about the relationship between old-fashioned ratio IQs and "deviation IQs". I have the impression that there's no definitive relationship between ratio IQs and deviation "IQs". If we are correct that the natural logarithms of mental ages/chronological ages are Gaussian-distributed--and there's evidence that suggests this--then a precise mathematical relationship will have been forged between mental ages/chronological ages (essentially, ratio IQs) and their frequencies of occurrence.
    It's the precise mathematical relationship that I find exciting. 
    Our proposition is that the logarithms of IQs are distributed along the kind of Gaussian curve that IQs themselves could never be made to fit  These natural logarithms, multiplied by 100 and added to 100, are what are termed deviation "IQs".  This (we hope) establishes (for the first time?) an exact relationship between IQs and their frequencies of occurrence.
    As I read the tea leaves, one of the consequences is that ratio IQs, and therefore mental ages, would seem to exist, and to be natural phenomena. If so, Alfred Binet didn't invent the concept of mental age; he discovered it.
    If the logarithms of IQs are Gaussian-distributed, then Dr. Guy Fogleman has observed that the factors that give rise to intelligence would be multiplicative rather than additive (since their logarithms are additive). Chris Langan has pointed out that the Central Limit Theorem would ensure that even if the logarithms of the individual multiplicative factors are not Gaussian-distributed, the sum of a large number of them would yield an approximately normal distribution.
    So what are these multiplicative factors, and how would this square with the many factorial studies that support the existence of a single "g" factor for intelligence? My notion would be that these multiplicative factors would have to be involved in series in the generation of "g". They might consist of such physiological substrates as the thickness of the middle layer of the neocortex, nerve conduction speeds, and probably other factors whose product would determine "g". They would have to be factors for which we don't currently directly test or they would have shown up in conventional factor analyses.

(To be continued)