Intelligence has been defined in many ways: the capacity for logic, understanding, self-awareness, learning, emotional knowledge, reasoning, planning, creativity, critical thinking, and problem-solving. More generally, it can be described as the ability to perceive or infer information, and to retain it as knowledge to be applied towards adaptive behaviors within an environment or context.
Intelligence is most often studied in humans but has also been observed in both non-human animals and in plants. Intelligence in machines is called artificial intelligence, which is commonly implemented in computer systems using programs and, sometimes, specialized hardware.
HISTORY OF THE TERM
The word intelligence derives from the Latin nouns intelligentia or intellēctus, which in turn stem from the verb intelligere, to comprehend or perceive. In the Middle Ages, the word intellectus became the scholarly technical term for understanding, and a translation for the Greek philosophical term nous.
This term, however, was strongly linked to the metaphysical and cosmological theories of teleological scholasticism, including theories of the immortality of the soul, and the concept of the active intellect (also known as the active intelligence). This approach to the study of nature was strongly rejected by the early modern philosophers such as Francis Bacon, Thomas Hobbes, John Locke, and David Hume, all of whom preferred “understanding” (in place of “intellectus” or “intelligence”) in their English philosophical works.
Hobbes for example, in his Latin De Corpore, used “intellectus intelligit”, translated in the English version as “the understanding understandeth”, as a typical example of a logical absurdity. “Intelligence” has therefore become less common in English language philosophy, but it has later been taken up (with the scholastic theories which it now implies) in more contemporary psychology.
The definition of intelligence is controversial. Some groups of psychologists have suggested the following definitions:
A very general mental capability that, among other things, involves the ability to reason, plan, solve problems, think abstractly, comprehend complex ideas, learn quickly and learn from experience. It is not merely book learning, a narrow academic skill, or test-taking smarts. Rather, it reflects a broader and deeper capability for comprehending our surroundings—”catching on,” “making sense” of things, or “figuring out” what to do.
Individuals differ from one another in their ability to understand complex ideas, to adapt effectively to the environment, to learn from experience, to engage in various forms of reasoning, to overcome obstacles by taking thought. Although these individual differences can be substantial, they are never entirely consistent: a given person’s intellectual performance will vary on different occasions, in different domains, as judged by different criteria.
Concepts of “intelligence” are attempts to clarify and organize this complex set of phenomena. Although considerable clarity has been achieved in some areas, no such conceptualization has yet answered all the important questions, and none commands universal assent. Indeed, when two dozen prominent theorists were recently asked to define intelligence, they gave two dozen, somewhat different, definitions.
Human intelligence is the intellectual power of humans, which is marked by complex cognitive feats and high levels of motivation and self-awareness. Intelligence enables humans to remember descriptions of things and use those descriptions in future behaviors.
It is a cognitive process. It gives humans the cognitive abilities to learn, form concepts, understand, and reason, including the capacities to recognize patterns, innovate, plan, solve problems, and employ language to communicate. Intelligence enables humans to experience and think.
HERITABILITY OF IQ
Research on the heritability of IQ inquires into the proportion of variance in IQ that is attributable to genetic variation within a population. “Heritability”, in this sense, is a mathematical estimate that indicates an upper bound on how much of a trait’s variation within a population can be attributed to genes. There has been significant controversy in the academic community about the heritability of IQ since research on the issue began in the late nineteenth century.
Intelligence in the normal range is a polygenic trait, meaning that it is influenced by more than one gene, and in the case of intelligence at least 500 genes. Further, explaining the similarity in IQ of closely related persons requires careful study because environmental factors may be correlated with genetic factors.
Twin studies of adult individuals have found a heritability of IQ between 57% and 73% with the most recent studies showing heritability for IQ as high as 80% IQ goes from being weakly correlated with genetics, for children, to being strongly correlated with genetics for late teens and adults. The heritability of IQ increases with age and reaches an asymptote at 18–20 years of age and continues at that level well into adulthood. This phenomenon is known as the Wilson Effect. However, poor prenatal environment, malnutrition and disease are known to have lifelong deleterious effects.
Although IQ differences between individuals have been shown to have a large hereditary component, it does not follow that mean group-level disparities (between-group differences) in IQ necessarily have a genetic basis. The current scientific consensus is that there is no evidence for a genetic component behind IQ differences between racial groups
“Heritability” is defined as the proportion of variance in a trait which is attributable to genetic variation within a defined population in a specific environment. Heritability takes a value ranging from 0 to 1; a heritability of 1 indicates that all variation in the trait in question is genetic in origin and a heritability of 0 indicates that none of the variation is genetic.
The determination of many traits can be considered primarily genetic under similar environmental backgrounds. For example, a 2006 study found that adult height has a heritability estimated at 0.80 when looking only at the height variation within families where the environment should be very similar. Other traits have lower heritabilities, which indicate a relatively larger environmental influence. For example, a twin study on the heritability of depression in men calculated it as 0.29, while it was 0.42 for women in the same study.
There are a number of points to consider when interpreting heritability:
Heritability measures the proportion of variation in a trait that can be attributed to genes, and not the proportion of a trait caused by genes. Thus, if the environment relevant to a given trait changes in a way that affects all members of the population equally, the mean value of the trait will change without any change in its heritability (because the variation or differences among individuals in the population will stay the same).
This has evidently happened for height: the heritability of stature is high, but average heights continue to increase. Thus, even in developed nations, a high heritability of a trait does not necessarily mean that average group differences are due to genes. Some have gone further, and used height as an example in order to argue that “even highly heritable traits can be strongly manipulated by the environment, so heritability has little if anything to do with controllability.”
A common error is to assume that a heritability figure is necessarily unchangeable. The value of heritability can change if the impact of environment (or of genes) in the population is substantially altered. If the environmental variation encountered by different individuals increases, then the heritability figure would decrease. On the other hand, if everyone had the same environment, then heritability would be 100%.
The population in developing nations often has more diverse environments than in developed nations. This would mean that heritability figures would be lower in developing nations. Another example is phenylketonuria which previously caused mental retardation for everyone who had this genetic disorder and thus had a heritability of 100%. Today, this can be prevented by following a modified diet, resulting in a lowered heritability.
A high heritability of a trait does not mean that environmental effects such as learning are not involved. Vocabulary size, for example, is very substantially heritable (and highly correlated with general intelligence) although every word in an individual’s vocabulary is learned. In a society in which plenty of words are available in everyone’s environment, especially for individuals who are motivated to seek them out, the number of words that individuals actually learn depends to a considerable extent on their genetic predispositions and thus heritability is high.
Since heritability increases during childhood and adolescence, and even increases greatly between 16–20 years of age and adulthood, one should be cautious drawing conclusions regarding the role of genetics and environment from studies where the participants are not followed until they are adults. Furthermore, there may be differences regarding the effects on the g-factor and on non-g factors, with g possibly being harder to affect and environmental interventions disproportionately affecting non-g factors.
Polygenic traits often appear less heritable at the extremes.
A heritable trait is definitionally more likely to appear in the offspring of two parents high in that trait than in the offspring of two randomly selected parents. However, the more extreme the expression of the trait in the parents, the less likely the child is to display the same extreme as the parents. At the same time, the more extreme the expression of the trait in the parents, the more likely the child is to express the trait at all. For example, the child of two extremely tall parents is likely to be taller than the average person (displaying the trait), but unlikely to be taller than the two parents (displaying the trait at the same extreme). See also regression toward the mean.