homehome Home chatchat Notifications


How much of human intelligence is genetic versus acquired? Is it even possible to get smarter?

Intelligence, a blend of genetics and environment, centers on problem-solving abilities and is best enhanced through learning and formal education.

Tibi Puiu
May 20, 2024 @ 6:22 pm

share Share

Intelligence genetic or acquired
AI-generated image. Credit: DALL-E 3.

Is intelligence a gift of nature or a product of nurture? This question has intrigued psychologists for more than a century, leading to extensive research and debate. The allure of enhancing intelligence has led to a bustling, multi-billion ‘brain-boosting’ supplement market for those seeking cognitive enhancements. Yet, the effectiveness of these methods remains dubious, often lacking regulatory approval and scientific backing.

Our society both reveres genius and, conversely, stigmatizes perceived lesser intellects. This explains why some people are obsessed with becoming smarter. But can intelligence be increased? Is it genetically predetermined, or does it wane with age? What does ‘intelligence’ mean in the first place? Let’s dive into what the latest science has to say about all this.

What is intelligence anyway? What about IQ?

There is no clear-cut definition of what constitutes intelligence. The consensus among scientists in this field is that intelligence refers to a set of mental abilities for problem-solving. There are many subsets of intelligence, including verbal ability, numerical ability, spatial ability, and even “emotional intelligence” (the ability to manage both your own emotions and understand the emotions of people around you).

However, the most important subset seems to be “general intelligence”, also known as the “g factor” among intelligence researchers. This factor plays a major role in differentiating individuals in standardized IQ tests, contributing to at least half of the variance observed in these tests’ scores. The g-factor is closely related to fluid intelligence, which involves problem-solving and reasoning skills.

The first attempts to quantify intelligence date back to the 1800s. Sir Francis Galton, an English polymath, pioneered quantitative methods to study intelligence. He was the one who coined the term ‘Nature vs Nurture’ in 1874, for a debate that persists today: Are smart people mostly born this way, or is this ability acquired through life?

Another key figure, Charles Spearman, proposed a ‘General Intelligence’ factor in 1904, suggesting a single underlying cognitive ability. Around the same, psychologist William Stern introduced the concept of ‘intelligence quotient’ (IQ) as a comparative measure of intelligence. Meanwhile, psychologists Alfred Binet and Théodore Simon developed intelligence tests for children, which eventually led to the standardization of IQ tests.

However, these tests were not without criticism. Even Stern, the originator of the IQ concept, recognized their limitations, emphasizing the inherent value and complexity of individual psychological lives.

In 1987, psychologist Raymond Cattell challenged the idea of a singular intelligence factor, proposing a split between structured and creative learning, known as Crystallized and Fluid Intelligence. Crystallized intelligence refers to intelligence manifested through the use of knowledge previously acquired through education or experience.

How accurate are IQ scores?

Do IQ scores accurately predict general intelligence? Critics argue that factors like socioeconomic status can skew IQ scores, potentially misrepresenting innate intelligence.

However, the consensus among experts in the field of intelligence research is that IQ is a good proxy for intelligence. Studies show that high-IQ people are more educated, achieve better grades at school, have higher incomes, are healthier and happier, are less likely to develop an addiction, and generally have higher social status than those with lower scores.

“The question is, does an IQ test that is administered to a young child predict these later life outcomes that we believe are indicative of intelligence? The answer to this question is an overwhelming ‘yes.’ IQ tests have enormous predictive capacity,” Louis Matzel, an expert from Rutgers University, wrote in a MetaFact review.

However, there are caveats. IQ tests generally fail to encompass broader aspects of intelligence such as creativity, practical skills, emotional, social, and wise intelligence (common sense).

Intelligence is mostly genetic

Intelligence is undeniably influenced by genetics, yet environmental factors also play a significant role. Twin and adoption studies reveal that genetic influence grows with age, and by adolescence, a significant portion of IQ differences can be attributed to genetics.

“There is a tremendous amount of empirical research clearly showing that intelligence is hereditary,” Dimitri van der Linden from Erasmus University Rotterdam responded to a MetaReview survey. “The estimates range somewhere between .50 to .80 (and are likely closer towards the latter).”

However, the expression of genes is shaped by environmental factors, and small genetic predispositions can be influenced by nurturing environments. This interplay between genes and environment shouldn’t be underestimated or overlooked.

Can you improve intelligence? Yes, but not with gimmicky brain games

Contrary to the belief that intelligence is fixed, research shows it is subject to change. The ‘Flynn effect’ shows there’s a steady increase in average IQ scores over the 20th century, associated with better improvements to education, especially higher ed. Additionally, schooling correlates with increased intelligence — every year of education adds 1 to 5 IQ points. So, yes, technically it is possible to raise your IQ score.

These effects are explained by the brain’s neuroplasticity — its ability to reorganize and form new neural connections throughout life, allowing it to adapt in response to learning, experience, or injury.

 “The brain is neuroplastic and keeps changing,” argued Gavin Brown from the University of Auckland in a MetaFact thread. “Stimulus from environments which requires flexibility in processing and exploitation of schematized structures keeps the brain active forming new paths.”

However, brain training games — the kind you see aggressively advertised online almost everywhere — do not work as promised at all. The most you can hope from engaging with these brain games is becoming better at their very specific and limited tasks, which can be said of anything where you apply practice. There are no general benefits. There is no evidence either that brain games can help stave off dementia, as some companies claim.

“Meta-analytic reviews of the empirical literature indicate either tiny or absent gains,” Nachshon Meiran from the Ben-Gurion University of the Negev told MetaFact. “In my opinion, given what we know, it is unfair (or worse) to promise otherwise.”

Focused cognitive training may temporarily boost specific abilities, but these effects often fade. The most substantial gains in intelligence appear to stem from formal education, which exercises the mind and brain through the acquisition of real knowledge.

This article appeared in January 2024 and was recently updated with new information.

share Share

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.

Worms and Dogs Thrive in Chernobyl’s Radioactive Zone — and Scientists are Intrigued

In the Chernobyl Exclusion Zone, worms show no genetic damage despite living in highly radioactive soil, and free-ranging dogs persist despite contamination.