So you think you're smart? Well, prove it.S

Unfortunately, despite nearly a century and a half of testing and decades of neuroscience, you can't prove you're intelligent. Our inability to define one of humanity's basic traits has led to some pretty weird ideas about intelligence — and could ultimately undermine our quest for strong artificial intelligence, too.

Intelligence Tests

Some of the first standardized intelligence tests, called the "Imperial Examinations," were developed over 1500 years ago in China. For over a millennium, the government chose its regional bureaucrats based on their mastery of five basic areas of knowledge, including the arts, law, and military strategy. Some historians call this the first state effort to create a meritocracy, though of course many citizens found these tests unfair and biased towards the elites.

Perhaps the most famous measure of intelligence is the so-called IQ test, which is actually a broad term for a whole range of tests that have been used since the early twentieth century to test people for a range of things — some of them legitimate (such as mental competence to serve in the military) and some of them deeply illegitimate (such as whether low-income black women should be sterilized). Over a century ago, the term "intelligence quotient" was used by educators and eugenicists alike, but it was first invented by the German psychologist William Stern in the nineteenth century, mostly as a way to characterize developmental disabilities (called "retardation" in the language of the day).

Of course IQ tests have changed a lot over the years since, but the basic scoring system remains the same. 100 is the mean, and 95% of the population will fall within two standard deviations of that — so 95% of the population has an IQ between 70 and 130. How is IQ measured? You'd be surprised at the range of ideas. Generally these days, IQ tests focus on verbal abilities, but in the early part of the twentieth century they focused on non-verbal abilities too. Most tests measure what American psychologist Louis Leon Thurstone characterized as "verbal comprehension, word fluency, number facility, spatial visualization, associative memory, perceptual speed, reasoning, and induction."

There are many kinds of IQ tests. You can take some here.

Various studies — all of which have been disputed at one time or another — correlate a high IQ score with everything from good grades in school and economic success, to longevity. Low IQ has been correlated with crime. Low IQ has also been associated with racism in recent years, which is rather ironic given that some of the early adopters of IQ tests were eugenicists.

Few would dispute that IQ seems to measure a person's ability to succeed on standardized tests, a skill which often translates well into getting good grades and obeying the law. And both those skills can help a person get a college education, and a middle class job. But is "success" the same thing as being smart? Many would argue that it isn't.

Other Kinds of Intelligence

IQ doesn't really measure things like ethics. Nor does it measure emotional intelligence, which is usually defined as an ability to control one's impulses, empathize with others, and forge social alliances. It also doesn't really seem to capture that ineffable quality of genius that has absolutely nothing to do with holding down a respectable job, obeying rules, and creating works of lasting importance.

Today, cognitive scientists are trying to pin down the mental qualities that make for what Malcolm Gladwell calls "outliers," whose skills may not register on an IQ test. In his book On Intelligence, Jeff Hawkins argues that prediction is the key to intelligence. By that he means that people who can accurately predict outcomes to any given situation — usually by modeling different scenarios in their heads — are the smartest. He also means something more fundamental, which is that part of what makes human brains unique is our ability to do this kind of predictive modeling.

Neurologist Elkhonon Goldberg, in a fascinating book called The New Executive Brain: Frontal Lobes in a Complex World, takes apart our myths about how the brain works piece by piece — including what constitutes "intelligence." He describes research into "moral development" and the frontal lobes of the brain. This research suggests that we develop a sense of right and wrong, as well as impulse controls, in the first few months of our lives — entirely as a result of the social contacts we have with the people around us. This social contact literally changes the structure of the frontal cortex. People deprived of contact in those crucial formative months may be more vulnerable to mental disorders related to impulse control, and could be predisposed to commit anti-social acts like lying or abusing other people.

And these are just a few of the ways that thinkers have defined intelligence.

If we take moral, emotional, and predictive abilities into account, we have to assume that smart people aren't just good at remembering numbers and rotating shapes in their brains. They are socially adept and ethically sensitive, and make decisions based on rationally modeling possible outcomes. Of course, we have ample evidence that people can be quite gifted and also incapable of planning anything. And certainly we've all met people who were undeniably smart but socially awkward.

Artificial Intelligence

Given that we still can't agree on what constitutes human intelligence, how will we ever invent artificial intelligence (AI)? Some, like the philosopher John Searle, would argue that we simply will never have what he calls "strong" AI, which is to say an AI that is equivalent to a human mind ("weak" AI is something like what Google is doing when it auto-completes your search terms). Searle claims that the main difference between humans and computers is that humans understand the meaning and implications of language (semanics), while a machine can only understand the structure of language (syntax).

He famously created a thought experiment called the "Chinese room," where he compared a computer to an English-speaking person using a set of rules to put Chinese ideograms together in an order that made sense. Though it would appear that this person understood Chinese, all they really understood was how to put the ideograms together.

People like Hawkins, author of On Intelligence, would argue against Searle, as would Marvin Minsky and other proponents of strong AI. They believe that a human-equivalent AI could be invented, and each does it by defining intelligence in a way that could eventually be translated into silicon.

Alan Turing, a pioneer in the field of computers, invented what is today called the Turing Test to distinguish between artificial and human intelligence. His hopeful idea in 1950 was that once a machine had passed the test, we'd have a thinking machine. Since that time, however, several programs have passed the test (including, recently, an NPC in a videogame) and nobody has hailed this as the moment when AI roams free.

Ultimately we may never have strong AI, but not because we haven't actually invented an intelligent computer. Instead, it will be because nobody can agree on what intelligence truly is.

Sources linked in the body of the text.

Illustration by gornjak via Shutterstock