How To Beat Your Futuristic Lie Detector

Scientists (and the companies that fund them) claim there's a new, better wave in lie-detection technology ahead. We'll use functional MRIs and electroencephalographies to make sure you're telling the truth. But one professor says they'll be easy to beat.

Professor Melissa Littlefield doesn't buy the hype surrounding functional Magnetic Resonance Imaging scans (fMRIs) and EEG technologies applied to lie detection. In an article due to be published in Science, Technology & Human Values, she says:

"Far from describing the brain and its functions, fMRI and [the EEG technology] Brain Fingerprinting® produce models of the brain that reinforce social notions of deception, truth and deviance," she concludes in the paper's abstract.

Littlefield argues that lies are complex behaviors that we don't yet have the technology to identify in the brain in a definite way — their symptoms require more interpretation then just which broad aspects are working at any given time.

Worse yet, she thinks they rely on a really minimal understanding of what the act of lying is:

Finally, she said, "they share this assumption that truth and deception are somehow connected. In deception studies, if you're looking at the polygraph or you're looking at the fMRI, the assumption is that truth is the baseline - the factual, the basic, the natural. And to lie is to add a story on top of the truth."

Some people don't actually know that they're lying, or have told a lie for so long that it becomes their subjective interpretation of reality. Lie detectors function on the assumption that there is one, acknowledge-able, objective truth that the lie-teller knows and is distorting or consciously not telling. But reality is always open to subjective interpretation, and lies (and the people that tell them) take more effort for some and none for others. Lie detectors might work — if you know you're lying, if it requires effort, if it doesn't match with your subjective experience of "reality," if you even recognize reality.

And, when it comes to fMRIs and EEGs, there are really simplistic ways to cheat — the high-tech equivalent of Xanax or a tack in your shoe.

"Protocols are such that if you didn't want to have your brain scanned, all you'd have to do is clench your teeth or move your head, and it would create artifacts in the images, and then you can't use them - luckily."

So until we get a Ceti eel or two, you'll just have to grit your teeth when you lie.

Scholar unconvinced new lie-detection methods better than old ones [PhysOrg]