Monday, September 15, 2008

The Truth Machine

Ron Baily at Reason Magazine has a blog post on research into Truth Machines.

The New York Times is reporting that an Indian criminal court accepted a brain scan as evidence of guilt in a murder trial in India earlier this year. The developer of the the Brain Electrical Oscillation Signature (BEOS) test claims that it uses electrodes to detect when regions of the brain "light up" with guilty knowledge.

According to the Times:

The woman, Aditi Sharma, was accused of killing her former fiancé, Udit Bharati. They were living in Pune when Ms. Sharma met another man and eloped with him to Delhi. Later Ms. Sharma returned to Pune and, according to prosecutors, asked Mr. Bharati to meet her at a McDonald's. She was accused of poisoning him with arsenic-laced food.

Ms. Sharma, 24, agreed to take a BEOS test in Mumbai, the capital of Maharashtra. (Suspects may be tested only with their consent, but forensic investigators say many agree because they assume it will spare them an aggressive police interrogation.)

As the Times points out, most U.S. experts doubt that the BEOS technology has been propoerly validated. However, neuroscience researchers are working toward creating such a "truth machine." As the Times notes:
....

Back in 2001, I looked at the status of brain scanning technologies and pointed to some of the possibilities that fully validated brain scanning technologies would offer for abuse by government and some implications for the future of privacy:

...James Halperin, author of the 1996 science fiction novel The Truth Machine.., notes an interesting convergence in current fMRI and brainwave research since his fictional "Cerebral Image Processor" measured a combination of electrical activity and blood flow. In The Truth Machine, Halperin illustrates the benefits and problems that the pervasive availability of an infallible lie detector would cause society. It is easy to see some of the benefits -- detecting would-be terrorists, finding politicians who tell the truth during campaigns, detecting honesty in meeting contractual obligations. But what about those areas of life we would like to keep private, say, one's sexual orientation, or unusual religious beliefs, or drug habits, or taste in pornography? Halperin suggests that right now, many of us tolerate laws and regulations on many of these private activities because we know that we are not likely to be caught when we violate them. In a world where the truth can be known absolutely, Halperin thinks laws regulating many private activities would be repealed and there would be areas of life in which the use of a truth machine itself would be banned.

Whole column here

Also mentioned are Constitutional issues.
 
Privacy may well suffer once such a machine is developed.  At least, as currently designed, this machine requires a close connection with the subject -- wearing electrodes or sitting inside a scanner, for instance.  If a form of this gadget can be built that reads at a distance, there will no longer be such a thing as "in the privacy of your own thoughts".

No comments: