Tuesday, December 27, 2005

Do we need a new Fourth Amendment?

Matt Yglesias's musings have me thinking about problems applying the Fourth Amendment in a new day and age. Suppose that the government develops a robot designed to enter houses, independently, and search for signs that the inhabitants are terrorists. Suppose that there aren't that many terrorists out there, but that the robot is very, very good at identifying them, and very, very good at not identifying non-terrorists. This smells like a Fourth Amendment violation: The police are't allowed to randomly search through houses to find criminals, and the robot surely is violating the rights of all those whose homes are searched to be free of such searches.

But suppose that the robot is designed to search houses without anyone realizing it's there. And suppose that it searches data -- fax transmissions, or e-mail, or telephone calls -- instead of homes. And suppose that the robot does not understand, transmit, or remember what it sees. Are these searches so unreasonable then?

I would speculate that the NSA has been doing something more analogous to this, rather than simply executing traditional searches without a warrant. Setting aside a potential national-security exception to the Fourth Amendment's requirement of a warrant -- should these searches be barred by the Constitution?

I really don't know the answer. I find it hard to believe no one has thought about this before, but it's not my area.

Comments:
This is why we rely on that penumbral right to privacy. You see it really must be there, else all is lost.
 
What's lost if the robot doesn't remember anything non-incriminating?
 
I think that's close to an impossible idealization, although it may be a useful way to think about algorithmic sifting of telecommunications ala the NSA. Yet after your algorithm or artificial intelligence short-lists candidate suspects, humans will be selecting who really are suspects worthy of further investigation. If you want to detect plane hijackers with maximum sensitivity, you're going to be compromising technologically with regard to specificity, and so you're going to be netting lots of people who are false positives for plane hijacking, but who might have other interesting things about them showing up all shiny in the net--that they're pedophiles or Democrats, for example. The human at the end of the line may not feel like throwing that info out or back, and there may be nobody to witness the decision and few who know what gets done with the information. And who gets to decide how much specificity we give up for the sake of sensitivity? The news makes it sound like the NSA is erring heavily on the side of sensitivity.
 
The Fourth Amendment doesn't require 100% certainty before you can get a warrant. What if the false-positive rate is 50%? That should still be good enough to get a warrant. 25%? 10%?
 
I guess then it gets down to witnesses and ease of abuse, not to mention the practically limitless amount of abuse that could result under automation. Hey, maybe we need a new Fourth Amendment. Did you ever think of that?
 
Actually, I don't like 50% very much at all, although I'll wave my objection when the suspicion is WMD. What is the legal standard for issue of a warrant? Is it the same regardless of the real estate and crime in question? e.g. An allegation of sexual harassment at Los Alamos?
 
I don't think that law enforcement needs to be prove that it's more likely than not that they'll find something, but I think they need to show a reasonable basis to so believe. So, less than 51%, but more than 2%.

I don't really know a whole lot about this area, to be sure.
 
Post a Comment

Subscribe to Post Comments [Atom]





<< Home

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]