Sunday, November 17, 2013

Not Even Close to "Almost Human" – Artificial Intelligence in Law Enforcement

As FOX does us the service tonight of introducing the entirely original idea of police robots in a dystopian future in Robocop Minority Report Almost Human, it’s worth pausing to consider some of the real artificial intelligence and autonomous technology (let’s collectively call these advances AI, for convenience) that will actually come to law enforcement in the near future. This is a topic I consider and discuss at great length in my forthcoming book, Robots Are People Too. These robots and computer programs won’t remind anyone of buddy cop movies, but they will force us to ask serious questions about how the 4th Amendment should limit its use.

For those non-lawyers, the 4th Amendment states:

“The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”

Sounds simple enough. But since the middle of the 20th century, courts, defendants, and prosecutors have debated few Constitutional provisions more. The Amendment was originally intended to protect each individual’s personal effects and private papers from government seizure. But the technological advances of the last half century have forced courts and law enforcement to expand the areas protected, mostly due to two issues.

The first is the development of new programs and machines that expand the spaces people use to store information – such as cell phones and flash drives – as well as the spaces that gather information whether we want them to or not – like your web browser history. The second is the development of new devices that police can use to collect information about crimes and criminals, such as thermal imaging devices and GPS trackers.
 
AI will introduce new devices and programs that continue these trends. Autonomous cars like Google Car will collect, store and transmit data about everywhere we go. As AI “assistants” like Siri become more common and more functional, they also will collect, store and transmit data about their usage. And drone technology that the military uses now – which increasingly relies on autonomous functions like navigating and reconnaissance – is already making limited appearances within the law enforcement community, helping with duties like search and rescue missions, where a flying drone is cheaper and easier than a helicopter for aerial searches.

Our laws are fairly non-existent when it comes to AI, which makes sense because until recently AI appeared exclusively in fiction like Almost Human. Rather, underlying most of our laws is the base assumption that only human beings make decisions. Our laws aren’t designed to govern scenarios where machines and programs make decisions – how fast to drive, how to navigate in the air, etc.  States and the federal government have started to admit this. David Strickland, the Administrator of the National Highway Traffic Safety Administration, said last year that “Most of NHTSA’s safety standards assume the need for a human driver to operate required safety equipment. A vehicle that drives itself challenges this basic assumption.” When the California and Florida legislatures passed legislation last year governing autonomous cars, the bills admitted that one of the reasons they wanted to pass those laws was that each state “does not prohibit or specifically regulate the testing or operation of autonomous technology in motor vehicles” or “the operation of autonomous vehicles.”

States are also concerned about the use of drones by law enforcement agencies. Through the summer of 2013, more than 40 states have considered bills that would limit drone usage by the police. For the most part, the legislatures of these states are not concerned specifically about AI drones. But you can bet that when police departments begin to remove the “man in the loop” – that is, make the drones completely autonomous, which will be an option in the not-distant future – there will be another round of panicked calls from constituents and more legislation.

And let’s not forget the Supreme Court, which historically has taken an almost childlike joy in settling 4th Amendment issues related to technological development. However, while the Court has a somewhat well-earned reputation for settling important issues by narrow 5-4, strictly partisan decisions, AI use by law enforcement under the 4th Amendment may break the stalemate. Last year, in United States v. Jones the Court UNANIMOUSLY decided that the police needed a warrant before attaching a GPS tracker to a suspect’s car to track his movements. Although the majority opinion by Justice Scalia relied, strangely, on 19th century trespass case law (there are few children who have as much childlike joy as Justice Scalia has when he is able to use 19th century law to govern 21st century technology), the concurring opinions by Justices Sotomayor and Alito discussed GPS tracking in a way that will be relevant to AI.

Both Justices worry that GPS technology erases an important practical limitation that protects suspects’ 4th Amendment rights: it is expensive to assign an officer to track a suspect for long periods of time. With a GPS tracker, it’s cheap and easy. That means law enforcement agencies must be particularly diligent to conform to constitutional limits when using that technology.

Justice Sotomayor added another thought that is particularly relevant to the use of AI drones by law enforcement. She wondered about “the existence of a reasonable societal expectation of privacy in the sum of one’s public movements” and asked “whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables the Government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.” Put another way, does the Constitution prevent law enforcement from turning on an AI drone and assigning it to follow and record anyone’s every movement in public spaces without a warrant?

Tonight, Almost Human will ask us to believe that police robots look like real people, talk like real people, and act like real people. The show asks us to believe because the set-up is fiction. Those robots don’t exist. But AI machines and programs are coming to law enforcement soon. They won’t need to ask us to believe in them because they won’t need our belief; they’ll be real with or without it. However, our belief isn’t the important issue; our Constitutional rights are. This technology will make it easier for police to track and monitor anyone and everyone. I have no doubt that nearly every police department that uses AI will do so with the intention of monitoring potentially dangerous people, making our neighborhoods safer, and improving lives. But their jobs will be easier if there are legal standards giving police guidelines for what can be done constitutionally. In order to do that, we need to seriously consider what we think the 4th Amendment can and should say about using AI to protect the public safety.