Artificial Intelligence Sentences Man to 6 years in prison

This is blog number one. Just a little background. I am your blogger, Robert Kiesling, a criminal and family law attorney in Austin, Texas.  In our first blog, we are going to take a look at what you can expect in my upcoming blogs, podcasts, and vlogs.

The purpose of this blog is to give the reader weekly news about artificial intelligence and how it’s being applied to the U.S. criminal justice system both now and in the future.

So how did I come up with this crazy idea to start this blog? I had just finished my near future crime novel, Discredited Citizen. And the research that I did lead to one terrifying conclusion… the future is now and if we as citizens don’t continue to be vigilant – forcing companies to be responsible with how they apply AI to our daily lives, we will lose the freedoms we currently enjoy. My hope is that through this blog that I continue to keep this subject at the forefront of debate for all humanity.

Folks there is not going to be any system that is not going to remain untouched by artificial intelligence so be aware of that as you go about your day-to-day activities whether it's when you ask Alexa a question, whether it’s how your Fitbit tracks your movements, your car that records your driving, and most importantly your cell phone. Why? Because ALL that INFORMATION is being uploaded to a private corporation who SELLS or KEEPS it for their own use. YOUR personal information. And you don’t have a say where or how or if it can be used. This needs to change post-haste.

Now that we’ve given a brief overview, let’s jump right into our first blog subject: Wisconsin vs. Loomis. Mr. Loomis was accused of a number of crimes but he decided to go ahead and waive his right to a jury trial and plead to two charges – fleeting and alluding an officer and also the unauthorized use of a vehicle – which he subsequently used in a drive-by shooting.

Mr. Loomis got to court and was told that he was not eligible for probation because an AI program said so. The program – and I’m simplifying here – had three different levels of candidates that would be likely to be a repeat offender: bad, really bad, and really really bad. Mr. Loomis was labeled as a really really bad offender. This took him from being sentenced on probation to having to serve hard time. Six years to be exact. The judge wasn’t familiar with the AI program (COMPAS) and continually asked the DA about it. But ultimately sentenced Mr. Loomis.  The issue is a due process issue for Mr. Loomis. The court took a private corporation’s AI black-box algorithm and used it without knowing how it was coded, who coded it, how it determined what a bad person was or how that person was a likely candidate to be a repeat offender. Let me make a side note here. What is a black box? It is a metaphor used by coders. There’s an input and an output but the inner workings are unknown to anyone except the company – look at your phone for example – you can turn it on, off, call, use the apps, but I doubt most of you know the first thing about the algorithms that are used for this process. Same thing with the metaphorical black box in this court case. Except blindly following the suggestion of this AI without knowing the inner workings of it, cost a man six years of his life.

Mr. Loomis appealed on Due Process grounds stating he had a right to know what was in the black box. And the case law precedent – note: case law precedent is prior cases that have made a ruling that newer cases typically look to for guidance on how to rule – in this case, had a number of prior cases that upheld Mr. Loomis’s argument – saying he had an absolute right to this information and that the court using this AI violated his due process rights.

The Court of Appeals in Wisconsin certified – this means they sent this very question up to the Supreme Court of Wisconsin and asked how they would rule on such a question. The SCW said, in a nutshell, that because there were ‘other factors’ in considering Mr. Loomis’ sentencing, the use of this private company’s secret AI did not violate his DP -- and the company’s right to keep its algorithm a secret trumps Mr. Loomis’s right to see how it concluded 6 years was the appropriate sentence -- rather than probation or less years in prison. Does this sound right to you? I’d love to get some dialogue and comments going on this. I invite the debate. Also, the SCW said we will make sure there’s – essentially – a warning label to all future defendants. Huh? How does this help?

Lastly, for our next blog we will touch on bias… There was testimony by an expert on code bias prior to the case being sent to the Court of Appeals. It’s important as you will see that clearly, all AI is biased.

I will end each blog with a quote. And here’s this weeks: it's OK to get knocked out but don't throw in the towel – David Goggins.

If Skynet doesn’t take over by my next blog, I will talk with you next week. Till then, I bid you adieu.

 
Previous
Previous

AI fights corona virus