Close Close Comment Creative Commons Donate Email Add Email Facebook Instagram Mastodon Facebook Messenger Mobile Nav Menu Podcast Print RSS Search Secure Twitter WhatsApp YouTube

SRSLY: Like ‘Minority Report,’ But Without Tom Cruise or Accuracy

Your three-minute read on the best reporting you probably missed.

SRSLY

The best reporting you probably missed

David Epstein

Welcome to SRSLY, an (experimental) newsletter highlighting under-exposed accountability journalism. We'll distill the important information from investigative reporting you probably missed, and deliver it to you in three-minutes-or-less worth of reading. Sign up to have it delivered to your inbox. (You can, of course, unsubscribe at the first whiff of a bad joke.)

Subscribe to SRSLY

MainMuck

Where would you be without algorithms? I’ll tell you where: you’d be watching “Zoolander 2” on Netflix because better recommendations (like “Zoolander 1,” masterpiece) wouldn’t be popping up on your screen. Yay algorithms! Except, you might like algorithms somewhat less if you knew they were involved in, say, stock market flash crashes, or predicting your future crimes. (Yes, you read that right.) You might like them less still if you’re black, because, a ProPublica investigation found, one “risk assessment” algorithm may be more likely to guess incorrectly that you’re going to be a repeat-offender criminal than if you were white. Your four W’s:

What?

It’s kind of like Moneyball, except instead of important applications — like determining which minor league shortstop will hit for power — algorithms are now used throughout the criminal justice system to forecast the future behavior of people who are arrested. The predictions can influence everything from sentencing length to parole eligibility. In 2014, then Attorney General Eric Holder noted that objective data analysis was “crafted with the best of intentions,” — like determining a defendant’s rehabilitation needs — but that it could also quietly exacerbate inequities. Holder called for the U.S. Sentencing Commission to study the impact of algorithms, but…

What else?

Surprise! They didn’t. So ProPublica did, using a sample of 7,000 arrestees in Florida. Only 20 percent of the people that the relevant algorithm predicted would commit violent crimes within two years were found to have done so. In other words, should a betting market on future crime appear at your local casino and you have an inside line on the algorithm, still stick to blackjack. When all crimes were considered — including misdemeanors — the algorithm did better: 61 percent of the people predicted to reoffend within two years did so.

Why do I care?

Perhaps because the invisible grip of algorithms on our lives is a little spooky. But you also care because the Florida algorithm erroneously flagged black defendants as future criminals at almost twice the rate of white defendants. Whoops. Conversely, more white defendants were mislabeled as low risk, only to offend again.

Who?

…creates this miraculous algorithm? Glad you asked. A company called Northpointe, which disputes ProPublica’s analysis. Northpointe shared with ProPublica some of the criteria used for prediction, which include everything from education level to employment status. As with forensic algorithms (featured last week), however, the risk assessment algorithm itself is proprietary, which makes it sort of like a witness who can’t be cross-examined. You can check out the 137 questions that Northpointe uses (none explicitly mention race), but, like your uncle Bob’s sense of humor and most of the ingredients in a pesticide, the inner workings are a mystery.

They Said It

“Risk assessments should be impermissible unless both parties get to see all the data that go into them.” — Christopher Slobogin, director of the criminal justice program at Vanderbilt Law School

#facepalm of the Week

Fancy-schmancy vegan restaurateur Sarma Melngailis had the proverbial vegan bull by the seitan horns. And then she ordered a cheesy pizza from Domino’s, and it all crumbled like a stack of steamed tofu. Actually, the pizza was just sort of the topping — police used the order to track Melngailis and her husband to a Holiday Inn in Tennessee. According to several media reports, Melngailis and her husband, Anthony Strangis, had been on the lam since Melngailis allegedly stopped paying employees at Pure Food and Wine, the vegan hotspot she ran in Manhattan. Woody Harrelson and Alec Baldwin were regulars, but, according to Forbes, employee paychecks were not, allegedly because Melngailis siphoned off $2 million from the business to pay for fancy watches, vacations, Uber rides (ever heard of UberPool?) and her husband’s gambling debts. Well, at least that’ll teach her… not to order pizza as a vegan.

Tweet of the Week

Make cheap-American-beer-owned-by-a-Belgian/Brazilian-company great again.

Additional research by Kate Brown.

Tips are appreciated. The paper kind, or the green paper kind.

ProPublica does not vouch for the accuracy of stories appearing on SRSLY. We select, review and summarize key points from accountability stories that may not have gotten wide exposure. But we are not able to independently vet or vouch for the accuracy of stories produced by others. We will inform readers if we learn that stories have been challenged publicly or corrected.

Latest Stories from ProPublica

Current site Current page