Julia Angwin is a senior reporter at ProPublica. From 2000 to 2013, she was a reporter at The Wall Street Journal, where she led a privacy investigative team that was a finalist for a Pulitzer Prize in Explanatory Reporting in 2011 and won a Gerald Loeb Award in 2010. Her book "Dragnet Nation: A Quest for Privacy, Security and Freedom in a World of Relentless Surveillance," was published by Times Books in 2014, and was shortlisted for Best Business Book of the Year by the Financial Times.
Also in 2014, Julia was named reporter of the year by the Newswomenâs Club of New York. In 2003, she was on a team of reporters at The Wall Street Journal that was awarded the Pulitzer Prize in Explanatory Reporting for coverage of corporate corruption. She is also the author of âStealing MySpace: The Battle to Control the Most Popular Website in Americaâ (Random House, March 2009). She earned a B.A. in mathematics from the University of Chicago and an MBA from the Graduate School of Business at Columbia University.
Most tech companies have policies against working with hate websites. Yet a ProPublica survey found that PayPal, Stripe, Newsmax and others help keep more than half of the most-visited extremist sites in business.
A trove of internal documents sheds light on the algorithms that Facebook’s censors use to differentiate between hate speech and legitimate political expression.
The state’s insurance department is following up on our findings that eight auto insurers charge more in minority neighborhoods than in other neighborhoods with similar risk.
Our analysis of premiums and payouts in California, Illinois, Texas and Missouri shows that some major insurers charge minority neighborhoods as much as 30 percent more than other areas with similar accident costs.
Americans face unprecedented threats to the digital safety of their personal information. We offer nine tips to foil hackers, ransomware, online trackers, data brokers and other menaces.
ProPublica’s analysis of bias against black defendants in criminal risk scores has prompted research showing that the disparity can be addressed — if the algorithms focus on the fairness of outcomes.
Artificial Intelligence is only as good as the patterns we teach it. To illustrate the sensitivity of AI systems, we built an AI engine that deduced synonyms from news articles published by different types of news organizations.
As we enter the era of artificial intelligence, machines regularly conduct experiments on human behavior. Here’s a look at how software used by the New York Times and New York Post uses you to test their headlines.
The phone you use, the computer you own and the ZIP code you live in can all be factors in what prices you see when shopping online. Welcome to the world of mass customization.
We live in an era of increasing automation. But as machines make more decisions for us, it is increasingly important to understand the algorithms that produce their judgments.
Thank you for your interest in republishing this story. You are are free republish it so long as you do the following:
You can’t edit our material, except to reflect relative changes in time, location and editorial style. (For example, “yesterday” can be changed to “last week,” and “Portland, Ore.” to “Portland” or “here.”)
If you’re republishing online, you have to link to us and to include all of the links from our story, as well as our PixelPing tag.
You can’t sell our material separately.
It’s okay to put our stories on pages with ads, but not ads specifically sold against our stories.
You can’t republish our material wholesale, or automatically; you need to select stories to be republished individually.
You cannot republish our photographs without specific permission (ask our Public Relations Director Minhee Cho if you’d like to).
You have to credit us — ideally in the byline. We prefer “Author Name, ProPublica.”
Copy and paste the following into your page to republish: