Close Close Comment Creative Commons Donate Email Add Email Facebook Instagram Mastodon Facebook Messenger Mobile Nav Menu Podcast Print RSS Search Secure Twitter WhatsApp YouTube

5 Things We Learned Collecting 3,352 Stories About Agent Orange Exposure

It’s not the size of the community that matters. It’s about finding the right one.

At the end of June, ProPublica and The Virginian-Pilot kicked off an investigation into the potential effects of Agent Orange on the children and grandchildren of Vietnam War-era veterans. But we didn't do it with a traditional story. We did it with a request. We asked these vets to tell us about their exposure to the toxic defoliant with a 35-question survey about their experience.

At ProPublica, we spend a lot of time thinking about the right questions to ask in the callouts that accompany our investigations. But the biggest one is often internal: "Will this resonate?"

We had our answer a week later. More than 1,000 vets or their family members had shared a story with us. We seem to have gotten that bang right out of the gate because we connected with a community that feels ignored and betrayed by the VA and the U.S. government by giving them an avenue to share their stories. The vets I spoke with said this over and over again. Feeling harmed and ignored can be major participation drivers for crowd-powered projects, and this one hit on both.

Over the next four months, people shared more than 3,350 stories and more than 500 pictures with us. We've received hundreds of follow-up emails and a dozen letters in the mail, including two from veterans who printed out our survey to fill it out. I went through the form with two others by phone to help them complete it.

You could look at the response and think, "Wow, that investigation must be tearing up Chartbeat." But that's not exactly the case — or the point. Our goal isn't to find the biggest audience; it's to find the one that can help us report this story.

Here's a closer look at how we got thousands of veterans and their loved ones to share their stories with us and what we learned in the process.

1. Holy crap, email!

Emailing with survey respondents is the single most important thing we have done to build a community out of this project. I now joke that I have 3,000 new best friends. Many of the people who completed our form now keep me updated on their progress fighting for benefits at the VA, or send notes about how their health is doing and elaborate — in great detail — about their time in Vietnam. Our partners at The Virginian-Pilot have also helped us spread the word, teaming up with us on reporting and bringing the survey to veterans groups in Virginia.

We collect the stories through Screendoor — a robust form and database tool. The platform allows us to bulk message anyone and everyone who has shared their story with us. Every time we publish a new Agent Orange story, we send them a message. When we hit a project milestone, we send a message. If we're looking for additional information, we send a message.

When I want to interview a specific segment of the community (veterans from a particular military branch, for instance), I filter our list and send them a message. With each email, we remind respondents to share the survey, to send us documents and images and to offer up feedback. (Always be looping!)

2. The right community isn't always the biggest

This isn't to say we don't want millions of people reading our stories and submitting tips. But in our crowd-powered projects, we'd rather get a small number of relevant tips from people with direct knowledge and experience in a particular story than higher engagement from an audience that's just passing through. Finding this "right" audience starts with research. Here are some preliminary questions we ask:

  • Who cares about this story?
  • Where are they, online and geographically?
  • What can they contribute to my story?
  • How can I get them to contribute?

The answers will help you narrow the scope of your survey and outreach efforts, but should deepen the engagement.

Our research for the Agent Orange project turned up dozens and dozens of Facebook pages, veteran-run websites, various organizations and nonprofits and a handful of subreddits. No group was too small. Many vets mentioned "Facebook" or "my Facebook group" as the spot where they got most of their Agent Orange updates, and Facebook became a huge driver of distribution for the survey. In monitoring Facebook, if I saw that someone had shared one of our stories, I would follow up in the comments by posting our survey.

3. If you're not listening, you're doing it wrong

One question to ask at the beginning of a crowd-powered project is: How are we going to report this out? What can we do to build stories, updates or other content around what people in this community are discussing and sharing? How can we show we're listening to them? We aim to report something that we've learned from our community about once a month. Each piece gives us a reason to recirculate the survey and contribute valuable content to community. With each post, we saw our biggest jumps in survey participation.

Here are some of the things we published:

4. Iterate with your community

You never truly know how the community is going to react to a crowd-powered project. It's hard to anticipate what is going to resonate and what will be a colossal failure. But keep this in mind: If you collect even one story, you have one person that probably has an opinion on what you could do better. Ask them. It's real-time user testing and it's invaluable. Here's how we iterated based on feedback from the community and why.

Added an "upload file" field (what you don't know the community will tell you, part 1)

When we launched the survey, we weren't thinking about images or documents. But someone in the community was. One of the vets who filled out the form replied to our automated confirmation message with a handful of pictures from the war. The next day, I added a file upload field along with ProPublica's postal address and asked people to share their images. We've now collected nearly 500 images, including about a dozen via snail mail.

Added more locations (what you don't know the community will tell you, part 2)

Agent Orange wasn't sprayed JUST in Vietnam. We had a drop down menu for location of service that included South Vietnam, the U.S., Laos, Cambodia, At Sea and an open field. We started getting a lot of "Thailand" and "Korea" in the open field. We decided to make those specific options. There were also specific communities for both Agent Orange in Thailand and in Korea that we wanted to target. Breaking these out added a level of personalization to the form for vets who were exposed in these areas.

Made our questions clearer (how to avoid messy data, part 1)

Screendoor has a feature that shows how many forms were started but never finished. So while we've had 3,352 forms submitted, we've had another 5,000 forms unsubmitted. The feature also shows any error messages people get, and I noticed a common issue: the "integer only" fields were tripping people up. Integer-only fields that accept only numbers are important when trying to collect clean data that you can visualize later (years of service, for example), but our descriptions of these fields were confusing veterans. I updated them to be clearer, but it's still a work in progress.

Created the field types we wanted to analyze (how to avoid messy data, part 2)

We had an open text field asking veterans how many of their children had health problems that should have been an "integer only" field. Some vets were writing in "none" or "none yet" or various other things. We knew we wanted to be able to analyze this field numerically, so we changed it before it got out of control.

Updated the survey order (find what's working and repeat)

Remember the feature that showed incomplete forms? Well, we were able to email many of those people to ask how we could help them complete the survey. Circling back brought in more than 100 new submissions and generated great feedback about the form. (Again, always be looping!) Based on that, we decided to move the email address field to the very top of the form so that we could follow up with anyone else who might have trouble.

5. Have a plan for finding stories

If you're doing a crowd-powered project, you're going to have data coming in various shapes and sizes. You better have a content management plan. At first, I didn't (really). After 1,000 surveys, I was like, "Well, this is getting out of control." I think I spent an entire day just organizing emails, logging exchanges, and filing images and documents.

I started to organize folders in Dropbox with veteran files and public domain footage I'd pulled to create a series of videos. I kept a folder in my email for incoming messages and marked them completed if I responded or solved whatever problem was raised. I also attached relevant email exchanges to their submissions in Screendoor. Whatever you're using to collect user stories, here are some content management questions you should ask yourself before you get started:

  • How am I managing incoming feedback?
  • How am I tracking any files that I get?
  • Where will everything live?

You're going to need to find everything at some point — for reporting out, for locating sources, maybe even for writing a "what we learned" post on Medium. So get organized.

And that's it. Those are five big things that helped us gather 3,350 Agent Orange stories. A big audience is always super fun, but think of it as a rave — tons of people packed into some enormous warehouse, dancing and partying all night long. But no one lives in that warehouse. Everyone goes home after the party — home to their neighborhood, their street and their community. That's where they live and feel safe.

When it comes to crowd-powered journalism, that's what we aim to build: A community where people want to live, where people want to stay, where people are fully invested. We're looking for a block party, not a rave.

Of course, every now and again we wouldn't mind having a rager.

Can you help us reach 5,000 stories shared?

Latest Stories from ProPublica

Current site Current page