Roughly half a million of the nation’s most vulnerable students are enrolled in a neglected sector of American K–12 education: a sprawling system of “alternative” schools. Supporters of alternative schools say they provide more support for struggling students, and the best do — through small classes, caring teachers, flexible schedules and extra counseling and tutoring. But a ProPublica analysis of federal data on measures of school quality, like funding, counseling staff and graduation rates, has found that a broad swath of alternative schools shortchange their students.

Since the No Child Left Behind Act of 2001 refashioned the yardstick for judging schools, alternative education has at times become a silent release valve for schools straining under the pressure of accountability reform. By shifting students with weak test scores and a greater propensity to be truant or drop out to alternative programs, some traditional high schools have found a way to improve their accountability metrics. And some states exempt alternative schools from achievement goals, oversight or reporting rules other schools must follow, ProPublica found.

Evidence thus far — including a 2007 legislative report in California and concerns stemming from lawsuits in Pennsylvania and Louisiana — suggests this is happening, but ProPublica wanted to study the question more systematically by combining federal databases, and supplementing them with local data.

Our analysis applied special scrutiny to alternative schools in Orlando, Florida, the nation’s tenth largest district, where our reporting indicates thousands of students leaving for-profit alternative schools without diplomas aren’t counted as dropouts.

Summary of Findings

  • Enrollment: Roughly half a million students were enrolled in alternative schools in 2014.
  • Accountability mandates: The number of students in alternative schools grew moderately over the past 15 years, with upticks in enrollment following new national mandates regarding standardized testing and graduation rates.
  • Socioeconomic disparity: Black, Hispanic and low-income students are overrepresented in alternative schools. Black students make up about 16 percent of students nationally but 20 percent of students in alternative schools. Hispanic and low-income students (those eligible for free lunch) are similarly overrepresented.
  • Unequal resources: In many districts, alternative schools received fewer resources than regular schools:
    • Funding: Nearly a third of the alternative-school population attends a school that spends at least $500 less per pupil than regular schools do in the same district. This includes state and local funding, but does not include federal funding.
    • Counselors: Forty percent of school districts with alternative schools provide counseling services only in regular schools.
  • Graduation rates: Graduation rates at alternative schools are low. While just 6 percent of regular schools have graduation rates below 50 percent, our analysis found nearly half of alternative schools do.
  • Graduation rate v Alternative school enrollment: We identified 83 school districts, including Orlando, Newark, and Los Angeles, where regular schools increased their graduation rates by at least one percentage point between 2011 and 2014, while the alternative school population grew by more than 5 percentage points or more than 150 students. Such a pattern could indicate that traditional schools are weeding out students at greater risk of dropping out, although there are many reasons why graduation rates rise.
  • Orlando: ProPublica’s analysis of federal data shows Orange County Public Schools in Orlando, Florida, tripled alternative school enrollment between 2009 and 2014, rising from 1,300 to 3,900 students. The driving force was charter schools operated by Accelerated Learning Solutions. Graduation rates at several regular high schools in the district increased by more than 10 percentage points between 2011 and 2014.
  • Michigan: In Secretary of Education Betsy DeVos’s home state, alternative school enrollment grew steeply from 2000 to 2014. Charter schools have played a part in this growth.

Data Sources

Our nationwide analysis included 15 years of data on 100,000 total schools (including almost 4,000 alternative schools) in 13,000 districts.

  1. The Common Core of Data (CCD), compiled by the U.S. Department of Education’s National Center for Education Statistics, was used as the master list for all schools and districts to be included in analysis. The CCD was used to identify alternative schools, to count students, and to measure school resources.
  2. Data from the Civil Rights Data Collection (CRCD), administered by the U.S. Department of Education’s Office for Civil Rights, was used to measure school resources.
  3. Data from EdFacts, a U.S. Department of Education initiative, was used to report graduation rates.
  4. The Stanford Education Data Archive (SEDA), from the Stanford Center for Education Policy Analysis, was used to reassign charter schools to the district in which they are geographically located.
  5. Data from Orange County Public Schools and from the state of Florida, obtained through records requests, was used to measure dropouts, withdrawals, and graduation rates in Orlando and throughout Florida.

Methodological Notes

The national datasets described above are available at the school level, and were linked using a shared school identifier. In preparing the data for analysis, we made two key decisions:

  1. Identifying alternative schools: We identified alternative schools using a school type classification from CCD, with a few modifications: We excluded juvenile justice schools, identified by keywords such as ‘juvenile justice’ and ‘prison’ in the school or district name. We also excluded schools from state and federally operated institutions that serve special needs populations and excluded schools operated under very unusual oversight structures. In the small percent of cases where districts indicated a school was both a magnet school and an alternative school, we left the data as reported. We excluded elementary schools. About 95 percent of the schools we considered alternative schools served high school grades. For national analyses, no further adjustments were made. For analyses of Orange County Public Schools in Florida, alternative school designations were adjusted to reflect additional reporting by ProPublica. District-specific figures included in our visualization and lookup were adjusted based on additional reporting, as indicated.
  2. Charter schools: Most schools in the country are operated by a traditional school district, which draws students from within a geographic catchment area. But some charter schools are authorized under their own administrative agency or under an agency other than a regular, local school district. We reassigned such schools to the district where they are located geographically, to better capture the number of total and alternative students in each district. The reassignment was done using a geographic crosswalk provided by the Stanford Education Data Archive. We only reassigned charter schools, with the exception of New York City Department of Education schools, which were coded under several different districts in the CCD data, but were combined using the SEDA crosswalk.

Map Visualization

A national map of alternative school enrollment appears on our site as seen above. The size of each bubble on the map represents a district’s overall alternative school student population. The color of the bubble reflects a metric composed of factors that taken together are warning signs that a district’s alternative schools are of poor quality or that a district may be using alternative schools to improve accountability measures.

The included factors were growth in enrollment, under resourcing, and low graduation rates. While cause for concern, these factors are not definitive evidence of problematic alternative schools. We measured districts against each other on these three factors, which received equal weight and were standardized using percentile rank. To determine the color presented on the map, districts were divided into quartiles.

  1. Growth in enrollment at alternative schools over the past 15 years was measured as the percentage point difference between the percent of students in alternative schools in 1999–2000 compared to 2013–14. (For a small number of districts that opened after 1999–2000, the baseline was the earliest available year.)

  2. Resources at alternative schools were compared to regular schools in the same district, in the 2013–14 school year. Several resource measures were used, which received equal weight and were standardized using percentile rank:

    a. Per pupil spending, measured as the percent difference between dollars per student at alternative and regular schools.

    b. Availability of counselors, measured as the percent difference between the percent of alternative schools in the district that had a counselor and the percent of regular schools that did.

    c. Student-teacher ratio, measured as the percent difference between the student teacher ratio at alternative and regular schools.

    d. Teacher experience, measured as the percent difference between the percent of teachers in their first or second year of teaching at alternative and regular schools.

    e. Teacher absenteeism, measured as the percent difference between the percent of absentee teachers at alternative and regular schools. Absentee teachers are those who were absent for more than 10 days in the school year.

  3. Graduation rates at alternative schools, compared to regular schools in the same district, in the 2013–14 school year. This was measured as the percentage point difference between the graduation rate at alternative schools and regular schools in the same district.

Line Chart Visualization

A line chart depicting changes to national alternative school enrollment over time appears as above. The value of the line corresponds to the number of students reported as enrolled for each year.

In cases where state enrollment fluctuated dramatically (indicating a likely reporting error) we smoothed the curve by imputing a value for the state based on the previous and prior years’ values.

We chose to label the years that four key accountability policies went into effect, in order to look for a possible relationship between alternative school enrollment the implementation of policies that would penalize schools for poor performance. These policies could provide a possible incentive for schools or districts to move underperforming students to alternative schools.

  • The No Child Left Behind act was passed in December 2001.
  • Starting In the 2005–2006 school year, states were required to test students in certain grades yearly.
  • Starting in the 2007–2008 school year, federally mandated testing consequences first rolled out.
  • In the 2010–2011 school year, states began switching to a tougher graduation rate formula.

Limitations

Some states and districts operate alternative programs within regular schools, as opposed to stand-alone alternative schools. These programs are not tracked in federal data sources and are not included in our analysis.

The federal data used in our analysis relies on reports from states, which in turn often rely on reports from school districts. While the federal data collection efforts include some verification and data cleaning, the data is generally only as accurate as states’ record-keeping and reporting allows. We’ve noticed that occasionally the CCD alternative school designation is incorrect. Due to the size of the dataset, it cannot be manually corrected and is therefore used as-is.

Some charter schools are not authorized under a traditional school district, and are instead authorized under their own administrative agency or under an agency other than a regular, local school district. We reassigned such schools to the district where they are located geographically, to better capture the number of total and alternative students in each district. This is imperfect: A charter school may draw students from several traditional school districts, but will be assigned to only one geographic school district, based on where the school is located. The reassignment of charter schools into geographic school districts means that student counts and other measures in this data may not match a district’s own accountability data.

Our line chart shows alternative school enrollment juxtaposed with the timing of key accountability policies that would penalize schools for poor performance. The number of students in alternative schools showed moderate increases contemporaneous with new national mandates regarding standardized testing and graduation rates. While suggestive, factors other than these policies may have also influenced enrollment in alternative schools.