This is the accessible text file for GAO report number GAO-04-62 
entitled 'Public Schools: Comparison of Achievement Results for 
Students Attending Privately Managed and Traditional Schools in Six 
Cities' which was released on October 29, 2003.

This text file was formatted by the U.S. General Accounting Office 
(GAO) to be accessible to users with visual impairments, as part of a 
longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to the Chairman, Committee on Education and the Workforce, House 
of Representatives:

United States General Accounting Office:

GAO:

October 2003:

PUBLIC SCHOOLS:

Comparison of Achievement Results for Students Attending Privately 
Managed and Traditional Schools in Six Cities:

GAO-04-62:

GAO Highlights:

Highlights of GAO-04-62, a report to the Chairman, Committee on 
Education and the Workforce, House of Representatives 

Why GAO Did This Study:

Over the last decade, a series of educational reforms have increased 
opportunities for private companies to play a role in public 
education. For instance, school districts have sometimes looked to 
private companies to manage poorly performing schools. The 
accountability provisions of the No Child Left Behind Act of 2001 may 
further increase such arrangements because schools that continuously 
fail to make adequate progress toward meeting state goals are 
eventually subject to fundamental restructuring by the state, which 
may include turning the operation of the school over to a private 
company. 

GAO determined the prevalence of privately managed public schools and 
what could be learned about student achievement in these schools from 
publicly available sources. To do so, GAO examined existing data on 
the number and location of privately managed schools and reviewed a 
variety of reports on student achievement. In addition, GAO compared 
standardized test scores of students attending privately managed 
public schools with scores of students attending similar traditional 
public schools. GAO identified privately managed schools that had been 
in operation for four years or more in 6 large cities and matched 
these schools with a group of traditional schools serving similar 
students. GAO then analyzed student scores on state reading and math 
tests at selected grade levels, controlling for differences in student 
populations.

What GAO Found:

The number of public schools managed by private companies has tripled 
in the last 5 years according to data compiled by university 
researchers, although such schools comprise less than 0.5 percent of 
all public schools. In the 2002-03 school year, nearly 50 private 
companies managed over 400 public schools nationwide. These companies 
managed schools in 25 states and the District of Columbia, with about 
one-half of the schools located in Arizona and Michigan. Information 
on student achievement at these schools was available in the form of 
state- or district-issued school report cards and annual reports 
issued by the management companies. Although these reports provided 
valuable descriptive information, they were generally not designed to 
answer research questions about the relative effectiveness of 
privately managed schools compared with traditional schools in raising 
student achievement. Consequently, GAO conducted test score analyses 
that provide further insight into student achievement in these 
schools.

GAO’s analyses of student test scores in 6 cities yielded mixed 
results. Scores for 5th grade students in Denver and San Francisco 
were significantly higher in both reading and math in two privately 
managed schools when compared with traditional schools serving similar 
students. However, 4th grade scores in reading and math were 
significantly lower in a privately managed public school in Cleveland, 
as were 5th grade scores in two privately managed schools in St. Paul. 
In Detroit, where eight privately managed schools were studied, 
reading and math scores of 5th graders in privately managed schools 
were generally lower. In Phoenix, GAO found no significant 
differences. GAO’s results are limited to the schools and grade levels 
examined and may not be indicative of performance at other schools.

www.gao.gov/cgi-bin/getrpt?GAO-04-62.

To view the full product, including the scope and methodology, click 
on the link above. For more information, contact Marnie Shaul at (202) 
512-7215 or shaulm@gao.gov.

[End of section]

Contents:

Letter1:

Results in Brief:

Background:

Number of Schools Managed by Education Management Companies Is 
Increasing; Descriptive Information on Achievement Widely Available:

No Consistent Pattern of Differences in Scores on State Tests Found 
between Public Schools Managed by Private Companies and Comparable, 
Traditional Elementary Schools:

Concluding Observations:

Agency Comments:

Appendix I: Scope and Methodology:

Scope and School Selection:

Measures and Analytic Methods:

Limitations of the Analysis:

Appendix II: Tables of Regression Results for Differences in Student 
Achievement Scores on State Assessments:

Appendix III: Characteristics of Privately Managed Schools and Comparable 
Traditional Public Schools in Detroit:

Appendix IV: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Acknowledgments:

Related GAO Products:

Tables:

Table 1: State Assessment Schedules and Tests of Reading and 
Mathematics through Fifth Grade in Six Cities in School Year 2001-02:

Table 2: School Characteristics of the Privately Managed Schools and 
Comparison Schools in Denver and San Francisco:

Table 3: School Characteristics of the Privately Managed Schools and 
Comparison Schools in Cleveland and St. Paul:

Table 4: School Characteristics of the Privately Managed School and 
Comparison Schools in Phoenix:

Table 5: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
Denver:

Table 6: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
San Francisco:

Table 7: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
Cleveland:

Table 8: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed School and Comparison 
Schools in St. Paul (School A Comparison):

Table 9: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed School and Comparison 
Schools in St. Paul (School B Comparison):

Table 10: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
Phoenix:

Table 11: Regression Results for Differences in Student Performance on 
State Reading Assessment at the Privately Managed and Comparison 
Schools in Detroit:

Table 12: Regression Results for Differences in Student Performance on 
State Math Assessment at the Privately Managed and Comparison Schools 
in Detroit:

Figures:

Figure 1: Number of Public Schools Managed by Private Companies from 
School Year 1998-99 through 2002-03:

Figure 2: Location of Public Schools Operated by Private Management 
Companies in School Year 2002-03 and Annual Number of States with Such 
Schools Since 1998-99:

Figure 3: Number of Educational Management Companies from School Year 
1998-99 through 2002-03:

Figure 4: Test Score Section of a Report Card for a Hypothetical School 
in Colorado for School Year 2002-03:

Figure 5: Fifth Grade Reading Scores for the Privately Managed School 
and Comparison Schools in Denver on the Colorado Student Assessment 
Program:

Figure 6: Fifth Grade Reading and Math Scores for the Privately Managed 
School and Comparison Schools in San Francisco on the Stanford-9 
Achievement Test:

Figure 7: Fourth Grade Reading Scores for the Privately Managed School 
and Comparison Schools in Cleveland on the Ohio Proficiency Test:

Figure 8: Fifth Grade Reading and Math Scores for the Privately Managed 
Schools and Comparison Schools in St. Paul on the Minnesota 
Comprehensive Assessment Program:

Figure 9: Fourth Grade Reading Scores for Privately Managed and 
Comparison Schools in Detroit on the Michigan Education Assessment 
Program:

Figure 10. Fourth Grade Math Scores for Privately Managed and 
Comparison Schools in Detroit on the Michigan Education Assessment 
Program:

Figure 11: Fifth Grade Reading and Math Scores for the Privately 
Managed School and Comparison Schools in Phoenix:

Abbreviations:

NCLBA: No Child Left Behind Act of 2001:

LEP: limited English proficiency:

OLS: ordinary least squares:

United States General Accounting Office:

Washington, DC 20548:

October 29, 2003:

The Honorable John A. Boehner 
Chairman 
Committee on Education and the Workforce 
House of Representatives:

Dear Mr. Chairman:

In the last decade, reports of failing schools and low student 
achievement have given rise to a variety of educational reforms that 
have expanded opportunities for private companies to play a role in 
public education. In some cases, school districts have looked to 
private companies to manage poorly performing schools with the 
expectation of improving scores on state achievement tests. The 
accountability requirements of the No Child Left Behind Act (NCLBA) of 
2001 may further increase such arrangements because schools that 
continuously fail to make adequate yearly progress toward meeting state 
proficiency goals may be eventually subject to fundamental 
restructuring by the state, including turning the operation of the 
school over to a private management company.[Footnote 1]

As the role of private companies in the management of public schools 
has developed, interest in students' academic performance at these 
schools has grown. In light of the expanding role for private companies 
in public education, we agreed with your office to determine the 
prevalence of public schools managed by private companies and to report 
on what can be learned about student achievement in these schools from 
publicly available information sources. In addition, we agreed to 
compare student achievement in elementary schools operated by private 
companies in large urban areas with student achievement in similar 
traditional elementary schools.

To determine the prevalence of privately managed schools, we obtained 
information from research organizations on the number and location of 
public schools that have both instructional and noninstructional 
services provided by private companies. We relied primarily on a 2002-
03 annual report compiled by Arizona State University that tracks 
nationwide growth of for-profit educational management companies, the 
only such report of its kind we found.[Footnote 2] We selectively 
verified data in that report with information compiled by the National 
Center for Education Statistics, the Center for Education Reform, the 
National Association of Charter School Authorizers, and university 
researchers in Michigan and New Jersey. To locate publicly available 
information on student achievement in privately managed schools, we 
examined a variety of Internet Web sites, including state, district, 
and the larger private management company sites. We also reviewed 
studies conducted by the companies and by other researchers, as well as 
performance reports issued by state and district school officials to 
learn what has been reported about achievement at these schools.

To compare student achievement in public elementary schools operated by 
private companies with that at similar traditional schools, we analyzed 
individual student performance in specific grades on mandatory state 
tests of reading and mathematics. We identified 14 public elementary 
schools in larger urban areas across the country that had been 
continuously managed by private companies since the 1998-99 school 
year. These schools, managed by six private companies, were located in 
six cities: Cleveland, Ohio; Denver, Colorado; Detroit, Michigan; 
Phoenix, Arizona; St. Paul, Minnesota; and San Francisco, California. 
We matched each of the 14 schools with a set of 2 or more traditional 
public schools in the same city that were similar in terms of grade 
span, enrollment, student race and ethnicity, and the percentage of 
students with limited-English proficiency, disabilities, and 
eligibility for the federally subsidized free and reduced-price school 
lunch program. (See app. I for details on the procedures used to match 
schools.) Using test scores for the school years 2000-01 and 2001-02, 
we compared student scores in reading and math at one grade level in 
each of the 14 privately managed schools with scores of students in the 
same grade at the set of similar traditional schools. We also analyzed 
changes in individual students' test scores over time in the three 
cities where such data were available--Denver, Phoenix, and San 
Francisco.

Our analyses controlled for differences in characteristics of students 
attending the privately managed and traditional schools by using 
demographic characteristics--such as those used in selecting similar 
traditional schools--and student mobility to the extent that these data 
were available for individual students. We use the word significant--as 
in significantly higher or lower--throughout this report to mean 
statistical significance at a 95-percent confidence level, not to refer 
to the importance of the difference. Our study is constrained to 
varying degrees by incomplete data for some locations and by the lack 
of information on the reasons that individual students enrolled in 
these schools. In addition, our findings about student performance are 
limited to the particular grades in the privately managed and 
traditional schools we studied and may not be indicative of other 
grades or schools. For this reason, we do not identify the specific 
schools or the associated management companies in our study by name. A 
detailed explanation of our methodology, study limitations, and data 
verification procedures are found in appendix I. We conducted our work 
from January to October 2003 in accordance with generally accepted 
government auditing standards.

Results in Brief:

The number of public schools managed by private companies has tripled 
in the last 5 years, according to data compiled by university 
researchers. Nevertheless, only slightly more than 400 public schools 
were privately managed in the 2002-03 school year, considerably less 
than 1 percent of all public schools. Managed by 47 private companies, 
these schools were located in 25 states and the District of Columbia, 
with about one-half located in Arizona and Michigan. Descriptive 
information about achievement at individual schools was widely 
available in the form of school report cards that identified the 
proficiency levels or achievement scores of students tested in the 
current year, relative to state standards and state or district 
averages. Three company reports presented information on changes in 
achievement over time for all their schools in one or more states. 
While providing useful information on student achievement, these 
reports were generally not designed to answer research questions about 
the relative effectiveness of privately managed schools compared with 
traditional schools.

Our analyses of scores on state reading and mathematics tests in 
selected grades did not show a consistent pattern of superior student 
performance between schools managed by private companies and 
demographically similar traditional public schools in six cities. In 
two cities, Denver and San Francisco, students at the privately managed 
schools had on average significantly higher reading and mathematics 
scores than students at similar traditional public schools. Students at 
these privately managed schools also demonstrated greater academic 
gains over multiple years. However, in two other cities, Cleveland and 
St. Paul, student scores in reading and math were significantly lower 
in schools managed by private companies compared with similar 
traditional schools. In Detroit, results were somewhat mixed, although 
scores tended to be lower in the privately managed schools--reading 
scores were lower in 6 of the 8 privately managed schools and math 
scores were lower in 7 of the 8 privately managed schools, compared 
with similar traditional schools. In Phoenix, there were no significant 
differences in either reading or math between students at the two types 
of schools. Our results are limited to the schools and grade levels 
examined and may not be indicative of performance at other schools.

Background:

The role of for-profit private companies in managing public schools is 
a fairly recent phenomenon. Until the early 1990's, school districts 
contracted with private companies largely to provide noninstructional 
services, such as transportation, building maintenance, or school 
lunches. By the 1994-95 school year, however, the role of private 
companies had expanded to include instructional services in four school 
districts, as we reported in a 1996 GAO study.[Footnote 3] These early 
decisions by school districts to contract with private companies often 
followed years of frustration with low student achievement in these 
schools. Since that time, the growth of private for-profit educational 
management companies has been aided by financial support from the 
business community and by the opportunities states have offered for 
greater flexibility in the provision of education services.

Private for-profit management companies supply a wide array of 
educational and management services that may include providing the 
curriculum, educational materials, and key staff as well as payroll 
processing, busing, and building maintenance. The range and type of 
services vary by company, and to some extent by school within the 
company, as some companies have adapted their educational programs to 
the needs and interests of local areas. According to a study of for-
profit educational management companies by Arizona State University, 
three-quarters of schools operated by private for-profit management 
companies in school year 2002-03 served elementary grade students in 
kindergarten through fifth grade and in some cases continued to serve 
students in higher grades. The size of schools operated by private 
management companies varied from an enrollment of fewer than 100 
students to more than 1,000 students, but averaged about 450. Several 
of the major companies reportedly served a predominantly low-income, 
urban, and minority student population.

Private companies operate both traditional public schools and public 
charter schools. Some states or districts contract with companies to 
manage traditional public schools--often poorly performing public 
schools. These schools are generally subject to the same set of 
requirements that govern traditional schools within the district. More 
commonly, companies manage charter schools --public schools that 
operate under agreements that exempt them from some state and district 
regulations but hold them accountable for improving pupil outcomes. 
Enrollment in charter schools generally is not limited to defined 
neighborhoods, but may draw from larger geographic areas than is the 
case for most traditional schools and must be open to all, without 
discrimination, up to enrollment limits. Like traditional public 
schools, charter schools receive public funds and may not charge 
tuition for regular school programs and services, but may charge for 
before-and after-school services, extended day kindergarten, or pre-
kindergarten classes.

Public schools operated by private management companies, both 
traditional and charter, are subject to requirements of the NCLBA, 
including expanded testing requirements. Under this law, states must 
establish standards for student achievement and goals for schools' 
performance. Results must be measured every year by testing all 
students in each of elementary grades three through five and middle 
school grades six through eight, starting in school year 2005-
06,[Footnote 4] and by assessing how schools have progressed in terms 
of improving the performance of their students. Information from these 
tests must be made available in annual reports that include the 
performance of specific student subgroups, as defined by certain 
demographic and other characteristics. During the school years covered 
in our study, states were only required to test students in one 
elementary, one middle school, and one high school grade. Table 1 
identifies the different state testing schedules and instruments for 
the elementary grades in school year 2001-2002 in the cities where we 
made test score comparisons.

Table 1: State Assessment Schedules and Tests of Reading and 
Mathematics through Fifth Grade in Six Cities in School Year 2001-02:

City, state: Phoenix, Arizona; Elementary grades tested: 2 - 5; 
State test administered: Stanford Achievement Test, 9th 
Edition.

City, state: San Francisco, California; Elementary grades tested: 2 - 
5; State test administered: Stanford Achievement Test, 9th 
Edition.

City, state: Denver, Colorado; Elementary grades tested: 3 - 5[A]; 
State test administered: Colorado Student Assessment Program.

City, state: Detroit, Michigan; Elementary grades tested: 4; 
State test administered: Michigan Educational Assessment Program.

City, state: St. Paul, Minnesota; Elementary grades tested: 3 & 5; 
State test administered: Minnesota Comprehensive Assessments.

City, state: Cleveland, Ohio; Elementary grades tested: 4; 
State test administered: Ohio Proficiency Test.

Source: State education departments of the states shown.

[A] Reading was tested in all three grades, but mathematics was tested 
only in fifth grade.

[End of table]

Infrequent state testing is one of several factors that have hampered 
efforts to evaluate the impact of privately managed public schools on 
student achievement. To assess the impact of school management, 
researchers must isolate the effects of private management from the 
effects of other factors that could influence students' test scores, 
such as school resources or student ability. Ideally, this would be 
accomplished by randomly assigning students to either a privately 
managed school or a traditionally managed school, resulting in two 
groups of students generally equivalent except for the type of school 
assigned. However, random assignment is rarely practical, and 
researchers usually employ less scientifically rigorous methods to find 
a generally equivalent comparison group. For instance, in some cases, 
schools may be matched on schoolwide student demographic 
characteristics such as race or socioeconomic status. When such 
characteristics can be obtained for individual students in the study, 
validity is improved. In addition, validity is further improved when 
the progress of students can be followed over several years. However, 
if the data on individual student characteristics are unreliable or 
unavailable, as has often been the case, researchers experience 
difficulties developing valid comparison groups. Similarly, if 
individual test scores are available only for one grade rather than 
successive grades, researchers cannot reliably track the progress of 
student groups over time and compare the gains made by the two groups. 
In our 2002 report that examined research on schools managed by some of 
the largest education management companies, we found that insufficient 
rigorous research existed to clearly address the question of their 
impact on student achievement.[Footnote 5] Part of the reason that so 
few rigorous studies are available may stem from the difficulties 
inherent in this research.

Number of Schools Managed by Education Management Companies Is 
Increasing; Descriptive Information on Achievement Widely Available:

Although the number of public schools operated by private, for-profit 
management companies has risen rapidly in recent years, these schools 
still comprise a very small proportion of all public schools 
nationwide. Largely charter schools, the 417 privately managed schools 
were located in 25 states and the District of Columbia in school year 
2002-03, with about one-half in Arizona and Michigan. These schools 
were operated by 47 private management companies. Descriptive 
information about achievement in these schools was widely available in 
the form of individual school report cards that often provided 
comparisons with state or district averages, but often not with similar 
traditional schools. Three management company reports summarized 
achievement gains over time for all their schools in one or more 
states, using various methodologies to illustrate student performance. 
School and company reports provided useful information on student 
achievement, but generally were not designed to answer research 
questions about the effectiveness of privately managed schools compared 
with traditional schools.

While Numbers Are Increasing, the Percentage of Public Schools Managed 
by Private Companies Remains Small:

In school year 2002-03, at least 417 public schools were operated by 
private for-profit management companies, according to Arizona State 
University researchers.[Footnote 6] This figure was three times greater 
than the number of schools operated by private management companies 
just 4 years earlier, when there were only 135 schools, as shown in 
figure 1. Over three-quarters of the 417 schools were charter schools, 
and they comprised about 12 percent of charter schools nationwide. 
Despite the sharp rise in the number of public schools operated by 
management companies, they represented a small proportion of all 
charter and traditional schools in 2002-03. About one-half of 1 percent 
of all schools nationwide were privately managed schools.

Figure 1: Number of Public Schools Managed by Private Companies from 
School Year 1998-99 through 2002-03:

[See PDF for image]

[End of figure]

Over the same 5 years, public schools operated by private management 
companies have also become more geographically widespread, according to 
data from the Arizona State University study. Figure 2 shows that in 
school year 1998-99, private management companies operated public 
schools in 15 states. By school year 2002-03, the companies had schools 
in 25 states and the District of Columbia, with about 48 percent of the 
privately managed schools in Arizona and Michigan. Florida, Ohio, and 
Pennsylvania also had large numbers of schools as indicated by the map 
in figure 2, which shows the location of public schools operated by 
private management companies in school year 2002-03.

Figure 2: Location of Public Schools Operated by Private Management 
Companies in School Year 2002-03 and Annual Number of States with Such 
Schools Since 1998-99:

[See PDF for image]

[End of figure]

The number of private management companies identified by the Arizona 
State University researchers also increased over the same period, but 
the companies varied greatly in terms of the number of schools they 
operated. As shown in figure 3, the number of companies increased from 
13 in school year 1998-99 to 47 in school year 2002-03. Most of these 
companies were founded in the decade of the 1990's, but since their 
founding, some companies have been consolidated or have gone out of 
business and have been succeeded by newly formed companies. In school 
year 2002-03, most of the companies were small, operating 15 or fewer 
schools each. Five medium-sized companies--Chancellor Beacon 
Academies; The Leona Group; Mosaica Education, Inc.; National Heritage 
Academies; and White Hat Management--operated from 21 to 44 schools 
each. The single largest company, Edison Schools, operated 116 schools.

Figure 3: Number of Educational Management Companies from School Year 
1998-99 through 2002-03:

[See PDF for image]

[End of figure]

According to the Arizona Sate University report, 43 of the 47 companies 
operating in school year 2002-03 managed only charter schools.[Footnote 
7] Charter schools have greater autonomy and decision-making ability in 
such areas as purchasing and hiring compared with traditional schools 
that are generally subject to district requirements, including labor 
agreements. Arizona researchers noted that state charter school laws 
have provided opportunities for private management that were not 
present earlier, and Western Michigan University researchers indicated 
that the growth of private educational management companies occurred 
soon after charter schools reforms were enacted in that state. They 
explained that some charter holders started their own private 
management companies and other charter holders sought the acumen and 
financial resources of management companies already established in the 
business.[Footnote 8]

Individual School Reports Describe Achievement Levels, and Some Company 
Reports Describe Gains Compared to State or District Averages:

Two kinds of reports available to the public --school reports and 
company reports --described student achievement at privately managed 
schools relative to national, state, or district averages in school 
year 2002-03. Referred to as school report cards, the detailed 
individual school reports generally provided a snapshot of how well 
students attending the school did in meeting state achievement 
standards for the year. These report cards were issued by states, 
school districts, and by some of the larger companies, like the Leona 
Group for its schools in Michigan.[Footnote 9] Often available through 
the Internet, the report cards for individual schools generally 
described results of state tests in terms of the proficiency levels or 
achievement scores for the school overall, by grade level, subject 
matter, or in some cases, minority group or other subgroup.[Footnote 
10] Some report cards also provided historical information on the 
school's performance over several preceding years. School 
characteristics, such as the size, demographics, staffing, and 
finances, were included in many cases along with the proficiency levels 
or achievement scores. Figure 4 is an example of the test score section 
of Colorado's school report card for a hypothetical school.

Figure 4: Test Score Section of a Report Card for a Hypothetical School 
in Colorado for School Year 2002-03:

[See PDF for image]

Note: The Colorado school report cards include an explanation of the 
factors used to develop the school's overall academic performance in 
this section.

[End of figure]

As in Colorado, many school report cards compared results to the 
average in the state or school district, which allowed parents to see 
how well their children's school was doing--not just in relation to 
state standards but also in relation to the performance of all other 
public schools in the state or district. However, these report cards 
were primarily designed to provide descriptive information for parents 
and to give an indication of school performance, not to evaluate the 
relative effectiveness of one school versus another. Report cards 
usually did not directly compare the performance of one school against 
other similar schools, and when they did, the comparison schools 
selected were, by necessity, matched at the school level, rather than 
the individual student level.[Footnote 11] Thus, differences in school 
performance at any particular grade might be due to differences in the 
students in that grade, as the reports released by the Leona Group 
warned, rather than due to factors related to the management or 
educational strategies of the school. For this reason, report cards, 
while useful to parents, are not the best source of information if the 
goal is to evaluate the effectiveness of one school compared with 
another.

Company reports, a second source of school performance information, 
tended to provide a summary of how well students at all the company's 
schools in one or more states were doing over a period of several 
years. Generally available through the Internet, reports from three 
companies--Mosaica Education, Inc.; the National Heritage Academies; 
and Edison Schools - emphasized broad patterns, such as gains in 
achievement test scores or proficiency levels that were averaged across 
schools, grades, and subjects tested. Our descriptions of the 
companies' findings are based on their public reports and not on our 
independent review of their methodologies or conclusions.

Both the Mosaica and National Heritage Academies reports compared 
student performance to national norms or state averages. The Mosaica 
Education, Inc., report summarized student gains on tests administered 
from the fall of school year 1999-2000 through the spring of 2001-02 at 
its 18 schools in 5 states and the District of Columbia.[Footnote 12] 
According to the report, there was sustained growth in average 
achievement scores over time, with an increase in the proportion of 
Mosaica students scoring as well or better than the average student on 
a nationally normed test and a commensurate decrease in the proportion 
scoring at or below the 25th percentile. On the basis of these test 
results, the report stated that about a third of Mosaica's students 
ranked in the top one-half of the nation's students in school year 
2001-02.

The National Heritage Academies report used individual student 
performance on the state's achievement tests to compare two groups of 
students attending the company's 22 schools in Michigan in school year 
2000-01--veteran students who took the test at least 2 years after they 
applied to the school and newcomers who took the test less than 2 years 
after they applied.[Footnote 13] The study found a relationship between 
time associated with the company's schools and higher performance, with 
veteran students outperforming newcomers across all subjects and grades 
tested and also outperforming state averages on 8 out of 10 tests. The 
report cautioned, however, that such evidence is not proof of causation 
and that some other factors not accounted for in the study might be 
responsible for the results.

The Mosaica and National Heritage Academies reports both provided a 
broad view of overall company performance that, along with school 
report cards, could give parents more information on which to base 
their decisions about their children's schooling. However, like school 
report cards, these two company studies were not designed to more 
directly assess school effectiveness. Neither company report included 
comparisons with students at similar traditional schools or addressed 
the question of whether the patterns of achievement that they 
identified might also be found in other schools as well.

Edison's annual report for 2001-02 used a methodology that went further 
toward assessing school effectiveness than other company reports we 
examined.[Footnote 14] In addition to providing a summary of how well 
its students were doing over time, Edison compared some of its schools 
with traditional schools. Generally, the report summarized trends in 
performance at 94 of Edison's 112 school sites in multiple states over 
several years, compared to state and district averages.[Footnote 15] 
According to the report, most schools had low levels of achievement at 
the time Edison assumed management, but achievement levels subsequently 
increased at most of its school sites. Trends were also provided for 
several subsets of its schools, including a comparison of 66 of the 94 
Edison schools that could be matched with 1,102 traditional schools on 
two demographic variables. Traditional schools selected as matches were 
those considered similar in terms of the percentages of students who 
were African-American and/or Hispanic and who were eligible for the 
free and reduced-price school lunch program, an indicator of low 
income.[Footnote 16] Edison compared the average scores of students in 
Edison schools with average scores of students in the traditional 
schools and found that its schools averaged gains that were about 2 
percentage points or 3 percentiles higher per year than those of 
traditional schools and that about 40 of its 66 schools outperformed 
the traditional schools.

However, the Edison analysis was limited by the fact that it was 
conducted using aggregated, school-level data and did not control for 
differences in the individual students being compared.[Footnote 17] 
Edison noted that it has taken steps to strengthen the way it evaluates 
the progress of its students and schools by commissioning a study by 
RAND, a nonprofit research organization that has evaluated educational 
reforms. The study began in 2000 and is scheduled for release in the 
summer of 2004. Where possible, RAND plans to compare the scores of 
individual Edison students to those of traditional public school 
students with similar characteristics.

No Consistent Pattern of Differences in Scores on State Tests Found 
between Public Schools Managed by Private Companies and Comparable, 
Traditional Elementary Schools:

Differences in student performance on state assessments between 
privately managed public schools and comparable, traditional public 
schools varied by metropolitan areas for the grade levels in our 
study.[Footnote 18] Average student scores were significantly higher in 
both reading and math for fifth graders in 2 privately managed schools, 
1 in Denver and 1 in San Francisco, compared with similar traditional 
public schools, as were gains over time when we examined a previous 
year's scores for these students. However, fourth grade scores in the 
privately managed school in Cleveland and fifth grade scores at 2 
privately managed schools in St. Paul were significantly lower compared 
with scores in the similar traditional schools. In Detroit, average 
fifth grade reading scores were significantly lower in 6 of the 8 
privately managed schools, and math scores were lower in all but 1 
privately managed school. No significant differences in reading or math 
scores were found between the privately managed school and comparison 
schools in Phoenix.

Scores on State Tests Were Higher in Privately Managed Schools in 
Denver and San Francisco:

Average scores on state tests for fifth grade students attending 
privately managed schools in Denver and San Francisco were 
significantly higher compared with students attending similar, 
traditional public schools. Table 2 shows the characteristics used in 
matching privately managed and traditional schools in Denver and San 
Francisco and how the selected schools compared on these 
characteristics.[Footnote 19] As shown, schools generally had high 
proportions of minority and low-income students (as measured by free/
reduced-lunch program eligibility) and students with limited English 
proficiency (LEP). For our test score analyses, we were able to obtain 
data on characteristics shown in table 2 for individual students in our 
study, as well as data on student mobility.[Footnote 20] We used these 
data in the test score analyses to further control for student 
differences in the grade level we studied. (See app. II, where tables 5 
and 6 show detailed results of these analyses.):

Table 2: School Characteristics of the Privately Managed Schools and 
Comparison Schools in Denver and San Francisco:

City: Denver; Privately managed/traditional: Privately managed; 
Enrollment: 665; Percent free and reduced lunch: 76; Percent special 
education: 8; Percent LEP: 27; Percent minority: 95.

City: Denver; Privately managed/traditional: Traditional; Enrollment: 
645; Percent free and reduced lunch: 77; Percent special education: 4; 
Percent LEP: 40; Percent minority: 95.

City: Denver; Privately managed/traditional: Traditional; Enrollment: 
638; Percent free and reduced lunch: 52; Percent special education: 7; 
Percent LEP: 25; Percent minority: 95.

City: Denver; Privately managed/traditional: Traditional; Enrollment: 
403; Percent free and reduced lunch: 80; Percent special education: 8; 
Percent LEP: 52; Percent minority: 96.

City: Denver; Privately managed/traditional: Traditional; Enrollment: 
394; Percent free and reduced lunch: 76; Percent special education: 15; 
Percent LEP: 23; Percent minority: 77.

City: San Francisco; Privately managed/traditional: Privately managed; 
Enrollment: 506; Percent free and reduced lunch: 68; Percent special 
education: 4; Percent LEP: 40; Percent minority: 95.

City: San Francisco; Privately managed/traditional: Traditional; 
Enrollment: 474; Percent free and reduced lunch: 96; Percent special 
education: 9; Percent LEP: 51; Percent minority: 97.

City: San Francisco; Privately managed/traditional: Traditional; 
Enrollment: 525; Percent free and reduced lunch: 81; Percent special 
education: 10; Percent LEP: 33; Percent minority: 96.

Source: Common Core of Data school year 2000-01 and school districts.

[End of table]

As shown in figure 5, in Denver the average reading score of 572 for 
fifth grade students in the privately managed public school is higher, 
compared with the average of 557 for students in similar traditional 
public schools. The average math score of 467 at the privately managed 
school is also higher than the 440 average score in the comparison 
traditional schools. For both reading and math, differences in scores 
remained significantly higher after we controlled for factors 
representing differences in the student populations.

Figure 5: Fifth Grade Reading Scores for the Privately Managed School 
and Comparison Schools in Denver on the Colorado Student Assessment 
Program:

[See PDF for image]

Note: Percentiles are derived from analyses that control for 
differences in student characteristics.

[End of figure]

Figure 5 also shows the difference in reading performance, controlling 
for other factors, between the typical student at the privately managed 
school and the average student at the same grade level in the similar 
traditional schools in Denver. The bell curve represents the 
distribution of combined student scores in the traditional schools, 
with the lighter figure representing the student scoring at about the 
50th percentile. The shaded figure represents the average student from 
the privately managed school. Although this student's score is at about 
the 50th percentile in the privately managed school, the same score 
would place him or her at about the 60th percentile when compared 
against the scores of students in the traditional schools. The 
difference in math scores suggests a similar outcome--that is, the 
average student in the privately managed school would score at about 
the 60th percentile in the comparison traditional schools.[Footnote 21]

In San Francisco, fifth grade reading scores averaged 636 for students 
in the privately managed school and 627 for students in the comparison 
traditional schools. Performance in mathematics of 640 was also higher 
for fifth grade students at the privately managed school, compared with 
623 for students in the similar traditional schools. (See fig. 6.) As 
in Denver, these differences were significant when controlling for 
other factors. This analysis suggests that an average student in the 
privately managed school would likely exceed about 60 percent of 
students in the traditional comparison schools in reading and about 65 
percent of those students in math.

Figure 6: Fifth Grade Reading and Math Scores for the Privately Managed 
School and Comparison Schools in San Francisco on the Stanford-9 
Achievement Test:

[See PDF for image]

[End of figure]

In both Denver and San Francisco, we were able to examine student 
performance over time, and our findings of achievement over time were 
similar to the findings described above. Students attending the 
privately managed schools showed significantly greater gains over time 
than the students in the comparison traditional schools. Specifically, 
fifth-grader students in our study who had attended their privately 
managed schools since the third grade demonstrated significantly higher 
achievement gains between grades 3 and 5 than did such students in the 
traditional comparison schools.[Footnote 22]

Scores on State Tests Were Lower in Privately Managed Schools in 
Cleveland and St. Paul:

Average scores on state tests for fourth grade students attending 
privately managed schools in Cleveland and fifth grade students 
attending privately managed schools in St. Paul were significantly 
lower compared with scores of students attending similar traditional 
public schools.[Footnote 23] One privately managed school in Cleveland 
and 2 privately managed schools in St. Paul were examined, and as in 
Denver and San Francisco, the schools in our study from these cities 
were high minority and low-income schools. Table 3 shows the 
characteristics used to match schools in Cleveland and St. Paul and how 
the schools selected compared on these characteristics. For our test 
score analyses in Cleveland, we were able to obtain data on 
characteristics shown in table 3 for individual students in our study, 
as well as data on student mobility.[Footnote 24] In St. Paul, we 
obtained data on all characteristics shown in table 3 for individual 
students, except special education.[Footnote 25] In addition, we were 
able to obtain data on limited English proficiency. We used these data 
in the test score analyses for both cities to further control for 
student differences in the grade level we studied. (See app. II, where 
tables 7, 8, and 9 show detailed results of these analyses.):

Table 3: School Characteristics of the Privately Managed Schools and 
Comparison Schools in Cleveland and St. Paul:

City: Cleveland; Privately managed/traditional: Privately managed; 
Enrollment: 411; Percent free and reduced lunch: 77; Percent special 
education: 4; Percent minority: 100.

City: Cleveland; Privately managed/traditional: Traditional; 
Enrollment: 422; Percent free and reduced lunch: 80; Percent special 
education: 10; Percent minority: 100.

City: Cleveland; Privately managed/traditional: Traditional; 
Enrollment: 496; Percent free and reduced lunch: 88; Percent special 
education: 8; Percent minority: 99.

City: Cleveland; Privately managed/traditional: Traditional; 
Enrollment: 352; Percent free and reduced lunch: 77; Percent special 
education: 16; Percent minority: 99.

City: Cleveland; Privately managed/traditional: Traditional; 
Enrollment: 561; Percent free and reduced lunch: 99; Percent special 
education: 8; Percent minority: 99.

City: St. Paul; Privately managed/traditional: Privately managed; 
Enrollment: 116; Percent free and reduced lunch: 70; Percent special 
education: 12; Percent minority: 51.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 386; Percent free and reduced lunch: 46; Percent special 
education: 12; Percent minority: 43.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 484; Percent free and reduced lunch: 48; Percent special 
education: 12; Percent minority: 50.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 223; Percent free and reduced lunch: 71; Percent special 
education: 9; Percent minority: 72.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 348; Percent free and reduced lunch: 59; Percent special 
education: 10; Percent minority: 69.

City: St. Paul; Privately managed/traditional: Privately managed; 
Enrollment: 126; Percent free and reduced lunch: 71; Percent special 
education: 14; Percent minority: 72.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 313; Percent free and reduced lunch: 76; Percent special 
education: 16; Percent minority: 82.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 223; Percent free and reduced lunch: 71; Percent special 
education: 9; Percent minority: 72.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 438; Percent free and reduced lunch: 64; Percent special 
education: 13; Percent minority: 76.

City: St. Paul; Privately managed/traditional: Traditional; 
Enrollment: 524; Percent free and reduced lunch: 68; Percent special 
education: 17; Percent minority: 61.

Source: Common Core of Data school year 2000-01 and school districts.

[End of table]

Figure 7 shows average reading scores for the privately managed school 
in Cleveland and its set of comparable schools. The average scores were 
significantly lower for students attending the privately managed school 
in both reading and math for the school years examined after 
controlling for other factors. The magnitude of the difference in 
reading scores is shown in the same figure 7. As can be seen in the 
figure, the score of the average student in the fifth grade in the 
privately managed school falls at about the 20th percentile when 
compared with student scores in the comparison traditional schools. 
Similarly, the difference in math scores implies that the average 
student in the privately managed school would score at about the 20th 
percentile in the traditional comparison schools.

:

Figure 7: Fourth Grade Reading Scores for the Privately Managed School 
and Comparison Schools in Cleveland on the Ohio Proficiency Test:

[See PDF for image]

Note: Percentiles are derived from analyses that control for 
differences in student characteristics.

[End of figure]

In St. Paul, we studied 2 privately managed schools (labeled school A 
and school B in figure 8) and used a different set of comparison 
traditional schools for each privately managed school. The average 
scores in both reading and math were significantly lower for students 
at both privately managed schools studied compared with similar 
traditional schools.

Figure 8: Fifth Grade Reading and Math Scores for the Privately Managed 
Schools and Comparison Schools in St. Paul on the Minnesota 
Comprehensive Assessment Program:

[See PDF for image]

[End of figure]

The differences for the first privately managed school suggest that an 
average student at that school would score at about the 30th percentile 
in reading and the 20th percentile in math if attending the comparison 
traditional schools. The differences in scores at the second privately 
managed school imply that the score of an average student would be at 
about the 30th percentile in the comparison traditional schools in both 
reading and math.

Scores on State Tests in Privately Managed Schools Varied in Detroit 
and Were Similar to Traditional Schools in Phoenix:

Average scores for fourth grade students in Detroit varied, but tended 
to be lower in both reading and math for students attending privately 
managed schools than for students attending similar traditional 
schools.[Footnote 26] As in other locations, student populations in 
schools we studied in Detroit tended to be minority and low income. 
(See app. III for other school characteristics.) Except for race/
ethnicity, we did not use individual student demographic data in the 
Detroit test score analyses because the demographic data we received on 
individual students did not appear to be accurate. In spite of these 
missing data, we believe the analyses provide useful information, given 
the degree of similarity among the matched schools.

As shown in figure 9, reading scores were significantly lower for 
students in six of the privately managed schools compared with students 
in similar traditional schools in Detroit. The size of these 
differences generally suggested that an average student attending the 
privately managed schools would score at about the 30th percentile in 
the similar traditional schools. In one comparison (labeled C in fig. 
9), reading scores were significantly higher in the privately managed 
school compared with similar traditional schools. Students at this 
privately managed school would likely perform at about the 70th 
percentile in the traditional schools. For one other privately managed 
school (comparison B), differences in scores were not significantly 
different.

Figure 9: Fourth Grade Reading Scores for Privately Managed and 
Comparison Schools in Detroit on the Michigan Education Assessment 
Program:

[See PDF for image]

[A] Represents a statistically significant difference at the 95-percent 
confidence level.

[B] Not statistically significant at the 95-percent confidence level 
but approaches significance (p<.06).

Note: There are two parts to this reading exam, story section and 
information section. The reported reading scores are an average of the 
two sections and are only for the 2002 school year.

[End of figure]

Math scores followed a similar pattern, with student scores 
significantly lower at 7 of the 8 privately managed schools when 
compared with similar traditional schools. Scores for average students 
in the privately managed school would range from about the 15th 
percentile to about the 35th percentile in the traditional schools, 
depending on the particular set of schools compared. In the one higher-
performing privately managed school (comparison B in fig. 10), an 
average student in this privately managed school would score at the 
70th percentile in similar traditional schools.

Figure 10: Figure 10. Fourth Grade Math Scores for Privately Managed 
and Comparison Schools in Detroit on the Michigan Education Assessment 
Program:

[See PDF for image]

[A] Represents a statistically significant difference at the 95-percent 
confidence level.

[End of figure]

In Phoenix, scores of fifth grade students at the privately managed 
school did not differ significantly from scores at similar traditional 
schools. As in the other locations studied, both the privately managed 
and similar traditional schools had high percentages of minority and 
low-income students. Table 4 shows the characteristics of the schools 
in our study in Phoenix. For test score analyses, we were able to 
obtain reliable data for minority status for individual students. 
Additionally, we obtained reliable data on student mobility, and these 
were included in our analysis. Data on special education and limited 
English proficiency for individual students were not believed to be 
accurate and were not included. Individual student data on free and 
reduced-lunch eligibility were not available.

Table 4: School Characteristics of the Privately Managed School and 
Comparison Schools in Phoenix:

City: Phoenix; Privately managed/traditional: Privately managed; 
Enrollment: 1,066; Percent free and reduced lunch: 96; Percent special 
education: 25; Percent LEP: 50; Percent minority: 88.

City: Phoenix; Privately managed/traditional: Traditional; 
Enrollment: 913; Percent free and reduced lunch: 81; Percent special 
education: 19; Percent LEP: 42; Percent minority: 85.

City: Phoenix; Privately managed/traditional: Traditional; 
Enrollment: 682; Percent free and reduced lunch: 97; Percent special 
education: 15; Percent LEP: 48; Percent minority: 95.

City: Phoenix; Privately managed/traditional: Traditional; 
Enrollment: 544; Percent free and reduced lunch: 92; Percent special 
education: 20; Percent LEP: 39; Percent minority: 99.

City: Phoenix; Privately managed/traditional: Traditional; 
Enrollment: 1,138; Percent free and reduced lunch: 97; Percent special 
education: 9; Percent LEP: 49; Percent minority: 95.

Source: Common Core of Data school year 2000-01 and state education 
department.

[End of table]

Figure 11 shows average student scores for reading and math in the 
privately managed school and in the comparison traditional schools for 
Phoenix. Scores were not significantly different in either reading or 
math. We also analyzed changes in reading and math scores between third 
and fifth grade for those students who had tested in the same school in 
both years. Again, we found no significant difference between students 
attending the privately managed school and those attending traditional 
schools.

Figure 11: Fifth Grade Reading and Math Scores for the Privately 
Managed School and Comparison Schools in Phoenix:

[See PDF for image]

[End of figure]

Concluding Observations:

As opportunities increase for parents to exercise choice in the public 
education arena, information on school performance, such as that found 
in school report cards produced by many states, becomes more important. 
Such information can be useful to parents in making school choices by 
providing a variety of information about schools, including how they 
are performing in terms of students meeting state achievement standards 
or relative to statewide averages.

However, educators and policymakers often want to know not only how 
well schools are performing but also the factors that contribute to 
their high or low performance so that successful strategies can be 
emulated. Answering this kind of evaluative question requires a 
different kind of methodology and more complex analyses to isolate the 
effects of the particular strategies of interest--educational 
practices, management techniques, and so on--from the many other 
factors that could affect student achievement. Although not a 
comprehensive impact evaluation, our study investigates the effect of 
school management by comparing traditional and privately managed 
schools and by controlling for differences in the characteristics of 
students attending the schools. In this way, our study provides a 
different type of information than that typically found in school 
report cards.

While our study explores the role of school management, it has certain 
important limitations, as discussed earlier and in appendix I. Among 
these are data issues commonly encountered by educational researchers, 
for instance, lack of test score data for successive years and 
unreliable demographic data for individual students in some sites. 
However, with the implementation of NCLBA, more rigorous studies should 
be possible, as annual testing of all grades is phased in and with 
expected improvements in the quality of demographic data resulting from 
requirements to report progress for various subpopulations of students, 
based on such characteristics as race and low-income status.

Finally, our mixed results may be evidence of the complexity of the 
factor under study. Our study analyzed differences between 2 categories 
of schools, grouped by whether they were traditional, district-managed 
schools or managed by a private company. However, these schools may 
have differed in other ways not included in our study--for example 
curricula, staff composition and qualifications, and funding levels--
and these factors may also have affected student achievement. Any of 
these factors or combination of factors could account for the 
differences we found or may have masked the effects of differences we 
otherwise would have found.

Agency Comments:

We provided a draft of this report to the Department of Education for 
review and comment. Education's Executive Secretariat confirmed that 
department officials had reviewed the draft and had no comments.

We are sending a copy of this report to the Secretary of Education, 
relevant congressional committees, appropriate parties associated with 
schools in the study, and other interested parties. We will make copies 
available to others upon request. In addition, the report will be 
available at no charge on GAO's Web site at http://www.gao.gov.

If you or your staff have any questions about this report, please call 
me at (202) 512-7215. See appendix IV for other staff acknowledgments.

Sincerely yours,

Marnie S. Shaul: 

Director: 

Education, Workforce, and Income Security Issues:

Signed by Marnie S. Shaul: 

[End of section]

Appendix I: Scope and Methodology:

To compare achievement of public elementary schools in large cities 
operated by private management companies with similar traditional 
public schools, we analyzed individual student scores on state 
assessments in reading and mathematics. We matched each privately 
managed public school with 2 to 4 traditional public schools located in 
the same city that were similar in terms of size, grade span, and 
student characteristics. To confirm the reasonableness of the matches, 
we spoke with principals in all of the privately managed schools in our 
study and visited most of the schools. We also spoke with principals 
and visited many of the traditional schools selected. For selected 
grade levels, we compared the individual student scores of students 
attending the privately managed schools with those of students in the 
similar traditional public schools. We also compared changes in 
individual student performance over time where such data were 
available. This appendix describes the scope and school selection, 
outcome measures and analytic methods, and the limitations of the 
analysis.

Scope and School Selection:

Using available public information,[Footnote 27] we attempted to 
identify all privately managed public elementary schools in large urban 
areas that had been in continuous operation by the same management 
company since the 1998-99 school year.[Footnote 28] We defined a large 
urban area for this study as a central city with a population of at 
least 400,000 in a standard metropolitan statistical area with a 
population of at least 2,000,000. We identified 17 public elementary 
schools managed by private companies meeting these criteria.[Footnote 
29] The 17 schools were located in Cleveland, Ohio; Denver, Colorado; 
Detroit, Michigan; Phoenix, Arizona; St. Paul, Minnesota; and San 
Francisco, California.

We matched each of these privately managed schools with 2-4 similar 
traditional public schools in the district where the privately managed 
school was located.[Footnote 30] To select similar traditional public 
schools, we employed a "total deviation" score procedure. For each 
public elementary school in the defined public school district and the 
privately managed school, we determined the following school 
characteristics: (1) racial and ethnic percentages,[Footnote 31] (2) 
percent special education, (3) percent eligible for free and reduced 
lunch, (4) percent limited-English proficient,[Footnote 32] and (5) 
student enrollment. We calculated z-scores (the statistic that 
indicates how far and in what direction the value deviates from its 
distribution's mean, expressed in units of its distribution's standard 
deviation) for each characteristic, and then calculated the absolute 
value of the difference between the z-score of the privately managed 
school and the z-score of each traditional public school on that 
characteristic. For each school, we summed the absolute difference in 
z-scores into a total deviation score. The total deviation score 
represents the sum of the differences between the privately managed 
public school and the candidate traditional public schools.

Traditional public schools were considered a close match if the total 
deviation score divided by the number of characteristics for which we 
computed z-scores was less than or equal to 1.0. A score less than or 
equal to 1.0 indicates that the traditional school did not deviate from 
the privately managed school by more than 1 standard deviation when 
averaging across all variables considered in the match. For example, if 
8 variables were used to calculate the total deviation score and the 
total deviation score was 7.8, the amount that the candidate school 
deviated from the privately managed school would be, on average, less 
than 1 standard deviation. All comparison schools selected for our 
analyses met this criterion for a close match.

After mathematically selecting close matches, we consulted with public 
school district officials about the schools selected.[Footnote 33] 
These considerations led to adjustments to our final selection of 
matches as follows. In St. Paul, traditional public schools closely 
matching the privately managed schools included magnet schools and 
neighborhood, that is, attendance-zone, schools. The two "best" 
matching magnet schools and the two "best" neighborhood schools were 
selected as matches for the analysis. Similarly in Cleveland, 
traditional public schools closely matching the privately managed 
schools included former magnet schools and traditional neighborhood 
schools. For balance in matching, the two "best" matching former magnet 
schools and two "best" matching neighborhood schools were selected as 
matches for the analysis. In Denver, the five closest matching schools 
were all located in a distinct neighborhood, geographically distant 
from the privately managed school. In consultation with local school 
district personnel, the two "best" matching schools from this area and 
the two "best" matching schools from outside this area were selected 
for the analysis. In San Francisco, one of the three traditional school 
matches was discarded because it had a special teacher training 
program, resulting in only two matches with the privately managed 
school. In Detroit, the best three matching traditional schools were 
selected except in one instance where one of the matching schools was 
discarded because a subsequent site visit determined that the school 
had selection criteria for attendance based upon prior achievement. In 
Phoenix, there were 21 elementary school districts located in the city, 
and 13 of these districts comprise the Unified Phoenix High School 
District. Since the privately managed schools were located within the 
Unified Phoenix High School District, we selected matches from among 
the 13 school districts in the Unified Phoenix High School District 
using the "best" matching school of each elementary school district as 
a pool from which we selected the best four matches, each from a 
different school district.

Two privately managed schools in Phoenix and one privately managed 
school in Cleveland were dropped from the analysis because no matching 
traditional schools were found using our methodology. This resulted in 
a total of 14 privately managed schools included in the study, 8 of 
which were located in Detroit. Schools selected were managed by Designs 
for Learning, Inc.; Edison Schools; The Leona Group; Mosaica Education, 
Inc.; Schoolhouse; and White Hat Management.

Measures and Analytic Methods:

We used student reading and math scale scores on routinely administered 
state assessments as measures of academic achievement. At the time of 
our study, the most recent data available were for school year 2001-
2002. Test scores and student characteristic data were obtained from 
either the school district or state education agency. We used a variety 
of approaches to verify the accuracy of these data. In most cases, we 
verified data by comparing a sample of the data received against school 
records examined at the school site. In Detroit, data verification 
indicated student low-income, special education, and mobility data 
provided by the state were unreliable, and we decided not to use these 
data in our final analyses. In Phoenix, data verification indicated 
that student limited-English proficiency and special education data 
provided by the state for the privately managed school were unreliable 
and this was confirmed with diagnostic analysis. Therefore, we were 
unable to include these control variables in our final analyses.

For each privately managed school and its set of matched, comparison 
schools, we selected the highest elementary grade for which test scores 
were available. We collected test score information for 2 school years, 
2000-01 and 2001-02, except in Detroit where only 2001-02 scores were 
used due to difficulties obtaining data and changes in the test given. 
For each site, we compared reading and math student scores in the 
privately managed school(s) with the scores of same-grade students in 
the set of matched, comparison schools. The scores for the 2000-01 and 
2001-02 school years were combined in the analysis.[Footnote 34] In 
addition, in three locations where testing occurred more frequently, 
Denver, Phoenix, and San Francisco, we obtained third grade scores for 
students who had taken the state assessment in the same school and 
examined the difference in scores over time.

For each site, we conducted multivariate ordinary least squares (OLS) 
regression analysis to quantify differences in student achievement 
while controlling for school type and student characteristics. Specific 
independent variables included in the regression model were as follows:

* School type, with the traditional public school being given a value 
of 1 and the privately managed school a value of 0.

* Mobility, with a value of 1 given to students not attending for 2 
years the same school at which he or she took the state assessment.

* Limited English proficiency (LEP), with a value of 1 given if the 
child was designated as limited-English proficient.[Footnote 35]

* Special education, with a value of 1 given if the student was 
enrolled in special education.[Footnote 36]

* Low-income, with a value of 1 indicating the student was eligible for 
free or reduced lunch.[Footnote 37]

* Race and ethnicity, with a value of 1 given for the child's 
appropriate minority racial/ethnic identity. Each child was placed in 
only one racial category, and the number of racial categories used 
varied from place to place. When numbers for a particular racial group 
in a city were small, they were combined collectively as "other 
minority." (Specific racial and ethnic identities employed in each city 
are set out in the results in app. II.):

Student achievement on reading and mathematics were analyzed separately 
for each privately managed public school with its set of matched 
schools. The regression formula was:

[See PDF for image]

[End of figure]

where, (1) i is the individual student, (2) low-income is determined by 
eligibility for free and/or reduced lunch, and (3) race and ethnicity 
are distinct codes dependent upon the geographical area.

We also performed analyses on different groupings of the comparison 
schools in Denver, Cleveland, and St. Paul. In Denver, 2 of our matched 
schools were in a distinct neighborhood that school district personnel 
believed might be atypical; in Cleveland and St. Paul several of the 
matched schools were magnet or former magnet schools. We re-analyzed 
the data in each of these cities using these groupings as factors. The 
overall results were unchanged, with the exception that in Denver, 
reading scores were not significantly different when the privately 
managed school was compared with the 2 schools not in the distinct 
neighborhood.

In Denver, San Francisco, and Phoenix, for the students in the grades 
we analyzed, we also obtained the prior years' reading scores if the 
student took the test in the same school. For this analysis, the 
regression formula used the difference between reading scores in the 
highest elementary grade and that of 2 years earlier as the dependent 
variable. The independent variables were similar to those employed in 
the cross sectional analysis with the exception that the reading/
mathematics score for the period 2 years earlier was also included as 
an independent variable. The regression formula was:

[See PDF for image]

[End of figure]

In conducting these analyses, we performed certain diagnostic and 
analytic tests to confirm both the appropriateness of aggregating 
categories in our analyses and the reasonableness of assumptions 
pertaining to normality and homogeneity of variance. In addition, we 
determined the extent of missing data and performed sensitivity 
analyses to assess the effect on our results. We determined that 
missing case level data had a negligible effect on our results.

To illustrate the magnitude of differences found, we computed effect 
sizes based on standardized mean differences. Using the OLS regression 
results, we divided the unstandardized coefficient associated with 
school type by the pooled standard deviation to obtain z-scores for 
average students in the privately managed and traditional schools. The 
reported percentile was the area of the normal curve associated with 
the z-scores.

Tables 5-12 in appendix II list the regression results and independent 
variables included in our analyses. The size and significance of the 
differences we report were derived from OLS regression models. We 
obtained results that were almost identical to the OLS results when we 
used robust estimation procedures to calculate the standard errors 
associated with the estimated differences. We also considered robust 
regression models that allowed for the clustering, and lack of 
independence, of students within schools. These models yielded somewhat 
fewer differences that were statistically significant at the 95-percent 
confidence level. We do not focus our reporting on the results of the 
models that account for clustering, however, since the statistical 
properties and validity of such models when applied to data with a very 
small number of clusters (in this case, 3 to 5 schools) is 
questionable.[Footnote 38] However, changes to significance levels of 
the school type coefficients due to robust standard errors and robust 
standard errors with clustering are noted in appendix II.

Limitations of the Analysis:

The findings in this study are subject to typical limitations found in 
quasi-experimental designs. We examined the highest elementary grades 
tested for school years 2000-01 and 2001-02, and student achievement in 
these grades and years may not be indicative of student achievement in 
other grades and years in those schools. In addition, our matching 
process may not have produced equivalent groups for comparison. We 
mitigated this potential problem by using individual student 
characteristics in our analyses. However, reliable and complete student 
demographic data were not available in all sites, which resulted in the 
elimination of important factors from the model in several sites. In 
addition, other factors such as student ability, prior achievement, 
operating environment, reasons students enrolled in privately managed 
schools, and parental involvement, may be related to student 
achievement and are not accounted for in the study. Finally, our 
examination of student performance over time, that is, changes in 
achievement between grades, also has some limitations. First, the data 
allowed a study of achievement over time in only 3 of the 6 sites. In 
addition, the analyses included only students who continuously attended 
the school over the time period studied, and this in some cases 
eliminated more than half of the subjects from the analyses. We were 
unable to determine whether those students who remained in the school 
for this period were different in some important way from those who 
left.

[End of section]

Appendix II: Tables of Regression Results for Differences in Student 
Achievement Scores on State Assessments:

Tables 5-12 in this appendix show the variables used in the OLS 
regression models and the results of those analyses. The results are 
presented separately by city and for each privately managed school and 
its particular set of matching traditional schools, with reading and 
math presented within the same table in all cases, except Detroit. The 
number of observations, shown as N, is the total of the observations in 
the privately managed school and its set of comparison schools used in 
each regression analysis.

We also ran similar regression analyses using robust estimation 
procedures with and without clustering, as discussed in appendix I. In 
most cases, effects of school type remained significant at the 95-
percent confidence level. Exceptions are indicated by table notes.

Table 5: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
Denver:

[See PDF for image]

Source: GAO data analysis.

[A] Using robust standard error procedures with clustering, the effect 
of school type approaches but does not reach significance at the 95-
percent confidence level. (p = 0.06 for reading; p = 0.09 for math.):

[End of table]

Table 6: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
San Francisco:

[See PDF for image]

Source: GAO data analysis.

[A] Using robust procedures with clustering, the effect of school type 
is no longer significant at the 95-percent confidence level.

[End of table]

Table 7: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
Cleveland:

[See PDF for image]

Source: GAO data analysis.

[End of table]

Table 8: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed School and Comparison 
Schools in St. Paul (School A Comparison):

[See PDF for image]

Source: GAO data analysis.

Note: Special education data were available for only one school year 
and so were not included in the final analyses. Diagnostic analyses 
were run for the one year that special education data were available to 
test for the effects of including special education in the model. When 
special education was included, school type remained significant at the 
95-percent confidence level.

[End of table]

Table 9: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed School and Comparison 
Schools in St. Paul (School B Comparison):

[See PDF for image]

Source: GAO data analysis.

Note: Special education data were available for only one school year 
and so were not included in the final analyses. Diagnostic analyses 
were run for the one year that special education data were available to 
test for the effects of including special education in the model. When 
special education was included, school type remained significant at the 
95-percent confidence level.

[End of table]

Table 10: Regression Results for Differences in Student Performance on 
State Assessments at the Privately Managed and Comparison Schools in 
Phoenix:

[See PDF for image]

Source: GAO data analysis.

Note: Special education and limited English proficiency were removed as 
independent variables because the data received were considered 
unreliable.

[End of table]

Table 11: Regression Results for Differences in Student Performance on 
State Reading Assessment at the Privately Managed and Comparison 
Schools in Detroit:

[See PDF for image]

Source: GAO data analysis.

Note: Where results do not include race or ethnic variables, all 
students at the privately managed school and comparable schools used in 
the regression analysis were African American.

[A] Using robust standard error procedures with clustering, the effect 
of school type is not significant at the 95-percent confidence level.

[B] Using robust estimation procedures without clustering, the effect 
of school type is significant at the 95-percent confidence level.

[End of table]

Table 12: Regression Results for Differences in Student Performance on 
State Math Assessment at the Privately Managed and Comparison Schools 
in Detroit:

[See PDF for image]

Source: GAO data analysis.

Note: Where results do not include race or ethnic variables, all 
students at the privately managed school and comparable schools used in 
the regression analysis were African American.

[A] Using robust standard error procedures with clustering, the effect 
of school type is not significant at the 95-percent confidence level.

[End of table]

[End of section]

Appendix III: Characteristics of Privately Managed Schools and 
Comparable Traditional Public Schools in Detroit:

Privately Managed/traditional: Private - A; Enrollment: 867; Percent 
free and reduced: 68; Percent special ed: 3; Percent minority: 100.

Privately Managed/traditional: Traditional - A; Enrollment: 693; 
Percent free and reduced: 81; Percent special ed: 4; Percent minority: 
100.

Privately Managed/traditional: Traditional - B; Enrollment: 538; 
Percent free and reduced: 58; Percent special ed: 3; Percent minority: 
100.

Privately Managed/traditional: Traditional - C; Enrollment: 594; 
Percent free and reduced: 78; Percent special ed: 5; Percent minority: 
99.

Privately Managed/traditional: Private - B; Enrollment: 354; Percent 
free and reduced: 79; Percent special ed: 11; Percent minority: 99.

Privately Managed/traditional: Traditional - A; Enrollment: 594; 
Percent free and reduced: 78; Percent special ed: 5; Percent minority: 
99.

Privately Managed/traditional: Traditional - B; Enrollment: 158; 
Percent free and reduced: 79; Percent special ed: 7; Percent minority: 
100.

Privately Managed/traditional: Traditional - C; Enrollment: 389; 
Percent free and reduced: 74; Percent special ed: 3; Percent minority: 
98.

Privately Managed/traditional: Private - C; Enrollment: 322; Percent 
free and reduced: 39; Percent special ed: 8; Percent minority: 99.

Privately Managed/traditional: Traditional - A; Enrollment: 485; 
Percent free and reduced: 43; Percent special ed: 12; Percent minority: 
100.

Privately Managed/traditional: Traditional - B; Enrollment: 434; 
Percent free and reduced: 47; Percent special ed: 4; Percent minority: 
100.

Privately Managed/traditional: Traditional - C; Enrollment: 446; 
Percent free and reduced: 65; Percent special ed: 5; Percent minority: 
95.

Privately Managed/traditional: Private - D; Enrollment: 1108; Percent 
free and reduced: 46; Percent special ed: 3; Percent minority: 100.

Privately Managed/traditional: Traditional - A; Enrollment: 538; 
Percent free and reduced: 58; Percent special ed: 3; Percent minority: 
100.

Privately Managed/traditional: Traditional - B; Enrollment: 369; 
Percent free and reduced: 47; Percent special ed: 4; Percent minority: 
99.

Privately Managed/traditional: Traditional - C; Enrollment: 677; 
Percent free and reduced: 53; Percent special ed: 2; Percent minority: 
99.

Privately Managed/traditional: Private - E; Enrollment: 368; Percent 
free and reduced: 70; Percent special ed: 9; Percent minority: 100.

Privately Managed/traditional: Traditional - A; Enrollment: 389; 
Percent free and reduced: 74; Percent special ed: 3; Percent minority: 
98.

Privately Managed/traditional: Traditional - B; Enrollment: 487; 
Percent free and reduced: 67; Percent special ed: 5; Percent minority: 
100.

Privately Managed/traditional: Traditional - C; Enrollment: 524; 
Percent free and reduced: 62; Percent special ed: 5; Percent minority: 
100.

Privately Managed/traditional: Private - F; Enrollment: 319; Percent 
free and reduced: 75; Percent special ed: 7; Percent minority: 95.

Privately Managed/traditional: Traditional - A; Enrollment: 214; 
Percent free and reduced: 68; Percent special ed: 7; Percent minority: 
89.

Privately Managed/traditional: Traditional - B; Enrollment: 389; 
Percent free and reduced: 74; Percent special ed: 3; Percent minority: 
98.

Privately Managed/traditional: Traditional - C; Enrollment: 451; 
Percent free and reduced: 80; Percent special ed: 0; Percent minority: 
98.

Privately Managed/traditional: Private - G; Enrollment: 716; Percent 
free and reduced: 37; Percent special ed: 3; Percent minority: 100.

Privately Managed/traditional: Traditional - A; Enrollment: 538; 
Percent free and reduced: 58; Percent special ed: 3; Percent minority: 
100.

Privately Managed/traditional: Traditional - B; Enrollment: 677; 
Percent free and reduced: 53; Percent special ed: 2; Percent minority: 
100.

Privately Managed/traditional: Traditional - C; Enrollment: 369; 
Percent free and reduced: 47; Percent special ed: 4; Percent minority: 
100.

Privately Managed/traditional: Private - H; Enrollment: 452; Percent 
free and reduced: 46; Percent special ed: 10; Percent minority: 79.

Privately Managed/traditional: Traditional - A; Enrollment: 561; 
Percent free and reduced: 73; Percent special ed: 0; Percent minority: 
76.

Privately Managed/traditional: Traditional - B; Enrollment: 705; 
Percent free and reduced: 65; Percent special ed: 2; Percent minority: 
72.

Privately Managed/traditional: Traditional - C; Enrollment: 586; 
Percent free and reduced: 84; Percent special ed: 2; Percent minority: 
76.

Sources: GAO data analysis from Common Core of Data school year 2000-01 
unless otherwise noted. Special education data were from school Web 
sites. Limited English proficiency data were not available.

[End of table]

[End of section]

Appendix IV: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Deborah Edwards (202) 512-5416:

Patricia Elston (202) 512-3016:

Acknowledgments:

In addition to those named above, Peter Minarik, Mark Braza, Douglas M. 
Sloane, and Shana Wallace made key contributions to this report. Deidre 
M. McGinty and Randolph D. Quezada also provided important support.

[End of section]

Related GAO Products:

Title I: Characteristics of Tests Will Influence Expenses; Information 
Sharing May Help States Realize Efficiencies. GAO-03-389. Washington, 
D.C.: May 8, 2003.

Public Schools: Insufficient Research to Determine Effectiveness of 
Selected Private Education Companies. GAO-03-11. Washington, D.C.: 
October 29, 2002.

School Vouchers: Characteristics of Privately Funded Programs. GAO-02-
752. Washington, D.C.: September 10, 2002.

Title I: Education Needs to Monitor States' Scoring of Assessments. 
GAO-02-393. Washington, D.C.: April 1, 2002.

School Vouchers: Publicly Funded Programs in Cleveland and Milwaukee. 
GAO-01-914. Washington, D.C.: August 31, 2001.

Charter Schools: Limited Access to Facility Financing. GAO/HEHS-00-163. 
Washington, D.C.: September 12, 2000.

Charter Schools: Federal Funding Available but Barriers Exist. HEHS-98-
84. Washington, D.C.: April 30, 1998.

Charter Schools: Recent Experiences in Accessing Federal Funds. T-HEHS-
98-129. Washington, D.C.: March 31, 1998.

Charter Schools: Issues Affecting Access to Federal Funds. GAO/T-HEHS-
97-216. Washington, D.C.: September 16, 1997.

Private Management of Public Schools: Early Experiences in Four School 
Districts. GAO/HEHS-96-3. Washington, D.C.: April 19, 1996.

FOOTNOTES

[1] Public Law 107-110, Jan. 8, 2002.

[2] Arizona State University researchers at the Education Policy 
Studies Laboratory compile annual data on the number of companies and 
their schools by school type, grade level, size of enrollment, year 
opened, and location. See Alex Molnar, Glen Wilson, and Daniel Allen, 
Profiles of For-Profit Education Management Companies 2002-2003, 
(Tempe: Arizona State University, Jan. 2003).

[3] See U.S. General Accounting Office, Private Management of Public 
Schools: Early Experiences in Four School Districts, GAO/HEHS-96-3 
(Washington, D.C.: Apr. 19, 1996).

[4] This requirement takes effect as long as specified amounts of 
federal funding are provided for test administration. For more on this 
subject, see U.S. General Accounting Office, Title I: Characteristics 
of Tests Will Influence Expenses; Information Sharing May Help States 
Realize Efficiencies, GAO-03-389 (Washington, D.C.: May 8, 2003).

[5] See U.S. General Accounting Office, Public Schools: Insufficient 
Research to Determine Effectiveness of Selected Private Education 
Companies, GAO-03-11 (Washington, D.C.: Oct. 29, 2002).

[6] Arizona State University researchers list only schools operated by 
management companies that the researchers can positively identify as 
for-profits, but additional schools and companies may exist that the 
researchers cannot positively identify. The researchers count as a 
single school the grades in one or more buildings that are under the 
supervision of a single principal.

[7] Most of the schools managed by two of the other companies were 
charter schools, but less than one-third of the schools operated by 
Edison Schools and Victory Schools, Inc., were charter schools.

[8] See Jerry Horn and Gary Miron, An Evaluation of the Michigan 
Charter School Initiative: Performance, Accountability, and Impact, 
(Western Michigan University: July 2000). 

[9] Individual school reports are also available from GreatSchools.net 
and from Standard & Poors for a limited number of schools.

[10] NCLBA requires that report cards issued by states and districts 
include this information, but scores for very small subgroups may be 
withheld to protect the privacy of individual students whose scores 
might otherwise be inferred.

[11] California compares each individual school's rating with the 
ratings for a set of 100 other schools matched on certain demographic 
and other characteristics. The comparison schools selected by the state 
are not required to be within the same geographic area, so that, for 
example, a school in San Francisco might be matched with a school in 
San Diego. Colorado compares each individual school's rating with those 
of other schools in the neighborhood that are selected for their 
geographic proximity rather than specially matched for demographic and 
other characteristics.

[12] See R. William Cash, Mosaica Education Annual Report: Testing 
Results 1998-2002 (WestEd: Nov. 2002).

[13] See Gary Wolfram, PhD, Making the (Better) Grade: A Detailed 
Statistical Analysis of the Effect of National Heritage Academies on 
Student MEAP Scores, undated, www.heritageacademies.com/hillsdale.pdf, 
(downloaded June 30, 2003). Because enrollment dates were not 
available, application dates were used as a proxy for enrollment. 
Furthermore, because raw scores were not available, the analysis was 
based on the proficiency levels attained, ranging from 2 possible 
levels on the writing tests to 4 possible levels on the social studies 
tests. Other than gender, demographic data also were not available.

[14] See Fifth Annual Report on School Performance: 2001-2002 (Edison: 
Feb. 2003).

[15] The report explains that 18 schools were excluded due to lack of 
data for two points in time. For the remaining 94 schools, trends were 
calculated from various beginning dates through 2001-02. The beginning 
dates varied by school, depending on when Edison assumed management, 
and ranged from school year 1995-1996 to school year 2000-01. 

[16] For the comparison, all traditional schools in a district were 
considered similar and included if their enrollment was within 10 
percentage points of the Edison school on both student characteristics. 
If no traditional schools were that close, then they were considered 
similar and included if their enrollment was within 10 percentage 
points on one characteristic and 30 percentage points on the other 
characteristic.

[17] An Edison official told GAO that the company did not have access 
to individual data on students at traditional public schools used for 
the comparison, so it was not able to conduct such an analysis.

[18] The word significant is used in this section to refer to 
statistical significance. Differences discussed are significant at the 
95-percent confidence level using ordinary least squares regression 
models. Due to concerns about certain assumptions inherent in these 
models, we also ran models using robust estimation procedures to 
calculate standard errors. For all models, the robust procedures 
yielded almost identical results to those of the ordinary least 
squares. See appendix I for further details. 

[19] For brevity, we show percent minority in this and similar tables. 
However, our matching process actually used various categories of race/
ethnicity, depending on the data available for the site, rather than a 
single minority category. See appendix II for the exact categories 
used.

[20] In these analyses, a student is considered mobile if he or she did 
not attend the same school in the prior year.

[21] See appendix I for a further discussion of this effect size 
illustration and additional analyses comparing the privately managed 
school in Denver with different groupings of the comparison traditional 
schools.

[22] Third grade scores were available only for reading in Denver; in 
San Francisco both reading and mathematics were examined.

[23] See appendix I for a discussion of additional analyses comparing 
the privately managed school in Cleveland and St. Paul with different 
groupings of the comparison traditional schools.

[24] In Cleveland, no students in our study were designated as limited 
in English proficiency.

[25] The special education data we received on individual students in 
St. Paul were not complete and thus were not used in our analyses of 
individual test scores. 

[26] For Detroit schools, because of difficulties obtaining data and 
changes in the test, we analyzed reading and math test scores for 1 
school year--2001-02. 

[27] The most comprehensive source we found for this information was a 
report done by Arizona State University. We selectively verified data 
in this report with other sources, such as compilations done for the 
Center for Education Reform and the National Association of Charter 
School Authorizers. 

[28] If an elementary school managed by a private company also included 
middle or high school grades, the school was retained in the study if 
other selection criteria were met. 

[29] We identified schools in Washington, D.C., and Miami, Florida, 
that met our selection criteria. We did not include Miami in this study 
because we previously reported the results of a study of the privately 
managed school at this site. See U.S. General Accounting Office, Public 
Schools: Insufficient Research to Determine Effectiveness of Selected 
Private Education Companies , GAO-03-11. (Washington, D.C.: Oct. 29, 
2002). We did not include Washington, D.C., because we were concerned 
about obtaining reliable data. 

[30] In Phoenix, the Phoenix Unified High School District was used as 
the district demarcation for drawing matching traditional public 
schools.

[31] The specific matching variables varied from city to city. If 
students in a given racial or ethnic group comprised less than 10 
percent of the student population in the privately managed school and 
if students in that racial or ethnic group comprised less than 10 
percent of the student population for the other schools in the 
district, excluding outliers, we excluded that racial or ethnic group 
as a specific matching variable. 

[32] We sought, but were not able to obtain for use in the matching 
process, data on percentage of students with limited English 
proficiency for schools in St. Paul and Detroit.

[33] Phoenix had multiple school districts, so we consulted with state 
officials.

[34] Diagnostic analysis determined that school year was not related to 
achievement scores in all sites except for reading scores in San 
Francisco.

[35] There are degrees of LEP; however, the data did not allow us to 
differentiate the degree of limitation.

[36] There are degrees of disability; however, the data did not allow 
us to differentiate for the degree or type of disability.

[37] In cities where both free and reduced-lunch variables were 
provided, the analysis considered them separately.

[38] See Jeffrey M. Wooldridge, Econometric Analysis of Cross Section 
and Panel Data (Cambridge: MIT Press, 2002), p.135. 

GAO's Mission:

The General Accounting Office, the investigative arm of Congress, 
exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. General Accounting Office

441 G Street NW,

Room LM Washington,

D.C. 20548:

To order by Phone: 	

	Voice: (202) 512-6000:

	TDD: (202) 512-2537:

	Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director, NelliganJ@gao.gov (202) 512-4800 U.S.

General Accounting Office, 441 G Street NW, Room 7149 Washington, D.C.

20548: