This is the accessible text file for GAO report number GAO-04-734 
entitled 'No Child Left Behind Act: Improvements Needed in Education's 
Process for Tracking States' Implementation of Key Provisions' which 
was released on September 30, 2004.

This text file was formatted by the U.S. Government Accountability 
Office (GAO) to be accessible to users with visual impairments, as part 
of a longer term project to improve GAO products' accessibility. Every 
attempt has been made to maintain the structural and data integrity of 
the original printed product. Accessibility features, such as text 
descriptions of tables, consecutively numbered footnotes placed at the 
end of the file, and the text of agency comment letters, are provided 
but may not exactly duplicate the presentation or format of the printed 
version. The portable document format (PDF) file is an exact electronic 
replica of the printed version. We welcome your feedback. Please E-mail 
your comments regarding the contents or accessibility features of this 
document to Webmaster@gao.gov.

This is a work of the U.S. government and is not subject to copyright 
protection in the United States. It may be reproduced and distributed 
in its entirety without further permission from GAO. Because this work 
may contain copyrighted images or other material, permission from the 
copyright holder may be necessary if you wish to reproduce this 
material separately.

Report to Congressional Committees:

United States Government Accountability Office:

GAO:

September 2004:

No Child Left Behind Act:

Improvements Needed in Education's Process for Tracking States' 
Implementation of Key Provisions:

GAO-04-734:

GAO Highlights:

Highlights of GAO-04-734, a report to congressional committees

Why GAO Did This Study:

The No Child Left Behind Act of 2001 (NCLBA) has focused national 
attention on improving the academic achievement of the nations’ 48 
million students by establishing a deadline—school year 2013-14—for 
public schools to ensure that all students are proficient in reading 
and math. Accordingly, states, the District of Columbia, and Puerto 
Rico developed plans that set goals for increasing the numbers of 
students who attain proficiency on state tests each year, with all 
meeting goals by 2014. To provide information about states’ efforts, 
GAO determined (1) what goals states established for student 
proficiency and their implications for whether schools will meet these 
goals; (2) what factors facilitated or impeded selected state and 
school district implementation efforts; and (3) how the Department of 
Education (Education) supported state efforts and approved state plans 
to meet student proficiency requirements.

What GAO Found:

States varied in how they established proficiency goals and measured 
student progress, which is permitted by NCLBA so that states can 
address their unique circumstances. For example, states differed in 
the annual rates of progress they expected schools to make in order to 
have all of their students academically proficient by 2014 and in 
methods used to determine whether schools had met state goals. This 
variation in state approaches could affect how many schools meet their 
annual goals over time. 

State and school district officials said that their leadership’s 
commitment to improving student achievement and technical assistance 
provided by an Education contractor facilitated implementation of 
NCLBA requirements. However, tight timeframes for determining school 
progress and problems with student data impeded implementation. 
Measuring achievement with faulty data can lead to inaccurate 
information on schools meeting proficiency goals. Education is working 
on efforts to help states improve their data systems, such as 
monitoring state data quality policies.

Education assisted states in developing their plans for improving 
student proficiency and by June 10, 2003 approved, fully (11) or 
conditionally (41), all plans. As of July 31, 2004, Education had 
fully approved 28 states’ plans without conditions; plans from 23 
states and the District of Columbia were approved but contained 
conditions needed to implement NCLBA requirements. To help states, 
Education asked assessment experts to review all plans and provide 
states with on-site evaluations. Although Education officials said 
that they are continually monitoring states whose plans have 
conditions, the Department does not have a written process that 
delineates how and when each state will meet its conditions. In 
addition, by the school year (2005-06) NCLBA requires states to 
increase assessments. Education has developed guidance for its review 
and approval of states’ expanded standards and assessments. However, 
it has not established a written plan that clearly identifies the 
steps required, interim goals, review schedules, and timelines. 
Without such written plans, states may be challenged to meet NCLBA 
system requirements by the 2005-06 deadline.

Approval Status of State Plans as of July 31, 2004: 

[See PDF for image]

[End of figure]

What GAO Recommends:

We are recommending that the Secretary of Education delineate a 
written process and timeframes for states to meet conditions for full 
approval, develop a written plan with steps and timeframes so all 
states have approved standards and assessment systems by 2006, and 
further support states’ efforts to gather accurate student data used 
to determine if goals have been met. Education disagreed with the 
first recommendation and agreed with the others.

www.gao.gov/cgi-bin/getrpt?GAO-04-734.

To view the full product, including the scope and methodology, click on 
the link above. For more information, contact Marnie S. Shaul at (202) 
512-7215 or shaulm@gao.gov.

[End of section]

Contents:

Letter:

Results in Brief:

Background:

States Varied in the Goals Established for Student Progress, and That 
Variation May Have Implications for How Many Schools Meet State Goals 
over Time:

Leadership and Technical Assistance Facilitated Implementation 
Efforts, but Data Accuracy Problems and Tight Timelines Impeded 
Efforts:

Education Has Aided States in Developing Their Plans and Assessment 
Systems but Did Not Have Written Plans to Help States Meet NCLBA 
Provisions:

Conclusions:

Recommendations for Executive Action:

Agency Comments and Our Evaluation:

Appendix I: Methods to Establish Starting Points:

Appendix II: Percentage of Schools That Met State Goals in 2002-03:

Appendix III: State Plan Requirements:

Appendix IV: Comments from the Department of Education:

Appendix V: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Staff Acknowledgments:

Related GAO Products:

Tables:

Table 1: Primary Methods Education Used To Support State Planning and 
Implementation Efforts:

Table 2: Calculating a Starting Point Using the School at the 20th 
Percentile in Cumulative Enrollment:

Figures:

Figure 1: Example of States' Discretion to Develop Their Own Content 
Standards and Tests and to Determine What Constitutes Proficiency on 
Each of the Tests:

Figure 2: Percentage of Students in Each State Expected to Demonstrate 
Proficiency on the Reading Tests in the First Year:

Figure 3: Minimum Size of Student Groups by Number of States:

Figure 4: Three Variations in State Projected Rates of Progress from 
2002 to 2014:

Figure 5: A Majority of States Used Confidence Intervals to Determine 
Student Progress in School Year 2002-03:

Figure 6: Student Progress Measures and Potential Effects on Whether 
Schools Meet Proficiency Goals:

Figure 7: Example of a Timeline to Determine School's Proficiency 
Status:

Figure 8: Approval Status of State Plans as of July 31, 2004:

Abbreviations:

ESEA: Elementary and Secondary Education Act: 
NCLBA: No Child Left Behind Act: 
LEA: Local Educational Agency:

United States Government Accountability Office:

Washington, DC 20548:

September 30, 2004:

The Honorable Judd Gregg:
Chairman:
The Honorable Edward M. Kennedy:
Ranking Minority Member:
Committee on Health, Education, Labor, and Pensions:
United States Senate:

The Honorable John A. Boehner:
Chairman:
The Honorable George Miller:
Ranking Minority Member:
Committee on Education and the Workforce:
House of Representatives:

The No Child Left Behind Act of 2001 (NCLBA) has focused national 
attention on increasing academic achievement and closing achievement 
gaps among the nation's 48 million school-aged children by establishing 
a deadline--school year 2013-14--for public schools to bring all of 
their students to an achievement level deemed "proficient" in reading 
and math by their state. This includes students in total and in NCLBA-
designated student groups--students who are economically 
disadvantaged, are members of major racial or ethnic groups, have 
disabilities, or have limited English proficiency. As a condition for 
receiving federal funds, NCLBA required that each state submit a plan 
to the Department of Education (Education) that describes how the state 
will ensure that all students are proficient in reading and math by the 
deadline, as measured primarily by tests each state used. To provide 
information about the current status of states' efforts to implement 
student proficiency requirements, GAO determined (1) what goals states 
established for student proficiency and their implications for whether 
schools will meet these goals, (2) what factors facilitated or impeded 
selected state and school district implementation efforts, and (3) how 
Education supported state efforts and approved state implementation 
plans to meet student proficiency requirements.

To address these issues, we analyzed data from plans which all states 
submitted to Education and that Education approved by June 2003. We 
extracted detailed information from each plan and developed a database 
of that information to facilitate analysis. We also contacted officials 
in 50 states, the District of Columbia, and Puerto Rico to obtain 
information about the number of schools they had identified as meeting 
annual progress goals and their school and district characteristics in 
2002-03.[Footnote 1] We visited 4 states (California, Illinois, North 
Carolina, and Rhode Island) and 6 school districts within these states, 
and conducted phone interviews with officials in another 17 states to 
obtain information about factors that facilitated and impeded 
implementation of student proficiency requirements. The states and 
districts were selected to achieve variation in geography and size and 
to explore variation among the states in such areas as their starting 
points, first-year goals, and successive annual student proficiency 
goals. We reviewed documentation Education provided the states, 
reviewed regulations and guidance issued by Education, and interviewed 
Education officials about their efforts to assist states in developing 
plans and their process for approving plans. We also reviewed the 
status of Education's approval of states' standards and assessments 
systems that were required to comply with the 1994 Elementary and 
Secondary Education Act (ESEA). In July 2004, in response to our 
requests, Education provided us with updated and new information 
related to the approval status of states' plans, grant award 
conditions, assessment system enforcement efforts, and assistance 
provided to improve the quality of state data. Finally, we interviewed 
officials from national education organizations and other experts in 
the area. We conducted our work between August 2003 and August 2004 in 
accordance with generally accepted government auditing standards.

Results in Brief:

States varied in how they established proficiency and measured student 
progress, and this variation in state approaches could affect how many 
schools meet their annual goals over time. NCLBA permits such 
variability for each state to address its unique circumstances, thus 
differences are not unexpected. First, states varied in their starting 
points--the 2001-02 assessment levels that were used to set first-year 
proficiency goals--and also varied in their first-year goals. NCLBA 
prescribed a statutory formula for determining starting points based on 
each state's 2001-02 assessment data. State starting points reflected 
the differences in decisions states had previously made in choosing 
content standards, determining the rigor of tests developed or chosen 
to measure student performance, and setting proficiency levels. 
Consequently, the percentage of students expected to meet proficiency 
goals in the first year varied widely. For example, in California's 
schools, 14 percent of elementary school students were expected to be 
proficient in reading in the first year, while Colorado expected that 
78 percent of its elementary students would be proficient. States also 
varied in the minimum size of designated groups, such as economically 
disadvantaged and ethnic minority students, whose progress must be 
measured separately. In determining whether schools met proficiency 
goals, states were not required to include results for these groups if 
the number of students was too small to yield statistically reliable 
information. For example, in the state of Washington, which has a 
minimum group size of 30, schools would not be required to include 
separately the test scores for any group of fewer than 30 students. 
States also varied in the percentage of students they expected to be 
proficient annually to meet NCLBA's requirement that all students be 
proficient by 2014. For example, some states expected schools to show 
steady progress every year and others every 3 years. Finally, states 
varied in how they planned to determine whether their schools met state 
goals. The majority of states used statistical techniques that they 
believed improved the accuracy of their determinations, such as 
determining that a school had made adequate progress if the percentage 
of students scoring at the proficient level or above came within a 
statistical range (i.e., confidence intervals) of the state goal. The 
approaches states used to establish goals and determine student 
proficiency, such as confidence intervals, could have implications over 
time for the number of schools that meet their goals.

State officials we interviewed cited factors that facilitated 
implementation of student proficiency requirements, such as the 
commitment of their state leadership to the goals of NCLBA and 
technical assistance. However, factors such as data problems and tight 
timelines for determining school progress impeded implementation. 
Officials reported that state leadership, by providing administrative 
and legislative support, had been influential in facilitating the goals 
of NCLBA. They also reported that technical assistance from the Council 
of Chief State School Officers, under contract with Education, had been 
an important factor in facilitating states' first-year implementation. 
On the other hand, more than half of the state and school district 
officials we interviewed reported being hampered by poor and unreliable 
student data. Reliable data are essential for implementing the 
requirements of the law. For example, officials in Illinois reported 
that about 300 of their 1,055 districts had problems with data 
accuracy. Education is working on efforts to help states improve their 
data systems, such as monitoring state data quality policies and 
establishing a common set of data definitions. Officials from about 
half of the 21 states also said that tight timelines impeded 
implementation of student proficiency requirements. For example, 
because tests were often given late in the school year, it was 
difficult for states to make final determinations about whether schools 
had met progress goals prior to the next school year.

Education assisted states in developing their plans for improving 
student proficiency in several ways and approved all plans, fully or 
conditionally, by June 10, 2003. To help states, Education asked 
experts familiar with student assessments to review all plans and 
provide them with on-site evaluation. Education also allowed states 
some flexibility with certain requirements, such as granting all states 
greater flexibility in determining how students with limited English 
proficiency could be assessed. On June 10, 2003, when Education 
announced it had approved all plans, 11 state plans met all NCLBA 
requirements. The remaining 41 plans were approved by Education with 
conditions that needed to be met to satisfy all NCLBA requirements. As 
of July 31, 2004, 28 states had plans that met all NCLBA requirements, 
and 24 states, including the District of Columbia, had plans with 
conditions that needed to be met before receiving full approval from 
Education. According to Education, states approved with conditions had 
sufficient information in their plans to demonstrate that the 
requirements of NCLBA could be met in the future if certain actions 
were taken. Although Education officials said that they are continually 
monitoring states whose plans have not been fully approved, the 
department does not have a written process that delineates how and when 
each state will meet the conditions. In addition, in July 2004 some 
states did not have approved academic standards and assessment systems 
in place to meet the requirements for the 1994 education law, even 
though they are the primary means by which the law requires states to 
determine student proficiency. By school year (2005-06), all states are 
required by NCLBA to increase the current level of testing. Given the 
difficulties states experienced meeting the 1994 requirements, 
developing new standards and assessment systems to meet the expanded 
assessment requirements may be challenging for states. Education has 
developed guidance for its review and approval of states' expanded 
standards and assessment systems. However, it has not established a 
written plan that clearly identifies the steps required, interim goals, 
review schedules, and timelines. Without such a plan, states may be 
challenged to meet NCLBA standards and assessment systems requirements 
by the 2005-06 school year deadline.

We are recommending that the Secretary of Education delineate in 
writing the process and time frames that are appropriate for each 
state's particular circumstances to meet conditions for full approval, 
develop a written plan that includes steps and time frames so that all 
states have approved NCLBA standards and assessment systems by the 
2005-06 school year, and further support states' abilities to gather 
accurate student data used to determine whether schools met state 
goals.

In its comments on a draft of this report, Education expressed support 
for the recommendations we made on developing a written plan to help 
states meet the 2005-06 NCLBA requirements for standards and assessment 
systems and indicated the department has begun to take steps to develop 
such a plan. Education also supported our recommendation to provide 
additional assistance to the states to improve their abilities to 
gather accurate student performance data. Education disagreed with our 
recommendation that it delineate in writing the process and time frames 
for states to meet conditions needed to receive Education's full 
approval of their plans. Education indicated that it has a process to 
monitor states' progress, although not in writing, and that this 
process has resulted in additional plans being fully approved. We 
recognize the efforts the department has taken to support states' 
implementation of NCLBA. However, Education has not fully approved 
almost half (24) of state plans, meaning that conditions still exist 
for some states in order for them to be able to meet NCLBA provisions. 
A written delineation of conditions that these plans need to meet and 
the time frames appropriate for each state's circumstances would 
provide the necessary documentation and assurance to Education, 
Congress, and the public that the steps states need to take and the 
timeframes for their actions are clear and understood.

Background:

Prior Federal Reform Efforts:

Over the past 40 years, the Elementary and Secondary Education Act 
(ESEA) has authorized billions of dollars in federal grants to states 
and school districts to improve educational opportunities for 
economically disadvantaged children.[Footnote 2] ESEA was reauthorized 
in 1994, with requirements designed to hold states accountable for 
student progress.[Footnote 3] Specifically, as a condition for 
receiving federal financial assistance under Title I, Part A, of the 
act, states were required to develop academic standards, develop tests 
and measure student proficiency in certain grades, and determine 
whether schools were meeting proficiency goals. As ESEA neared 
reauthorization in 2001, however, only 17 states had received 
Education's approval of their systems for standards and testing, and 
Congress was concerned that student performance was not improving as 
quickly as it should have, specifically among some student groups, such 
as the economically disadvantaged.

New Test Requirements and Standards and a Goal for 2014:

In part to address these issues, the No Child Left Behind Act of 2001 
enhanced the federal government's role in kindergarten-12TH grade (K-
12) education by taking steps to ensure that all students reach the 
"proficient" level of achievement within 12 years of the enactment of 
the law, that is, by school year 2013-14. NCLBA strengthened the 1994 
reauthorization requirements in several ways. NCLBA increased the 
amount of testing in future school years. Beginning in the 2005-06 
school year, tests in math and reading must be administered every year 
in grades 3 through 8 and once in high school, and by 2007-08, states 
must also measure students' science achievement. NCLBA requires that 
these tests serve as the primary means of determining the annual 
performance of schools and that states provide Education with evidence 
from the test publisher or other relevant sources that these 
assessments are of adequate technical quality and consistent with 
nationally recognized professional and technical standards. States are 
to show that increasing numbers of students are reaching the proficient 
level on state tests over time so that by 2014, every student is 
proficient.

Similar to the 1994 law, NCLBA also designated specific groups of 
students for particular focus. These four groups are students who (1) 
are economically disadvantaged, (2) represent major racial and ethnic 
groups, (3) have disabilities, and (4) are limited in English 
proficiency.[Footnote 4] States and school districts are required to 
measure the progress of all students in meeting proficiency goals, as 
well as to measure separately the progress of these designated groups. 
To be deemed as having made adequate progress, each school must show 
that each of these groups, as well as the school as a whole, met the 
state proficiency goal. Schools must also show that at least 95 percent 
of students in grades required to take the test have done so.[Footnote 
5] Further, schools must also demonstrate that they have met state 
targets on another measure of progress--graduation rates in high school 
or attendance or other measures in elementary or middle 
schools.[Footnote 6]

Finally, NCLBA requires that additional actions be taken if schools 
that receive funding under Title I, Part A, of the act do not meet 
state goals. Schools that have not made progress for 2 consecutive 
years or more are "identified for improvement" and must take certain 
actions such as offering parents an opportunity to transfer students to 
another school (school choice) and providing supplemental services 
(e.g., tutoring).[Footnote 7] States and school districts are required 
to provide funding up to a maximum amount specified in law for such 
actions, including transportation, tutoring, and training.

Although NCLBA placed many new requirements on states, states have 
broad discretion in many key areas. States develop their own tests to 
measure the content students are taught in their schools. States set 
their own standards for what constitutes "proficiency" (see fig. 1). 
NCLBA does, however, require states to set two standards for high 
achievement--"advanced" and "proficient," to reflect a degree of 
mastery--and to set another standard for "basic" achievement to 
indicate the progress of the lower-achieving children toward mastering 
their state standards. As part of its monitoring, Education reviews any 
changes states may make to their tests and academic and proficiency 
requirements, and the law requires states to notify Education of any 
significant change.

Figure 1: Example of States' Discretion to Develop Their Own Content 
Standards and Tests and to Determine What Constitutes Proficiency on 
Each of the Tests:

[See PDF for image]

[End of figure]

State Plans for Setting Goals and Measuring Student Progress:

Under NCLBA, each state requesting federal financial assistance was 
required to submit a plan to Education that, among other things, 
demonstrated how the state will meet the law's requirements for setting 
annual goals and measuring student progress.[Footnote 8] The law 
required that plans demonstrate that the state has developed and is 
implementing a statewide system that will be effective in ensuring that 
schools make adequate yearly progress toward the 2013-14 goal. The law 
also required that state plans demonstrate what constitutes adequate 
yearly progress, and required that plans establish:

* Starting points for measuring the percentage of students who meet or 
exceed the state's proficient level of academic achievement using 
assessment data from the 2001-02 school year. The methods for computing 
starting points, as specified in the law, take into account such 
factors as scores from designated student groups and how schools rank 
in their state.[Footnote 9] Separate starting points were to be 
developed for reading/language arts and math.

* Annual goals, including first year goals, establishing the single 
minimum percentage of students who will be required to score at or 
above the proficient level on the state assessment in each year until 
2013-14. The goals are based on the starting points for each state's 
reading and math assessments.[Footnote 10]

* The minimum number of students in a designated student group 
necessary for their test results to be used as a separate group in 
determining whether a school met state goals. Each state was allowed to 
determine the minimum number required to ensure that the group size was 
sufficient to produce statistically reliable results.[Footnote 11]

* Graduation rates for high schools and another indicator of progress 
of the state's choosing for elementary and middle schools, such as 
attendance rates. Graduation rate is defined in NCLBA as the percentage 
of students who graduate from secondary school with a regular diploma 
in the standard number of years.

Following states' submission of their plans in January of 2003, 
Education was statutorily required to conduct a peer review process, 
identifying federal and state officials and outside experts to meet as 
a team with each state, review its plan, and provide assistance. 
Subsequently, the teams were to provide their assessment of the extent 
to which state plans met NCLBA requirements, such as having starting 
points, first-year goals and annual goals to ensure that every student 
would become proficient, and minimum student group sizes for measuring 
the achievement of designated students. The law required Education to 
review the plans and approve them within 120 days of a state's 
submission. If the Secretary determined that a state plan did not meet 
all requirements, he was required to notify the state and offer 
technical assistance, among other actions, before disapproving the 
plan.

NCLBA also requires the Secretary to report to Congress annually 
regarding state progress in implementing various requirements, 
including the number of schools identified as needing improvement. 
While NCLBA requires accurate and reliable data on student test scores 
and valid systems for identifying designated student groups, GAO, along 
with other auditors, has reported that states and school districts face 
serious challenges in this regard. GAO has proposed several 
recommendations for improving the collection and reporting of student 
data.[Footnote 12] Additionally, Education's Inspector General has 
reported that the lack of procedures and controls on student data is a 
continuing challenge for the department.[Footnote 13]

States Varied in the Goals Established for Student Progress, and That 
Variation May Have Implications for How Many Schools Meet State Goals 
over Time:

States varied in how they established goals and measured student 
progress; this variation may affect how many schools meet their annual 
goals--adequate yearly progress--each year. NCLBA permits variability 
in a number of areas, allowing states to address their unique 
circumstances. States varied in the percentage of students they 
expected to demonstrate proficiency on their tests in the first year of 
NCLBA's implementation and in the number of students in designated 
groups whose proficiency had to be measured separately. They also 
varied in the annual rates they set to increase student proficiency. 
Finally, they differed in how they measured student progress. These 
variations may have implications for the number of schools that meet 
their goals each year.

States' Starting Points and First-Year Goals Varied:

States' starting points--based on the percentage of students proficient 
in reading and math on state tests in 2001-02--varied widely, as did 
their first-year performance goals. NCLBA specified that states were to 
use their 2001-02 test data to calculate their starting points and 
instructed states on how the starting point was to be set from these 
data. After states computed their starting points, they specified 
performance goals for each year that would result in all children being 
proficient by 2013-14. As figure 2 illustrates, the percentage of 
students expected to be proficient in reading in the 2002-03 school 
year differed widely among the states.

Figure 2: Percentage of Students in Each State Expected to Demonstrate 
Proficiency on the Reading Tests in the First Year:

[See PDF for image]

Notes: Thirty-six states both provided data on first-year goals and 
used "percent proficient" as their measure for the goals. Six states 
used a different measure (proficiency index) that allowed them to 
incorporate other data in determining school progress, while 10 states 
did not provide first-year goal data.

When states set different first-year goals (e.g., separate goals for 
elementary, middle, and high schools), we used goals set at the lowest 
grade span or level (e.g., elementary) for this chart.

[End of figure]

For example, in order for an elementary school to meet the state 
reading goal in California, at least 14 percent of its students had to 
score at the proficient level on the state test, whereas in Colorado, 
at least 78 percent of the students had to score at the proficient 
level.[Footnote 14]

Variation in states' starting points and first year-goals reflected the 
differences in decisions states had previously made in choosing content 
standards, developing tests to measure student performance, and setting 
proficiency levels, among other factors. For example, the score 
required to be proficient on a similar type of test might be higher in 
one state than in another, potentially affecting the percentage of 
students that demonstrate proficiency on the test.

In addition to establishing widely varying first-year goals, states 
differed in whether they set the same goal for all of their schools or 
whether they set different goals by grade level. Given that each state 
has its own system and structure, decisions about setting the same or 
different goals for schools was generally within the states' 
discretion. Some states established different first-year goals for each 
grade; others for elementary, middle, and high schools; and some 
established the same first-year goals for all schools. Vermont, for 
example, had distinct goals for different grade configurations: schools 
that had elementary, middle, and high school grades had different goals 
than schools with just elementary grades.

States Set Different Size Requirements for Measuring the Progress of 
Designated Groups:

The size of the designated groups (the economically disadvantaged, 
ethnic minorities, students with disabilities, and students with 
limited English proficiency) whose progress must be measured separately 
also varied among states.[Footnote 15] NCLBA specified that to make 
adequate yearly progress, the school overall and each of these 
individual groups must reach the performance goal unless the number of 
students in the group is small enough to reveal personally identifiable 
information on an individual student or to yield statistically 
unreliable information. States decided the minimum number of students 
in such groups, and the resulting group sizes varied from state to 
state and sometimes within a state. As figure 3 shows, the majority of 
states (36) set the minimum group size between 25 and 45 
students.[Footnote 16]

Figure 3: Minimum Size of Student Groups by Number of States:

[See PDF for image]

Note: This figure does not include Montana or North Dakota, which used 
a statistical model to determine the minimum group size so that the 
number may be different for each school and designated student group.

[End of figure]

For example, in Washington state, with a minimum group size of 30, 
schools were not required to include the results of any student group 
with fewer than 30 students in determining whether they met the state's 
proficiency goals. In this case, if a school had fewer than 30 students 
of a particular ethnic group, for instance, the scores of this student 
group would not be considered separately. These students' individual 
scores would still be considered, however, in determining whether their 
school as a whole had met its goal. A few states used different group 
sizes, depending on other factors. For example, California set its 
group size at 50 but allowed the minimum size to be 100, depending on 
the size of the school's enrollment. Ohio, among other states, used a 
larger group size for its students with disabilities than the one used 
for other student groups. According to its state plan, one of the 
reasons Ohio set a larger size for this group was to account for the 
fact that students in that group have a wide variety of conditions and 
results for small groups could be unreliable.

States Set Different Rates for Annual Student Progress:

States also varied in the annual rate at which they expected their 
students to progress toward full proficiency by 2014. Using the 
flexibility in the law, some states set different proficiency goals 
each year, while others set goals for 3-year intervals.[Footnote 17] 
Some states used a combination of staggered and steady progress. (See 
fig. 4.)

Figure 4: Three Variations in State Projected Rates of Progress from 
2002 to 2014:

[See PDF for image]

Note: Graphs in this figure are hypothetical and do not reflect 
particular states.

[End of figure]

Three states assumed generally equal annual increases in student 
progress. (See panel A.) For example, Arkansas's first-year goal was 
that 32 percent of the elementary students in each of its schools would 
be proficient in reading, followed by an increase of about 6 percent of 
its students annually until all of its students were proficient by 
2014. In contrast, 14 states staggered improvement over 2-or 3-year 
periods rather than in 1-year increments. (See panel B.) For example, 
North Carolina's first-year goal was that about 69 percent of 
elementary students in each of the state's schools would be proficient 
in reading in the first year and the state set goals for subsequent 
increases every 3 years: 77 percent by 2005, 84 percent by 2008, 92 
percent by 2011, and 100 percent by 2014. Finally, 18 states used a 
combination of progress rates. (See panel C.) Nevada, for example, 
staggered improvement goals in 2-and 3-year increments until 2011, at 
which point the state planned for annual increases in percentages of 
students that were proficient up to 2014.[Footnote 18]

States Varied in How They Measured Annual Student Progress:

States also used different approaches for determining whether schools, 
and designated groups of students within schools, met their annual 
performance goals. In the 2002-03 school year, a majority of states 
used statistical measures such as confidence intervals in which schools 
were deemed to have made adequate yearly progress if they came within a 
range of the state proficiency goal, as shown in Figure 5.[Footnote 19]

Figure 5: A Majority of States Used Confidence Intervals to Determine 
Student Progress in School Year 2002-03:

[See PDF for image]

Note: How each state measured annual student progress was taken from 
state plans Education approved by June 10, 2003. Arkansas did not 
indicate in its original plan that it would use confidence intervals; 
however, it amended its plan in 2003 to include them and used them to 
determine whether schools met state goals in 2002-03. California and 
Texas indicated in their plans that they would use confidence intervals 
only with schools with small numbers of students or test scores. 
Montana's original plan included confidence intervals, but it 
subsequently did not use them for technical reasons. Recently, Alabama, 
North Carolina, and Pennsylvania had plan amendments approved whereby 
they also will use confidence intervals.

[End of figure]

States that used confidence intervals constructed an estimate of 
student performance that included a range of scores, which was then 
compared with the state goal. For example, 68 percent of students 
making the goal might be represented by a confidence interval of 64-72 
percent. If the state goal was 70 percent, it would be included in the 
confidence interval, thus the school or designated group would be 
classified by the state as having made its performance goal.

Education officials told us that states used such statistical 
procedures to improve the reliability of determinations about the 
performance of schools and districts. According to some researchers, 
such methods may provide more valid results because they account for 
the effect of small group sizes[Footnote 20] and year-to-year changes 
in student populations.[Footnote 21]

The Way States Measured Student Progress Has Implications for Schools 
Meeting State Goals Over Time:

Variations in states' approaches may influence whether schools will 
meet annual state goals. First, schools in states that established 
smaller annual increases in their initial proficiency goals may be more 
likely to meet state goals in the earlier years compared with schools 
in states that set larger annual increases. For example, Iowa projected 
moderate annual increases in student proficiency in the first 8 years, 
followed by more accelerated growth. Nebraska, however, projected a 
different scenario--steady increases in student proficiency. Although 
schools in states such as Iowa may be more likely to meet state goals 
in the first few years, they may find it more challenging to meet state 
goals in subsequent years to ensure that all students are proficient by 
2013-14.

Second, schools with a large number of designated student groups may be 
less likely to meet state goals than schools with few such groups, all 
other factors being equal.[Footnote 22] In order for a school to meet 
its state goal, both the school as a whole and each designated student 
group must meet proficiency goals. Some schools may have few student 
groups that must demonstrate progress because they do not have the 
state-prescribed minimum number of such students needed for their 
results to be considered separately. Several state officials told us 
that many of their schools were not meeting state goals because one or 
two student groups did not meet their annual proficiency goals.

Finally, the approach states used in determining whether schools met 
proficiency goals may influence the number of schools meeting goals. 
Some states used statistical methods, such as confidence intervals, 
which may result in more of their schools reaching proficiency goals 
than states that do not. For instance, Tennessee--a state that 
initially did not use confidence intervals but later received approval 
to do so--re-analyzed its data from 2002-03, applying confidence 
intervals. The application of confidence intervals substantially 
decreased the number of schools not meeting state goals. The number of 
elementary and middle schools not making state goals was reduced by 
over half--47 percent to 22 percent. The application of confidence 
intervals can produce such differences because the computed ranges can 
be large, especially when small numbers of students make up groups or 
when scores vary significantly among students. For example, in a 
Kentucky high school, 16 percent of students with disabilities scored 
at the proficient level a state test in 2004, and the goal was 19 
percent. However, when the state applied confidence intervals, the 
computed interval associated with 16 percent was 0 to 33 percent. 
Because the state goal--19 percent--was within the confidence interval, 
the state considered this group to have met the goal.[Footnote 23] (See 
fig. 6 for potential effects of different student progress measures on 
whether schools meet proficiency goals.)

Figure 6: Student Progress Measures and Potential Effects on Whether 
Schools Meet Proficiency Goals:

[See PDF for image]

[End of figure]

Leadership and Technical Assistance Facilitated Implementation 
Efforts, but Data Accuracy Problems and Tight Timelines Impeded 
Efforts:

State officials we interviewed cited factors that facilitated 
implementation of student proficiency requirements, such as their 
states leadership's commitment to the goals of NCLBA and technical 
assistance provided by the Council of Chief State School Officers, 
through a contract with Education. Officials also cited factors that 
impeded implementation, such as problems with the data they use to 
determine student proficiency, tight timelines, and a lack of timely 
guidance from Education.

Officials Cited Leadership Commitment and Technical Assistance as Key 
Factors That Facilitated Implementation of Student Proficiency 
Requirements:

Officials in 10 of the 21 states we interviewed said that their 
leadership's commitment to improving student achievement facilitated 
their efforts to implement student proficiency requirements. For 
example, one state's Commissioner of Education said he supported 
holding schools accountable for the progress of all students, a 
sentiment echoed by other state officials. Officials in three of the 
school districts where we interviewed expressed their commitment to 
NCLBA's focus on raising the proficiency of all students. For example, 
one district official said the law has been helpful in demonstrating 
achievement gaps to school officials. Another told us that NCLBA has 
focused the state's attention on the importance of annually tracking 
student proficiency. Leadership's commitment facilitated 
implementation in many ways, such as helping schools and school 
districts focus on improving student proficiency and enabling state 
education staff in different offices to share information.

Officials from 7 states also reported that the assistance provided by 
the Council of Chief State School Officers facilitated implementation 
of NCLBA requirements. Through its contract with Education, the council 
has provided states technical assistance in implementing NCLBA 
requirements and issued many publications about the law's requirements. 
The council has also held meetings where state officials have discussed 
common challenges and strategies and received advice and assistance 
from national experts and Education officials. For example, of the 
seven officials citing the council's work, two said that their meetings 
assisted them in developing their state plans. Officials from another 
state said they turned to the council for information when they were 
unable to obtain answers about implementation from other sources.

Data Quality Issues and Tight Timelines Were Cited as Impeding 
Implementation Efforts:

Concern about the quality and reliability of student data was the most 
frequently cited impediment to implementing student proficiency 
requirements. More than half of the state and school district officials 
we interviewed cited this concern. For example, officials in California 
indicated that they could not obtain racial and ethnic data--used to 
track the progress of designated student groups--of comparable quality 
from their school districts.[Footnote 24] Officials in Illinois 
reported that about 300 of its 1,055 districts had problems with data 
accuracy, resulting in those schools' appealing their progress results 
to the state. Similarly, officials in Indiana acknowledged data 
problems but said addressing them would be challenging. Inaccurate data 
may result in states incorrectly identifying schools as not meeting 
annual goals and incorrectly trigger provisions for school choice and 
supplemental services. GAO, Education's Inspector General, and other 
auditing groups have also reported the challenges states face in 
gathering and processing accurate and reliable student data. For 
example, in a 2004 report, Education's Inspector General reported that 
many states lacked procedures and controls necessary to report reliable 
student data. Another auditing group reported that some states were not 
reporting accurate student data to Education and recommended that 
Education take steps to help states address data accuracy 
problems.[Footnote 25]

Although NCLBA focuses primarily on the state's responsibility to 
ensure data reliability and validity, Education also has a critical 
role in assisting states to improve the quality of data used for 
assessment and reporting. NCLBA requires the Secretary of Education to 
provide an annual report to Congress that includes national and state-
level data on states' progress in implementing assessments, the results 
of assessments, the number of schools identified as needing 
improvement, and use of choice options and supplemental services. 
Education officials acknowledged the need to share responsibility with 
the states to improve data quality so data provided to Congress are 
valid and reliable. According to Education officials, they are working 
with states to monitor state data quality policies and establish a 
common set of data definitions. Education also has begun a multiyear 
pilot project related to data reporting. However, while one of the 
primary goals of this effort is to improve the quality of state data, 
this long-term project will not address states' problems with the data 
they are now using to report on student progress. In addition to 
reporting data quality concerns, officials from about half of the 21 
states said that tight timelines impeded implementation of student 
proficiency requirements, even though many of those requirements built 
upon provisions in the previous reauthorization of ESEA. That previous 
law required states to test students in reading and math in three 
grades to measure if schools were making progress. However, a majority 
of states did not have approved assessment systems in place when NCLBA 
was enacted.[Footnote 26] NCLBA set specific time frames, because many 
states had not been taking the necessary steps to position themselves 
to meet requirements. Those states that had taken steps to meet the 
earlier requirements were generally better positioned to meet NCLBA 
requirements.

Officials we interviewed from 5 states said that they had very little 
time to develop their state plans. They said that developing a system 
that meets NCLBA requirements for measuring student proficiency for all 
students and selected subgroups was complicated, and they had to 
resolve many issues before their systems could be up and running. 
States that already had a state system for measuring school progress in 
place prior to NCLBA faced other challenges. These states had to 
determine how they would reconcile parts of their existing systems with 
NCLBA's requirements in order to submit their plans to Education on 
time.

Officials from 6 states said it was difficult for the state to notify 
schools of their status in meeting proficiency goals in a timely 
fashion. Many states test students in the spring, and NCLBA requires 
that test results be processed before the beginning of the next school 
year in order for districts to identify which schools did not make 
progress, as illustrated in figure 7. However, many factors may make it 
difficult to meet these deadlines, such as identifying and correcting 
errors in student data.

Figure 7: Example of a Timeline to Determine School's Proficiency 
Status:

[See PDF for image]

Note: This example reflects a sample timeline for a public school 
district. Different states and districts may and do have different 
timelines for these steps.

[End of figure]

Officials in 12 of the 21 states where we interviewed said that the 
lack of clear and timely guidance and information from Education has 
impeded their efforts to implement NCLBA's student proficiency 
requirements. Several officials said that Education's communications 
with them were not timely and sometimes changed. Other officials said 
that Education was not timely in resolving issues Education had with 
their plans. In response, Education officials told us they provided 
states draft guidance on plan requirements, and subsequent changes were 
made in order to be responsive to the concerns of state officials. 
Education officials told us that it was challenging to provide the 
support states needed to implement NCLBA's proficiency requirements so 
that states could begin assessing students in the 2002-03 school year. 
They also said it was challenging, because the support often needed to 
be tailored, given the varied ways states chose to measure student 
proficiency.

Education Has Aided States in Developing Their Plans and Assessment 
Systems but Did Not Have Written Plans to Help States Meet NCLBA 
Provisions:

Education aided states in developing their plans in several ways, 
including having peer review teams evaluate plans on site and allowing 
states flexibility in implementing some NCLBA requirements. As of July 
31, 2004, Education had fully approved 28 plans as meeting all NCLBA 
requirements; the remaining states had approval with conditions. In 
addition, 17 states did not have approved academic standards and 
testing systems in place to meet the requirements of the 1994 law, even 
though they are the primary means by which the law requires states to 
determine student proficiency. According to Education officials, the 
department has been continually monitoring states progress in meeting 
conditions and has been working with states to meet prior and NCLBA 
requirements for standards and assessment systems. However, Education 
officials told us that they did not have a written process to track 
that states are taking steps toward meeting the conditions set for full 
approval of their plans or to document states' progress in meeting 
NCLBA requirements for the expanded standards and assessment systems 
required under NCLBA.

Education Aided States in Developing Their Plans for Measuring Student 
Progress:

Education aided states in developing their plans for measuring student 
progress and provided technical assistance for implementing them. The 
department helped states by having peer review teams examine and 
provide suggestions about the plans, allowing states flexibility in 
adhering to certain NCLBA requirements and issuing guidance to clarify 
key aspects of the law. (See table 1.)

Table 1: Primary Methods Education Used To Support State Planning and 
Implementation Efforts:

Method Education used to support states: Peer review; 
Purpose: To review and provide on-site suggestions to state officials 
as they were developing their plans.

Method Education used to support states: Technical assistance; 
Purpose: To assist states in developing state plans as well as 
implementing other aspects of NCLBA.

Method Education used to support states: Guidance; 
Purpose: To clarify requirements in NCLBA so that states understood 
their roles and responsibilities with respect to NCLBA.

Method Education used to support states: Flexibility; 
Purpose: To help states deal with challenges they faced in 
implementing some proficiency requirements, both in general and on a 
case-by-case basis.

Source: GAO analysis of Education's processes for supporting state 
efforts.

[End of table]

As required by NCLBA, Education assembled a team of experts, consisting 
of Education officials and external members drawn from state education 
agencies and other organizations familiar with student assessments and 
accountability, to review and provide states with advice on their 
plans. In reviewing them, the peer review teams identified areas where 
states were not meeting NCLBA requirements and closely examined areas 
that were particularly complex, such as their methods for measuring 
student progress goals. Peer reviewers also met with state officials 
on-site to discuss their plans and to suggest ways to improve them. 
Following the reviews, the teams presented the results to Education. 
The department then used this information to determine the extent to 
which state plans met requirements. Education also established a 12-
member National Oversight Panel to review state plans and advise 
Education of the extent of their completeness. This panel, which met 
monthly, was composed of parents, teachers, local education agency 
officials, and state education officials with knowledge about a range 
of areas, including standards and assessments and the needs of low-
performing schools.

Education also provided states with technical assistance to implement 
their plans. The department hosted conferences where it provided 
information on requirements for state plans. Education also contracted 
with the Council of Chief State School Officers to provide technical 
assistance to states. The council has held meetings and workshops as 
well as issued instructional publications about implementing different 
NCLBA requirements.

Additionally, Education issued guidance in a number of areas to assist 
states in their implementation efforts. For example, Education issued 
guidance explaining state responsibilities for monitoring NCLBA 
implementation and for providing schools with technical assistance, 
including the kinds of assistance they must provide to schools 
identified as needing improvement.[Footnote 27] Education also issued 
guidance addressing actions states should take if schools do not meet 
their goals and explaining the purpose of supplemental educational 
services and state responsibility for providing and monitoring the 
receipt of such services.

Education also allowed all states flexibility to address difficulties 
they experienced implementing some requirements and granted additional 
flexibility to states on a case-by-case basis. For example, Education 
granted all states greater flexibility in determining how students with 
limited English proficiency could be assessed. Education no longer 
required states to include the reading test results during students' 
first year in school. Further, on a case-by-case basis, Education 
allowed several states to vary the sizes they set for designated 
student groups. For example, Ohio and other states were allowed to use 
a larger group size for students with disabilities than for other 
student groups.

In February 2004, Education granted additional flexibility to states by 
establishing a process whereby states could propose amendments to their 
plans.[Footnote 28] Forty-seven states proposed amendments; for 
example, some states proposed to use a 3-year average to calculate the 
percentage of students taking state tests, rather than use the annual 
percentage. This flexibility may lessen the effects of year-to-year 
fluctuations in how many students take the tests. At the conclusion of 
our review, Education officials told us they had responded to every 
state and approved many of their proposals. Many of these amendments 
were in response to recently announced flexibility options. Other 
amendments were responses to specific conditions that Education had 
placed on some state plans before it would grant full approval. For 
example, one state amended its plan to resolve with Education how it 
would calculate its graduation rate for high school students.

Education Approved All State Plans by June 10, 2003, although Most 
States Were Approved with Conditions to Meet All NCLBA Requirements:

Education's review and approval of state plans included discussions 
with state officials and ongoing exchanges of drafts of state plans 
because of the uniqueness of each state's educational system. According 
to Education officials, the review process was particularly challenging 
for those states that did not have existing assessment systems that 
could provide a basis for meeting NCLBA requirements. Education also 
noted that some states with long-standing assessment systems needed to 
change their processes to develop an assessment system that conformed 
to NCLBA provisions.

NCLBA required each state's plan to demonstrate that the state had 
developed and was implementing a single statewide accountability 
system, had determined what constitutes adequate yearly progress for 
public schools (e.g., starting points, graduation rates), and had 
established a timeline for meeting state proficiency levels by 2014. 
The law specifically identified that elements--such as the method for 
determining adequate yearly progress--be demonstrated. Thus, a state's 
assurance that an element will be implemented in the future would not 
be sufficient to meet plan requirements.

NCLBA establishes that the Secretary of Education is responsible for 
approving plans. Education developed guidance for states that lists 
state plan requirements. (See app III) In reviewing state plans, 
Education established two levels of approval: "fully approved," and 
"approved." Education designated a plan as "fully approved" if it met 
all NCLBA requirements, and "approved" if additional conditions had to 
be met to fulfill requirements. Education described an approved plan as 
one that demonstrated that, when implemented, the state, its school 
districts, and its schools could meet NCLBA provisions. A state was 
required to have an approved plan before it could receive its Title I 
funding for the 2003-04 school year, with release of funds scheduled 
for July 1, 2003. Education included the conditions for approval in 
states' 2003 grant awards.[Footnote 29]

On June 10, 2003, Education announced that it had approved all state 
plans. Education fully approved 11 of the state plans (Connecticut, 
Hawaii, Illinois, Kansas, Mississippi, Missouri, New Jersey, North 
Dakota, Oregon, Texas, and Washington) as meeting all NCLBA 
requirements. Between June 11, 2003 and July 31, 2004, Education fully 
approved plans for an additional 16 states and Puerto Rico. The 
remaining 23 states and the District of Columbia had plans that met 
some, but not all, requirements, and were approved with conditions. 
(See fig 8 for the approval status of state plans as of July 31, 
2004.)

Figure 8: Approval Status of State Plans as of July 31, 2004:

[See PDF for image]

Note: By "Fully approved by 7/31/04" we mean fully approved between 
June 11, 2003, and July 31, 2004.

[End of figure]

According to Education, these 23 states and the District of Columbia 
had sufficient information in their plans to demonstrate that the 
requirements of NCLBA could be met in the future if certain actions 
were taken. Our review of the letters Education sent to the 23 states 
and the District of Columbia whose plans it had not fully approved 
indicated a range of conditions that needed to be met, such as 
providing performance targets for graduation rates or other indicators, 
analyzing the effect of using confidence intervals and providing state 
report card examples. Further, of these 23 states and the District of 
Columbia, 4 had to obtain final state action from their state boards of 
education or legislatures as their only condition for receiving full 
approval from Education.

According to Education officials, these approved plans provided 
sufficient assurance that when implemented they would meet NCLBA 
requirements. For example, some states provided Education with 
definitions for how they would calculate their goals and targets and 
assurances that the information would be forthcoming, but did not 
include the rates and percentages required by the law. Education 
officials said that some of these states did not have enough data to 
report graduation rates, but that the states defined how they would do 
so once they began collecting such data. Education approved these state 
plans with the condition that states collect data on graduation rates 
and define them in a manner consistent with their plans.

Education officials told us that they were in frequent communication 
with states regarding unmet plan requirements. However, the department 
did not have a written process to track interim steps and document that 
states meet the identified conditions within a specified time frame. In 
the follow-up letters Education sent to most states, it did not 
indicate specific time frames for when it expected states to 
demonstrate that they had met all NCLBA requirements. Education 
officials told us that they did not have a written process to ensure 
states are taking steps toward meeting the conditions set for full 
approval or what actions the department would take if states do not 
meet them.

States Face Challenges in Meeting the 2005-06 NCLBA Requirements for 
Standards and Assessment Systems:

Standards and assessments are the primary means by which states gauge 
student progress. States' current testing is governed by requirements 
first enacted by the 1994 ESEA, which required that states assess 
students once in each of three grade spans--elementary (3-5), middle 
(6-9), and high school (10-12). Under this law, state standards and 
assessments systems must meet certain requirements, such as measuring 
how well students have learned the academic content taught in school.

As of March 2002, Education had not approved most states' (35) 
standards and assessment systems required by the 1994 ESEA. [Footnote 
30] Education granted timeline waivers or compliance agreements for 
those states that did not demonstrate that they could meet the 1994 
ESEA requirements within the statutory time frame.[Footnote 31] 
According to Education, enforcement efforts have included close 
monitoring of states' progress, for example, agreements included 
interim steps to ensure that states are making progress and submitting 
quarterly reports to the department. Further, Education's enforcement 
efforts have included withholding funds from one state that did not 
fulfill its commitments under its timeline waiver. In accordance with 
the law, the department withheld 25 percent of Title I administrative 
funds from this state for fiscal year 2003. As of July 31, 2004, 35 
states had approved standards and assessment systems and 17 states did 
not.

By the next school year (2005-06) states will be required to increase 
the current level of testing, as required by NCLBA. For example, states 
will be required to test students annually in grades 3 through 8 and 
once in high school in reading and math. Given the difficulties states 
experienced meeting the 1994 requirements, developing new standards and 
assessment systems to meet the expanded assessment requirements may be 
challenging for states. All states will have to undergo a review and 
approval process for these tests to ensure that state standards and 
assessment systems meet NCLBA requirements.

Education has taken some steps to guide its review and approval process 
of states' standards and assessments systems to meet the 2005-06 time 
frame. It issued regulations on implementation in July 2002 and 
nonregulatory guidance on the standards and assessments requirements in 
March 2003. In April 2004, Education issued guidance to inform states 
about the information they will need to demonstrate that their systems 
meet NCLBA requirements and help peer reviewers determine whether state 
systems are consistent with NCLBA. Finally, Education officials told us 
that they are planning to train state Title I directors and to provide 
additional outreach to states. Education officials said that they do 
not intend to grant any waivers or extensions of time to states that 
fail to meet the NCLBA standards and assessment requirements.

Although Education has undertaken several initiatives to prepare for 
the review of state systems to meet the 2005-06 NCLBA deadline, it has 
not established a written plan that clearly identifies the steps 
required, interim goals, review schedules, and timelines. The 
assessment systems are likely to be complex, given the increased number 
of tests required under NCLBA. Given the complexity of developing such 
systems, the department may find that, similar to its experience with 
states' compliance with the 1994 law, some states may be challenged to 
meet NCLBA standards and assessment system requirements by the 2005-06 
school year deadline.[Footnote 32]

Conclusions:

NCLBA seeks to make fundamental changes in public education. For the 
first time, Congress has specified a time frame for when it expects all 
students to reach proficiency on state tests showing that they know 
their state's academic subject matter. It has also focused attention on 
closing the learning gap between key groups of students that have 
historically not performed well by also requiring that they be 
proficient. Achieving the goal of having all students proficient will 
be a formidable challenge for states, school districts, schools, and 
students. NCLBA provides a framework to help states achieve this goal 
and has required states to plan how they intend to do so. Education has 
undertaken numerous efforts to assist states with meeting this 
challenge. For example, it promulgated regulations, provided guidance, 
and reviewed state plans within fairly tight time frames to meet NCLBA 
requirements.

Education approved all state plans by June 10, 2003. However, many of 
these plans lacked key information regarding how states measure student 
proficiency, such as graduation rates. Education approved these plans 
conditionally, with the states' assurances that conditions could be met 
in the future. As of July 31, 2004, the plans for 23 states and the 
District of Columbia had not been fully approved. Although Education 
officials said that they have been in frequent communication with these 
states, the department does not have written procedures and specified 
time frames for monitoring states' progress for these 24 plans still 
needing to meet conditions. Without such tracking mechanisms, Education 
may not be able to ensure that required actions are taken in a timely 
way.

State assessment systems are the foundation for determining whether 
students are proficient. NCLBA has significantly increased the amount 
of testing, and states are required to have approved NCLBA standards 
and assessments by the 2005-06 school year. Education does not have a 
written plan that delineates steps and time frames to facilitate its 
review of plans to ensure NCLBA time requirements are met. Given 
Education's recent experience of a significant number of states that 
did not meet the 1994 ESEA requirements for standards and assessments 
systems, the lack of a written plan could hinder Education's efforts to 
better position states to meet the NCLBA requirements.

Furthermore, many state officials indicated they have concerns about 
the accuracy of student demographic and test data. Education has also 
noted these issues and has undertaken several initiatives to assist 
states with their data systems. States and districts have routinely 
collected student demographic and test data. However, the need to 
ensure the data's accuracy is even more important with the introduction 
of NCLBA's accountability requirements. The number of schools that are 
identified as in need of improvement has implications for states and 
school districts, especially when provisions for school choice and 
supplemental services become applicable, as they have for schools in a 
number of states. Measuring achievement with inaccurate data is likely 
to lead to poor measures of school progress, with education officials 
and parents making decisions about educational options on the basis of 
faulty information.

Recommendations for Executive Action:

For those states that have plans that did not meet all NCLBA 
requirements and still have conditional approval, we recommend that the 
Secretary of Education delineate in writing the process and time frames 
that are appropriate for each state's particular circumstances to meet 
conditions for full approval.

Further, we recommend the Secretary of Education develop a written plan 
that includes steps and time frames so that all states have approved 
NCLBA standards and assessment systems by the 2005-06 school year.

To improve the validity and reliability of state data used to determine 
whether schools are meeting state goals, we recommend that the 
Secretary of Education further support states' abilities to gather 
accurate student data through activities such as disseminating best 
practices and designating technical specialists who can serve as 
resources to help states.

Agency Comments and Our Evaluation:

We provided a draft of this report to Education for review and comment. 
Education agreed with our recommendation that it develop a written plan 
that includes steps and time frames so all states have approved NCLBA 
standards and assessment systems by the 2005-06 school year. Education 
noted that such actions are consistent with current departmental 
efforts and should help NCLBA implementation. Similarly, the department 
agreed with our recommendation to further support states' abilities to 
gather accurate student data. Education provided new information in its 
comments on efforts to support states' improvements in their data 
collection capacities. Consequently, we modified the report to reflect 
Education's comments. Education officials also provided technical 
comments that we incorporated into the report where appropriate. 
Education's comments are reproduced in appendix IV.

Education disagreed with our recommendation that it delineate in 
writing the process and time frames that are appropriate for each 
state's particular circumstances to meet conditions for full approval 
of their state plans. In its comments, Education cited several reasons 
for disagreeing with this recommendation. Education stated it has a 
process of continuous monitoring, although not in written form, and 
cited as evidence of success of its process that all states have used 
their plans to make annual progress determinations. However, experience 
under the 1994 ESEA has shown that school progress determinations can 
be made without meeting all plan requirements. As of July 31, 2004, 
plans from 23 states and the District of Columbia have not received 
full approval, and according to Education officials, these plans need 
to meet conditions to be able to meet NCLBA requirements. We recognize 
the significant efforts the department has taken to support states' 
implementation of NCLBA and its plans to continue assisting states to 
improve the performance of their districts, schools, and students. 
However, a written delineation, appropriate to each state's 
circumstances, of the process and time frames necessary for the 
remaining states to meet all conditions would provide the necessary 
documentation and assurance to Education, Congress, and the public that 
the steps states need to take and the timeframes for their actions are 
clear and understood.

In its comments, Education also questioned our statement that it 
approved plans without the states meeting all plan requirements. The 
department states that no plan was approved unless it demonstrated that 
when implemented the state, its districts and schools could meet the 
accountability requirements of the law. Thus, Education asserted that 
GAO narrowly interpreted approval. We do not disagree with the 
department's interpretation of its authority to conditionally approve 
plans. Instead, our focus was on whether plans contained all the 
elements required by NCLBA and not merely on whether the plan contained 
an assurance that in the future it would meet the requirements of the 
law. We found that many plans that were conditionally approved did not 
meet all NCLBA requirements for what states were to have in their 
plans, and Education did not dispute this finding.

We will send copies of this report to the Secretary of Education, 
relevant congressional committees, and other interested parties. We 
will also make copies available to others upon request. In addition, 
the report will be made available at no charge on GAO's Web site at 
http://www.gao.gov.

Please contact me at (202) 512-7215 if you or your staff have any 
questions about this report. Other contacts and major contributors are 
listed in appendix V.

Signed by: 

Marnie S. Shaul: 
Director, Education, Workforce, and Income Security Issues:

[End of section]

Appendix I: Methods to Establish Starting Points:

To establish goals for schools to reach in the first year of No Child 
Left Behind Act (NCLBA) implementation, states were to set starting 
points using student test performance data from the 2001-02 school 
year. They computed results for each designated student group and for 
each school. To set the starting points, states were required to choose 
the higher percentage of students scoring at the proficient level or 
higher of the following:[Footnote 33]

(a) the student group with the lowest 2001-02 test performance from 
among:

1. economically disadvantaged students,

2. students from major racial and ethnic groups,

3. students with disabilities,

4. students with limited English proficiency,

or:

(b) the score of the school at the 20th percentile of enrollment when 
all schools in the state were ranked according to 2001-02 test 
performance.

To identify the student group with the lowest 2001-02 test performance, 
states had to determine what percentage of students in each of the 
designated groups scored at the proficient level on state tests. For 
example, a state may have found that 15 percent of students with 
disabilities scored at the proficient level, whereas all other groups 
had more students do so. In this case, the state would identify the 
students with disabilities group as the lowest-performing student 
group.

To identify the score of the school at the 20th percentile of 
enrollment, states had to follow the following process. First, they had 
to determine the enrollment and percentage of students that were 
proficient for each of their schools. Then, they would rank the schools 
based on how many students were proficient in each school. For example, 
the state may list schools as shown in the following table.

Table 2: Calculating a Starting Point Using the School at the 20th 
Percentile in Cumulative Enrollment:

School name: Roosevelt H.S; 
Percent scoring at the proficient level: 25.0%; 
Enrollment: 110; 
Cumulative enrollment: 1,875.

School name: Madison Elem; 
Percent scoring at the proficient level: 21.2%; 
Enrollment: 90; 
Cumulative enrollment: 1,765.

School name: Jefferson Elem; 
Percent scoring at the proficient level: 15.0%; 
Enrollment: 75; 
Cumulative enrollment: 1,675.

School name: Adams Elem; 
Percent scoring at the proficient level: 9.1%; 
Enrollment: 350; 
Cumulative enrollment: 1,600.

School name: Lincoln H.S; 
Percent scoring at the proficient level: 7.5%; 
Enrollment: 700; 
Cumulative enrollment: 1,250.

School name: Washington Elem; 
Percent scoring at the proficient level: 7.2%; 
Enrollment: 550; 
Cumulative enrollment: 550. 

Source: Cowen, Kristen Tosh. 2004. The New Title I: The Changing 
Landscape of Accountability. Washington, D.C.: Thompson Publishing 
Group, (used with permission of the publisher).

[End of table]

Beginning with the school at the lowest rank, the state would add the 
number of students enrolled until it reached 20 percent of the state's 
enrollment. If the state's total student population is 9,375, then the 
20th percentile cutoff is 1,875, (9,375 x 20 percent), or Roosevelt 
H.S. in this example. At Roosevelt H.S., 25 percent of students were 
proficient.

The state would compare the two results. Since the percentage of 
students at the proficient level at Roosevelt H.S. (25 percent 
proficient) was higher than the results for students with disabilities 
(15 percent proficient), the state would set its starting point at 25 
percent. For a school to meet the state's proficiency goal in 2002-03, 
at least 25 percent of its students would have to score at the 
proficient level; however, a first year goal could be set 
higher.[Footnote 34]

[End of section]

Appendix II: Percentage of Schools That Met State Goals in 2002-03:

Name of state: Alabama; 
Percentage of schools that met state proficiency goals in 2002-03: 96%. 

Name of state: Alaska; 
Percentage of schools that met state proficiency goals in 2002-03: 42%. 

Name of state: Arizona; 
Percentage of schools that met state proficiency goals in 2002-03: 76%. 

Name of state: Arkansas; 
Percentage of schools that met state proficiency goals in 2002-03: 89%. 

Name of state: California; 
Percentage of schools that met state proficiency goals in 2002-03: 54%. 

Name of state: Colorado; 
Percentage of schools that met state proficiency goals in 2002-03: 75%. 

Name of state: Connecticut; 
Percentage of schools that met state proficiency goals in 2002-03: Not 
available. 

Name of state: Delaware; 
Percentage of schools that met state proficiency goals in 2002-03: 44%. 

Name of state: District of Columbia; 
Percentage of schools that met state proficiency goals in 2002-03: 45%. 

Name of state: Florida; 
Percentage of schools that met state proficiency goals in 2002-03: 18%. 

Name of state: Georgia; 
Percentage of schools that met state proficiency goals in 2002-03: 64%. 

Name of state: Hawaii; 
Percentage of schools that met state proficiency goals in 2002-03: 39%. 

Name of state: Idaho; 
Percentage of schools that met state proficiency goals in 2002-03: 75%. 

Name of state: Illinois; 
Percentage of schools that met state proficiency goals in 2002-03: 56%. 

Name of state: Indiana; 
Percentage of schools that met state proficiency goals in 2002-03: 77%. 

Name of state: Iowa; 
Percentage of schools that met state proficiency goals in 2002-03: 93%. 

Name of state: Kansas; 
Percentage of schools that met state proficiency goals in 2002-03: 88%. 

Name of state: Kentucky; 
Percentage of schools that met state proficiency goals in 2002-03: 60%. 

Name of state: Louisiana; 
Percentage of schools that met state proficiency goals in 2002-03: 92%. 

Name of state: Maine; 
Percentage of schools that met state proficiency goals in 2002-03: 88%. 

Name of state: Maryland; 
Percentage of schools that met state proficiency goals in 2002-03: 65%. 

Name of state: Massachusetts; 
Percentage of schools that met state proficiency goals in 2002-03: 76%. 

Name of state: Michigan; 
Percentage of schools that met state proficiency goals in 2002-03: 76%. 

Name of state: Minnesota; 
Percentage of schools that met state proficiency goals in 2002-03: 94%. 

Name of state: Mississippi; 
Percentage of schools that met state proficiency goals in 2002-03: 75%. 

Name of state: Missouri; 
Percentage of schools that met state proficiency goals in 2002-03: 51%. 

Name of state: Montana; 
Percentage of schools that met state proficiency goals in 2002-03: 80%. 

Name of state: Nebraska; 
Percentage of schools that met state proficiency goals in 2002-03: 51%. 

Name of state: Nevada; 
Percentage of schools that met state proficiency goals in 2002-03: 60%. 

Name of state: New Hampshire; 
Percentage of schools that met state proficiency goals in 2002-03: 69%. 

Name of state: New Jersey; 
Percentage of schools that met state proficiency goals in 2002-03: 88%. 

Name of state: New Mexico; 
Percentage of schools that met state proficiency goals in 2002-03: 79%. 

Name of state: New York; 
Percentage of schools that met state proficiency goals in 2002-03: 76%. 

Name of state: North Carolina; 
Percentage of schools that met state proficiency goals in 2002-03: 47%. 

Name of state: North Dakota; 
Percentage of schools that met state proficiency goals in 2002-03: 91%. 

Name of state: Ohio; 
Percentage of schools that met state proficiency goals in 2002-03: 78%. 

Name of state: Oklahoma; 
Percentage of schools that met state proficiency goals in 2002-03: 79%. 

Name of state: Oregon; 
Percentage of schools that met state proficiency goals in 2002-03: 72%. 

Name of state: Pennsylvania; 
Percentage of schools that met state proficiency goals in 2002-03: 74%. 

Name of state: Puerto Rico; 
Percentage of schools that met state proficiency goals in 2002-03: 90%. 

Name of state: Rhode Island; 
Percentage of schools that met state proficiency goals in 2002-03: 77%. 

Name of state: South Carolina; 
Percentage of schools that met state proficiency goals in 2002-03: 24%. 

Name of state: South Dakota; 
Percentage of schools that met state proficiency goals in 2002-03: 47%. 

Name of state: Tennessee; 
Percentage of schools that met state proficiency goals in 2002-03: 57%. 

Name of state: Texas; 
Percentage of schools that met state proficiency goals in 2002-03: 92%. 

Name of state: Utah; 
Percentage of schools that met state proficiency goals in 2002-03: 72%. 

Name of state: Vermont; 
Percentage of schools that met state proficiency goals in 2002-03: 88%. 

Name of state: Virginia; 
Percentage of schools that met state proficiency goals in 2002-03: 59%. 

Name of state: Washington; 
Percentage of schools that met state proficiency goals in 2002-03: 78%. 

Name of state: West Virginia; 
Percentage of schools that met state proficiency goals in 2002-03: 59%. 

Name of state: Wisconsin; 
Percentage of schools that met state proficiency goals in 2002-03: 89%. 

Name of state: Wyoming; 
Percentage of schools that met state proficiency goals in 2002-03: 85%. 

Source: GAO analysis of state plans and other information reported by 
states.

Note: Connecticut has not yet released figures for the 2002-03 school 
year; and Iowa reported only schools that received funds through Title 
I, Part A.

[End of table]

[End of section]

Appendix III: State Plan Requirements:

Principle 1: all schools; 
State accountability system element: 1.1; 
Accountability system includes all schools and districts in the state.

Principle 1: all schools; 
State accountability system element: 1.2; 
Accountability system holds all schools to the same criteria.

Principle 1: all schools; 
State accountability system element: 1.3; 
Accountability system incorporates the academic achievement standards.

Principle 1: all schools; 
State accountability system element: 1.4; 
Accountability system provides information in a timely manner.

Principle 1: all schools; 
State accountability system element: 1.5; 
Accountability system includes report cards.

Principle 1: all schools; 
State accountability system element: 1.6; 
Accountability system includes rewards and sanctions.

Principle 2: all students; 
State accountability system element: 2.1; 
The accountability system includes all students.

Principle 2: all students; 
State accountability system element: 2.2; 
The accountability system has a consistent definition of full academic 
year.

Principle 2: all students; 
State accountability system element: 2.3; 
The accountability system properly includes mobile students.

Principle 3: method of adequate yearly progress determinations; 
State accountability system element: 3.1; 
Accountability system expects all student subgroups, public schools, 
and LEAs to reach proficiency by 2013-14.

Principle 3: method of adequate yearly progress determinations; 
State accountability system element: 3.2; 
Accountability system has a method for determining whether student 
subgroups, public schools, and LEAs made adequate yearly progress.

Principle 3: method of adequate yearly progress determinations; 
State accountability system element: 3.2a; 
Accountability system establishes a starting point.

Principle 3: method of adequate yearly progress determinations; 
State accountability system element: 3.2b; 
Accountability system establishes statewide annual measurable 
objectives.

Principle 3: method of adequate yearly progress determinations; 
State accountability system element: 3.2c; 
Accountability system establishes intermediate goals.

Principle 4: annual decisions; 
State accountability system element: 4.1; 
The accountability system determines annually the progress of schools 
and districts.

Principle 5: subgroup accountability; 
State accountability system element: 5.1; 
The accountability system includes all the required student subgroups.

Principle 5: subgroup accountability; 
State accountability system element: 5.2; 
The accountability system holds schools and LEAs accountable for the 
progress of student subgroups.

Principle 5: subgroup accountability; 
State accountability system element: 5.3; 
The accountability system includes students with disabilities.

Principle 5: subgroup accountability; 
State accountability system element: 5.4; 
The accountability system includes students with limited English 
proficiency.

Principle 5: subgroup accountability; 
State accountability system element: 5.5; 
The state has determined the minimum number of students sufficient to 
yield statistically reliable information for each purpose for which 
disaggregated data are used.

Principle 5: subgroup accountability; 
State accountability system element: 5.6; 
The state has strategies to protect the privacy of individual students 
in reporting achievement results and in determining whether schools 
and LEAs are making adequate yearly progress on the basis of 
disaggregated subgroups.

Principle 6: based on academic assessments; 
State accountability system element: 6.1; 
Accountability system is based primarily on academic assessments.

Principle 7: additional indicators; 
State accountability system element: 7.1; 
Accountability system includes graduation rate for high schools.

Principle 7: additional indicators; 
State accountability system element: 7.2; 
Accountability system includes an additional academic indicator for 
elementary and middle schools.

Principle 7: additional indicators; 
State accountability system element: 7.3; 
Additional indicators are valid and reliable.

Principle 8: separate decisions for reading/language arts and 
mathematics; 
State accountability system element: 8.1; 
Accountability system holds students, schools and districts separately 
accountable for reading/ language arts and mathematics.

Principle 9: system validity and reliability; 
State accountability system element: 9.1; 
Accountability system produces reliable decisions.

Principle 9: system validity and reliability; 
State accountability system element: 9.2; 
Accountability system produces valid decisions.

Principle 9: system validity and reliability; 
State accountability system element: 9.3; 
State has a plan for addressing changes in assessment and student 
population.

Principle 10: participation rate; 
State accountability system element: 10.1; 
Accountability system has a means for calculating the rate of 
participation in the statewide assessment.

Principle 10: participation rate; 
State accountability system element: 10.2; 
Accountability system has a means for applying the 95 percent 
assessment criteria to student subgroups and small schools.

Source: U.S. Department of Education.

Note: Italics in original. 

[End of table]

[End of section]

Appendix IV: Comments from the Department of Education:

UNITED STATES DEPARTMENT OF EDUCATION:

THE DEPUTY SECRETARY:

September 8, 2004:

Ms. Marnie S. Shaul:
Director, Education, Workforce and Income Security Issues: 
Government Accountability Office: 
441 G Street, NW:
Washington, DC 20548:

Dear Ms. Shaul:

I am writing in response to your request for comments on the Government 
Accountability Office (GAO) draft report (GAO-04-734), dated September 
2004, and entitled "No Child Left Behind Act: Improvements Needed in 
Education's Process for Tracking States' Implementation of Key 
Provisions." I take very seriously the role of GAO in holding Federal 
agencies accountable for the proper and efficient implementation of 
programs under their command. I also take very seriously the U.S. 
Department of Education's responsibility to ensure that States 
implement as vigorously as possible the assessment and accountability 
provisions enacted by the landmark No Child Left Behind Act (NCLBA). I 
am proud of the significant accomplishments of States and school 
districts to date, with the assistance of this Department, to ensure 
the law is properly implemented. Although GAO has tried to capture some 
of this energy and effort in its report, States, school districts, and 
the Department have made far more progress than the draft report 
suggests.

As you know from your research, both the Department and, even more 
significantly, the States have made substantial efforts to implement 
NCLBA. Prior to this law, many States had no statewide system of 
accountability. Starting from scratch in many cases, States had to 
craft-or in some cases, recraft-their systems, within the parameters 
set out in the law, and secure the support of their stakeholders. 
Legislatures, in some instances, had to change State laws to conform to 
NCLBA, and many States had to revise their regulations and policies. It 
is an unprecedented accomplishment that, a year and a half after NCLBA 
was enacted, all States had submitted accountability plans and used 
those plans to hold their schools and districts accountable for the 
achievement of their students during the 2002-03 school year. This was 
a historic milestone for our nation and education reform.

I am very proud of our role in assisting States and school districts to 
implement NCLBA and vow to continue our vigorous enforcement of its 
provisions. As the following comments reflect, I believe two of GAO's 
recommendations should help NCLBA implementation and, in fact, are 
consistent with activities the Department already has well underway. I 
must respectfully disagree, however, with GAO's first recommendation.

GAO Recommendation 1: For those States that have plans that did not 
meet all NCLBA requirements, the Secretary of Education should 
delineate. in writing the process and timeframes that are appropriate 
for each State's particular circumstances to meet conditions for full 
approval.

I take issue with the draft report's characterization that we approved 
State plans without the States' meeting all plan requirements. No plan 
was approved unless it demonstrated that, when implemented, the State, 
its school districts, and its schools could meet the accountability 
provisions required by statute and regulations. I believe the draft 
report, by narrowly interpreting the word "approval," overemphasizes 
bureaucratic process and discounts significant outcomes. Every State 
used the accountability plan we approved to make adequate yearly 
progress (AYP) determinations for school year 2002-03.

The Department must approve a State's plan under Title I (of which the 
accountability plan is a critical part) in order for the State to 
receive Title I funds. The Department's standard for plan approval in 
order to award funds is whether a plan is in substantially approvable 
form. See generally 34 C.F.R. §76.703. This standard is met if a plan 
contains sufficient information for the Department to determine that 
the State would be able to meet all applicable statutory and regulatory 
requirements when it implements the plan. This standard does not demand 
that a plan must be perfect. When a plan is substantially approvable, 
the Department may still need to condition a grant award to obtain 
compliance over the course of the grant period.

Each State completed an accountability workbook that we developed to 
detail the basic elements-i.e., the specific statutory and regulatory 
requirements-of the State's accountability system. This workbook became 
the State's accountability plan once all the basic elements were 
satisfied. Following peer review by a team of experts, consultation 
between the Department and each State, and any consequent modifications 
by the State, we concluded that each State's accountability plan was in 
substantially approvable form before we approved it and thus were able 
to award fiscal year 2003 funds on or about July 1, 2003, the date 
those funds became available for obligation.

In reviewing and approving accountability plans, we took into 
consideration the unique circumstances of each State. I believe GAO's 
findings fail to consider the cyclical nature of State assessment and 
accountability systems together with our desire and responsibility to 
make grant awards on or near the date funds became available for 
obligation if State plans were substantially approvable. For example, 
GAO found that of the 50 States, the District of Columbia, and Puerto 
Rico six did not include all starting points in their plans. All but 
one of these States, under previously approved timeline waivers, were 
administering their new assessments for the first time in spring 2003. 
Those States could not set their starting points until they had 
accumulated assessment data that were generally not available until mid 
to late summer of 2003. It would have been of absolutely no benefit to 
require those States to calculate their starting points on the basis of 
assessments they administered in school year 2001-02, only to 
immediately recalculate them following the 2003 administration. 
Moreover, because the formula for calculating starting points is 
prescribed by the law, we believed a State's description of how it 
would implement that formula once it had its actual assessment data was 
sufficient to approve the State's plan. In these instances, we 
conditioned the State's 2003 Title I grant award to ensure that the 
State would provide the Department with its actual starting points as 
soon as they were calculated. Each State in this situation set its 
starting points in time to calculate AYP for the 2002-03 school year 
and subsequently met its condition. It is circumstances like this one 
that the draft report misconstrues as not meeting all plan 
requirements.

The draft report recommends that the Department delineate in writing 
the process and timeframes that are appropriate for each State's 
particular circumstances to meet conditions for full approval. The 
Department already has a process in place, which it has been 
implementing, to move States toward full approval. This process 
involves continuous monitoring of a State's progress in meeting its 
conditions and, as the draft report notes, has resulted in an 
additional 22 States that formerly had conditions becoming fully 
approved.

The most significant evidence of the success of our process is that 
every State used its accountability plan to make AYP determinations for 
school year 2002-03, as GAO's data confirm, and is currently using its 
plan again to make AYP determinations for school year 2003-04. That, to 
me, demonstrates that our approval system works and negates the need 
for the draft report's first recommendation.

GAO Recommendation 2: The Secretary of Education should develop a 
written plan that includes steps and timeframes so that all States have 
approved NCLBA standards and assessment systems by the 2005-06 school 
year.

I appreciate GAO's recommendation that the Department establish a plan 
that allows sufficient time to ensure that States have approved NCLBA 
standards and assessment systems by the 2005-06 school year. The 
Department is well underway in implementing such a plan. Immediately 
following enactment of NCLBA, we conducted negotiated rulemaking on 
regulations to implement NCLBA's standards and assessment requirements. 
The resulting regulations, published July 5, 2002, represent the 
consensus of a wide range of stakeholders: Federal, State, and local 
administrators, principals, teachers, parents, and assessment experts. 
We also issued nonregulatory guidance on the standards and assessment 
requirements on March 10, 2003.

To ensure States are taking the steps necessary to administer annual 
assessments in grades 3-8 in reading/language arts and mathematics by 
the 2005-06 school year, the Department required each State to submit, 
as part of its May 2003 NCLB consolidated State application, evidence 
that it had developed academic content standards or grade-level 
expectations for reading/language arts and mathematics for grades 3 
through 8, and a detailed timeline of its process to develop aligned 
assessments as well as science standards required by NCLBA. We will 
peer review States' academic content standards and aligned assessments 
as part of the upcoming standards and assessment peer review process. 
Further, as we monitor States under Title I, we check to ensure that 
each State is progressing on its timeline.

In preparation for peer reviews of States' compliance with NCLBA's 
standards and assessment requirements, the Department recently provided 
to every State a copy of the peer review guidance. This guidance serves 
both to guide peer review teams and to assist States as they prepare 
for their peer review. The guidance outlines each element that will be 
reviewed and offers examples of evidence that States can submit to 
demonstrate compliance with each requirement. The guidance also 
provides helpful information on technical concepts such as alignment, 
validity, and reliability.

Although all of these activities effectively serve to prepare States to 
meet the NCLBA standards and assessment requirements, the Department is 
planning to take additional steps to ensure that all States meet the 
deadline. In the near future, we will be conducting a series of 
interactive Webcasts to train State Title I directors and assessment 
directors on the peer review guidance. We also have plans underway for 
additional outreach to States that will reinforce the importance of 
implementing the new requirements in a timely manner. We are compiling 
lists of potential peer reviewers and are formulating the review 
process and schedule we will follow. In early October, we intend to 
notify each State of these plans. We anticipate that several States 
will undergo their peer review this fall, well ahead of the statutory 
deadline. As we move forward in the standards and assessment review and 
approval process, we welcome any suggestions on how we can make the 
review beneficial for all.

GAO Recommendation 3: To improve the validity and reliability of State 
data used to determine whether schools met State goals, the Secretary 
of Education should further support States' ability to gather accurate 
student data, such as disseminating best practices and designating 
technical specialists.

This section of the draft report focuses on States' use of valid and 
reliable data when determining if schools, districts, and the State 
have met AYP targets. The draft report, notes the Department's numerous 
efforts in working with States to improve data quality. For example, we 
are working to: (1) monitor State internal data control policies and 
data quality; (2) provide guidance to States on the technical adequacy 
requirements for assessment systems to meet NCLBA requirements; (3) 
support technical assistance for States via Federal grants to State 
collaborative working groups; (4) assist States in using "relevant, 
nationally recognized professional and technical standards" (e.g., 
Standards for Educational and Psychological Testing (AERA/APA/NCME, 
1999)); (5) consolidate and streamline State data collection; and (6) 
establish a set of common definitions across many of the programs 
funded by the Department.

GAO also acknowledges the shared responsibility of States and the 
Department in improving the processes and procedures for collecting 
high-quality data for use in assessment and accountability reporting. 
States have done a good job of identifying and addressing data-quality 
issues, but many challenges still exist, particularly for those States 
without high-quality student information management systems and limited 
staffing resources. I believe that the efforts undertaken by the 
Department to date reflect the strong leadership that we have exercised 
to address data-quality concerns such as those raised by the report. I 
agree that we should continue to support States' ongoing efforts to 
improve the quality of assessment data and their assessment and 
accountability reports so that student achievement is accurately 
represented and reported at the school, district, State, and Federal 
levels.

In sum, our goal has been to ensure that all States have working 
accountability systems with which to hold schools and school districts 
accountable for the achievement of all their students. The process we 
used to implement this goal emphasized outcomes consistent with the 
NCLBA, and we are proud of the huge strides States have made. Still, we 
have much work to do. We look forward to continuing to work with States 
as they develop and implement the new standards and assessment systems 
that NCLBA requires. We will continue to support States in their 
efforts to improve data quality and accountability. Working together, 
we will leave no child behind.

Sincerely,

Signed by: 

Eugene W. Hickok: 

[End of section]

Appendix V: GAO Contacts and Staff Acknowledgments:

GAO Contacts:

Harriet C. Ganson, (202) 512-7042, GansonH@gao.gov Jason S. Palmer, 
(202) 512-3825, PalmerJS@gao.gov:

Staff Acknowledgments:

In addition to those named above, Deborah Edwards, Gilly Martin, Sherri 
Doughty, Richard Burkard, Luann Moy, and Sheranda Smith-Campbell made 
key contributions to the report.

[End of section]

Related GAO Products:

No Child Left Behind Act: Additional Assistance and Research on 
Effective Strategies Would Help Small Rural Districts. GAO-04-909. 
Washington, D.C.: September 23, 2004.

Special Education: Additional Assistance and Better Coordination Needed 
among Education Offices to Help States Meet the NCLBA Teacher 
Requirements. GAO-04-659. Washington, D.C.: July 15, 2004.

No Child Left Behind Act: More Information Would Help States Determine 
Which Teachers Are Highly Qualified. GAO-03-631. Washington, D.C.: July 
17, 2003.

Title I: Characteristics of Tests Will Influence Expenses; Information 
Sharing May Help States Realize Efficiencies. GAO-03-389. Washington, 
D.C.: May 8, 2003.

Title I: Education Needs to Monitor States' Scoring of Assessments. 
GAO-02-393. Washington, D.C.: April 1, 2002.

Title I Funding: Poor Children Benefit Though Funding Per Poor Child 
Differs. GAO-02-242. Washington, D.C.: January 31, 2002.

Title I Program: Stronger Accountability Needed for Performance of 
Disadvantaged Students. GAO/HEHS-00-89. Washington, D.C.: June 1, 2000.

FOOTNOTES

[1] Hereinafter, the term states will refer collectively to the 50 
states plus the District of Columbia and Puerto Rico.

[2] Title I, Part A, of the ESEA is the largest program of federal aid 
for elementary and secondary education, allocating almost $12 billion 
in fiscal year 2003 to serve disadvantaged children in approximately 90 
percent of the nation's school districts.

[3] ESEA was reauthorized and amended as the Improving America's 
Schools Act in 1994.

[4] Students with disabilities refers to students covered under the 
Individuals with Disabilities Education Act, the primary law that 
addresses the unique needs of children with disabilities. 

[5] The department issued guidance in March 2004 indicating that 
states--if they request to do so--may average participation rates over 
2 or 3 years.

[6] These other measures may include, but are not limited to, grade-to-
grade retention rates, and changes in the percentage of students 
completing gifted and talented and advance placement or college 
preparatory courses. 

[7] Schools designated as in need of improvement under the Improving 
America's Schools Act had their designation carry over after NCLBA took 
effect.

[8] We use the term plan to refer to a state's "accountability 
workbook," a format developed by Education. Education required states 
to use the accountability workbooks to detail the basic elements of the 
state's system to demonstrate meeting NCLBA requirements.

[9] See appendix I for a description of the methods the law required 
states to use to develop starting points. 

[10] The first-year goals may or may not be the same as the starting 
points.

[11] Measuring the achievement of a group of students is not required 
if the number of students in that group is insufficient to yield 
statistically reliable information or would reveal personally 
identifiable information about an individual student.

[12] GAO, Title I: Education Needs to Monitor States' Scoring of 
Assessments, GAO-02-393, (Washington, D.C.: Apr. 1, 2002), and Title I 
Program: Stronger Accountability Needed for Performance of 
Disadvantaged Students, GAO/HEHS-00-89, (Washington, D.C.: June 1, 
2000).

[13] U. S. Department of Education, Office of Inspector General, 
Department of Education Management Challenges, Feb. 2004.

[14] First-year goals for math also varied substantially across states.

[15] This statement refers to the minimum size of designated student 
groups for measuring proficiency and not for reporting test scores.

[16] Some states had more than one group size. When states reported 
multiple group sizes, we report the lower size. 

[17] NCLBA states that when states increase their goals from one year 
to the next, those increases must occur in equal increments, that the 
first increase must occur by 2004-05, and that future increases occur 
no later than every 3 years thereafter. 

[18] As of June 10, 2003, 17 states either did not report an annual 
rate or used some other method.

[19] When using confidence intervals, upper and lower limits around a 
school's or district's percentage of proficient students are 
calculated, creating a range of values within which there is 
"confidence" the true value lies. For example, instead of saying that 
72 percent of students scored at the proficient level or above on a 
test, a confidence interval may show that percentage to be between 66 
and 78, with 95 percent confidence.

[20] Theodore Coladarci, Gallup Goes to School: The Importance of 
Confidence Intervals for Evaluating "Adequate Yearly Progress" in Small 
Schools, the Rural School and Community Trust Policy Brief, Oct. 2003.

[21] Thomas J. Kane and Douglas O. Staiger, "Volatility in School Test 
Scores: Implications for Test-Based Accountability Systems," in Diane 
Ravitch, ed., Brookings Papers on Education Policy 2002, pp. 235-283. 
Washington, D.C.: Brookings Institution.

[22] Other factors, such as the application of certain statistical 
procedures, can affect this result.

[23] These results were preliminary at the time we obtained them and 
were calculated at the 99 percent confidence level.

[24] California officials told us that a bill had recently passed in 
its state legislature that may address this issue.

[25] Texas State Auditor's Office, A Joint Audit Report on the Status 
of State Student Assessment Systems and the Quality of Title I School 
Accountability Data, SAO Report No. # 02-064, (Austin, Texas: Aug. 
2002).

[26] According to Education, shortly after NCLBA was enacted and prior 
to the statutory deadline, all states without approved assessment 
systems were under either a timeline waiver or a compliance agreement 
with specific deadlines for full compliance.

[27] See Department of Education, No Child Left Behind: LEA and School 
Improvement (Non-Regulatory Guidance), Jan. 2004. 

[28] Several states exercised this authority prior to Education 
establishing a process.

[29] If a state does not meet the conditions cited in the grant award, 
it is subject to withholding of administrative funds.

[30] GAO, Title I: Education Needs to Monitor States' Scoring of 
Assessments, GAO-02-393, (Washington, D.C.: Apr.1, 2002).

[31] The NCLBA gave states 90 days to show how they would address any 
aspect of their standards and assessment systems that did not meet the 
1994 requirements. After that 90-day window expired, the NCLBA 
prohibited Education from granting additional waivers of deadlines for 
meeting these requirements. States failing to meet deadlines 
established by the 1994 law (or under a waiver or compliance agreement) 
are subject to a mandatory withholding of 25 percent of administrative 
funds. For states that do not comply with NCLBA requirements, the law 
authorizes, but does not require, Education to withhold Title I state 
administrative funds. 

[32] In Title I Program: Stronger Accountability Needed for Performance 
of Disadvantaged Students (GAO/HEHS-00-89) issued in June 2000, GAO 
concluded that most states were not positioned to meet the 1994 ESEA 
requirement to collect and report on student assessment by designated 
subgroups. In Education's response to the report, it noted that states 
were not required to publicly report these data until the 2000-01 
school year. Specifically, Education commented, "the Department is 
reviewing State final assessment systems (using external peer 
reviewers) to ensure compliance with Title I assessment requirements, 
including the requirement that States publicly report disaggregated 
assessment data." Although Education devoted efforts to ensure that 
deadlines were met, only 17 states had approved assessment systems by 
the 2000-01 deadline.

[33] For the rest of this appendix, we will refer to scoring at the 
proficient level to mean scoring at the proficient level or higher.

[34] This example draws extensively from Cowan, Kristen Tosh. 2004. The 
New Title I: The Changing Landscape of Accountability. Washington, 
D.C.: Thompson Publishing Group.

GAO's Mission:

The Government Accountability Office, the investigative arm of 
Congress, exists to support Congress in meeting its constitutional 
responsibilities and to help improve the performance and accountability 
of the federal government for the American people. GAO examines the use 
of public funds; evaluates federal programs and policies; and provides 
analyses, recommendations, and other assistance to help Congress make 
informed oversight, policy, and funding decisions. GAO's commitment to 
good government is reflected in its core values of accountability, 
integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony:

The fastest and easiest way to obtain copies of GAO documents at no 
cost is through the Internet. GAO's Web site ( www.gao.gov ) contains 
abstracts and full-text files of current reports and testimony and an 
expanding archive of older products. The Web site features a search 
engine to help you locate documents using key words and phrases. You 
can print these documents in their entirety, including charts and other 
graphics.

Each day, GAO issues a list of newly released reports, testimony, and 
correspondence. GAO posts this list, known as "Today's Reports," on its 
Web site daily. The list contains links to the full-text document 
files. To have GAO e-mail this list to you every afternoon, go to 
www.gao.gov and select "Subscribe to e-mail alerts" under the "Order 
GAO Products" heading.

Order by Mail or Phone:

The first copy of each printed report is free. Additional copies are $2 
each. A check or money order should be made out to the Superintendent 
of Documents. GAO also accepts VISA and Mastercard. Orders for 100 or 
more copies mailed to a single address are discounted 25 percent. 
Orders should be sent to:

U.S. Government Accountability Office

441 G Street NW, Room LM

Washington, D.C. 20548:

To order by Phone:

	

Voice: (202) 512-6000:

TDD: (202) 512-2537:

Fax: (202) 512-6061:

To Report Fraud, Waste, and Abuse in Federal Programs:

Contact:

Web site: www.gao.gov/fraudnet/fraudnet.htm

E-mail: fraudnet@gao.gov

Automated answering system: (800) 424-5454 or (202) 512-7470:

Public Affairs:

Jeff Nelligan, managing director,

NelliganJ@gao.gov

(202) 512-4800

U.S. Government Accountability Office,

441 G Street NW, Room 7149

Washington, D.C. 20548: