Steady Progress: State Public Reporting Practices for Students with Disabilities after the First Year of NCLB (2002-2003)

NCEO Technical Report 40

Published by the National Center on Educational Outcomes

Prepared by:

Hilda Ives Wiley • Martha L. Thurlow • Jean A. Klein

May 2005


Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Wiley, H. I., Thurlow, M. L., & Klein, J. A. (2005). Steady progress: State public reporting practices for students with disabilities after the first year of NCLB (2002-2003) (Technical Report 40). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://education.umn.edu/NCEO/OnlinePubs/Technical40.htm


Executive Summary

This report is the seventh analysis of state reports conducted by the National Center on Educational Outcomes (NCEO) to examine the extent to which states publicly report information about students with disabilities in statewide assessments. We present descriptions of statewide testing systems and examine whether these systems included participation and performance information for students with disabilities, as indicated by publicly available data. The majority of our information was obtained by analyzing states’ Department of Education Web sites. If disaggregated information was not posted, the states were then asked to submit public documents that included these results.

For the 2002–2003 school year, the number of states that reported both participation and performance data on students with disabilities for their general assessments was 36. This number was just one more than for the 2001–2002 school year, in which only 35 states reported both participation and performance data. For the 2002–2003 year, participation data were presented in a variety of ways. The most common way was to present the number of students tested. Almost all states that reported participation data did this. Twenty-seven states went beyond the numbers to report rates of participation for state-level data.

General assessment performance data for students with disabilities also were reported in a variety of ways by states. Performance data reported on state’s general assessments, more often now than in previous years, compared students with disabilities to general education students or the total population of students. The results clearly illustrate the achievement gap that exists between these two groups. Still, the gap does vary dramatically across states.

Alternate assessment participation and performance reporting for 2002–2003 was available in 29 states. This was up considerably from only 22 states in 2001–2002. Other states provided only performance data (four states) or only participation data (three states). The nature of the data presented on alternate assessments was usually just an overall count of students participating or an overall rate of students passing. Though some states did break participation and performance information down by grade level or content matter, many states still only provide aggregated numbers.

More states are reporting on the participation and performance of students with disabilities for their general and alternate assessments than ever before. Additionally, the quality of reporting and the level of detail have improved. With increased Web-based reporting, searching for specific data has become easier. Based on the data from states that do report results, some recommendations for how to further improve reporting practices are presented in this report.


Overview

The issues of accountability for students in special education have been under discussion for many years (Ysseldyke et al., 1983) due to the fact that many states were not including the results of students with disabilities in their assessments. The 1997 Individuals with Disabilities Education Act required that each state report to the public, with the same frequency and in the same detail as it reports on the assessment of nondisabled children, data about the participation and performance of students with disabilities on regular and alternate assessment. The law spurred the National Center on Education Outcomes to begin looking at the extent to which and the ways in which states accomplished this task (Thurlow, Ysseldyke, Erickson, & Elliott, 1997).

The first report found that states varied in the way in which they reported information on students with disabilities, and that some formats were more “user friendly” than others (Thurlow, Langenfeld, Nelson, Shin, & Coleman, 1998). It also found that most states did not report information on either the participation or performance of students with disabilities. Reports in the following years found that states slowly made improvements in their reporting practices for students with disabilities by disaggregating the performance and participation of these students in their reports of school, district, and state educational progress. These reports covered the time span from 1998 to 2002 (Bielinski, Thurlow, Callender, & Bolt, 2001; Thurlow, House, Boys, Scott, & Ysseldyke, 2000; Thurlow, Langenfeld, Nelson, Shin, & Coleman, 1998; Thurlow, Nelson, Teelucksingh, & Ysseldyke, 2000; Thurlow, Wiley, & Bielinksi, 2003; Ysseldyke, Thurlow, Langenfeld, Nelson, Teelucksingh, & Seyfarth, 1998). Despite slow changes from year to year, the overall change is still significant in the number of states reporting disaggregated performance data on at least some state tests across time, from 11 states in 1997 (Thurlow et al., 1997) to 48 states in 2002 (Thurlow & Wiley, 2004).

With the introduction of No Child Left Behind (NCLB), there is increased accountability for states to demonstrate the improved performance of students with disabilities in their statewide assessments. It requires that states publicly report the performance of all students, including those with disabilities, on their statewide exams. The 2002–2003 school year represents the first year that states were required to compare their reading and math data to the data from the baseline year (2001–2002) to demonstrate the changes in performance of all students, including those with disabilities.

This report marks the seventh in a line of reports that have followed states’ practices in publicly reporting state assessment information for students with disabilities. We sought to build on the findings of earlier reports. More specifically, we investigated how states reported on the participation and performance of students with disabilities in their statewide assessments, including those that are part of their accountability system, their alternate assessments, and how they are reporting this information to the public.


Method

We began our search for information by reviewing every state’s Department of Education Web site. We began collecting data in March 2004 and collected information for the 2002–2003 school year. We recorded assessments administered and documented whether participation and performance information was reported for students with disabilities. We also examined the way in which participation was reported and whether participation and performance information was reported for students who took a test with accommodations. By March 2004, a large percentage of states had already posted their 2002–2003 assessment data online in a way that made the data easy to locate and understand.

On April 27, 2004, we mailed a letter to each state director of assessment outlining our findings from the state’s Web site (see Appendix A). We asked them to review our findings, correct for any misinformation, and provide the public document or Web site at which the correct information was available. We asked that they send us these changes by June 4, 2004. Many states that had changes to make either sent us printed documents with the data or directed us to a Web page that we had not found in our search. Several states gave us dates they expected their disaggregated assessment results to be posted. Overall, we received responses from 30 directors of assessment.

To ensure that our findings were as accurate as possible, we followed up these efforts with a letter to each state’s director of special education (see Appendix B). These letters were mailed on June 25, 2004. The letters asked the directors to review our findings and make any changes by July 24, 2004. For states from which we had already received a response from the director of assessment, we noted that in the letter by stating that “These results were verified by your state’s director of assessment, but if you have anything to add, please let us know.” For states from which we did not hear from the director of assessment, we sent the same letter to the director of special education as we had sent to the director of assessment. Of the 50 states to which we sent letters, 23 responded with either corrections or to verify that the information that we had was correct.

Finally, there were still 10 states for which we had not heard back from either the director of assessment or the director of special education. For three of those states we found information on students with disabilities for all their regular and alternate assessments. For another state we found disaggregated information for all their regular assessments and we knew that their alternate assessment had not been administered during the 2002–2003 school year. For the remaining six states, phone calls and e-mails were sent until we had confirmation from either the director of assessment or director of special education that our data were accurate.

 

Characteristics of State Assessment Systems

Appendix C lists all the state mandated general assessments that we identified for the 50 states. This list includes the state, the name of the test, the grades and content areas tested, and whether the state had publicly available disaggregated participation and performance data for students with disabilities for their 2002–2003 state assessments. We identified 110 separate statewide tests or testing systems. Thirty-five states had more than one general assessment.

Figure 1 breaks down the 110 testing systems by type: norm-referenced tests (NRT), criterion-referenced tests (CRT), exit tests used as a gate for graduation or earning a particular type of diploma (EXIT), and tests that combined standardized NRTs with additional state-developed test items (NRT/CRT). While we recognized that many exit exams may also be NRTs, CRTs, or both, the high stakes consequences for students on these exit exams indicated a need to create a separate category for these tests.

Figure 1. Types of General Assessments

Criterion-referenced tests comprised 58% of all the assessments that states administered in 2002–2003. In fact, only eight states (Florida, Indiana, Iowa, Missouri, Montana, New Mexico, North Dakota, and South Dakota) did not administer a CRT, though six of those states administered a test with both CRT and NRT components. Both norm-referenced tests and exit exams comprised 18% of tests administered. These numbers are similar to the 2001–2002 assessment pattern, in which 52% of tests were CRTs, 22% were NRTs, and 21% were exit exams (Thurlow & Wiley, 2004).


States Reporting Disaggregated 2002–2003 General Assessment Data for Students with Disabilities

Figure 2 summarizes the different ways in which general assessment data were reported in all 50 states. Overall, 72% of states reported disaggregated participation and performance information on students with disabilities for all their assessments, 2% reported performance for all assessments (but not participation data), 20% reported participation and performance information for some assessments, and 6% did not report any disaggregated information.

Figure 2. States that Disaggregate Assessment Results for Students with Disabilities

Figure 3 indicates which of the 50 states reported their data in each of the four ways shown in Figure 2. States that reported disaggregated data for students with disabilities at the state level generally reported results at the district and school level, too.

Figure 3: States that Report 2002–2003 Disaggregated Results for Students with Disabilities


Figure 4 shows the number of states that reported participation and performance data for the tests that they include in their statewide accountability systems. Only a subset of assessments in many states are part of their No Child Left Behind accountability system. When we examined just the NCLB assessments, we found that 40 reported participation and performance for students with disabilities on all of these assessments. Although this is more than the number of states reporting information on all the assessments given in a state, it is still not all of them. As evident in Figure 4, the states that do disaggregate for all accountability assessments are spread across the U.S.; they are states with both small and large populations. The states that reported disaggregated 2002–2003 data for their general assessments did so regardless of whether they had just one assessment or multiple assessments, and regardless of whether they tested in just a few grades or in as many as 10 grades. The tests that are part of each state’s accountability system are indicated by an asterisk before the test name in Appendix C.

Figure 4: States that Report 2002–2003 Disaggregated Results for Students with Disabilities in their State Accountability Systems


Of the six states that reported participation and performance information for some of their accountability assessments, half were only missing data on one test. These states were Florida, Nevada, and South Carolina. South Carolina reported all of the performance data for all of its tests. It was only missing participation data for one of its assessments. For those states that did not report disaggregated information, Wyoming and Oregon reported disaggregated information at the district level.


States Reporting 2002–2003 Alternate Assessment Data for Students with Disabilities

As shown in Figure 5, results from our Web searches and mailings revealed that 29 states publicly reported both participation and performance results at the state-level for their alternate assessment. An additional four states reported performance only, and three states reported participation only. Thus, 28% of states did not report any type of information about their alternate assessment. However, 58% of states did report both participation and performance for their alternate assessment, which is an increase over 44% in the 2001–2002 school year.

Figure 5. Information States Reported for their Alternate Assessment

Figure 6 illustrates which states reported alternate assessment participation and performance data. There is no obvious geographic pattern to the states that did not report alternate assessment data. The states with no information are not states that did not have an alternate assessment in 2002–2003.

 

Figure 6: States Publicly Reporting State-Level Data for the 2002–2003 Alternate Assessment

 


Assessment Participation in 2002–2003

General Assessment Participation Results

Among the states identified as providing participation data for students with disabilities, the way in which this information was reported varied (see Appendix D). Figure 7 illustrates the number of assessments with disaggregated participation data and how those participation data were reported. Information is presented in terms of the number of assessments for which participation data were available, not in terms of the number of states. For example, in Alabama there are three assessments and each is counted separately. We used this approach because not all states report participation in the same way across assessments. For example, one state might report only a count of students tested for one assessment, but for another assessment it might report a count tested, a percent tested, and a percent not tested.

Figure 7. Participation Reporting Approaches for General Assessments (Number of Tests=110)

 

Reporting a percentage of students tested is more informative than just reporting the number of students tested, although there are good reasons to report both the number and the percentage. Twenty-seven states (41 assessments total) reported either the percent of students tested or the percent not tested for at least one of their assessments. For 34 assessments, the percent of students tested was given, and for 14 assessments, the percent of students not tested was given. Seventy-four assessments provided the number of students tested, making this by far the most frequent way of reporting participation data. The number or percent of students who were exempt or excluded from assessments was given for six tests and the number or percent of students absent was given for 14 tests.

Figure 8 illustrates the participation rates reported in those states for which there was clear participation rate information reported. Though the percentage of students tested or not tested was given for 41 assessments, those assessments came from only 27 states. While it may have been possible to calculate participation rates for other states as well, using information that was reported about student enrollment and the number of students tested, we did not take the extra step to do the math calculations. This is because we were concerned about the information that was readily available. However, if the state did provide only the percentage of students not tested, we did report the percentage of students tested in the table. It is important that states report the percentage of students tested, in addition to just a count, because this presents a more accurate picture of how many students are participating. These rates should ideally be based on the school enrollment on the day of testing (Ysseldyke, Thurlow, Langenfeld, Nelson, Teelucksingh, & Seyfarth, 1998); however, using the December 1st Child Count data is also an acceptable option if test day enrollment is not available.

Figure 8. Percentages of Students with Disabilities Participating in Middle School General Assessments in Those States with Clear Participation Reporting of Rates

To summarize participation rate information, we selected one grade to portray in Figure 8. In most states, participation in the middle school/junior high school math test was used. If the state tested in more than one grade in the middle school level, the 8th grade test data were used. Appendix E contains information about the tests and exact grades used for Figure 8. Percentages in the figure are rounded to the nearest whole number. Not all states provided data broken down in this way. In Ohio, Pennsylvania, South Carolina, and Virginia, the data are given for the math test but the grade levels are all aggregated. Four other states (California, Kentucky, Minnesota, and New Hampshire) provided a rate, but it was number of students with disabilities tested out of all students rather than a percent of students with disabilities who were tested. West Virginia provided a rate, but it was for all grades and all subjects. It is important to note that results in Figure 8 were obtained from different types of tests that were being used in these states. Nevertheless, during this 2002–2003 academic year, participation rates ranged from 51% to 100%; 13 out of the 21 states had participation rates of 95% or higher.

 

Alternate Assessment Participation Results

Figure 9 illustrates how states reported participation for their alternate assessment. Much greater participation information was provided this year (2002–2003) as compared to the previous testing year. Appendix F outlines in more detail all the ways that information is reported. Thirty-two states provided participation information for their alternate assessments. Similar to the regular assessment, the most common way of reporting participation information was to give the number of students tested, which was done by 28 states. Fifteen states gave a rate, which was the percent of students tested for 14 states and the percent not tested for one state. Only one state provided the percent of students who were exempt, and three states provided either the rate or count of students who were absent.

Figure 9. Participation Reporting Approaches for Alternate Assessments (Number of States=32)

Fifteen states provided a rate of either the percent of students tested or the percent not tested in their alternate assessments. These rates are shown in Figures 10, 11, and 12. Appendix G provides more details about the grades and content areas included in the table. When possible, we tried to use rates from 8th grade math. We divided this information into three figures because there were three different ways in which participation data were presented by states. Six states gave the percent of students tested out of the total number who were eligible/recommended to take the alternate assessment (Figure 10). North Carolina administered two different alternate assessments (NC-1 indicates the AAAI and NC-2 indicates the Portfolio Assessment), and both of these are shown in Figure 10.

Nine states provided information on the percent of students tested on the alternate assessment out of all the students enrolled (see Figure 11). Finally, three states provided information about the number of students who took the alternate out of all their students with disabilities (see Figure 12). Nebraska was the one state not included in any figure because it only provided a rate for its reading test and did not administer a math test during the 2002–2003 school year.

Figure 10. Percentages of Students with Disabilities tested with the Alternate Assessment out of the Total Number of Students Recommended/Eligible for the Alternate Assessment

 

Figure 11. Percentages of Students with Disabilities tested with the Alternate Assessment out of the Total Number of Enrolled Students

Figure 12. Percentages of Students with Disabilities tested with the Alternate Assessment out of the Total Number of Students with Disabilities


Other Information Collected for 2002–2003

In our analysis of state reporting for 2002–2003, we looked at additional characteristics of states’ information. Specifically, we looked at information available on accommodations used, and if available, performance when accommodations were used. We also examined the quality of Web-based reporting.

 

Accommodations

Fifteen states provided state-level information about students who took an assessment with an accommodation. In some cases, states reported on standard accommodations (those considered appropriate and not ones that change the constructs measured by the assessment); in other cases they reported on nonstandard accommodations (which generally were considered to change the constructs measured—and might be referred to as “non-allowed”—although IEP teams could select them), and in other cases they reported on both or did not specify which.

Table 1 describes the information the 15 states provided. Appendix H contains additional information about the data provided by these states, with details about the participation and performance of students in each category that the state provides. Five states broke down student participation and performance by accommodation (e.g., directions read orally, Braille, extended time), and ten states provided only overall information on students who, in general, used accommodations.

Table 1. States that Reported State-Level Information about Accommodations for Reading or Math

 

State

Standard/Non-standard Accommodation

 

Participation

 

Performance

 

For whom

Arizona

Non-Standard

Yes

Yes

SWD

Colorado

Standard

Yes

Yes

ALL

Non-Standard

Yes

No

ALL

Georgia

Standard & Non-Standard

Yes

No

SWD & All & General Ed

Indiana

Standard

Yes

Yes

SWD & ALL

Kentucky

Standard

Yes

Yes

SWD

Louisiana

Standard

Yes

Yes

ALL

Maine

Not Specified

Yes

No

SWD

Massachusetts

Not Specified

Yes

No

SWD

Michigan

Standard & Non-Standard

Yes

Yes

ALL

Missouri

Not Specified

Yes

Yes

SWD

New Hampshire

Non-Standard

Yes

Yes

ALL

New Mexico

Standard

Yes

Yes

SWD & ALL

North Carolina

Standard & Non-Standard

Yes

Yes

ALL

Pennsylvania

Standard

Yes

No

ALL

Rhode Island

Not Specified

Yes

Yes

SWD

 Note: SWD=Students with Disabilities

 

Quality Analysis of Web-Based Reporting

After examining every state’s Department of Education Web site, it became evident that some states presented data in a much more accessible format than others. Because assessment data are reported on the Web in most states, it is crucial that these data be clear and easy to access. We decided to collect data for each state that reported results for students with disabilities online and examine the quality of the reporting on the Web site. It is important to note, however, that because Web sites are frequently updated, it is possible that some of our findings no longer hold true.

Several states used drop down menus that allowed an individual to select the test, year, grade, and status of students of interest. The Web site then displayed a chart of the data scores in question. In some cases, these charts were relatively easy to understand and provided a way of assessing how the test was scored and what percentage of students attained satisfactory scores (e.g., Washington). Other states provided the percentage of students attaining a given score, but it was not clear which set of scores constituted satisfactory completion of the test (e.g., New York for the Regents Competency Test). Still other states provided charts with student scores separated by student status groups (e.g., Texas).


Assessment Performance in 2002–2003

General Assessment Performance Results

We examined the performance of all students, and then the performance of students with disabilities. When examining performance across states, it is important to remember that the scores from each state are based on different tests. These tests may emphasize different standards and are likely to differ in difficulty. In addition, there is great variability across states in terms of the percentages of students with disabilities whose scores have been included in the assessments. Thus, it is not appropriate to compare performance across states. It is possible, however, to examine the performance differences within each state between all students and students with disabilities.

Performance results are reported for both reading and math assessments because these content domains are the ones assessed by most states and are the content areas required first by NCLB to be assessed, reported, and included in accountability. For greater comparability in what we report and because states are now moving away from norm-referenced tests toward a wider use of criterion-referenced tests, we only report performance on CRTs. We also report performance on exit exams that students are required to pass to graduate from high school with a standard diploma.

We separated grade levels into three categories: elementary (3–5), middle school (6–8), and high school (9–12). For our summary, we chose to present only one grade for each level. When available, 4th grade was used to represent the elementary level, 8th grade to represent the middle school level, and 10th grade to represent the high school level. These grades were chosen because they are the grades at which the greatest number of states test students. If data from those grades were not available, the grade below was used, followed by the remaining grade if no other data were available. The number in the parenthesis next to the state’s name indicates the grade from which the data were obtained. Appendix I reports the name of the test we used and the grade.

Although most states reported the performance of all students and then the performance of subgroups, such as students with disabilities, some states did not report the performance of all students. When these data were not available, the performance of general education students was given. Because the performance of general education students as a group may be slightly higher than the performance of all students as a group, we have indicated those states with “all students” actually based only on general education students by an asterisk after the name of the state.

It should further be noted that two states (Rhode Island and Vermont) only provided subtest scores on their assessments. In these cases, subtest scores for reading skills and math basic understanding are reported. States were dropped if they only reported aggregated scores across grades. Thus, South Carolina is not reflected in any of the figures because it provided only aggregated data across grades for its students with disabilities.

 

Reading Performance

Figures 13–15 present the reading performance of students. The performance of students with disabilities in reading is generally much lower than the performance of all students. Though the gap is greater in some states than in others, students with disabilities are always performing below all students. As students move from elementary to high school, the gap grows wider. At the elementary level, the widest gap was 37.2 percentage points in New Jersey. In middle school the greatest gap was 57 percentage points in New Jersey. At the high school level, the largest gap was 59.95 percentage points in Delaware. Though these are the largest gaps, the pattern is the same for most states.

Figure 13. Elementary School Reading Performance on Criterion-Referenced Tests

Figure 14. Middle School Reading Performance on Criterion-Referenced Tests

Figure 15. High School Reading Performance on Criterion-Referenced Test

Mathematics Performance

Performance of all students and students with disabilities on states’ 2002–2003 mathematics assessments is shown in Figures 16–18. The figures cover elementary, middle, and high school. The same cautions apply to these figures as applied to the reading figures.

Figure 16. Elementary School Mathematics Performance on Criterion-Referenced Tests

Figure 17. Middle School Mathematics Performance on Criterion-Referenced Tests

Figure 18: High School Mathematics Performance on Criterion-Referenced Tests

As shown in Figures 16–18, the gap between students with disabilities and all students on math assessments is quite similar to the gap found for reading assessments. The gap for math assessments exists in all states and varies considerably from state to state. The gap also increases by grade level. In elementary grades, the largest gap was 38 percentage points in Arizona. In middle school, the largest gap was 50 percentage points in Wisconsin, and in high school it was 55.1 percentage points in Idaho.

Figures 19 and 20 show the results of high school reading and math exit exams. States administer exit exams in different grades. The number in the parenthesis next to the state’s name indicates the grade from which the data come. If no number is indicated, that means that the exit exam incorporates multiple high school grades.

Figure 19. Percent Passing Minimum Competency/High School Reading Exit Exam

Figure 20. Percent Passing Minimum Competency/High School Mathematics Exit Exam

Only those states that report disaggregated results for students with disabilities are included in these figures. Also these results reflect only the first administration of the exit exam. States offer multiple retest opportunities for their exit exams and the percent passing increases with each retest. Often the gaps between general and special education students become very small on retesting. New York offers two exit exams: the Regent’s Comprehensive Exam is referred to as NY1 and the Regent’s Competency Test is referred to as NY2 in the figures. (NY2 is a “safe harbor” assessment implemented only for students with disabilities and those who received special education services in previous years. It reflects an older less rigorous standard than the NY1 as well as additional locally-selected assessments). Virginia offers math tests by content area; we selected Algebra I for Figure 20.

The figures presented here for first-time testing show that large gaps exist for exit exams, though the percent of students passing the exit exams varies widely by state. For both reading and math, New Jersey had the largest gap (54.8 percentage points for reading; 51.4 percentage points for math). The gap on reading tests was small for both New York’s Regent’s Competency Test (12 point difference). For math, again the gap was smallest on New York’s Regents Competency Test (12 points).


Discussion

This seventh analysis of state education public reporting shows that states seem to have gotten stuck in their reporting of participation and performance of students with disabilities. About the same number are reporting disaggregated information on their general assessments as in 2001–2002. Fewer than half of the states are reporting both participation and performance information for their alternate assessment, up just slightly from the number in 2001–2002.

A total of 47 states reported some state-level information about students with disabilities on their state assessments. Of these states, only 36 reported participation and performance for all of their assessments. An additional 10 states provided participation and performance information for some of their assessments, and 1 state reported performance data for all of their tests, though not participation. The number of states reporting both participation and performance rose slightly from 35 states during the 2001–2002 academic year to 36 in 2002–2003. When examining participation rates for students with disabilities, rates in 2001–2002 ranged from 71.1% to 99.1% participating, whereas in 2002–2003 the data ranged from 51% to 100%. Thirteen out of the 21 states that provided clear rates had participation rates of 95% or higher.

When examining alternate assessments, only 36 states reported any information. Though this is an increase from 32 states during the 2001–2002 year, states clearly are not reporting on their alternate assessments at the same level as they are for their general assessment. Twenty-nine states provided both participation and performance data for their alternate (up from 22 states in 2001–2002), four states gave performance data only, and three states gave participation data only. The lower level of alternate assessment reporting seems to be due only in part to the fact that some states were still working on the development of their alternate assessments. According to Thompson and Thurlow (2001), all but two states had an alternate assessment approach by 2001 and all but 16 states had decided how scores from the alternate assessments would be reported. It is likely that the 14 states that did not have information for 2002–2003 are among the 16 that they identified.

For their general assessments, 27 states reported either the percent of students tested or not tested for at least one of their assessments (41 assessments total). This is a much more informative way of presenting data than just giving the number of students tested. However, the number of students tested still continues to be the most common way of reporting participation (74 assessments). The number or percent of students who were exempt or excluded from assessments was given for six tests and the number or percent of students absent was given for 14 tests. For their alternate assessments, the most common way of reporting participation information was to give the number of students tested, which was done by 28 states. Only 15 states gave a rate.

When we examined the performance of students, we found that for the general assessment large gaps existed between students with disabilities and all students. Though some gaps were significantly larger than others, the gaps were noticeable for all states that provided performance data. Gaps increase as students get older.

 

Recommendations for Reporting

With the push from NCLB to provide assessment data to schools by the start of the school year, Web-based reporting has clearly become the primary vehicle for sharing data with the public. It is crucial, then, that the data be both easy to locate and to comprehend. Based on our analyses of both Web-based and paper reports, we make the following recommendations:

1.  Report not only the number of students with disabilities assessed, but also the percentage assessed. When states provide the number of students assessed, this information is less helpful than when a percentage is provided. By giving a percent, people are able to get a more accurate picture of how many students are participating in the state assessment system.

2.  Report results for the alternate assessment. Though states are finally beginning to provide participation and performance data for their general assessment, they are still slow about reporting that information for their alternate assessment. This information should be provided so that the public can see how all students are performing.

3.  Report the number and percent of students with disabilities using accommodations. Many students with disabilities are not able to take the general assessment in the standard format, and thus are provided with accommodations. Many states consider the scores of some of these accommodated assessments to either not count or to count as “not-proficient” because they are non-standard accommodations. In some states, the number of students participating using non-standard accommodations is quite high. If these numbers are not reported, then the picture painted of how all students are doing will be inaccurate. It is important to know the extent to which students are using accommodations, and specifically those accommodations that result in the removal of their scores.

After completing this analysis of the first year in which NCLB has had labeling consequences take effect, it is surprising to see that some states still are not reporting results for all their assessments, particularly for their alternate assessments. It was also interesting to compare the reporting patterns of states for all their assessments compared to only assessments that are part of the state’s accountability system. Though only 36 states gave participation and performance data for all their tests, this number rose to 40 when considering only accountability tests. Though this is higher, NCLB requires that subgroup participation and performance be reported at the state-level for these accountability tests. Therefore, ten states still lag far behind this legislation.


References

Bielinski, J., Thurlow, M. L., Callender, S., & Bolt, S. (2001). On the road to accountability: Reporting outcomes for students with disabilities (Technical Report 32). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

No Child Left Behind Act of 2001, 20 U.S.C. 6301 et seq. (2002).

Thompson, S. L., & Thurlow, M. L. (2001). 2001 State special education outcomes: A report on state activities at the beginning of a new decade. Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L., House, A., Boys, C., Scott, D., & Ysseldyke, J. (2000). State assessment policies on participation and accommodations for students with disabilities: 1999 update (Synthesis Report 33). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L., Langenfeld, K. L., Nelson, J. R., Shin, H., & Coleman, J. E. (1998). State accountability reports: What are states saying about students with disabilities? (Technical Report 20). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L., Nelson, J. R., Teelucksingh, E., & Ysseldyke, J. E. (2000). Where’s Waldo? A third search for students with disabilities in state accountability reports (Technical Report 25). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L. & Wiley, H. I. (2004). Almost there in public reporting of assessment results for students with disabilities (Technical Report 39). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M., Wiley, H. I., & Bielinksi, J. (2003). Going public: What 2000–2001 reports tell us about the performance of students with disabilities (Technical Report 35). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L., Ysseldyke, J.E., Erickson, R. N., & Elliott, J. L. (1997). Increasing the participation of students with disabilities in state and district assessments. (Policy Directions No. 6). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Ysseldyke, J. E., et al. (1983). Generalizations from five years of research on assessment and decision making: The University of Minnesota Institute. Exceptional Education Quarterly, 41, 75–93.

Ysseldyke, J. E., Thurlow, M. L., Langenfeld, K., Nelson, J. R., Teelucksingh, E., & Seyfarth, A. (1998). Educational results for students with disabilities: What do the data tell us? (Technical Report 23). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Appendix A

Verification Letter to State Assessment Director

The National Center on Educational Outcomes is examining states’ public reports on 2002–2003 school year assessment results. We have reviewed your state’s Web site for both participation and performance data on your statewide assessments. Attached tables reflect what we believe to be the tests your state administers and the results that we have found thus far on the Web (Table 1), how participation information is reported for students with disabilities (if it is available) (Table 2), and whether information is given about students who took assessments with individual accommodations (Table 3).

Please review the tables and verify their accuracy. Our goal is to (a) identify all components of each state’s testing system, (b) determine whether each state reports disaggregated test results for students with disabilities, (c) describe the way participation information is presented, and (d) describe how states report results for students who took the test with accommodations or modifications.

If any data element is inaccurate, please provide us with the public document and/or website that contains the accurate information. Address your responses to Hilda Ives Wiley at the above address.

If you have any questions about our request, please call Hilda Ives Wiley at (612) 626–8913 or email: ives0016@umn.edu. If we do not hear from you by Friday, June 4, 2004, we will assume that our summaries are accurate.

Thank you for taking the time to verify our findings.

Sincerely,

Hilda Ives Wiley
Graduate Research Assistant

Martha Thurlow
Director

 

Table 1: Tests Administered and Results Found

Please review this table for its accuracy, make any changes (if necessary), and fill in any blank spaces.

State

Test

Grades Tested

Subject Areas

Is Disaggregated Info for Students with Disabilities Reported (Yes/No)

Is this test part of the state accountability system? (Yes/No)

AL

 

 

 

Participation

Performance

 

Direct Assessment of Writing

(criterion-referenced)

5, 7

Writing

Yes

Yes

 

High School Graduation Exam

(Exit Exam)

12

Reading, Math, Science, Social Studies

Yes

Yes

 

SAT-10

(norm-referenced)

3–8

Reading, Language, Math, Science (7)

Yes

Yes

 

Alternate Assessment

3–8, 11, 12

Not specified

Yes

Yes

 

 

 

Table 2: Participation Information for Students with Disabilities

Please review this table, which describes the way in which participation data are publicly reported in your state. A dot in the box indicates information is reported in this way. Please add dots if you know of any other method of participation reporting, and please provide us with the information that is reported in that way (either a hard copy or a Web-link).

State

Test

Number

Tested

Number Not Tested

Number Exempt

Number Excluded

% of students tested

% of students not tested

%

Exempt

% Excluded

Number and/or Percent Absent

AL

Direct Assess. of Writing

·

 

 

 

 

 

 

 

 

HS Grad. Exam

·

 

 

 

 

 

 

 

 

SAT-10

·

 

 

 

 

 

 

 

 

Alternate Assess.

·

 

 

 

 

 

 

 

 

 Blank cell = No data

Table 3: Accommodations

We are interested in examining if and how states report information about students who take assessments using accommodations. Please change our responses (if necessary) to reflect information that is reported for your state. If you do make changes, please provide us with the information (either a hard-copy or a Web-link).

 

Test

Standard

Administration

Nonstandard Administration

 

Participation

Performance

Participation

Performance

Direct Assessment of Writing

No

No

No

No

High School Graduation Exam

No

No

No

No

SAT-10

No

No

No

No

Alternate Assessment

No

No

No

No

 


Appendix B

Letters to State Directors of Special Education

(Two Forms Depending on Input from Assessment Director. Example here is if letter was verified by the Assessment Director. If no verification, letter was the same as in Appendix A.)

The National Center on Educational Outcomes is examining states’ public reports on 2002–2003 school year assessment results. We have reviewed your state’s Web site for both participation and performance data on your statewide assessments. Attached tables reflect what we believe to be the tests your state administers and the results that we have found thus far on the Web (Table 1), how participation information is reported for students with disabilities (if it is available) (Table 2), and whether information is given about students who took assessments with individual accommodations (Table 3). These tables have been verified by your state’s Director of Assessment, but if you have anything to add, please let us know.

Please review the tables and verify their accuracy. Our goal is to (a) identify all components of each state’s testing system, (b) determine whether each state reports disaggregated test results for students with disabilities, (c) describe the way participation information is presented, and (d) describe how states report results for students who took the test with accommodations or modifications.

If any data element is inaccurate, please provide us with the public document and/or website that contains the accurate information. Address your responses to Hilda Ives Wiley at the above address.

If you have any questions about our request, please call Hilda Ives Wiley at (612) 626-8913 or email: ives0016@umn.edu. If we do not hear from you by Friday, July 24, 2004, we will assume that our summaries are accurate.

Thank you for taking the time to verify our findings.


Sincerely,


Hilda Ives Wiley
Graduate Research Assistant

Martha Thurlow
Director

Table 1: Tests Administered and Results Found

Please review this table for its accuracy, make any changes (if necessary), and fill in any blank spaces.

State

Test

Grades Tested

Subject Areas

Is Disaggregated Info for Students with Disabilities Reported (Yes/No)

Is this test part of the state accountability system? (Yes/No)

AL

 

 

 

Participation

Performance

 

Direct Assessment of Writing

(criterion-referenced)

5, 7

Writing

Yes

Yes

 

High School Graduation Exam

(Exit Exam)

12

Reading, Math, Science, Social Studies

Yes

Yes

 

SAT-10

(norm-referenced)

3–8

Reading, Language, Math, Science (7)

Yes

Yes

 

Alternate Assessment

3–8, 11, 12

Not specified

Yes

Yes

 

 

 

Table 2: Participation Information for Students with Disabilities

Please review this table, which describes the way in which participation data are publicly reported in your state. A dot in the box indicates information is reported in this way. Please add dots if you know of any other method of participation reporting, and please provide us with the information that is reported in that way (either a hard copy or a Web-link).

State

Test

Number

Tested

Number Not Tested

Number Exempt

Number Excluded

% of students tested

% of students not tested

%

Exempt

% Excluded

Number and/or Percent Absent

AL

Direct Assess. of Writing

·

 

 

 

 

 

 

 

 

HS Grad. Exam

·

 

 

 

 

 

 

 

 

SAT-10

·

 

 

 

 

 

 

 

 

Alternate Assess.

·

 

 

 

 

 

 

 

 

 Blank cell = No data

 

Table 3: Accommodations

We are interested in examining if and how states report information about students who take assessments using accommodations. Please change our responses (if necessary) to reflect information that is reported for your state. If you do make changes, please provide us with the information (either a hard-copy or a Web-link).

 

Test

Standard

Administration

Nonstandard Administration

 

Participation

Performance

Participation

Performance

Direct Assessment of Writing

No

No

No

No

High School Graduation Exam

No

No

No

No

SAT-10

No

No

No

No

Alternate Assessment

No

No

No

No

 


Appendix C

2002–2003 State Assessment Systems and Status of Disaggregated Data
 

State

Assessment Component

Grades

Subject

Disaggregated Special Education Data

Part

Perf

Alabama

*Direct Assessment of Writing [CRT]

5,7

Writing

Yes

Yes

*High School Graduation Exam

[EXIT]

12

 

Reading, Language, Math, Science, Social Studies

Yes

Yes

*Stanford Achievement Test, 10th ed. (SAT-10) [NRT]

3–8

Reading, Language, Math, Science, Social Studies

Yes

Yes

Alaska

*Benchmark Exams [CRT]

3,6,8

Reading, Math, Writing

Yes

Yes

*High School Graduation Qualifying Exam [EXIT]

10

Reading, Math, Writing

Yes

Yes

Arizona

*Stanford Achievement Test, 9th ed. (SAT-9) [NRT]

2–9

Reading, Language, Math

Yes

Yes

*AZ Instrument to Measure Standards (AIMS) [CRT]

3,5,8

Reading, Math, Writing

Yes

Yes

*AIMS [EXIT]

10

Reading, Math, Writing

Yes

Yes

Arkansas

Stanford Achievement Test, 9th ed. (SAT-9) [NRT]

5,7,10

Complete Battery

No

No

*Arkansas Benchmark Exams (including End-of-Course) [CRT]

4,6,8, 9–12

Literacy [Reading & Writing] (4,6,8,11), Math (4,6,8), EOC–Algebra I (9–12), EOC-Geometry (9–12)

No

No

California

*Content Standards [CRT]

2–11

English Language Arts, Math (2–9), Algebra I & II (8–11), Integ. Math I–III (9–11), Geometry (8–11), Soc. Studies (8), World Hist. (10), U.S. Hist. (11), Bio./Life Sci. (9–11), Chem. (9–11), Earth Sci. (9–11), Physics (9–11), Integ./Coord. Sci. (9–11)

Yes

Yes

Spanish Assessment of Basic Education (SABE/2) [NRT]

2–11

Reading, Language, Math, Spelling (2–8)

Yes

Yes

*California Achievement Test, 6th ed. (CAT-6) [NRT]

2–11

Reading, Language, Math, Spelling (2–8), Science (9–11)

Yes

Yes

Colorado

*CO Student Assessment Program (CSAP) [CRT]

3–10

Reading, Math (5–10), Writing, Science (8)

Yes

Yes

Connecticut

*CT Mastery Test (CMT) [CRT]

4,6,8

Reading, Math, Writing

Yes

Yes

*CT Academic Performance Test (CAPT) [CRT]

10

Reading, Math, Writing, Science

Yes

Yes

Delaware

*DE Student Testing Program (DSTP) [SAT-9 for R,M with other criterion measures; NRT/CRT]

2–111

Reading (2–11), Math (2–11), Writing (3,5,8,10), Science (8,10), Social Studies (8,11)

Yes

Yes

 

Florida

*FL Comprehensive Assessment Test (FCAT) includes SAT-9

[NRT/CRT]

3–10

Reading, Math, Writing

Yes

Yes

High School Competency Test (HSCT) [EXIT] (for those not exempted by their FCAT performance in 10th grade )

11

Reading, Math

No

No

Georgia

 

*GA High School Graduation Test (GHSGT) [EXIT]

11

English/Language Arts, Math, Science, Social Studies

Yes

Yes

*Criterion-Referenced Competency Tests (CRCT) [CRT]

1–8

Reading, English/Language Arts, Math, Science (3–8), Social Studies (3–8)

Yes

Yes

*Middle Grades/High School Writing Assessment [CRT]

5,8,11

Writing

Yes

Yes

Hawaii

*HI Content and Performance Standards (HCPS II) State Assessment [CRT]

3,5,8,10

Reading, Math

Yes

Yes

 

Idaho

ID Direct Assessments [CRT]

4–6,8,9

Math (4,6,8), Writing (5,9)

Yes

Yes

*Idaho Standards Achievement Tests (ISAT) [CRT]

2–10

Reading/Language Arts, Math

Yes

Yes

Idaho Reading Indicator (IRI) [CRT]

K–3

Reading

Yes

Yes

Illinois

*IL Standards Achievement Test (ISAT) [CRT]

3,4,5,7,8

Reading (3,5,8), Math (3,5,8), Writing (3,5,8), Science (4,7), Social Studies (4,7)

No

 

Yes

*Prairie State Achievement Exam [CRT]

11

Reading, Math, Writing, Science, Social Studies

No

Yes

Indiana

*IN Statewide Testing for Educational *Progress (ISTEP+) [NRT/CRT]

3,6,8

English Language Arts, Math

 

Yes

Yes

 

*Graduation Qualifying Exam

[EXIT]

10

English Language Arts, Math

Yes

Yes

Iowa

*ITBS/ITED [NRT]

3–12

(only report on grades 4,8,10)

Reading, Math, Science (8,11)

Yes

 

Yes

Kansas

*KS Assessment System [CRT]

 

4–8,10,11

Reading (5,8,11), Math (4,7,10), Science (4,7,10), Social Studies (6,8,11)

Yes

Yes

 

 

Kentucky

*Comprehensive Test of Basic Skills, 5th ed. (CTBS/5) [NRT]

3,6,9

Reading, Language, Math

Yes

Yes

*KY Core Content Test [CRT]

4,5,7,8, 10–12

Reading (4,7,10), Math (5,8,11), Writing (4,7,12), Science (4,7,11), Social Studies (5,8,11), Arts & Humanities (5,8,11), Practical Living & Vocational Studies (5,8,10)

Yes

Yes

 

Louisiana

*LA Educational Assessment Program (LEAP 21) [CRT]

4,8

English/Language Arts, Math, Science, Social Studies

Yes

Yes

*Graduation Exit Exam (GEE-21) [EXIT]

10, 11

Language Arts, Math, Science, Social Studies

Yes

Yes

*Iowa Tests of Basic Skills/Iowa Tests of Educational Development [NRT]

3,5,6,7,9

Reading, Language, Math, Science, Social Studies

Yes

Yes

Maine

*Maine Educational Assessment (MEA) [CRT]

4,8,11

Reading, Math

Yes

Yes

Maryland

*Maryland School Assessment (MSA) [CRT]

3,5,8,10

Reading (3,5,8,10), Math (3,5,8,10)

Yes

Yes

High School Assessment [CRT]

9–122

English I, Biology, Government,
Algebra

Yes

Yes

Massachusetts

*MA Comprehensive Assessment System (MCAS) [CRT]

3–8,10

Reading (3), English Language Arts (4,7,10), Math (4,6,8,10), Science/Technology (5,8)

Yes

Yes

Michigan

*MI Educational Assessment Program (MEAP) [CRT]

4,5,7,8

 

 

 

Reading (4,7), Math (4,8), Writing (4,7), Science (5,8), Social Studies (5,8), Listening (4,7)

Yes

 

Yes

 

Minnesota

*MN Comprehensive Assessment (MCA) [CRT]

3,5,7,10,11

Reading (3,5,7,10), Math (3,5,7,11), Writing (5,10)

Yes

No

*Basic Skills Test [EXIT]

8,10

Reading (8), Math (8), Writing (10)

Yes

No

Mississippi

   *MS Curriculum Test (MCT) [CRT]

2–8

   Reading, Language, Math

Yes

Yes

Comprehensive Tests of Basic Skills, 5th ed. (CTBS/5) [NRT]

6

Reading, Language, Math

Yes

Yes

Writing Assessment [CRT]

4,7

Writing

Yes

Yes

Functional Literacy Exam (FLE) [EXIT]

For most students, only math is required for graduation.

11

Reading, Math, Writing

Yes

Yes

*Subject Area [CRT]

9–12

Algebra I, U.S. History, Biology, English II

Yes

Yes

Missouri

*MO Assessment Program (MAP) (Terra Nova survey) [NRT/CRT]

3,4,7,8,10,11

Communication Arts (3,7,11), Math (4,8,10), Science (optional; 3,7,10), Social Studies (optional; 4,8,11)

Yes

Yes

Montana

*Iowa Tests of Basic Skills/ Iowa Tests of Educational Development (ITBS/ITED) [NRT]

4,8,11

Reading, Math, Language Arts, Science, Social Studies

Yes

Yes

Nebraska

*Statewide Writing Assessment [CRT]

4,8,11

Writing

Yes

Yes

*Assessment of State Reading Standards [CRT]

4,8,11

Reading

Yes

Yes

Nevada

Iowa Tests of Basic Skills/ Iowa Tests of Educational Development (ITBS/ITED)  [NRT]

4,7,10

Reading, Math, Science, Social Studies

Yes

Yes

*Nevada Criterion Referenced Exam [CRT]

3,5,8

Reading, Math

Yes

Yes

*NV High School Proficiency Exam [EXIT]

10

Reading, Math, Science

Yes

Yes

*NV Direct Writing Assessment [CRT]

4,8,11,12

Writing

No

No

New Hampshire

*NH Educational Improvement and Assessment Program (NHEIAP) [CRT]

3,6,10

English Language Arts, Math, Science (6,10), Social Studies (6,10)

Yes

Yes

New Jersey

   *NJ Assessment of Skills and  

   Knowledge (NJ-ASK) [CRT]

4

Language Arts Literacy, Math, Science

Yes

Yes

*Grade Eight Proficiency Assessment (GEPA) [CRT]

8

    Language Arts Literacy, Math,

    Science

Yes

Yes

*High School Proficiency Assessment (HSPA) [EXIT]

11

    Language Arts Literacy, Math,

    Writing

Yes

 

Yes

New Mexico

*NM Achievement Assessment Program (NMAAP) (CTBS/5 & other criterion measures) [NRT/CRT]

3–9

Reading, Language, Math, Science, Social Studies

Yes

Yes

NM High School Competency Exam [EXIT]

10

Reading, Language Arts, Math, Science, Social Studies, Writing

Yes

Yes

New York

Occupational Education Proficiency Exams [EXIT]

9–12

Occupational Education

Yes

Yes

 

Regents Comprehensive Exams [EXIT]

9–12

English, Foreign Languages, Math, History/Social Studies, Science

Yes

Yes

 

Regents Competency Test

[EXIT]

9–12

Reading, Math, Science, Writing, Global Studies, U.S. Hist & Gov’t

Yes

Yes

 

NY State Assessment Program [CRT]

4,8

English/Language Arts, Math, Science

Yes

Yes

North Carolina

*End of Grade [CRT]

3–8, 10

Reading,  Math

Yes

Yes

    *End of Course [CRT]

9–12

Biology, Chemistry, Physics, Economics, English I, Physical Science, History, Algebra I & II, Geometry

Yes

Yes

   *Grade 3 Pretest [CRT]

3

Reading, Math

Yes

Yes

     Writing Assessment [CRT]

4,7,10

Writing

Yes

Yes

    *Computer Skills [CRT]

8

Computer

No

No

    *Competency Test [EXIT]

9

   Reading, Math

No

No

     *High School Comprehensive Test

         [CRT]

10

   Reading, Math

Yes

Yes

North Dakota

 

*North Dakota State Assessment (NDSA) [NRT/CRT)

4,8,12

Reading/Language, Math

Yes

Yes

Ohio

*OH Proficiency Tests [CRT]

4,6,10

Reading, Math, Writing, Science, Citizenship

Yes

Yes

*OH Proficiency Test [EXIT]

9

Reading, Writing, Math, Science, Citizenship

Yes

Yes

Oklahoma

*Core Curriculum Tests [CRT]

5,8

Reading, Math, Writing, Science, History, Geography, Arts

No

Yes

 

*Stanford Achievement Test, 9th ed. (SAT-9) [NRT]

3

Reading, Math, Language, Spelling, Listening

No

No

*High School End-of-Instruction Tests [CRT]

9–11

English II, U.S. History, Algebra I, Biology

No

No

Oregon

*OR Statewide Assessment [CRT]

3,5,8,10

 

Reading/Literature, Math, Math Problem Solving (5,8,10), Writing, Science (8,10)

No

No

Pennsylvania

*PA System of School Assessment (PSSA) [CRT]

3,5,6,8,9,11

Reading (3,5,8,11), Math (3,5,8,11), Writing (6,9,11)

Yes

Yes

Rhode Island

 

*New Standards Reference Examinations [CRT]

4,8,10

Reading, Math, Writing

Yes

Yes

RI State Writing Assessment [CRT]

3,7,11

Writing

No

Yes

RI Health Education Assessment [CRT]

9

Health

No

Yes

South Carolina

*Palmetto Achievement Challenge Tests (PACT) [CRT]

3–8

English/Language Arts, Math, Science, Social Studies

Yes

Yes

*High School Exit Exam [EXIT]

10

Reading, Math, Writing

No

Yes

South Dakota

 

*Dakota STEP Test [CRT/NRT]

3–8, 11

Reading, Math

Yes

Yes

Stanford Writing Assessment [NRT]

5,9

Writing

No

No

Tennessee

*Achievement Test [NRT]

3–8

Reading, Language, Math, Science, Social Studies

Yes

Yes

*Writing Test [CRT]

4,7,11

Writing

No

No

*Gateway Testing Initiative [CRT]

9–12

Algebra I, Biology, English II

No

No

Texas

*Texas Assessment of Knowledge and Skills (TAKS) [CRT]

3–10

 

Reading (3–9), English Language Arts (10), Math (3–10), Writing (4,7),  Science (5, 10), Social Studies (8, 10); Spanish version administered in grades 3–6.

Yes

Yes

*Exit Level TAKS [EXIT]

11

English/Language Arts (11), Math (11), Science (11), Social Studies (11)

Yes

Yes

Reading Proficiency Tests in English [CRT]

3–12

English Reading Proficiency

Yes

Yes

Utah

Stanford Achievement Test, 9th ed. (SAT-9) [NRT]

3,5,8,11

Reading, Language, Math, Science, Social Studies

Yes

Yes

*Core Criterion-Referenced Tests [CRT]

1–11

Reading, Math (1–10), Writing (6,9)

Yes

Yes

Direct Writing Assessment (NRT]

6,9

Writing

No

No

Vermont

*VT Comprehensive Assessment System [CRT]

2,4,5,8–

11

Reading (2), English/ Language Arts (4,8,10), Math (4,8,10), Science (5,9,11)

Yes

Yes

Virginia

*Standards of Learning (SOL) [CRT]

3,5,8

English (3), English: Reading/Literature and Research (5,8), English: Writing (5,8), Math, History, Science, Computer Technology (5, 8)

Yes3

Yes

*Standards of Learning

[EXIT]4

9–12 (may be taken at an earlier grade)

English, Math (Algebra I, II, & Geometry), History/Social Studies (World History I & II, Geography, U.S. History), Science (Earth, Biology, Chemistry)

Yes3

Yes

*VA State Assessment Program (VSAP) (SAT-9, Form TA) [NRT]

4,6,9

Reading, Language, Math [Science, Social Studies are optional]

Yes

Yes

Washington

*WA Assessment of Student Learning (WASL) [CRT]

4,7,8,10

Reading (4,7,10), Math (4,7,10), Writing (4,7,10), Science (8,10)

Yes

Yes

Iowa Tests of Basic Skills/Iowa Tests of Educational Development  (ITBS/ITED) [NRT]

3,6,9

Reading, Math

No

No

West Virginia

*Stanford Achievement Test, 9th ed. (SAT-9) [NRT]

3–11

Reading/Language, Math, Science, Social Studies

Yes

Yes

WV Writing Assessment [CRT]

4,7,10

Writing

No

No

Wisconsin

*WI Knowledge and Concepts Exam (WKCE) [CRT]

4,8,10

Reading, Language Arts, Math, Science, Social Studies

Yes

Yes

WI Reading Comprehension Test (WRCT) [CRT]

3

Reading

Yes

Yes

Wyoming

*WY Comprehensive Assessment System (WyCAS) [CRT]

4,8,11

Reading, Writing, Math

No

No

Terra Nova Comprehensive Tests of Basic Skills, 5th ed. (CTBS/5) [NRT]

4,8,11

Reading, Language, Math

No

No

1DE: In reading and math, students are tested in grades 2–11 but data are reported for only grades 3, 5, 8, & 10.
2MD: The High School Assessment is administered in whatever grade the relevant course is given. Some students take the HSA as early as 4th grade.
3VA: The percentage of students not tested is given but is aggregated for the SOL, the SOL-EXIT, and the Alternate Assessment.
4VA: There is not one single exit exam. Students usually have to pass high school courses and the related SOL tests to earn verified credits for a standard or advanced diploma.
*Test is part of state accountability system for No Child Left Behind.


Appendix D

Disaggregated Participation Information (Given for State-Level Data)

 State

 Test

 Count

 Count Not Tested

 Count Exempt

 Count Excluded

 Percent of Students Tested

Percent of Students Not Tested

 Percent Exempt

 Percent Excluded

 Count and/or Percent Absent

AL

HSGE

Y

 

 

 

 

 

 

 

 

SAT-10

Y

 

 

 

 

 

 

 

 

DAW

Y

 

 

 

 

 

 

 

 

AK

Bench. Exams

Y

 

 

 

Y

 

 

 

 

HSGQE

Y

 

 

 

Y

 

 

 

 

AZ

SAT-9

Y

 

 

 

 

 

 

 

 

AIMS

Y

 

 

 

 

 

 

 

 

AIMS-EXIT

Y

 

 

 

 

 

 

 

 

CA

Cont. Stands.

Y

 

 

 

Y

 

 

 

 

CAT/6

Y

 

 

 

 

 

 

 

 

SABE/2

Y

 

 

 

 

 

 

 

 

CO

CSAP

Y

Y

 

 

 

 

 

 

Y

CT

CMT

Y

 

 

 

Y

 

Y

 

Y

CAPT

Y

 

 

 

Y

 

Y

 

Y

DE

DSTP (SAT-9)

Y

Y

Y

 

Y

 

Y

 

Y

FL

FCAT

 

 

 

 

Y

 

 

 

 

GA

GHSGT

Y

 

 

 

 

 

 

 

 

CRCT

Y

 

 

 

 

 

 

 

 

Writ. Assess.

Y

 

 

 

 

 

 

 

 

ID

IDA

Y

 

 

 

Y

 

 

 

 

ISAT

Y

 

 

 

Y

 

 

 

 

IRI

Y

 

 

 

Y

 

 

 

 

IN

ISTEP+

Y

 

 

 

 

 

 

 

 

GQE

Y

 

 

 

 

 

 

 

 

IA

ITBS/ITED

Y

 

 

 

 

 

 

 

 

KS

KAS

Y

 

 

 

 

Y

 

 

Y

KY

KCCT

Y

 

 

 

Y

 

 

 

 

CTBS/5

Y

 

 

 

Y

 

 

 

 

LA

ITBS/ITED

Y

 

 

 

 

 

 

 

 

LEAP-21

Y

 

 

 

 

 

 

 

 

GEE-21

Y

 

 

 

 

 

 

 

 

ME

MEA

Y

 

 

 

Y

 

 

 

 

MD

MSA

Y

 

 

 

 

 

 

 

 

HSA

Y

 

 

 

 

 

 

 

 

MA

MCAS

Y

 

 

 

Y

 

 

 

 

MI

MEAP

Y

 

 

 

 

 

 

 

 

MN

MCA

 

 

 

 

Y

 

 

 

 

BST

 

 

 

 

Y

 

 

 

 

MS

CTBS/5

Y

 

 

 

 

 

 

 

 

MCT

Y

 

 

 

 

 

 

 

 

Writ. Assess.

Y

 

 

 

 

 

 

 

 

FLE

Y

 

 

 

 

 

 

 

 

Subject Area

Y

 

 

 

 

 

 

 

 

MO

MAP

Y

 

 

 

 

Y

 

 

Y

MT

ITBS/ITED

Y

 

 

 

 

 

 

 

 

NE

Assess. of St. Read. Stands.

 

 

 

 

Y

Y

 

 

 

Statewide Writ. Assess.

Y

 

 

 

Y

 

 

 

 

NV

Crit Ref Exam

Y

 

 

 

 

 

 

 

 

ITBS/ITED

Y

 

 

 

Y

 

 

 

 

NV HSPE

Y

 

 

 

 

 

 

 

 

NH

NHEIAP

Y

 

 

 

Y

 

 

 

 

NJ

ESPA/GEPA/

HSPT

Y

 

 

 

 

 

 

 

 

NM

NMAAP

Y

 

 

 

 

 

 

 

 

NMHSCE

Y

 

 

 

 

 

 

 

 

NY

NYSAP

Y

 

 

 

 

 

 

 

 

RCT

Y

 

 

 

 

 

 

 

 

RCE

Y

 

 

 

 

 

 

 

 

OEPE

Y

 

 

 

 

 

 

 

 

NC

End of Grade

Y

Y

 

 

Y

Y

 

 

Y

End of Course

Y

Y

 

 

Y

Y

 

 

Y

Gr. 3 Pretest

Y

 

 

 

Y

 

 

 

Y

Writ. Assess.

Y

 

 

 

Y

 

 

 

Y

HSCT

Y

Y

 

 

Y

Y

 

 

Y

ND

NDSA

Y

 

 

 

Y

 

 

 

 

OH

OPT

 

 

 

 

Y

 

 

 

 

PA

PSSA

Y

 

 

 

Y

 

 

 

 

RI

NSRE

Y

 

 

 

Y

 

 

 

 

SC

PACT

Y

 

 

 

Y

 

 

 

 

TN

Achiev. Test

Y

 

 

 

 

 

 

 

 

TX

TAKS

Y

Y

Y

 

Y

Y

Y

 

Y

TAKS-EXIT

Y

Y

Y

 

Y

Y

Y

 

Y

RPTE

Y

 

 

 

 

 

Y

 

 

UT

SAT-9

Y

 

 

 

 

 

 

 

 

CCRT

Y

 

 

 

 

 

 

 

 

VT

VCAS

Y

 

 

 

 

 

 

 

 

VA

SOL

 

 

 

 

 

Y1

 

 

 

SOL-EXIT

 

 

 

 

 

Y1

 

 

 

VSAP

Y

 

 

 

 

 

 

 

 

WA

WASL

Y

Y

 

 

 

Y

 

 

Y

WV

SAT-9

Y

 

 

 

Y

 

 

 

 

WI

WKCE

 

 

 

 

 

Y

 

 

 

WRCT

Y

 

 

Y

Y

Y

 

Y

 

1VA reports the percentage of students not tested, but the percentage is aggregated for the SOL, the SOL-EXIT, and the Alternate Assessment.


Appendix E

Participation Rate Analyses

State

Grade

Subject

Test Name

AK

8

Math

Benchmarks

CT

8

Math

CMT

DE

8

Math

DSTP

FL

8

Math

FCAT

ID

8

Math

ISAT

KS

7

Math

KAS

ME

8

Math

MEA

MA

8

Math

MCAS

MO

8

Math

MAP

NV

7

Math

ITBS

NC

8

Math

End-of-Grade

ND

8

Math

NDSA

OH

Aggregate of 4–10

Math

OPT

PA

Aggregate of 3–11

Math

PSA

RI

8

Math

NSRE

SC

Aggregate of 3–8

Math

PACT

SD

8

Math

STEP Test

TX

8

Math

TAKS

VA

Aggregate of 3–12

Math

SOL and Alternate

WA

7

Math

WASL

WI

8

Math

WKCE


Appendix F

Alternate Assessment Participation Information (State-Level Data)

 State

 Test

Count

 Count Not Tested

 Count Exempt

 Count Excluded

 Percent of Students Tested

Percent of Students Not Tested

 Percent Exempt

 Percent Excluded

 Count and/or Percent Absent

AL

Alternate

Y

 

 

 

 

 

 

 

 

AK

Alternate

Y

 

 

 

Y

 

 

 

 

AZ

AIMS-Alt.

Y

 

 

 

 

 

 

 

 

ASAT

Y

 

 

 

 

 

 

 

 

CA

Alternate

Y

 

 

 

 

 

 

 

 

CO

CSAP-A

Y

Y

 

 

 

 

 

 

Y

CT

Alternate

Y

 

 

 

 

 

 

 

 

DE

DAPA

Y

 

 

 

Y

 

Y

 

 

FL

Alternate

 

 

 

 

Y

 

 

 

 

GA

Alternate

Y

 

 

 

Y

 

 

 

 

ID

Alternate

Y

 

 

 

Y

 

 

 

 

KS

Alternate

Y

 

 

 

 

 

 

 

 

KY

Alt. Portfolio

Y

 

 

 

Y

 

 

 

 

LA

Alternate

Y

 

 

 

 

 

 

 

 

ME

Alternate

Y

 

 

 

Y

 

 

 

 

MD

IMAP

Y

 

 

 

 

 

 

 

 

MA

MCAS-Alt

Y

 

 

 

Y

 

 

 

 

MI

MI-Access

Y

 

 

 

 

 

 

 

 

MO

MAP-Alt.

Y

 

 

 

 

 

 

 

 

MT

Alternate

Y

 

 

 

 

 

 

 

 

NE

Alternate

 

 

 

 

Y

 

 

 

 

NV

SCAAN

Y

 

 

 

 

 

 

 

 

NH

Alternate

Y

 

 

 

Y

 

 

 

 

NY

NYSAA

Y

Y

 

 

 

 

 

 

 

NC

NCAAI

Y

 

 

 

Y

 

 

 

Y

NCAAP

Y

 

 

 

Y

 

 

 

Y

ND

NDALT

Y

 

 

 

 

 

 

 

 

OR

Ext. Assess.

Y

 

 

 

Y

 

 

 

 

PA

Alternate

Y

 

 

 

 

 

 

 

 

SC

Alternate

Y

Y