NCEO Logo
Bookmark and Share

Technical Report 59

2008-09 Publicly Reported Assessment Results for Students with Disabilities and ELLs with Disabilities 

Martha L. Thurlow • Chris Bremer • Deb Albus

August 2011

All rights reserved. Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Thurlow, M. L., Bremer, C., & Albus, D. (2011). 2008-09 publicly reported assessment results for students with disabilities and ELLs with disabilities (Technical Report 59). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Table of Contents


Executive Summary

This is the thirteenth annual report by the National Center on Educational Outcomes (NCEO) that analyzes public reporting practices of assessment data for students with disabilities in K-12 schools in the United States. The Individuals with Disabilities Education Act (IDEA) required states to disaggregate performance data at the state and district level. This year marks the tenth annual reporting period since this requirement was established, and the seventh reporting period since the 2001 reauthorization of the Elementary and Secondary Education Act (ESEA).

For school year (SY) 2008-09, the number of regular states publicly reporting disaggregated data for both participation and performance for students with disabilities taking regular assessments maintained at 46 states from the previous year. However, this year marked the first time that all 50 states disaggregated data for at least some regular tests for ESEA accountability systems. For regular states reporting on alternate assessments based on alternate achievement standards (AA-AAS), 45 states had at least some data reported, up from 36 the previous year. Among these, 44 states reported both participation and performance data.

Reporting on English language learners (ELLs) with disabilities was examined in this report. The number of regular states disaggregating data for these students on regular assessments was eight, with five states reporting both participation and performance and three states reporting these data for some regular assessments. States reporting ELLs with disabilities participation and performance data for the AA-AAS was higher at 24 states, with 20 states reporting both participation and performance.

For alternate assessments based on modified achievement standards (AA-MAS), there were 8 states that reported data. Seven of these reported participation and performance and one reported participation only. Of these, four states also reported data disaggregated by ELLs with disabilities. No unique state (i.e., American Samoa, Bureau of Indian Education, Commonwealth of Northern Mariana Islands, U.S. Department of Defense Education Affairs, District of Columbia, Federated States of Micronesia, Territory of Guam, Republic of Palau, Commonwealth of Puerto Rico, Republic of the Marshall Islands, and U.S. Virgin Islands) had an AA-MAS.

This year's report also includes information on states that reported data for English language proficiency assessments used for Title III accountability. Of the 10 states that reported participation or performance data for these assessments 5 states reported data for ELLs with disabilities.

Reporting among unique states showed increases over the previous year. The number of unique states reporting disaggregated assessment data for students with disabilities on regular assessments was maintained at four states, and the number reporting on AA-AAS increased to four states, up from one state last year. Of these states, one state had participation information for ELLs with disabilities on its regular assessment.

Public reporting on accommodations was also more comprehensive with 28 states reporting for 2008-09, up from 19 states in 2007-08. Of these 28 states, all reported participation, 22 reported performance, and 20 reported both the number of students using accommodations and their performance. Seven states reported either the number using accommodations or performance with accommodations by specific type of accommodation. Two states reported data for at least one test indicating administration with standard vs. non-standard accommodation use. Two states reported accommodations used on an AA-MAS. One state reported linguistically accommodated testing for students with disabilities and a "bundled" set of accommodations for students with dyslexia, also by ELL status.

Finally, the publicly disaggregated participation and performance data described in this report covered a variety of state assessments based on state content standards. States have increased the breadth of their reporting over the years, to some extent due to additional testing options but also due to more detailed reporting and reporting data not reported previously.

Most states have now adopted the common core state standards and will be transitioning to new assessments designed by consortia of states. We anticipate that as states implement the new assessments, some of the current limitations in data interpretation will disappear. Assuming the continued disaggregation of publicly reported data by subgroups, we believe that we will gain a clearer national picture of the participation and performance of students with disabilities.


Overview

This report on the 2008-09 school year marks the thirteenth in a series of reports by the National Center on Educational Outcomes (NCEO) documenting state public reporting practices for large-scale statewide assessments. It is the seventh reporting period since the 2001 reauthorization of the Elementary and Secondary Education Act (ESEA), and the tenth since the 1997 reauthorization of the Individuals with Disabilities Education Act (IDEA) that required disaggregated public reporting for special education students.

The number of states reporting online disaggregated participation and performance data for students with disabilities on all regular assessments in accountability systems has maintained at 46 states (2006-07 and 2007-08). These numbers were up from the 28 states that were reporting these data before ESEA was passed (2000-01). The number had increased to 46 states in 2006-07 (Albus, Thurlow, & Bremer, 2009) after fluctuating between 35 to 39 states between 2002-03 to 2006-07 (Albus, Thurlow, & Bremer, 2009; Klein, Wiley, & Thurlow, 2006; Thurlow & Wiley, 2004; Thurlow, Wiley, & Bielinski, 2003; Wiley, Thurlow, & Klein, 2005; Thurlow, Bremer, & Albus, 2008; Thurlow, & Klein, 2005; VanGetson & Thurlow, 2007).

The number of states reporting disaggregated participation and performance data for alternate assessments based on alternate achievement standards (AA-AAS) has maintained at 36 states from 2006-07 to 2007-08. This number had reached a high of 42 states in 2004-05, and had dipped to 28 states in 2005-06.

The changes in number of states reporting on regular or AA-AAS assessments may be related to changes in federal policies for reporting to the U.S. Department of Education as well as to our criteria, which narrowed for the start of 2005-06. Annual Performance Report (APR) data were not counted as publicly reported data after 2004-05 because if a state only reported these data, the state did not report "to the public with the same frequency and in the same detail as it reports on the assessment of nondisabled children," as required by IDEA (see Thurlow, Bremer, & Albus, 2008). The public reporting of data is and will continue to be an important aspect of accountability for states, even as the majority of states are looking toward a transition to new assessments.


Method

In January 2010, project staff searched the Web sites of state departments of education for posted reports with disaggregated data for students with disabilities, including English language learners (ELLs) with disabilities, for school year 2008-09. Although states are required to report their data in the fall following the assessment year (e.g., 2008-09 data are reported in fall, 2010), they often revise data through the end of the year. Thus, January of the year after the school year in which assessments are administered is the month when almost all states have their corrected and verified data on their Web sites.

States that were searched included the 50 "regular" states and 11 "unique" states (American Samoa, Bureau of Indian Education, Commonwealth of Northern Mariana Islands, U.S. Department of Defense Education Affairs, District of Columbia, Federated States of Micronesia, Territory of Guam, Republic of Palau, Commonwealth of Puerto Rico, Republic of the Marshall Islands, and U.S. Virgin Islands). Information was collected both on the actual participation and performance data reported for students with disabilities and on how the states reported those data. The data collection included all regular and alternate state assessments within and outside the ESEA accountability systems, including assessments designed specifically for bilingual or English language learners. In this report, additional assessment data were collected for English language proficiency assessments used for Title III purposes. Although Title III assessment results are reported to the federal government as Annual Measurable Achievement Objectives (AMAOs) and are not required to be publicly reported online, some states do publicly report results for state English Language Proficiency (ELP) assessments. Thus, these data were included in our collection in order to get the broadest data available on students with disabilities who also are ELLs.

After data were collected, individual state summary tables were created for verification. These summaries included only the descriptive information on how the state reported participation and performance. See Appendix A for a sample letter and summary table used in the verification process with state assessment directors.

The verification process occurred in two waves between March and June of 2010. In the first wave, letters and summary tables based on Web searches for data were mailed to state assessment directors. Twenty-six regular states and one unique state responded to our request for verification in the first wave. In the second wave, after data tables were revised based on feedback, letters were sent to all state directors of special education (see Appendix B). Twenty-one regular states and one unique state responded to the second request for help in verification, with ten of the same states from the first wave confirming data a second time. Finally, we completed data entry and double checks for accuracy.

In the majority of this report, we credited states as reporting participation rates if no calculations were needed to arrive at them from reported data. The one exception was Figure 9 in which we also included states for which participation rates could be calculated from publicly reported data. When states reported percentages, the denominator that they used in calculations generally was not reported. It might have been the number of students with Individualized Education Plans (IEPs), the number of students with IEPs who attended school for a full academic year (FAY), the number of students eligible to take an assessment, or some other number.

As noted by Thurlow et al. (2008), NCEO adjusted the definition for what is counted as public reporting from earlier years. Starting with the 2005-06 school year, NCEO no longer considered state Annual Performance Reports (APRs) and State Performance Plans (SPPs) to be typical public reports that a state creates to meet the requirement to report public data on students with disabilities in the same manner and with the same frequency as it reports for all students. For this current report, we made the decision to use this same narrowed definition of public reporting only for tables and appendices that reflect public reporting overall (Appendices C and D). Further, if a state merged participation data but mentioned that students taking an alternate assessment were included in the overall numbers, this also counted as reporting on those assessments for students with disabilities. In tables that described how states reported data, including accommodations data, the definition for accepted data was broadened to include APR and SPP data for those tables and appendices. Throughout this report, these distinctions are clearly noted for the reader.

The definitions of regular students and students with disabilities differ across states. “Regular student” refers to a population that might include all students assessed or a further disaggregation to all students without disabilities, depending on the state. The definition should be considered in interpreting the data, because we compare “regular student” data with data for students with disabilities. Further, the term “students with disabilities” may also vary by state, with some states reporting only students with IEPs, and others reporting a combination of students with IEPs and 504 Plans. In this report effort was made to prioritize “regular” student as being all students without disabilities and using “students with disabilities” to mean students with IEPs where this was feasible.

When we examined gaps between all students and students with disabilities, we employed the same procedures as in the past, choosing representative grades to present data for elementary, middle, and high school. For our examination of gaps, we chose grades 4, 8, and 10. If a state did not have data for a grade, we chose one grade below. If that grade was not available, we chose the grade above. Further, in contrast to past reports that only focused on reading and mathematics content areas, in this report we also include science assessments in our gap analyses. Information on how states reported in other content areas is in Appendices C-E.


Results

Characteristics of Regular Assessments in State Assessment Systems

A list of regular state assessments for 2008-09 is located in Appendix C. It includes all 50 regular states and the 11 unique states, with information on the name of each assessment, grades and content areas assessed, whether the state had publicly available disaggregated participation or performance data for students with disabilities for 2008-09, and whether the results of each assessment are used for ESEA accountability purposes.

For 2008-09 we identified 126 regular statewide assessments for the 50 regular states in and outside ESEA accountability systems. Of the eight states that indicated they had administered at least one norm-referenced test (NRT), two states used the Iowa Test of Basic Skills (ITBS) (Iowa and Utah), two states used TerraNova/Cat/6 (Alaska and Arizona), two used the Stanford Achievement Test, Tenth Edition (SAT 10) (Alabama and Arkansas), two states each used EXPLORE and PLAN (Kentucky and West Virginia), and one state used Direct Writing Assessment (DWA) (Utah). Other NRTs used by one state each included the Iowa Reading Test (Utah) and the Otis-Lennon School Ability Test, Eighth Edition (OLSAT-8) (Alabama). Seven states used NRTs augmented with criterion referenced items (Arizona, Delaware, Indiana, Louisiana, Michigan, Missouri, and North Dakota). All other assessments were exit exams (EXIT), end of course exams (EoC), and criterion-referenced tests (CRTs) unique to each state, except for a few that were used commonly across a small group of states such as the New England Common Assessment Program (NECAP) assessment used by three states (New Hampshire, Rhode Island, and Vermont) with the future addition of Maine in 2009-10. North Carolina administered seven regular assessments in 2008-09, which was the largest number given by any one state.

In addition to the 50 regular states, we also included the 11 unique states. For these 11, we had specific names for 9 state assessments. The Stanford Achievement Test (SAT-9 or SAT-10) was used by three states (American Samoa, Commonwealth of the Northern Mariana Islands, and Guam). The TerraNova (NRT) was used by the Department of Defense Education Affairs, and Palau was revising its Palau Achievement Test (PAT), also an NRT. Two entities used augmented NRT/CRTs (Puerto Rico and Virgin Islands). Only one unique state (Commonwealth of the Northern Mariana Islands) used more than one regular assessment, employing a CRT in addition to the SAT-10 already mentioned. For the Bureau of Indian Education, students participate in assessments in their state of residence and are reported together as a group based on proficiency data in their respective states.

Figure 1 displays the distribution of the 126 regular assessments found for the 50 states (both in and outside ESEA accountability system) by type: criterion-referenced tests, norm-referenced tests, augmented NRTs with state-developed CRT items, exit exams used as a requirement for graduation (EXIT), and End of Course (EoC) exams taken at the end of subject area courses. End of Course assessments in Figure 1a did not have information indicating they were required for graduation.

Figure 1. Total Percent of Regular Assessments In and Outside ESEA Accountability Systems by Assessment Type (N=126)

Figure 1 Pie Chart

Note: Assessments are counted by assessment name. If a state had different names for CRTs by elementary/middle and high school these are counted separately. Alternate assessments based on alternate or modified achievement standards are presented separately in this report. Usually NRTs such as PLAN and EXPLORE assessments were treated separately. However, one state did give an assessment name to cover these and the additional ACT test under one term so these are counted as one NRT throughout.

Overall, the most common assessment type (Figure 1) in and outside ESEA accountability systems in 2008-2009 was the CRT at 59% (N=75), followed by EXIT assessments at 14% (N=18), EoC at 11% (N=14), NRTs at 10% (N=12), and augmented NRT/CRTs at 6% (N=7). Comparing the percent of each assessment type to that in 2007-08, most categories remained about the same with only a 1-4% change. EoC was added to the categories this year so a comparison to previous years is not possible.

Figure 2 displays the same information as Figure 1a except that it includes only those assessments used for ESEA accountability. Of the 93 assessments, CRTs (N=66) made up 71%, EXITs (N=13) made up 14%, NRT/CRTs (N=7) made up 7%, EoCs (N=6) made up 6%, and NRTs (N=1) made up 1%.

Figure 2. Number of Regular Assessments in ESEA Accountability Systems by Assessment Type (N=93)

Figure 2 Pie Chart 

Note: Assessments are counted by assessment name. If a state had different names for CRTs at the elementary/middle and high school levels these are counted separately.

 

States that Reported Disaggregated Regular Assessment Data for Students with Disabilities

Figure 3 summarizes state reporting of participation and performance data for students with disabilities for regular assessments within ESEA accountability systems in the 50 regular states. These assessments are the state content assessments based on grade-level achievement standards. In recent years, the total number of states reporting participation and performance for all regular assessments had grown, with 92% of states (N=46) reporting this in 2006-07 and 90% (N=45) in 2007-08. In 2008-09, the percent returned to 92% (N=46). For 2008-09, the remaining four states had participation and performance reported for some but not all regular assessments. Still, 2008-09 marks the first time since these reports on public reporting began that all states had publicly reported data disaggregated for students with disabilities on regular state assessments inside ESEA accountability.

Among those states with alternate assessments based on grade level achievement standards (AA-GLAS), which are included in Appendix C with regular assessments, two reported participation and performance (North Carolina and Minnesota), one reported only participation (Massachusetts), and one reported only performance (Virginia). Although these alternate assessments are considered regular assessments in Appendix C, the figures in this report focus on the regular assessments not including the AA-GLAS because all states are required to have regular assessments.

Figure 3. Disaggregated Assessment Results for Students with Disabilities on Regular Assessments in ESEA Accountability Systems within the 50 Regular States

Figure 3 Pie Chart 

Figure 4 portrays the participation and performance reporting for the regular assessment by state. This map shows that nearly all states had full reporting of participation and performance for students with disabilities on all regular assessments within ESEA accountability systems.

Figure 4. States Reporting 2008-09 Disaggregated Participation or Performance Data for Students with Disabilities on Regular State Assessments in ESEA Accountability Systems*

Figure 4 US Map 

*The figure does not include state APR or SPP data. A broad definition was used to determine whether a state had data. States were included if they had any data reported for the assessment (regardless of whether it was only across all grades, by grade ranges, or for specific grades).

Figure 5 shows the prevalence of full reporting of participation and performance data by assessment type in ESEA accountability systems, across the 50 regular states. Ninety-two percent of CRTs (up from 85%), had both participation and performance reported for students with disabilities, with 61 out of 66 assessments reported. NRT and NRT/CRT assessments were fully reported for students with disabilities at 100%, as for 2007-08. Exit assessments had 92% fully reported for students with disabilities (12 out of 13 assessments), which was the same as 2007-08. End of Course (EoC) assessments, documented by us for the first time this year, had 83% (5 out of 6 assessments) with data reported for students with disabilities.

Figure 5. Percent of Regular Assessments in ESEA Accountability Systems Reporting Participation and Performance by Assessment Type

Figure 5 Bar Chart 

Figure 6 is a map showing disaggregated participation and performance reporting for students with disabilities for all state mandated assessments (both within and outside of ESEA accountability). Comparing Figure 6 with Figure 4 reveals a similar pattern to that observed in previous years, with more complete reporting of disaggregated participation and performance data for assessments within ESEA accountability systems (Figure 3) compared to assessments within and outside ESEA systems (Figure 6).

Figure 6. States Reporting 2008-09 Disaggregated Participation or Performance Data for Students with Disabilities on Regular State Assessments In and Outside of the ESEA Accountability System

Figure 6 US Map 

*The figure does not include state APR or SPP data. A broad definition was used to determine whether a state had data. States were included if they had any data reported for the assessment (regardless of whether it was only across grades, by grade ranges, or for specific grades.

 

Unique States that Reported Disaggregated Regular Assessment Data for Students with Disabilities

In 2008-09, the number of unique states publicly reporting disaggregated regular assessment data for students with disabilities was four states (see Table 1). This was the same number as reported these data in 2007-08. Two years ago, in 2006-07, there was only one unique state reporting these data publicly.

Table 1. Unique States Reporting Disaggregated 2008-09 Participation or Performance Data for Students with Disabilities on Regular Assessments

Unique States

Participation

Performance

American Samoa

No

No

Bureau of Indian Education

Yes

Yes

Commonwealth of the Northern Mariana Islands

No

No

Department of Defense Education Affairs

No

No

District of Columbia1

Yes

Yes

Federated States of Micronesia

No

No

Guam

Yes

Yes

Palau

No

No

Puerto Rico

No

No

Republic of the Marshall Islands

No

No

Virgin Islands

Yes

Yes

 

States that Reported Disaggregated Regular Assessment Data for ELLs with Disabilities

In this year’s report we incorporated data for ELLs with disabilities. Figure 7 shows the extent to which states report on regular assessments disaggregated by students with disabilities who are also English language learners. These data are also presented in detail in Appendix C. Most states do not disaggregate data for these students, though the number of states that do has increased slightly across years (Bremer, Albus, Thurlow, 2011). For 2008-09, four states reported participation and performance for all regular assessments (Colorado, Michigan, Minnesota, and Ohio) and five states reported these data for some regular assessments (California, Connecticut, Delaware, North Carolina, and Texas).

Figure 7. States Reporting 2008-09 Disaggregated Participation or Performance Data for ELLs with Disabilities on Regular Assessments In ESEA Accountability Systems

Figure 7 US Map 

 

Unique States that Reported Disaggregated Regular Assessment Data for ELLs with Disabilities

Among unique states, the District of Columbia reported participation information for ELLs with disabilities on its regular assessment. Because this was the only unique state out of 11 unique states with data, these findings are not represented in a figure.

 

Characteristics of Alternate Assessments in State Assessment Systems

Alternate assessments can be based on alternate achievement standards (AA-AAS), modified achievement standards (AA-MAS), or grade-level achievement standards (AA-GLAS). All of these assessments are used within ESEA accountability systems. We included AA-GLAS assessments, based on grade level achievement standards with regular assessments in this report because they are based on the same grade-level achievement standards as the regular assessments.

All 50 regular states indicated using at least one AA-AAS (see Appendix D). The state with the highest number of alternate assessments, including AA-AAS (N=2) and AA-MAS (N=3), was North Carolina, which is consistent with findings from previous years. We first present our findings on the public reporting for the AA-AAS. Then we include a brief section on public reporting for the AA-MAS.

States that Reported Disaggregated AA-AAS Data for Students with Disabilities

Of the 50 regular states with at least one AA-AAS, two states (Arizona and North Carolina) had two tests based on alternate achievement standards. Arizona has one test for students in elementary and middle school and another one at the high school level. North Carolina had an AA-AAS for Writing at 10th grade in addition to its main AA-AAS for other content across grades.

Figure 8 shows the number and percent of states that disaggregated participation and performance data for students with disabilities on AA-AAS. In the previous two years, 2006-07 and 2007-08, there was very little change in the number of states that reported both participation and performance on the AA-AAS. The number increased from 36 states in 2007-08 to 44 states in 2008-09. Only one state reported performance only, and five states did not report either participation or performance data in public online documents.

Figure 8. Disaggregated Alternate Assessment Based on Alternate Achievement Standards Results for Students with Disabilities in 2008-09 for 50 Regular States*

Figure 8 Pie Chart 

*The figure does not include state APR or SSP data.

Figure 9 shows the regular states that reported disaggregated participation and performance data for students with disabilities on the AA-AAS. The number of states reporting participation and performance data for all AA-AAS for 2008-09 was 44. One state reported performance only (Wisconsin), and five other states did not report any participation or performance data in public online reports (Idaho, Missouri, New Mexico, Utah, and Wyoming).

Figure 9. States Reporting 2008-09 Disaggregated Participation or Performance Data for Students with Disabilities on Alternate Assessments based on Alternate Achievement Standards*

Figure 9 US Map 

*The figure does not include state APR or SPP data. A broad definition was used to determine whether a state had data. States were included if they had any data reported for the assessment (regardless of whether it was only across grades, by grade ranges, or for specific grades.

Unique States that Reported Disaggregated AA-AAS Data for Students with Disabilities

In previous years, three unique states indicated using an AA-AAS for ESEA accountability purposes. In 2007-08, one state (Virgin Islands) posted data for participation and performance on an AA-AAS. In 2008-09, we found four states that reported both participation and performance for AA-AAS (District of Columbia, Guam, Puerto Rico, and the Virgin Islands) (see Table 2). It is unclear whether the remaining unique states are not reporting their AA-AAS data or have not yet developed or implemented one, due in part to the lack of information about an AA-AAS on their Web sites.

Table 2. Unique States Reporting Disaggregated 2008-09 Participation or Performance Data for Students with Disabilities on Alternate Assessments based on Alternate Achievement Standards

Unique States

Participation

Performance

American Samoa

No

No

Bureau of Indian Affairs

No

No

Commonwealth of the Northern Mariana Islands

No

No

Department of Defense Education Affairs

No

No

District of Columbia1

Yes

Yes

Federated States of Micronesia

No

No

Guam

Yes

Yes

Palau

No

No

Puerto Rico

Yes

Yes

Republic of the Marshall Islands

No

No

Virgin Islands

Yes

Yes

States that Reported Disaggregated AA-AAS Data for ELLs with Disabilities

Figure 10 shows the states that reported on the AA-AAS for students with disabilities who are English language learners (see Appendix D). Compared to other types of assessments, almost half of all states report at least some data on this group for the AA-AAS. Twenty states reported both participation and performance, with three other states reporting only performance, and one state reporting only participation.

Figure 10. States Reporting 2008-09 Disaggregated Participation or Performance Data for ELLs with Disabilities on Alternate Assessments Based on Alternate Achievement Standards

Figure 10 US Map 

Unique States that Reported Disaggregated AA-AAS Data for ELLs with Disabilities

One unique state, Puerto Rico, reported participation data for ELLs with Disabilities on an AA-AAS. For many of the unique states, it is either unclear if they have an AA-AAS or are developing one.

States that Reported Disaggregated AA-MAS Data for Students with Disabilities

Eight states reported data for the AA-MAS (California, Kansas, Louisiana, Maryland, North Carolina, North Dakota, Oklahoma, and Texas) (see Appendix E). Three other states were developing AA-MAS (Ohio, Pennsylvania, and Tennessee) during 2008-09 and did not yet have data to report. Of the eight states with existing AA-MAS, six reported disaggregated participation and performance, one reported disaggregated participation data but merged performance data with regular assessment data (North Dakota), and one state reported participation and performance data merged with the regular assessment (Kansas).

States that Reported Disaggregated AA-MAS Data for ELLs with Disabilities

Four of the eight states with an AA-MAS reported on ELLs with disabilities (California, Maryland, North Carolina, and Texas). All four of these states reported participation and performance for all of their AA-MAS assessments for this group of students. The exception was that in Maryland the AA-MAS for high school had only participation reported and not performance for ELLs with disabilities. Maryland reported both participation and performance for the high school AA-MAS for students with disabilities overall.

 

Communicating Participation in 2008-2009

Regular Assessment Participation Approaches for Students with Disabilities for Regular States

In this section we show the ways in which regular states reported participation data for regular assessments. More specifically, we describe the participation information immediately available to readers of a state’s assessment report, without conducting further calculations. Figure 11 shows the approaches taken by the 50 regular states in presenting participation data. This information is presented by state in Appendix F.

Figure 11. Number of States Reporting Participation by Students with Disabilities Using Various Approaches for Regular Assessments in ESEA Accountability Systems in 2008-2009

Figure 11 Bar Chart 

The most common way that states reported participation was number of students assessed (n=37). This was followed by reporting the percent of students assessed (n=18), and information about exempted or excluded students (n=11). The fewest states reported number and/or percent absent (n=7) and number not assessed (n=5).

Figure 12 shows the participation rates reported for 8th grade math in states where this information was reported, or where rates could be calculated from publicly reported data. The grade and content area (middle school math) were chosen to maintain consistency with previous reports. As in past reports, states that aggregated middle school grades together are not included. For the 2008-09 school year, participation rates ranged from 68% to 99%. The uncharacteristically low participation rate for Connecticut, as noted, is because some students were not included in the rate, such as students who were part of the pilot test for the AA-MAS for that year. Otherwise the reported rates would be 91% to 99% across the 18 states, which is the first time in our tracking of these data that all states with comparable data had rates of 90% or above. For the 2007-08 academic year, participation rates ranged from 86% to 99% and in 2006-07 they were 79% to 100%.

Figure 12. Percentages of Students with Disabilities Participating in Middle School Regular Math Assessments in Those States with Clear Reporting of Participation Rates* in 2008-09

Figure 12 Bar Chart 

*Note: States graphed here include those with percentages calculated from presented data, so some may not be counted as reporting a rate in Appendix F.
**Connecticut’s data do not include students who participated in the AA-MAS pilot.

Regular Assessment Participation Approaches for Students with Disabilities by Unique States

Participation data for unique states are not graphed due to the small amount of data. Of the four unique states that publicly reported participation, three reported the number tested. Two of these did so not by grade but by grade range (i.e., elementary) or the total across grades. Three states reported the percentage tested, again with two states not reporting this by grade. The remaining seven unique states reported no disaggregated participation data publicly.

Regular Assessment Participation Approaches for ELLs with Disabilities

Figure 13 shows how regular states reported participation data for ELLs with disabilities on regular assessments (see Appendix G). As with other assessments, number assessed is most often reported by states. A difference from the pattern in Figure 8a is that more states report information on the number or percent exempt or excluded from testing for ELLs with disabilities. This would be expected given that some ELLs are exempted from reading assessments due to length of residence in the U.S.

Figure 13. Number of States Reporting Participation by Approaches for ELLs with Disabilities on Regular Assessments in ESEA Accountability Systems in 2008-09

Figure 13 Bar Chart 

Regular Assessment Participation Approaches for ELLs with Disabilities by Unique States

There was one unique state, the District of Columbia, that reported participation information on its regular assessment for ELLs with disabilities, but the data were for exemptions from testing within the population of students with disabilities. No specific information about number of students was reported and no performance data were reported.

AA-AAS Participation Approaches for Students with Disabilities

We examined the ways in which states reported participation data for their alternate assessments based on alternate achievement standards (see Appendix H). Figure 14 shows how the 50 regular states approached reporting of participation data for AA-AAS. As with other assessments, the most common participation reporting category was number of students assessed (n=36). This was followed by percent of students assessed (n=11) and percent of students by assessment type (n=10).

Figure 14. Number of Regular States Reporting Participation by Various Approaches for AA-AAS in the ESEA Accountability System in 2008-2009

Figure 14 Bar Chart 

AA-AAS Participation Approaches for ELLs with Disabilities

Figure 15 shows how states reported participation on the AA-AAS assessment for ELLs with disabilities (see Appendix I). As for students with disabilities who were not ELLs, most states reported the number assessed and percentage assessed. For ELLs with disabilities 13 states reported the number of ELLs with disabilities tested in AA-AAS, and 4 states reported the percent of ELLs with disabilities assessed and the number and/or percent exempted or excluded.

Figure 15. Number of Regular States Reporting Participation by Approach for ELLs with Disabilities on Alternate Assessments Based on Alternate Achievement Standards in the ESEA Accountability System in 2008-09

Figure 15 Bar Chart

AA-MAS Participation Approaches for Students with Disabilities

The approaches for the few states with AA-MAS assessments are not graphed but how states presented their participation data are described as follows. Six of the eight states reported the number of students tested, and two reported the percent of students tested and percent of students in the AA-MAS compared to all other assessments. Oklahoma reported number tested, split by accommodated and non-accommodated status. North Dakota reported AA-MAS data by combining all grades. Other information was reported by the following states. California reported the percent of enrolled students taking the test and the number with scores. North Carolina and Texas also reported the percent of students tested. North Carolina added data reported by gender, and Texas further reported the number or percent of students exempted, including LEP (Limited English Proficient) exempt, and the number absent. One state reported participation data for AA-MAS merged with its regular assessment data. No unique state reported data for an AA-MAS.

AA-MAS Participation Approaches for ELLs with Disabilities

Four regular states reported participation information for ELLs with disabilities on an AA-MAS. Four states reported the number tested. Two states reported the percent tested. Two states reported the number or percent exempted or not tested. One state reported the percent of students assessed by assessment type.

 

Communicating Performance in 2008-2009

Regular Assessment Performance Approaches for Students with Disabilities

States also report performance data in a variety of ways, such as the number or percent in each achievement level, percent proficient or not proficient, and scaled scores. The details for the figures in this section are presented by state and assessment in Appendix J.

Figure 16 shows how the 50 states reported performance on regular assessments. The most common way states reported performance data was by percent in each achievement level (n=39), followed by percent proficient (n=26) and other score (n=18). The “other score” category includes scaled scores or other types of scores that do not fit into the other categories.

Regular Assessment Performance Approaches for ELLs with Disabilities

Figure 17 shows the ways states report performance data for ELLs with disabilities (see Appendix K). The top two ways of reporting in Figure 17 are slightly different from Figure 16, with the same number of states (six) reporting percent proficient and percent in each achievement level.

Figure 16. Number of States Reporting Performance by Various Approaches for Regular Assessments in the ESEA Accountability Systems in 2008-2009

Figure 16 Bar Chart 

Figure 17. Number of States Reporting Performance by Approaches for ELLs with Disabilities in Regular Assessments in the ESEA Accountability Systems in 2008-2009

Figure 17 Bar Chart 

 

Selected Results of Regular Assessment Performance for Students with Disabilities

In this section we compare the performance of students with disabilities and general education students for states that reported data for the three representative grades of 4, 8 and 10 in reading, mathematics, and science. It is important to remember that the specific content and levels of proficiency vary from state to state, and periodically within a state when changes to standards or assessments are made. In these instances where a state noted that a year’s results are not comparable to previous years, we provide a note to this effect. Further, characteristics of students included in regular assessments vary from state to state based on the type of assessments available (e.g., AA-GLAS or AA-MAS). The characteristics of students in regular assessments also vary when assessments are provided in other languages and the participation of students in these is reported separately. Therefore, it is unwise to compare proficiency rates of individual states, or to compare gaps between general and special education students across states. We present data on performance gaps across states in order to describe, generally, how states are doing with regard to gaps between these populations, with the caveat to be careful in interpreting the data for reasons already mentioned. We include only regular assessments in this section, and these are predominantly state CRTs although Exit assessments are also used in instances where states had no other assessment for 10th grade for ESEA accountability. One state, Iowa, uses an NRT across all grade spans.

In comparing the general education and special education students in this section, it is important to know that both of these groups may include slightly different groups of students. For example, depending on how a state reports its data, the general education students group may include either all students assessed or all students without disabilities who were assessed. In collecting data, the desired comparison group was students without disabilities, but these data were not available for all states. Likewise, the special education group of students may be reported as students with IEPs only, or students with IEPs and 504s combined. These differences in definitions of groups can influence to some degree how gap comparisons might be interpreted.

Performance Gaps in Reading and Mathematics for Students with Disabilities. For 2008-09, slightly more states had data available for gap analyses than in 2007-08. Table 3 shows the size of the gap between students with disabilities and general education students each year from 2005-06 through 2008-09, along with the number of states included in the calculation of the gap. Generally, as in previous years, students with disabilities had a smaller percentage of students scoring proficient in these content areas compared to general education students. Unlike in years past, when there seemed to be progressively smaller gaps between the two student groups for both reading and math, in 2008-09 we observed a slight rise in the mean average gap across all grade ranges and content areas.

Table 4 shows the gap changes across years. The largest percentage increase in average gaps from 2007-08 to 2008-09 was in elementary math (+2.1), followed by middle school reading and high school math (both at +1.9). In comparing data from 2005-06 to 2008-09, the average gaps that decreased the most over the 3 years were for elementary reading (-3.8), middle school reading (-3.0), and middle school math (-3.1).

Table 3. Gaps Between Students with Disabilities and General Education Students on Regular Assessments for All States with Data: Comparison of Mean Gaps for SY 2005-06 to 2008-09

Mean Gaps for All States with Data

2005-06

2006-07

2007-08

2008-09

Gap

Number of states

Gap

Number of states

Gap

Number of states

Gap

Number of states

Elementary Reading

34.5

45

31.4

47

29.2

44

30.7

45

Middle School Reading

42.5

45

40.5

47

37.7

44

39.6

46

High School Reading

42.5

41

39.8

46

38.9

42

39.9

44

Elementary Math

29.3

45

28.9

47

26.3

44

28.4

46

Middle School Math

40.9

45

39.7

47

36.8

44

37.8

46

High School Math

38.5

42

38.2

44

35.3

43

37.2

44


Table 4. Gap Changes Between Students with Disabilities and General Education Students on Regular Assessments for All States with Data: SY 2005-06 to 2008-09

Gap Changes for All States

2005-06 and

2006-07

2006-07 and

2007-08

2007-08 and

2008-09

2005-06 and

2008-09

Elementary Reading

-3.1

-2.2

+1.6

-3.8

Middle School Reading

-2.1

-2.8

+1.9

-3.0

High School Reading

-2.7

-1.0

+1.0

-2.6

Elementary Math

-0.4

-2.6

+2.1

-0.9

Middle School Math

-1.2

-2.9

+1.0

-3.1

High School Math

-0.2

-2.9

+1.9

-1.3

 

In Tables 5 and 6, the same information on gaps and gap changes is presented only for those states that had data across all four years. In these states, we also see the same slight reversal from lower gaps to increasing gaps across years, for all grade ranges and content areas. Elementary mathematics and middle school reading showed the greatest increase in gap (+2.5) and (+2.1) respectively. Across years for these states, the largest decrease in gaps were in elementary reading (-4.4), followed by middle school math (-3.6). Although there were increases in these gaps, it should be noted that the mean percent proficient in regular and special education increased for all grades and content areas in 2008-2009, but the mean percent proficient for regular students increased more than the mean percent proficient students with disabilities (see Table 7).

Table 5. Gaps Between Students with Disabilities and General Education Students on Regular Assessments for States with 4 Years of Data: 2005-06 to 2008-09

Number of Common States with Data Across Four Years

 

Mean Gaps for States with 4 Years of Data

2005-06

2006-07

2007-08

2008-09

Elementary Reading

43

34.7

31.7

29.0

30.6

Middle School Reading

43

42.6

40.5

37.5

39.6

High School Reading

38

42.9

41.1

39.1

39.2

Elementary Math

42

29.5

29.3

26.3

28.8

Middle School Math

43

41.0

39.9

36.6

38.0

High School Math

38

39.2

39.1

36.0

37.5


Table 6. Gaps Changes Between Students with Disabilities and General Education Students on Regular Assessments for States with 4 Years of Data: 2005-06 to 2008-09

Gap Changes for States with 4 Years of Data

Gap
2005-06 and
2006-07

Gap
2006-07 and
2007-08

Gap
2007-08 and
2008-09

Gap
2005-06 and 2008-09

Elementary Reading

-3.0

-2.7

+1.3

-4.4

Middle School Reading

-2.0

-3.0

+2.1

-2.9

High School Reading

-1.8

-2.1

+0.8

-3.6

Elementary Math

-0.3

-3.0

+2.5

-0.7

Middle School Math

-1.1

-3.3

+1.1

-3.2

High School Math

-0.2

-3.1

+1.2

-2.2


Table 7. Average Percent Proficient for Regular Students and Students with Disabilities Across 2007-08 and 2008-09 by Grade and Content Level for States with Data

Number of Common States with Data Across Years

 Regular Students

Students with Disabilities

2007-2008

2008-2009

Mean Gain

2007-2008

2008-2009

Mean Gain

Elementary Reading

41

72.5

75.0

2.5

43.2

44.0

0.1

Middle School Reading

41

70.0

75.0

0.5

32.4

36.0

3.6

High School Reading

34

73.2

77.0

3.8

34.7

38.0

3.3

Elementary Math

40

71.2

75.0

3.8

44.8

46.0

1.2

Middle School Math

41

63.7

69.0

5.3

26.8

30.0

3.2

High School Math

34

63.7

66.0

2.3

27.4

29.0

1.6


Reading Performance. The reading performance of students, in states with publicly reported data by grade in 2008-09, is graphed in Figures 18-20. Generally, the performance of students with disabilities in reading was much lower than the performance of general education students. Similar to the data reported in previous years, the average percent proficient for students in elementary school was higher than for students in the middle and high school levels.

Reading performance at the elementary level showed gaps between students with disabilities and general education students that ranged from 8 to 47 percent (see Figure 18). At the middle school level, gaps in reading performance ranged from 6 to 55 percent (see Figure 19). At the high school level, gaps ranged from 16 to 58 percent (see Figure 20).

Figure 18. Elementary School Reading Performance on the Regular Assessment

In past reports we included state abbreviations in figures presenting publicly reported performance data. Because of differences across states in definitions of proficient performance, instructional practices, and other factors, we now present performance data without state identification. We believe that this will discourage inappropriate comparisons.

Figure 18 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.

Figure 19. Middle School Reading Performance on the Regular Assessment

Figure 19 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.

Figure 20. High School Reading Performance on the Regular Assessment

Figure 20 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.


Mathematics Performance. Figures 21-23 show the performance of general education students and students with disabilities on states’ 2008-09 regular math assessments. It appears, as with reading, that there were slightly greater gaps in math performance across all school levels in comparison to previous years. At the elementary school level, gaps between general education students and special education students in 2008-09 in math ranged from 4 to 44 percentage points (see Figure 21). At the middle school level (see Figure 22), the gaps in achievement ranged from 8 to 52 percentage points. And in high school gaps ranged from 9 to 55 percentage points (see Figure 23).

Figure 21. Elementary Mathematics Performance on the Regular Assessment

Figure 21 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.

Figure 22. Middle School Mathematics Performance on the Regular Assessment

Figure 22 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.

Figure 23. High School Mathematics Performance on the Regular Assessment

FIgure 23 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.


Performance Gaps in Science for Students with Disabilities. Figures 24-26 present the science performance of students in states that publicly reported data by grade. At the elementary level, in 2008-09, the performance gap between students with disabilities and general education students ranged from 6 to 43 percentage points (see Figure 24). Middle school science gaps ranged from 15 to 50 percentage points (see Figure 25). High school science gaps ranged from 16 to 48 percentage points (see Figure 26).

Figure 24. Elementary Science Performance on the Regular Assessment

Figure 24 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.

Figure 25. Middle School Science Performance on the Regular Assessment

Figure 25 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.

Figure 26. High School Science Performance on the Regular Assessment

Figure 26 Bar Chart 

Legend: Heavy Solid Bar = Students with disabilities percent proficient
Dashed Line = Gap between students with disabilities and regular students. For some states the “regular students” comparison group may include students with disabilities, because states report data differently.

 

Selected Results for Regular Assessment Performance for ELLs with Disabilities

Performance Gaps Across Content Areas for ELLs with Disabilities. In Figures 27 through 29, the performance of ELLs with disabilities and students with disabilities who are not ELLs are presented. Grades 4, 8, and 10 were used for most states to represent elementary, middle and high school levels. It is important to remember that the populations taking regular assessments in each state may vary. Reasons for the variation include the existence of different alternate assessments offered across states so that students who might typically take a regular assessment in one state, in another state might instead take an AA-GLAS or AA-MAS. They also might take a different language version of a regular assessment, one that is reported separately. Further, there are different standards across states, with varying levels of difficulty in assessments and how they are administered, as well as different numbers of students taking assessments. When a small number of students take an assessment, such as ELLs with disabilities, additional caution must be taken in interpreting performance.

Figure 27 shows elementary level performance gaps in reading, mathematics, and science for ELLs with disabilities and students with disabilities who are not ELL. The range in gaps between the two student groups across states was from 19 to 33 percentage points for reading, 8 to 21 percentage points for mathematics, and 6 to 23 percentage points for science. One state’s reading data are not graphed because there were too few students to publicly report.

Figure 27. ELLs with Disabilities’ Elementary Performance on Regular Assessment Compared to Peers with Disabilities Who are Not ELLs, Across Content Areas, 2008-09

Figure 27 Bar Chart 

Legend: Heavy Solid Bar = ELLs with disabilities percent proficient
Dashed Line = Gap between ELLs with disabilities and students with disabilities who are not ELLs

Table 8 shows the comparison in performance gaps for the five states that reported data on ELLs with disabilities, all students with disabilities who are not ELL, and general education students. This table shows the gap data for all students with disabilities and general education students as well as additional data collected for students with disabilities who are not ELL. For the five states with data at the elementary level, Table 8 shows that the smallest average gaps in performance for ELLs with Disabilities (EWD) in both comparison groups, were in mathematics, followed by science then reading. For students with disabilities who are not ELL in the middle column, the smallest gaps were in science, followed by reading and mathematics.

Table 8. Elementary Level Gap Comparisons Across Reading, Mathematics and Science for States Reporting Data for ELLs with Disabilities in 2008-09

State

 

EWD and SWD Gaps

 

SWD and GenEd Gaps

(by states at left))

 

EWD and GenEd Gaps

(by states at left)

Reading

Math

Science

Reading

Math

Science

Reading

Math

Science

State A

na

14

6

27

42

22

na

55

28

State B

19

8

23

32

23

22

51

31

45

State C

33

21

20

31

36

22

62

10

41

State D

21

17

15

31

34

27

52

51

42

State E

22

11

19

5

4

15

44

33

33

Avg. Gaps

24

14

17

25

28

22

52

36

38


Figure 28 shows performance across content areas for the middle school level. The gap ranges were 12 to 33 percentage points for reading, -2 to 12 for mathematics, and 10 to 23 for science. The data for mathematics, in one state, showed ELLs with disabilities with a slightly higher percentage scoring proficient than their peers with disabilities who were not ELLs.

Figure 28. ELLs with Disabilities’ Middle School Performance on Regular Assessment Compared to Peers with Disabilities Who are Not ELLs, Across Content Areas, 2008-09

Figure 28 Bar Chart 

Legend: Heavy Solid Bar = ELLs with disabilities percent proficient
Dashed Line = Gap between ELLs with disabilities and students with disabilities who are not ELLs.

Table 9, which shows gaps for middle school students, revealed a pattern similar to the elementary level, with ELLs with disabilities exhibiting smaller gaps in math, followed by science and reading with larger gaps. In State A, ELLs with disabilities had a slightly higher percentage scoring proficient in mathematics than their peers with disabilities who were not ELLs, thus the negative gap indication. For gaps between students with disabilities who are not ELLs and regular education students, the smallest gaps were in reading followed by very similar and larger gaps in mathematics and science.

Table 9. Middle School Level Gap Comparisons Across Reading, Mathematics and Science for States Reporting Data for ELLs with Disabilities in 2008-2009

State

EWD and SWD Gaps

SWD and Regular Gaps
(by states at left)

EWD and Regular Gaps
(by states at left)

Reading

Math

Science

Reading

Math

Science

Reading

Math

Science

State A

12

-2

14

37

47

46

49

45

75

State B

19

11

19

46

40

39

65

51

57

State C

21

11

13

41

41

28

61

52

41

State D

12

11

10

45

45

35

57

57

45

State E

33

12

23

6

8

34

52

48

55

Avg Gaps

19

9

16

35

36

36

57

51

55

 

Figure 29 presents high school level performance across content areas. The gap ranges at this level were 18 to 28 for reading, 6 to 14 for mathematics, and -5 to 20 for science. For science, one state reported ELLs with disabilities as having a higher percentage scoring proficient compared to peers with disabilities who were not ELLs.

Figure 29. ELLs with Disabilities’ High School Performance on Regular Assessment Compared to Peers with Disabilities Who are Not ELLs, Across Content Areas, 2008-09

Figure 29 Bar Chart 

Table 10 presents average gap comparisons at the high school level. At this level, ELLs with disabilities compared to students with disabilities who are not ELLs, have smaller average gaps in mathematics, followed generally by larger gaps in science and reading. Students with disabilities compared to regular students within the five states almost had the same sizes of gaps across content areas at this grade level.

Table 10. High School Level Gap Comparisons Across Reading, Mathematics and Science for States Reporting Data for ELLs with Disabilities in 2008-09

State

EWD and SWD Gaps

SWD and Regular Gaps
(Five Common States)

EWD and Regular Gaps
(Five Common States)

Reading

Math

Science

Reading

Math

Science

Reading

Math

Science

State A

na

11

-5

48

47

48

na

58

43

State C*

23

6

14

43

35

33

65

41

47

State D

18

14

18

45

48

39

63

62

57

State E

28

9

20

20

24

40

67

57

59

Avg. Gaps

23

10

12

39

39

40

65

55

52

*State B did not have data for comparison.

The pattern of smaller average gaps in mathematics, followed by larger gaps in science and reading was consistent for the five states plotted, across all grade levels, although, at the high school level the comparison with general education students showed a smaller gap in science, followed by mathematics and reading. There were consistently larger gaps in reading for ELLs with disabilities across all grade levels and in comparison with regular students and their peers with disabilities.


ELLs with Disabilities’ Performance on Regular Assessments in Other Languages. Three states (California, Colorado, and Texas) reported participation and performance data for students with disabilities taking regular assessments in Spanish. In one of the states (California), all students who take the Spanish version are ELLs, and in the other states (Colorado and Texas) there are a small number of students who may take the assessment who are not ELLs but are served in bilingual programs.

For one of the three states, the percent of ELLs with disabilities proficient on its regular assessment in Spanish was 13% for elementary reading, and 31% for elementary mathematics. That state had no reported data for middle or high school grades in Spanish. For the second and third states, the Spanish assessment performance in elementary reading was 25 and 46 percent proficient. The second state, with 100 tested in elementary reading, had 6 students who were not indicated as being ELL and thus were not designated concerning their disability status. This state also reported 30 percent proficient in writing in Spanish. Performance was reported for other content areas in the third state, with 52 percent proficient in mathematics and 22 percent in science. None of the three states with regular assessments taken in Spanish reported data for middle or high school grades.


AA-AAS Performance Approaches for Students with Disabilities and ELLs with Disabilities

Figure 30 displays the approaches that the 50 states used to report performance data for alternate assessments based on alternate achievement standards (AA-AAS). This figure shows that the most common performance reporting categories were percent in each achievement level (n=27), followed by percent proficient (n=17). These are similar to the approaches that states used to report on their regular assessments. More states reported “other scores” on the regular assessment than they did for AA-AAS. Only six states reported “other scores” for AA-AAS. For more detailed information by state and assessment, see Appendix L. The ways states reported these data for ELLs with disabilities were similar (see Figure 31 and Appendix M).

Figure 30. Number of States Reporting AA-AAS Performance by Various Approaches in the ESEA Accountability System in 2008-09

Figure 30 Bar Chart 

Figure 31. Number of States Reporting AA-AAS Performance for ELLs with Disabilities by Various Approaches in the ESEA Accountability System in 2008-09

Figure 31 Bar Chart 

AA-MAS Performance Approaches for Students with Disabilities. For AA-MAS performance, five states reported percent proficient, and four states each reported percent in each achievement level and number proficient. Three states reported number in each achievement level, and two states reported AA-MAS data merged with the regular assessment data.

AA-MAS Performance Approaches for ELLs with Disabilities. Four states reported performance data for ELLs with disabilities on an AA-MAS. Three states reported the percent of students in achievement levels. Two states reported the percent of students that were proficient. Two states reported average scale scores, and one state reported the mean scale scores.

 

Other Information Collected for 2008-2009

Title III Assessments for English Language Proficiency (ELP)

In past reports on ELLs with disabilities, we tried to track the extent to which states reported on all assessment types. This included assessments designed to measure the states’ Annual Measurable Achievement Objectives (AMAOs) as reported to the federal government under Title III (i.e., ELP assessments). In prior reports, the total number of states publicly reporting these data for ELLs with disabilities was small (Albus, Thurlow, & Liu, 2009). For 2008-09, there were five regular states that reported participation and performance data for this group of students (California, Michigan, Minnesota, New York, and Texas). Alaska and Colorado reported information on accommodated students who took ELP assessments, but these data likely did not include all ELLs with disabilities because some students probably did not use accommodations on the assessments. Pennsylvania also reported data publicly online in 2008-09, but users required a log-in to access the data. In the past (2007-08), there were also five states that reported these data for ELLs with disabilities, plus Alaska and Colorado, which reported some data by accommodations but not the total for all ELLs with disabilities assessed. Overall, in 2008-09, there were only 10 states that publicly reported online any participation or performance data on their Title III assessments; thus half of those reporting ELP data for ELLs also reported on ELLs with disabilities. General information on states reporting participation or performance on an ELP assessment is provided in Appendix N.

Table 11 shows the different ways that states reported ELP performance for 2008-09. The score type most in common across the states was composite scores showing percent proficient (California, Michigan, and Texas). Although more states reported just an overall score, the type of scores reported by modality (e.g., reading, writing, speaking, listening) were split between mean scale scores and providing information that allowed us to calculate percent proficient.

Performance gaps in states with composite scores are presented in Figure 32 for elementary (fourth grade), middle school (eighth grade), and high school levels (tenth grade). It should be noted that states individually determine how composite scores are constructed, and ELP assessments vary across states. Because we have data from only a few states, caution should be exercised in interpreting findings. The elementary performance gaps across the three states ranged from 29 to 42. At the middle school level the gaps ranged from 12 to 34, and at the high school level from 17 to 25 percentage points.

Table 11. How ELP Scores are Reported Across States in 2008-09

States

Type of Scorea

Performance

By Grade

Grade Ranges

% Proficient Can Be Calculated

Mean Scale Score

California

By Modality

X

X

Composite

X

X

Michigan

By Modality

X

X

Composite

X

X

Minnesota

By Modality

X

X

Composite

None

New York

Combined Modalities

X

X

Composite

None

Texas

By Modality

X

X

Composite

X

X

*By modality indicates data were reported for reading, writing, listening, and speaking separately. Composite indicates that a score across all modalities was reported.

Figure 32. English Language Proficiency Assessment Performance Gaps by Composite Scores and Grade Level for ELLs with Disabilities Compared to ELLs in 2008-09

Figure 32 Bar Chart 


Reporting on Accommodations

Twenty-eight states reported participation or performance data for students taking state assessments with or without accommodations (see Appendix O for details). This number was up from 19 in 2007-08. At least part of this increase may be attributed to the fact that we included APR data in some analyses for 2008-09, such as reporting of accommodations data, even though we did not in 2007-08. Of these 28 states, all reported accommodated students’ participation, performance, or both, disaggregating by grade for at least one of their assessments. Seven states reported participation or performance by specific type of accommodation used by students (Louisiana, Colorado, Mississippi, Minnesota, North Carolina, New Hampshire, and Texas). Two states reported participation and performance for accommodations based on levels of approval for their use: non-approved/modification (Colorado) and standard and non-standard accommodation (Michigan). Two states reported accommodation use disaggregated by assessments based on modified achievement standards (Louisiana and Oklahoma). Another state (Texas) reported linguistically accommodated testing (LAT) administration for students with disabilities, as well as a “bundled” set of accommodations for students with dyslexia on the English and Spanish versions of the regular assessment.

Of all 28 states reporting data on accommodated administrations of a state assessment, 20 states reported both participation and performance data for accommodated students. Eight states reported participation only, either the number or percent participating, with accommodations (Connecticut, Minnesota, Missouri, Nebraska, North Dakota, New Hampshire, Rhode Island, and Tennessee).

Table 12 presents information on how states that reported on accommodations on state tests reported participation information. Twenty-one states (22 state tests) each reported data with and without accommodations, four states reported with accommodations only (6 state tests), one state reported without accommodations only (1 state test), nine reported by specific accommodation (18 state tests), and one state each by non-approved/nonstandard accommodation, one state test. For specific data reporting, 12 states reported numbers tested (16 state tests), 3 states reported percent tested (3 state tests), and 19 states reported both number and percent tested (22 state tests).

Table 12. 2008-2009 Summary of States that Reported State-Level Information about Accommodations: Participation

State

Assessment

Participation

With and
Without Accom.

By Specific Accom.

By Non-
approved/Nonstandard

Ns Reported

%s Reported

Ns and %s Reported

Alaska

ELP Test

With accom

X

Arizona

AIMS

X

X

Arkansas

Regular Assmt

X

X

Colorado

CSAP

With accom

X

X

CSAPA

With accom

X

X

CELA

X

X

X

Connecticut

CMT & CAPT

X

Can calculate

X

Florida

FCAT

X

X

Hawaii

Regular Assmt

X

X

Illinois

Regular Assmt

X

X

Indiana

ISTEP+

X

X

Iowa

ITBS/ITED

X

X

Louisiana

LAA2

With accom

X

X

LAA1

With accom

X

X

LEAP & iLEAP

Without accom

X

X

Michigan

MEAP, ACCESS,

FI, MME,

And ELPA

X

X

Minnesota

MCAs

X

X

Mississippi

MCT2

X

X

X

Missouri

Regular Assmt

X

Category of

accom

X

X

Nebraska

Regular Assmt

X

X

North Carolina

EOG & EOC

X

X

NCEXTEND2

EOG

X

X

NCEXTEND2 OCS

X

X

Computer Skills

X

X

Writing

X

X

North Dakota

NDSA

X

X

New Hampshire

NECAP

X

X

Oklahoma

OCCT, EOI

X

X

OMAAP

X

X

Oregon

OSA

X

X

Rhode Island

NECAP

X

X

S. Carolina

Regular Assmt

X

X

S. Dakota

DSTEP

X

X

Tennessee

TCAP AT

X

X

Texas

TAKS

X

X

TAKS using

Dyslexia accoms

Bundled

For Dyslexia

X

LAT TAKS

With accom

X

Utah

Regular Assmt

X

X

W. Virginia

WESTEST2

X

X


Table 13 presents information on how states that reported on accommodations on state tests reported performance information. Seventeen states (18 state tests) reported data with and without accommodations, three states reported with accommodations only (3 state tests), three states reported without accommodations only (3 state tests), and three states reported by specific accommodation (9 state tests). For specific data reporting, 3 states reported numbers proficient (3 state tests), six states reported percent proficient (11 state tests), and 15 states reported both number and percent proficient (17 state tests).

Table 13. 2008-09 Summary of States that Reported State-Level Information about Accommodations: Performance

State

Assessment

Performance

With and Without
Accomm.

By Specific
Accomm.

Ns Reported

%s Reported

Ns and %s
Reported

Alaska

ELP Test

With accom

X

Arizona

AIMS

X

X

Arkansas

Regular Assmt

X

X

Colorado

CSAP

X

X

CSAPA

Without accom

X

X

CELA

X

X

Connecticut

CMT & CAPT

Florida

FCAT

X

X

Hawaii

Regular Assmt

X

X

Illinois

Regular Assmt

X

X

Indiana

ISTEP+

X

X

Iowa

ITBS/ITED

X

X

Louisiana

LAA2

Without accom

X

X

LAA1

X

X

LEAP & iLEAP

Without accom

X

X

Michigan

MEAP, ACCESS,

FI, MME,

And ELPA

X

X

Minnesota

MCAs

Mississippi

MCT2

X

X

Missouri

Regular Assmt

X

Category of

accom

X

Nebraska

Regular

North Carolina

EOG & EOC

X

X

NCEXTEND2

EOG

X

X

NCEXTEND2

OCS

X

X

Computer Skills

X

X

Writing

X

X

North Dakota

NDSA

X

X

New Hampshire

NECAP

Oklahoma

OCCT, EOI

OMAAP

With accom

X

Oregon

OSA

X

X

Rhode Island

NECAP

X

X

S. Carolina

Regular Assmt

S. Dakota

DSTEP

X

X

Tennessee

TCAP AT

Texas

TAKS

X

X

TAKS using

Dyslexia accoms

Bundled for

Dyslexia

X

LAT TAKS

With accom

X

Utah

Regular Assmt

X

X

W. Virginia

WESTEST2


“Click” Analysis of Web-based Reporting

Publicly reported data are not functionally public unless provided in an easily accessible manner. To examine ease of access, we analyzed the number of mouse clicks it took to locate disaggregated data on students with disabilities on the Web sites of state departments of educations. This analysis is similar to previous analyses we have conducted, and presents click summary figures for all regular states with data on regular assessments and AA-AAS. For states with a Web page that generates reports, we did not count the additional clicks needed to choose specific demographic or assessment characteristics. For those sites, we only counted the number of clicks needed to arrive at the generator site and a final “submit” click. Web page search engines were not used and “false starts” were not counted.

Figure 33 presents the number of clicks between Web pages required to arrive at the disaggregated data for states’ regular assessments. Figure 34 presents the same information for states’ AA-AAS. For 2008-09, most state Web sites in the analysis required 3 or 4 clicks to access data, with 33 states for regular assessments and 24 states for AA-AAS data. No state required 7 or more clicks for regular or AA-AAS assessments. This is somewhat comparable to the results of the previous year’s analysis (2007-08), which found 34 states with 3-4 clicks and 1 state with 7 clicks or more for regular assessments and 25 states with 3-4 clicks and 1 states with 7 clicks or more on AA-AAS. However, because Web sites change frequently, and because the number of states reporting data changes from year to year, an exact comparison is not possible. For example, in the past two years, all but one state has reported regular assessment data. For 2007-08, 42 states reported AA-AAS data, while 44 did so in 2006-07.

Figure 33. Number of States in Each “Click” Category for States Reporting Regular Assessments (Total N=50)

Figure 33 Pie Chart 

Figure 34. Number of States in Each Click Category for states Reporting AA-AAS (Total N=43)

Figure 34 Pie Chart


Summary and Discussion

Extent of Public Reporting for Students with Disabilities

2008-09 marked the first year that all 50 regular states were counted as having at least some disaggregated data for students with disabilities that is publicly reported in a manner comparable to that of their data for general education or all students. In the previous year, 49 of 50 states had data publicly reported online. States reporting participation and performance for all regular assessments totaled 46 states, with 4 states reporting these data for some but not all regular assessments within ESEA accountability systems. This number was similar to the numbers found for 2007-08. There were 32 states reporting participation and performance data for all regular assessments in and outside the ESEA accountability systems, and 18 states that reported these data for some assessments in and outside the system. This difference in how states reported on assessments in and outside ESEA systems remained about the same as in 2007-08.

For regular states reporting on alternate assessments based on alternate achievement standards (AA-AAS), the number of states with at least some data increased to 45. Of these, there were 44 states reporting both participation and performance, up from 36 states the year before, and one state reporting participation only.

 

Extent of Reporting Compared to Previous Years

Those states reporting both participation and performance for all regular assessments inside ESEA accountability systems changed by one state in the past three years, with 46 in 2006-2007, 45 in 2007-08, and 46 states in 2008-09. For this most recent year, states reported disaggregated participation and performance data online for 92% of all regular assessments within ESEA systems. Four states reported participation or performance for some but not all assessments within ESEA. No state reported only participation or only performance for all state assessments. The number of unique states, including special territories, publicly reporting disaggregated participation and performance data for regular assessments in 2008-09 remained at four states, the same as for 2007-08.

The number of regular states publicly reporting participation and performance for AA-AAS increased to 44 states after remaining at 36 states from 2006-07 to 2007-08. The number of states that did not publicly report AA-AAS data declined to 5 states, down from 8 states the previous year. Just one state reported performance only for its AA-AAS. For unique states, there were 4 states that reported both participation and performance on AA-AAS in 2008-09. This contrasts to either one or no states reporting these data from 2005-06 to 2007-08.

 

Extent of Public Reporting for ELLs with Disabilities

Most states do not disaggregate data for ELLs with disabilities, though the number that do is increasing slightly from year to year. Compared to just one state reporting participation and performance data on a regular assessment in 2006-07 (Albus, Thurlow, & Liu, 2009), five states reported participation and performance for all regular assessments and three states reported these data for some regular assessments in 2008-09. For AA-AAS, 20 states reported both participation and performance, with 3 other states reporting only performance, and 1 state reporting only participation. This compares to 14 states that reported participation and performance data for an AA-AAS in 2006-07, with one state that reported participation only, and 2 states that reported performance only, in that year. In 2008-09, one unique state, the District of Columbia, reported participation information on a regular assessment. One other unique state, Puerto Rico, reported participation information for ELLs with disabilities on its AA-AAS. Neither unique state reported disaggregated performance for either assessment.

 

How Data Are Reported

In the past three years, states’ most common approaches for communicating participation and performance on regular assessments and AA-AAS remained the same. For participation, the most common way to report was in terms of the number assessed (37 states for regular assessment and 36 for AA-AAS). For performance, the most common way was reporting the percent of students in each achievement level (39 states for regular assessments and 26 for AA-AAS) followed by percent proficient (26 states for regular assessments and 17 states for AA-AAS). These numbers were identical to those for 2007-08. The data on ELLs with disabilities were reported using the same approaches as for students with disabilities overall. For ELLs with disabilities, more states reported exemption information. This may be due to regulations that allow certain ELLs to be exempted from some testing due to amount of time they have been in the country, although ELLs with disabilities should be included in the data reported for all students with disabilities.

 

How Students Performed on Regular Assessments

The general pattern toward decreasing gaps between general education students and students with disabilities seems to have reversed slightly in 2008-09. There were indications of larger gaps across all content and grade levels. Still, there were increases in scores for both populations across all grades and content areas. Larger gaps seem to be due to a larger overall mean gain among general education students compared to the mean gain made by students with disabilities. Although the mean gap generally grew, the mean gaps remained smaller overall in elementary reading and middle school mathematics. The mean gaps grew the most in elementary mathematics and middle school reading, which was the result of general education students making even larger gains in these areas compared to the gains made by students with disabilities.

Performance for ELLs with disabilities, compared to students with disabilities who are not ELLs, and the general education student population, showed a consistent pattern across content areas in terms of the smallest mean gaps, although this analysis included only five states. ELLs with disabilities’ performance, compared to performance of students with disabilities who are not ELLs and performance of general education students, repeatedly showed the smallest mean gap in mathematics, followed by science and reading. The only exception was at the high school level, where the comparison with general education students showed the smallest gap in science, followed by mathematics and reading. Similar mean gap comparisons for students with disabilities who are not ELLs within the five states, showed no similar consistent pattern across grades by content areas. There were some data reported where ELLs with disabilities had a higher percentage proficient than their peers with disabilities who were not ELL. It is noted that caution should be used in data interpretation within a state or across states due to differences in the characteristics of populations taking each state’s regular assessment. These differences are related to the variations in assessment options available in states as well as other factors.

 

Performance on AA-MAS

Eight states reported data on their AA-MAS, with six reporting disaggregated participation and performance, one state reporting disaggregated participation but merged performance with the regular assessment, and one state reporting participation and performance merged with its regular assessment data. Of these eight states, four reported disaggregated participation and performance for ELLs with disabilities taking the state AA-MAS.

 

Performance on Title III ELP Assessments

Only 10 states reported participation or performance data for any student taking an English language proficiency assessment, with or without disabilities. Among these states, 5 reported data for ELLs with disabilities on English language proficiency assessments used for Title III accountability. Three of these five states reported performance by grade using similar reporting categories. These showed the largest mean gaps at elementary grades, with somewhat smaller mean gaps at middle and high school grades. Caution should be used in data interpretation with such a small number of states because of differences across states in test construction and design and how composite scores are constructed. No unique state reported Title III ELP assessment data.

 

Accommodations Reporting

Accommodations reporting continued its upward trend from previous years with the number of states reporting disaggregated data for students who used accommodations on state assessments in 2008-09 up to 28 states from 19 states in 2007-08 and 16 states in 2006-07. Of these 28 states, 7 reported either participation or performance by specific type of accommodation used by students. Two states reported participation and performance for accommodations based on levels of approval for their use (i.e., non-approved/ modification and standard and non-standard accommodations). Two states reported accommodations used on an AA-MAS, and one state reported linguistically accommodated testing for students with disabilities and a “bundled” set of accommodations for students with dyslexia, also by ELL status for English and Spanish versions of regular assessments. Twenty states reported both participation and performance data for accommodated students. Slightly more states reported participation data than performance data for students using accommodations on state tests.

 

Recommendations for Reporting

The following recommendations are offered concerning public reporting of disaggregated data for students with disabilities:

  1. Report participation and performance results for each assessment, content area, and grade level.
  2. Clearly label preliminary and final data with dates posted.
  3. Report participation with accommodations.
  4. Report participation percentages, disaggregated by grade.
  5. Make data accessible by attending carefully to the usability of formats, ease of finding information, and clarity of language.

For the 2008-09 school year, most states reported data by assessment, content area, grade level and whether it was finalized data. Also, states with more than one version of finalized data posted also publicly communicated which version of the reports to use. Similarly to last year, a few states are choosing to merge their regular and AA-AAS assessment performance data, or are combining other alternates that are not clearly identified in reports. Although the practice of combining regular and alternate assessment data makes sense for some accountability purposes, it does not allow analysis of data by test, and is therefore less desirable from the standpoint of those wishing to carry out such analyses.

This year, more states reported data on accommodated participation and performance than in previous years. The number of states reporting participation percentages remained about the same as in previous years; however, states continue to differ in the denominator used to calculate the percentage (whether percent of students tested in the system or percent of students tested based on numbers enrolled in grade level). Finally, the accessibility of reports has remained about the same over the past few years, as measured via the number of clicks it takes to get to assessment data from a state’s homepage. Despite the improvements, more can be done to ensure that data are presented in accessible formats for a broad population of users. States may wish to consider the populations of stakeholders using the data to determine how best to improve accessibility of data on their Web sites. For example, some states provide resources in other languages for understanding state assessments and results.


Conclusion

Although reporting practices for regular assessments have changed little for 2008-09 compared to the previous years, this year did mark the first time all 50 states reported disaggregated data for at least some state assessments in ESEA accountability systems. Reporting on AA-AAS also improved over the previous two years. Further, all states with AA-MAS reported participation and performance data.

For performance, although there were increased mean gaps for students with disabilities and regular students on regular assessments across all grades and content areas, the mean performance for students in both populations showed improvement in all grades and content areas, but regular students showed larger mean gains compared to last year. Unique states held steady with reporting on regular assessments, and made some improvement in the number of states reporting on AA-AAS. Further, with the inclusion of ELLs with disabilities in this report, we saw that although fewer states report on these students for regular assessments, this number also is increasing over prior reports. Nearly half of the regular states report on AA-AAS for this population. For Title III ELP assessments, half of the regular states that publicly reported data for ELLs did so for ELLs with disabilities.

Finally, the publicly disaggregated participation and performance data described in this report covered a variety of state assessments based on state content standards. States have increased the breadth of their reporting over the years, to some extent due to additional testing options but also due to more detailed reporting and reporting data not reported previously.

Most states now have adopted the common core state standards and will be transitioning to new assessments designed to be used by consortia of states. We anticipate that as states implement the new assessments, some of the current limitations in data interpretation will disappear. Assuming the continued disaggregation of publicly reported data by subgroups, we believe that we will gain a clearer national picture of the participation and performance of students with disabilities.


References

Albus, D., Thurlow, M., & Bremer, C. (2009). Achieving transparency in the public reporting of 2006-2007 assessment results (Technical Report 53). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Albus, D., Thurlow, M., & Liu, K. (2009). State reports on the participation and performance of English language learners with disabilities in 2006-2007 (Technical Report 54). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Bremer, C., Albus, D., & Thurlow, M. L. (2011). Public Reporting of 2007–2008 Assessment Information on Students with Disabilities: Progress on the Gap Front (Technical Report 57). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Klein, J. A., Wiley, H. I., & Thurlow, M. L. (2006). Uneven transparency: NCLB tests take precedence in public assessment reporting for students with disabilities (Technical Report 43). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L., & Bremer, C., & Albus, D. (2008). Good news bad news in disaggregated subgroup reporting to the public on 2005-2006 assessment results (Technical Report 52). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L., & Wiley, H. I. (2004). Almost there is public reporting of assessment results for students with disabilities (Technical Report 39). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Thurlow, M. L., Wiley, H. I., & Bielinski, J. (2003). Going public: What 2000-2001 reports tell us about the performance of students with disabilities (Technical Report 35). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

VanGetson, G. R., & Thurlow, M. L. (2007). Nearing the target in disaggregated subgroup reporting to the public on 2004-2005 assessment results (Technical Report 46). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Wiley, H. I., Thurlow, M. L., & Klein, J. A. (2005). Steady progress: State public reporting practices for students with disabilities after the first year of NCLB (2002-2003) (Technical Report 40). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.


Appendix A

Example Letter to Assessment Director

The National Center on Educational Outcomes is examining states’ public reports on 2008-2009 school year assessment results. Our goal is to (a) identify all components of each state’s testing system (b) determine whether each state reports disaggregated test results for students with disabilities and English language learners (ELLs) with disabilities, (c) describe the way participation and performance information is presented, and (d) describe how states report results for students who took the test with accommodations or modifications. This year we have combined verification of data for ELLs with disabilities to streamline our contact with you.

We have reviewed your Web site for test information, including both participation and performance data on your statewide assessments. Enclosed are tables highlighting our findings from that review. Please verify all included information. Specifically, please return the tables that we have attached, noting your changes to them. Also, if there is additional publicly reported information available for your state, please provide us with the public document and/or website that contains the accurate information. Address your responses to Deb Albus via email albus001@umn.edu, fax (612) 624-0879,or via mail to the above address.

If you have any questions about our request, please email Deb Albus or call at (612) 626-0323. Please respond by April 19, 2010.

Thank you for taking the time to provide this information.

Sincerely,

Martha Thurlow
Director

Deb Albus
Research Fellow

 

ALABAMA, 2008-2009
(Tables 1- 6)

Table 1: Tests Administered and Results Found on Your State’s Regular Report(s)

Please review this table for its accuracy, make any changes (if necessary), and fill in any blank fields.

Test

Grades Tested

Subject Areas

Is Disaggregated Info on Participation and Performance Reported for …

Is this test part of NCLB system? (Yes/No)

Students with Disabilities

ELLs with Disabilities

Partic.

Perform.

Partic.

Perform.

DIBELS

K-2

Reading

No

No

No

No

No

Direct Assessment of Writing (DAW) [CRT]

5, 7, 10

Writing

Yes

Yes

No

No

No

Alabama High School Graduation Exam (AHSGE) [EXIT]

11, 12

Reading, Language, Math, Science, Social Studies

Yes

Yes

No

No

Yes

Stanford Achievement Test, 10th ed. (SAT-10) [NRT]

3 - 8

Reading, Math,

Language (5-8), Science (5,7), Social Studies (6)

Yes

Yes

No

No

No

Alabama Reading and Mathematics Test (ARMT) [CRT]

3 - 8

Reading, Math

Yes

Yes

No

No

Yes

Alabama Science Assessment

5,7

Science

Yes

Yes

No

No

Yes

Otis-Lennon School Ability Test (OLSAT 8) [NRT]

3-8

Does not specify

No

No

No

No

No

Alabama Alternate Assessment (AAA)

AAS*

K - 11

Reading, Math, Science (5,7,11)

No

No

No

No

Yes

Title III ELP assessment

K-12

Reading, Writing, Speaking, Listening

In next columns

No

No

Yes

Standards:*AAS=based on alternate achievement standards; GLAS=based on grade level achievement standards
Assessment Types: CRT=Criterion Referenced Test; NRT=Norm Referenced; EXIT=Diploma Test

 

Table 2: Participation Information for Students with Disabilities

Please review this table. A “Y” indicates we found data reported this way in your state’s regular report(s). Please add a “Y” if your state uses additional categories in your regular report(s), and please provide us with the information (either a hard copy or a Web-link). A regular report is a public report summarizing data for students with disabilities in a manner equivalent to that used for state data reporting for students without disabilities or for all students.

Note: “Y” marks indicate categories the state uses descriptively (e.g., we do not add percentages of students across achievement levels to get total percent proficient for this table).

Test

 

Data reported by grade and individual test

Percent of Students by Assessment

(e.g.,4% in alternate on AAS)

Number of Students

Tested

Number of Students Not Tested

Percent of Students (participation rate e.g., 98% gr. 4)

Percent of Students Not Tested

Number and/or Percent Exempt or Excluded

Number and/or Percent Absent

AHSGE

Y

N

N

Y

N

N

N

SAT-10

Y

N

N

Y

N

N

N

ARMT

Y

N

N

Y

N

N

N

Science

Y

N

N

Y

N

N

N

OLSAT 8

N

N

N

N

N

N

N

Writing

Y

N

Y

N

N

N

N

DIBELS

N

N

N

N

N

N

N

AAA

N

N

N

N

N

N

N

 

Table 3: Participation Information for ELLs with Disabilities

Test

 

Data reported by grade and individual test

Percent of Students by Assessment

(e.g.,4% in alternate on AAS)

Number of Students

Tested

Number of Students Not Tested

Percent of Students (participation rate e.g., 98% gr. 4)

Percent of Students Not Tested

Number and/or Percent Exempt or Excluded

Number and/or Percent Absent

AHSGE

N

N

N

N

N

N

N

SAT-10

N

N

N

N

N

N

N

ARMT

N

N

N

N

N

N

N

Science

N

N

N

N

N

N

N

OLSAT 8

N

N

N

N

N

N

N

Writing

N

N

N

N

N

N

N

DIBELS

N

N

N

N

N

N

N

AAA

N

N

N

N

N

N

N

 

Table 4: Performance Information for Students with Disabilities

Please review this table. A “Y” indicates we found data reported this way in your state’s regular report(s). Please add a “Y” if your state uses additional categories in your regular report(s), and please provide us with the information (either a hard copy or a Web-link). A regular report is a public report summarizing data for students with disabilities in a manner equivalent to that used for state data reporting for students without disabilities or for all students.

Note: “Y” marks indicate categories the state uses descriptively (e.g., we do not add percentages of students across achievement levels to get total percent proficient for this table).

Test

Data reported by grade and individual test

Percent in Each Achievement Level

Percent in Each PR* Group

Percent Proficient

Percent Not Proficient

Number in Each Achievement Level

Number Proficient

Number Not Proficient

Other

AHSGE

Y

N

N

N

N

N

N

N

SAT-10

N

N

N

N

N

N

N

Y Percentile

Science

Y

N

N

N

N

N

N

N

ARMT

Y

N

N

N

N

N

N

N

OLSAT 8

N

N

N

N

N

N

N

N

Writing

Y

N

N

N

N

N

N

N

AAA

N

N

N

N

N

N

N

N

*=Percentile Rank

 

Table 5: Performance Information for ELLs with Disabilities

Test

Data reported by grade and individual test

Percent in Each Achievement Level

Percent in Each PR* Group

Percent Proficient

Percent Not Proficient

Number in Each Achievement Level

Number Proficient

Number Not Proficient

Other

AHSGE

N

N

N

N

N

N

N

N

SAT-10

N

N

N

N

N

N

N

N

Science

N

N

N

N

N

N

N

N

ARMT

N

N

N

N

N

N

N

N

OLSAT 8

N

N

N

N

N

N

N

N

Writing

N

N

N

N

N

N

N

N

AAA

N

N

N

N

N

N

N

N

*=Percentile Rank

 

Table 6: Accommodations

We are interested in examining if and how states report information about students who take assessments using accommodations. Please change our responses (if necessary) to reflect information that is reported for your state. If you do make changes, please provide us with the information (either a hard-copy or a Web-link).

Tests Reporting Data on Accommodations

Accommodation Categories

Is Disaggregated Info for Students Using Accommodations Reported? (Yes/No)

For Whom?

Participation

Performance

None


Appendix B

Example Letter to Special Education Director

The National Center on Educational Outcomes is examining states’ public reports on 2008-2009 school year assessment results. Our goal is to (a) identify all components of each state’s testing system (b) determine whether each state reports disaggregated test results for students with disabilities and English language learners (ELLs) with disabilities, (c) describe the way participation and performance information is presented, and (d) describe how states report results for students who took the test with accommodations or modifications.

We have reviewed your Web site for test information, including both participation and performance data on your statewide assessments. Enclosed are tables highlighting our findings from that review. Please verify all included information. Specifically, please return the tables that we have attached, noting your changes to them. Also, if there is additional publicly reported information available for your state, please provide us with the specific Web address that contains the information.

New this year:
Verification for ELLs with disabilities to streamline our contact with you
Tables 2-6 may include SPP, APR or other reports

Reminder:
Tables 2-5 includes data by test and grade.
These tables do not include data that require further calculations using other reported data.

Address your responses to Deb Albus via email albus001@umn.edu or fax (612) 624-0879. If you have any questions about our request, please email Deb Albus or call at (612) 626-0323. Please respond by June 1, 2010, though we are also flexible concerning this timeline.

Thank you for taking the time to provide this information.

Sincerely,

Deb Albus
Research Fellow

© 2013 by the Regents of the University of Minnesota.
The University of Minnesota is an equal opportunity educator and employer.

Online Privacy Statement
This page was last updated on May 30, 2013

NCEO is supported primarily through a Cooperative Agreement (#H326G050007, #H326G110002) with the Research to Practice Division, Office of Special Education Programs, U.S. Department of Education. Additional support for targeted projects, including those on LEP students, is provided by other federal and state agencies. Opinions expressed in this Web site do not necessarily reflect those of the U.S. Department of Education or Offices within it.