A Comparison of IEP/504 Accommodations Under Classroom and Standardized Testing Conditions: A Preliminary Report on SEELS Data

Synthesis Report 63

Nicole Bottsford-Miller • Martha L. Thurlow • Karen Evans Stout •
Rachel F. Quenemoen

September 2006

All rights reserved. Any or all portions of this document may be reproduced and distributed without prior permission, provided the source is cited as:

Bottsford-Miller, N., Thurlow, M. L., Stout, K. E., & Quenemoen, R. F.  (2006). A Comparison of IEP/504 accommodations under classroom and standardized testing conditions: A preliminary report on SEELS Data (Synthesis Report 63). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved [today's date], from the World Wide Web: http://education.umn.edu/NCEO/OnlinePubs/Synthesis63/

Table of Contents

Executive Summary
Why This Report?
Data Source and Methods
Extended Time
Read Aloud
Setting Accommodation
Response Accommodations
Modified Test
Alternate Test
Summary and Discussion of Accommodations, Modification, and Alternate Testing

Executive Summary

Individual Education Plans (IEP’s) and 504 Plans often recommend the use of accommodations to facilitate the learning of classroom material by students with disabilities. Since the Individuals with Disabilities Education Act (IDEA) in 1997, students with disabilities are expected to participate in state and district-wide assessments, using appropriate accommodations. Large-scale assessment assumes the use of standardized testing conditions to allow for comparability of test scores; however, some students with disabilities are better able to demonstrate their knowledge when allowed to use accommodations that offset the effect of their disability on the construct tested. For these students, de-standardizing the test conditions is the only meaningful way to obtain an accurate estimate of achievement.

In 1999 the Special Education Elementary Longitudinal Study (SEELS) began to examine the experience of elementary age students in schools, and pertinent to this report, their experience in testing situations, particularly the use of accommodations. In this paper, using data from SEELS, we examine accommodation use across different educational conditions, comparing IEP and 504 Plan accommodations to what students reportedly received in the classroom and on standardized tests.

Results suggest a lack of alignment in accommodation use among IEP/504 plans, classroom conditions, and state testing situations. Additionally, there is some variability in what happens for students with different categorical labels. Since the data for this report was obtained from an early administration of the SEELS (Wave 2), monitoring of the alignment issue should continue.


With increasing importance placed on including students with disabilities in large-scale assessment, psychometricians, researchers, and policymakers are pressed to understand who is receiving accommodations, what accommodations are provided, whether there is consistency in accommodation use across educational settings, and how accommodations impact test scores. In 1999 the Special Education Elementary Longitudinal Study (SEELS) began a 6-year study of elementary age students that will help to answer some of these questions as well as others related to the experiences of students with disabilities.

One of several areas that SEELS researchers have begun examining is accommodation and support, as used by students with disabilities in language arts classes. Comparisons have been made across disability categories, general and special education classrooms, and several additional student demographic characteristics. In a similar national longitudinal study on the experiences of middle and high school students, National Longitudinal Transition Study-2 (NLTS2) researchers examined the frequency with which different accommodations were used on standardized tests, comparing disability categories, race/ethnicity, and income (http://www.nlts2.org/index.html).

Both the SEELS and NLTS2 studies were commissioned following the reauthorization of the Individuals with Disabilities Education Act (IDEA) in 1997, which for the first time in special education law confirmed that students with disabilities were to participate in state and district-wide assessments, with accommodations where appropriate. This dramatic change in special education federal law in 1997 promoted the need for the SEELS and NLTS2 studies, which to some extent because of their timing would both reflect and be caught up in the changes that were sure to occur over time.

Similar to previous SEELS and NLTS2 reports on accommodation, the goal of this report is to examine accommodation use during the elementary and middle school years. Unlike previous reports, we examine accommodation use across different educational conditions, comparing IEP and 504 Plan accommodations to what students reportedly received in the classroom and on standardized tests. In doing this, we hope to gain a better understanding of the consistency with which accommodations are provided. This report is a preliminary analysis based on SEELS data available at http://www.seels.net/.


The value of implementing large-scale assessment under standardized conditions is in the comparability of test scores. If all else is held constant, score differences can be attributed to differences in learning achievement (Salvia & Ysseldyke, 2001). As Salvia and Ysseldyke recognize, however, the process of standardization can prevent the accurate measurement of achievement for students with disabilities. In order to accurately measure achievement, such students need accommodations that offset the effect of the disability on their ability to demonstrate knowledge (McDonnell, McLaughlin, & Morison, 1997; Rose, 2000). In other words, in some circumstances the only way to obtain a meaningful estimate of achievement is to de-standardize the testing process (e.g., providing enlarged print for a student with a visual impairment, or providing a reader for a student with a reading disability; Phillips, 1994).

Research suggests that when some accommodations are provided consistently in both the original learning environment and the testing process, these accommodations have a greater impact on student assessment performance. For example, Russell (1999) compared the impact of handwritten responses to typed responses on standardized test scores. Results suggested that students performed better when they were allowed to record answers in the format in which they learned and practiced the skills and knowledge. That is, students who were accustomed to writing on computers tended to generate higher quality responses on open-ended test items when allowed to use computers than students less accustomed to writing on a computer. Similarly, students who were not as computer proficient performed better on open-ended standardized test questions when allowed to handwrite answers than students more accustomed to writing on computers. Although Russell was not studying students with disabilities, the results support the need to match accommodations used in testing to accommodations typically available to students in class in order to maximize students’ ability to demonstrate learning achievement. (Sireci, Scarpati, & Li, 2005).

Why This Report?

Given the high stakes associated with tests for both students and schools, and given research suggesting that students perform better when provided similar accommodations in both classroom learning and testing situations, we need to understand what is occurring in classrooms and on assessments, beyond looking at state policy guidelines and reports. In addition to observations, an effective way of determining what accommodations are available to students with disabilities is to ask teachers and school personnel to report on student IEP/504 Plans and accommodation use in classrooms and on standardized tests.

This preliminary report compares the reported proportions of IEP/504 Plan accommodations with the proportions of accommodations used in class and in standardized testing conditions. The purpose is to explore proportional consistency across conditions. Additionally, we compare the use of various accommodations across four of the most common disability categories. Assessment accommodation policies influence the accommodations that may or may not be used and how scores are counted; nevertheless, a general understanding of how accommodation practices vary across students with differing disabilities will facilitate student participation in education in a fair and appropriate manner.

Data Source and Methods

The data for this report are from the Special Education Elementary Longitudinal Study (SEELS), a national study conducted by SRI International and Westat for the U.S. Department of Education. The purpose of the SEELS study is to document "the educational experiences and progress of thousands of elementary and middle school students nationwide who received special education services in 1999" (p. 1). To obtain a more comprehensive picture of the experiences of participants, information is being collected over a 6-year period from parents, teachers, and students (SEELS Newsletter, 2003). This information pertains to a variety of experiences students with disabilities have, ranging from academic performance and school experiences, to family supports, social adjustment, and personal growth. Information is being collected in three waves. Our review is limited to 2001-2002, Wave 2 information available on the SEELS Web site pertaining to accommodations as required or recommended on student IEP/504 Plans and their reported use in language arts classes and on standardized tests.

SEELS Sample. The SEELS sample consists of 11,512 students receiving special education in at least one grade from first to seventh grades. The representative sample was selected from 245 LEAs and 32 special schools across the country so that statistical summaries from SEELS will "generalize to special education students nationally as a group, to each of the 13 federal special education disability categories, and to each single-year age cohort" (http://www.seels.net/studydesign.htm).

SEELS Data Collection Instruments. SEELS researchers collect information about students repeatedly as they progress from elementary to middle school and from middle school to high school. Sources include: (1) a parent/guardian interview; (2) a Teacher Survey; (3) School Program Survey; (4) School Characteristics Survey; (5) transcripts; and (6) direct assessments of students’ reading and math skills, self concept and attitudes about school.

Selection of SEELS Data on Accommodations for This Report. We were most interested in three specific uses of accommodations, that is, what accommodations were stated on students’ IEP/504 Plans, what accommodations were used in classrooms, and what were used on standardized tests. Accordingly, although all four of the above listed questionnaires have questions pertaining to accommodations, we chose as our data source the two that were completed by people most likely to know what the student was receiving on a daily basis and to have direct access to the student’s IEP/504 Plan. These questionnaires are the Teacher Survey, which was completed by the student’s language arts teacher and the School Program Survey, which was completed by the school person who knows the student’s programs well.

In terms of selecting the accommodations of interest for this report, we noted some discrepancies in the wording of items on the data sources (e.g., the terms "modification" and "accommodation" are used interchangeably, making it unclear how respondents may have interpreted the questions). This interchange of terms is consistent with the terminology in IDEA 97, which was changed in IDEA 2004 because the field did not actually use the terms accommodation and modification interchangeably. Specific accommodations were selected to include in the report either because, as a response choice, they were similar across conditions (IEP, language arts class, and standardized testing) or a similarly worded response choice was available under the three conditions. These included: extended time, read aloud, setting accommodations, response accommodations, modified test, and alternative/alternate tests. Accommodations focused only on the classroom are not discussed (e.g., slower-paced instruction, shorter/different assignments).

Another discrepancy was the use of the words "alternate" and "alternative." Survey instruments used the term "alternate tests or assessments", but data tables referred to "alternative assessments." The word "alternative" was also used with other nouns such as "alternative settings." We use "alternate" in this report to refer to alternate assessments.

Selection of Student Groups to Compare. SEELS data tables present results for twelve disability categories. In this report we describe the accommodations/modifications recommended and used by the four most prevalent disability categories. These groups represent approximately 90% of all students with disabilities (U.S. Department of Education, 2002). These categories include learning disability (LD), speech/language impairment (SI), mental retardation (MR), and emotional disturbance (ED). To view disaggregated information for less frequently appearing disabilities or additional information about the study, go to the SEELS Web site at http://www.seels.net/.

Definitions. For the purposes of this report, the following definitions were used:

Accommodations are "those alterations to test presentation, setting, timing, scheduling, and response that mitigate the barrier of disability and allow a student with disabilities to demonstrate actual achievement in a particular academic area without changing the underlying construct of what is being measured" (Shaftel, Yang, Glasnapp, & Poggio, 2005, p. 358). It follows that test accommodations "enable students to participate in state or district assessments in a way that assess abilities rather than disabilities" (Lehr & Thurlow, 2003, p. 2).

The Standards (AERA, 1999) define a test modification as "changes made in the content, format, and/or administration procedure of a test in order to accommodate test takers who are unable to take the test under standard test conditions" (p. 183). For the purposes of some statewide testing programs, a distinction is usually made between accommodation and modification in that an accommodation does not alter the construct measured, while a modification is a change that alters the construct (Sireci, Scarpati, & Li, 2005). In the SEELS data such a distinction is not made and the words are used interchangeably.

a) Extended time—"Increase the allowable length of time to complete a test or assignment (and/or). . . change the way the time is organized" (Cortiella, 2005, p. 3)

b) Read aloud—is a presentation accommodation, which "allow students to access information in ways that do not require them to visually read standard print. These alternate modes of access are auditory, multi-sensory, tactile, and visual" (Cortiella, 2005, p.3).

c) Setting—"Change the location in which a test or assignment is given or the conditions of the assessment setting" (Cortiella, 2005, p.3).

d) Response—"Allow students to complete activities, assignments, and tests in different ways to solve or organize problems using some type of assistive device or organizer" (Cortiella, 2005, p. 3).

e) Other—"Special test preparation, on task/focusing prompts, and others that do not fit into other categories" (Thurlow, Elliott, & Ysseldyke, 2003, p. 31).

Alternate assessments. Alternate assessments are tools used to evaluate the performance of students who are unable to participate in general state assessments even with accommodations (Thurlow, Elliott, & Ysseldyke, 2003).

Organization of the findings. In what follows we first present the overall comparison of accommodations as stated on IEP/504 Plans, as used in the classroom, and as used for standardized tests. We then show differences in accommodation by disability type. Each section is followed by a short discussion. This same presentation format continues for modified test and alternate test. Finally we consider the findings and their implications as a whole.


Extended Time

Extended Time by Condition. Differences across conditions existed in the relative proportion of students receiving extended time (see Figure 1). Although 76.2% of students had this accommodation stated on IEP/504 Plans, and 75.6% were reported to have received additional time taking tests in language arts classes, only 53.3% received this accommodation on standardized tests.

Figure 1. Extended Time: Disabilities Combined

* Standard error of measurement (SEM): IEP/504 Plan (IEP/504) = 1.6 (n = 4,933), Language Arts Class (Class) = 1.5 (n = 5,243), Standardized Tests (Test) = 2.1 (n = 3,241).


Extended Time by Disability Group. The proportional use of extended time varied somewhat across disability categories and condition. Figure 2 shows that of the four disability categories, only the percentage of students with mental retardation using extended time across conditions tended to appear equivalent (MR: IEP/504 = 71.0%, Class = 71.2%, Tests = 71.5%), There was more variability for other disability groups. Students with emotional disturbance and learning disability tended to have congruence between the IEP/504 plan and use of the accommodation in language arts classes, with less use of the accommodation in the standardized testing condition. The greatest disparity occurred for students with a speech/language impairment (LD: IEP/504 = 83.5%, Class = 85.0%, Test = 58.1%; SI: IEP/504 = 66.4%, Class = 61.2%, Test = 35.2%; ED: IEP/504 = 78.4%, Class = 78.5%, Test = 59.9%). These students were least likely to receive extended time in any condition.

Figure 2. Extended Time: Separated by Disability

* SEM - LD: IEP/504 = 2.3 (n = 557), Class = 2.2 (n = 605), Test = 3.3 (n = 458); MR: IEP/504 = 2.7 (n = 550), Class = 2.8 (n = 573), Test = 3.7 (n = 286); SI: IEP/504 = 4.7 (n = 202), Class = 4.3 (n = 274), Test = 4.3 (n = 238); ED: IEP/504 = 2.8 (n = 448), Class = 3.1 (n = 470), Test = 3.8 (n = 343).


Extended Time Discussion. The provision of additional time is one of the most frequently used accommodations for assessments (Sireci, Li, & Scarpati, 2003). In a review of research on time accommodations, Sireci et al. (2003) found evidence suggesting that students with disabilities tend to benefit more from extended time than students without disabilities. Sireci, Scarpati, and Li (2005) analyzed test accommodations in terms of the interaction hypothesis, which postulates that an accommodation does not threaten the validity of test scores if the use of it improves the performance of students with disabilities but does not improve the performance of students without disabilities. These researchers found that extended time improves the performance for all students but that students with disabilities experience greater gains.

The impact of extended time on the validity of interpretations of test scores is not well understood. Although predictive validity appears to be reduced when timing accommodations are used on the SAT, the population (i.e., college bound, high school students) is not representative of typical K-12 students who require accommodations, nor is there a comparable criterion to college GPA (Sireci, Li, & Scarpati, 2003). Considering that 76.2% of students with disabilities had extended time on their IEP/504 Plan, and over half of the students taking the standardized test (53.3%) were reported as having received it, better understanding of the effects of this accommodation on test scores is warranted.

Read Aloud

Read Aloud by Condition. The percentage of students provided a read aloud accommodation, both in class and as stated on IEP/504 Plans, was somewhat greater than the percentage of students who received a read aloud accommodation on standardized tests (IEP/504 = 50.0%, Class = 51.5%, Test = 40.6%).

Read Aloud by Disability Group. The proportion of the read aloud accommodation varied somewhat across disability categories (see Figure 4). Overall, students with mental retardation were most likely to receive this accommodation and to receive it consistently across conditions (MR: IEP/504 = 60.7%; Class = 60.9%; Test = 68.1%). Students with speech/language impairment, on the other hand, appeared least likely to receive a read aloud accommodation. This pattern of discrepancy across conditions was similar for students with emotional disturbance and learning disability. Specifically, similar proportions of students with learning disability, speech/language impairment, and emotional disturbance appeared to receive a read aloud accommodation fairly consistently between IEP/504 Plan recommendations and in class, but were less likely to receive a read aloud accommodation when taking standardized tests (LD: IEP/504 = 53.1%; Class = 57.0%; Test = 44.9%; SI: IEP/504 = 45.9%; Class = 40.6%; Test = 26.6%; ED: IEP/504 = 42.0%; Class = 44.0%; Test = 33.3%).

Figure 3. Read Aloud: Disabilities Combined

* SEM: IEP/504 = 1.8 (n = 4,933), Class = 1.8 (n = 5,243), Test = 2.1 (n = 3,241) 
** n’s are the same for figures 7 and 9

Figure 4. Read Aloud: Separated by Disability.

* SEM - LD: IEP/504 = 3.1 (n = 557), Class = 3.0 (n = 605), Test = 3.4 (n = 458); MR: IEP/504 = 2.9 (n = 550), Class = 3.0 (n = 573), Test = 3.8 (n = 286); SI: IEP/504 = 5.0 (n = 202), Class = 4.3 (n = 274), Test = 4.0 (n = 238); ED: IEP/504 = 3.4 (n = 448), Class = 3.7 (n = 470), Test = 3.7 (n = 343).
** All n’s are the same for figures 8 and 10.

Read Aloud Discussion.
The provision of a reader is one of the more controversial assessment accommodations (see Thurlow, Lazarus, Thompson, & Robey, 2002). As such, it is somewhat encouraging that such a high percentage of students with disabilities received reading accommodations when taking standardized tests (i.e., between 40% and 50% on average). Additionally, there existed an approximate 10% discrepancy between what students were reported to receive in their language arts class and the read aloud accommodation provided on standardized tests.

One hypothesis for this discrepancy is that on a day-to-day basis, teachers may employ accommodations in class that optimize student performance and participation, but when administering standardized tests be less inclined, due to state policy or for other reasons, to alter the standardized procedures. If this hypothesis is correct, the extent to which students are exposed to different expectations in class than on standardized tests may be of concern.

Setting Accommodation

Setting Accommodation by Condition. From Figure 5, it is apparent that students who took the standardized test received setting accommodations with greater frequency than those who where given setting accommodations in class or as stated on their IEP/504 Plans. Only 20.2% of students with disabilities had a setting accommodation on their IEP/504 Plans, and while a slightly larger percentage (24.5%) received a setting accommodation in class, almost 40% of students who took the standardized test received this accommodation.


Figure 5. Setting Accommodation: Disabilities Combined
* SEM: IEP/504 = 1.5 (n = 4,933), Class = 1.6 (n = 5,243), Test = 1.3 (n = 3,241)
Figure 6. Setting Accommodation: Separated by Disability
* SEM - LD: IEP/504 = 2.2 (n = 557), Class = 1.6 (n = 605), Test = 3.4 (n = 458); MR: IEP/504 = 2.3 (n = 550), Class = 2.5 (n = 573), Test = 4.1 (n = 286); SI: IEP/504 = 4.1 (n = 202), Class = 3.6 (n = 274), Test = 3.9 (n = 238); ED: IEP/504 = 3.0 (n = 448), Class = 3.3 (n = 470), Test = 3.9 (n = 343).


Setting Accommodation by Disability Group. As shown in Figure 6, there was minimal discrepancy between IEP/504 Plans and classroom conditions. Students with speech/language impairment were the only group of students with proportional consistency in settings accommodations across all conditions (SI: IEP/504 = 21.7%, Class = 20.8%, Tests = 24.2%). Students with emotional disturbance (ED: IEP/504 = 24.9%, Class = 27.3%, Tests = 43.1%) were next but had more discrepancy in the testing condition. For students with learning disability or students with mental retardation, differences in proportional use between IEP/504 Plans and standardized test conditions ranged from approximately 30% to 34% (LD: IEP/504 = 14.5%, Class = 23.2%, Tests = 44.0%; MR: IEP/504 = 19.3%, Class = 20.8%, Tests = 53.7%).

Setting Accommodation Discussion.. With the exception of students with speech/language impairment, who received proportionally consistent setting accommodations, the reason for the considerable discrepancy between IEP/504 Plans, in-class accommodations and standardized testing is unclear. The general lack of controversy surrounding the use of setting accommodations, however, may account for more use of this accommodation during testing, even in the absence of IEP/504 Plan documentation (see Thurlow, Lazarus, et al., 2002). Another explanation may be that the individuals completing the IEP/504 Plan and in-class questions on accommodations were not focusing on location when considering the setting accommodation physical adaptationn, as might have been the case when the setting option was worded alternative setting

Response Accommodations

Response Accommodations by Condition. Response accommodation information was not available across all three conditions. Consequently, it was not possible to make direct comparisons across conditions for any response accommodation. Table 1 provides a summary of response accommodations listed on the SEELS questionnaires with all disabilities combined.

Table 1. Response Accommodations

 Combined Disabilities                                            Accommodation Use/Listed
 Response Accommodation 
 Language Arts Class
 Standardized Tests
 Student responses dictated/                                                                                              7.4
written by someone else                                                                                                   (1.1)
 More frequent feedback
Alternative format for responding
 n = 

Response Accommodation by Disability Group. In terms of student responses dictated/written by someone else as the response accommodation, students with mental retardation were most likely to receive this accommodation (MR: 14.2%, SEM = 2.9%), followed by students with emotional disturbance (ED: 8.2%, SEM = 2.1%), learning disability (LD: 6.3%, SEM = 1.6%), and speech/language impairment (SI: 5.3%, SEM = 2.0%). The more frequent feedback accommodation was documented on IEP/504 Plans and provided in largest proportion to students with mental retardation (MR: IEP/504 = 43.2%, SEM = 2.9%, Class = 58.9%, SEM = 3.0%) followed by students with emotional disturbance (ED: IEP/504 = 44.2%, SEM = 3.4%, Class = 62.7%, SEM = 3.6%), learning disability (LD: IEP/504 = 34.4%, SEM = 2.9%, Class = 48.8%, SEM = 3.1%), and speech/language impairment (SI: IEP/504 = 28.6%, SEM = 4.5%, Class = 36.1%, SEM = 4.2%). The frequency with which students in each disability category received an alternate format of response was too low to accurately estimate the proportion of students receiving this accommodation.

Response Accommodation Discussion. Although direct comparisons were not possible across conditions, one accommodation in particular stands out, that is the accommodation of more frequent feedback. A relatively high proportion of students either received this accommodation in class or were recommended to receive it based on their IEP/504 Plan. Since there is some evidence to suggest that some students perform better with ongoing encouragement, feedback, and reinforcement (see Tindal & Fuchs, 1999 for a review of studies), we need to better understand the frequency of use for this accommodation and its effect on standardized test validity.

Modified Test

Modified Test by Condition. The extent that student IEP/504 Plans document test modifications, and the extent that students received modified tests in class and on standardized tests is somewhat unclear. Interpreting this data is difficult due to the fact that respondents may have interpreted the term modificationn as synonymous with accommodation. Additionally, some changes to the way in which a standardized test is administered may be considered an accommodation under certain circumstances and a modification under others. For example, having a test read aloud may be considered an accommodation on a math test, but a modification on a reading test. The criteria for determining if a change in procedure is a modification or an accommodation depends upon whether the change is believed to significantly alter what is being measured.

On the standardized test survey question, when respondents selected "Student participated in the testing program with modification," they were directed to fill out a subsequent question that provided a list of several accommodations, some of which could be considered modifications, and one definite modification (i.e., shortened test). Only 1.9% of students received the shortened test modification. The bar graph showing modifications to standardized tests in Figure 7, however, is stacked with test changes that are most often considered modifications under certain circumstances (i.e., read aloud and dictated response) in order to show possible alternative percentages of students receiving modified standardized tests. Given the lack of clarity in the data, inferences based on modified test data should be tentative and limited.

Figure 7 suggests that test modification may have been fairly common practice for students with disabilities across the three conditions depending upon what was considered a modification or an accommodation.

Figure 7. Modified Tests: Disabilities Combined 


SEM: IEP/504 = 1.8, Class = 1.8, Test = 0.6 (Read Aloud = 2.1; Dictated Response = 1.1)

Modified by Disability Group. Modifying tests in class compared proportionally consistent with IEP/504 Plan stipulations (see Figure 8). Additionally, this accommodation was provided in similar proportions to students with learning disability, mental retardation, and emotional disturbance, with somewhat fewer students with speech/language impairment receiving it per IEP/504 Plan stipulations and reported in-class use (LD: IEP/504 = 41.6%, Class = 50.2%; MR: IEP/504 = 53.0%, Class = 54.7%; ED: IEP/504 = 39.6%, Class = 48.0%; SI: IEP/504 = 29.5%, Class = 27.1%).

Figure 8. Modified Test

Figure 9. Modified Standardized Test

*SEM – LD: IEP/504 = 3.0, Class = 3.1, Test = 0.9; MR: IEP/504 = 3.0, Class = 3.1, Test = 2.2; SI: IEP/504 = 4.6, Class = 3.9, Test = 0.0; ED: IEP/504 = 3.4, Class = 3.8, Test = *


As shown on Figure 9, students with mental retardation were the most likely to have standardized tests shortened, followed by students with learning disability (MR: Test = 7.5%, LD: Test = 2.0%). Zero students with speech/language impairment received shortened standardized test modifications, and too few students with emotional disturbance received it on standardized tests to provide an accurate estimate of proportional use. Although the use of dictated response was fairly consistent across disability groups, having standardized tests read aloud varied considerably.

Modified Test Discussion. The modification of tests appears to be fairly common practice in language arts class. It is unclear, however, how frequently it occurs on standardized tests due to our lack of information on what was considered a modification and what was considered an accommodation by those surveyed.

Alternate Test

Alternate Test by Condition. Comparable percentages of students with disabilities had IEP/504 Plans that stipulated alternate assessments and took alternate assessments in class (IEP/504 = 28.8%, Class = 30.9%; see Figure 10). The proportion who took alternate tests rather than standardized tests was considerably smaller (Test = 14.1%).

Figure 10. Alternate Test: Disabilities Combined
SEM: IEP/504 = 1.6, Class = 1.7, Test = 1.2 


Alternate Test by Disability Group. In all disability groups there was considerable discrepancy between the provision of alternate assessments in place of standardized tests and what was both stipulated on IEP/504 Plans and given in the classroom (see Figure 11). Students with mental retardation were most frequently provided with alternate assessments. Approximately 50% of students with mental retardation had alternate assessments stipulated on IEP/504 Plans and provided in language arts classes. Conversely, students with speech/language impairment were least likely to receive this accommodation in class or on their IEP/504 Plan stipulations (approximately 20%). Across disability groups, fewer students received alternate assessments in place of standardized tests, with frequencies ranging from approximately 34% for students with mental retardation to as few as 4.7% of students with speech and language impairment.

Figure 11. Alternate Test: Separated by Disability
SEM – LD: IEP/504 = 2.6, Class = 2.8, Test = 1.9; MR: IEP/504 = 3.0, Class = 3.1, Test = 2.8; SI: IEP/504 = 4.0, Class = 3.6, Test = 2.8; ED: IEP/504 = 3.0, Class = 3.6, Test = 2.0. 

Alternate Test Discussion.
In all conditions, the stipulation and provision of alternate tests appears to be high, and alternate tests in class may vary considerably in terms of the extent to which they deviate from typical in-class assessments. To comply with NCLB requirements, at most, only 2% of students with persistent academic difficulty should be scored as proficient on an alternate test in place of the state/district standardized test (U.S. Department of Education: Raising Achievement-Alternate Assessments for Students with Disabilities, ¶ 2).

Summary and Discussion of Accommodations, Modification, and Alternate Testing

Patterns of accommodation use emerged when looking across accommodations discussed in this report. With the exception of setting accommodations, all accommodations tend to appear more often on IEP/504 Plans and to be provided in the classroom, than were provided on standardized tests. Setting accommodations, on the other hand were more often provided on standardized tests. Additionally, for most accommodations, there appeared to be proportional consistency between the accommodations documented on IEPs and 504 Plans and in-class use of these. Note, however, that it is not clear from the SEELS data tables whether the relative proportion of students receiving accommodations under each condition were the same students, only that the proportions (or percentages) appeared similar.

As with general observations about accommodations, certain consistencies appeared when examining accommodations for students within specific disability categories. Students with learning disability and emotional disturbance tended to have similar patterns of discrepancy across conditions and similar rates of accommodation use overall. Accommodation use for students with speech/language impairment and mental retardation, on the other hand, tended to vary more. In terms of alternate assessments, students with speech/language impairment were the least likely to have received alternate assessments. Although students with speech/language impairment were also least likely to have received extended time and a read aloud accommodation, they had the greatest discrepancy of use across conditions (i.e., they were more likely to receive these accommodations in class than on standardized tests). Students with speech/language impairment, however, were the only group that received setting accommodations consistently across contexts. Students with mental retardation received extended time and a read aloud accommodation with the greatest consistency. On the other hand, students with mental retardation received setting accommodations with the least consistency (i.e., they received setting accommodations far more frequently on standardized tests than was stated on the IEP or 504 Plan or provided in class).

According to most state accommodation policies (see Clapper, Morse, Lazarus, Thompson, & Thurlow, 2005), accommodations provided on standardized tests should be the same or similar to those given on IEP/504 Plans and provided in class. Available data consists of proportions or percentages of groups receiving accommodations. It is assumed that, to the greatest extent possible, the students with accommodations identified on IEP/504 Plans were the same students receiving accommodations in class and on standard tests. Determining the extent of consistency across conditions for individual students will be an important issue to explore. Additionally, some students that fall into specific disability categories had considerable proportional discrepancies in accommodation provision across conditions. For example, for students with learning disability and speech/language impairment, there was an approximate 25% to 30% proportional discrepancy in the provision of extended time between the classroom and standardized testing, with students receiving extended time more often in the classroom. The lack of consistency suggests the need for greater attention to providing accommodations consistently across conditions.


The assumption that the accommodations students receive on large-scale standardized assessments should be reflected in what they receive when they take classroom assessments, and that both should be evident on students’ Individualized Education Programs (IEP) or 504 accommodation plans, is virtually common knowledge at this point. To ensure that IEP teams and individuals or teams who put together 504 plans apply this "common knowledge," almost every state has a policy or guideline that specifically states that accommodations used in state testing must be ones used for instruction (Lazarus, Thurlow, Lail, Eisenbraun, & Kato, 2006). Turning state policy into practice is challenging and not made easy when states have not taken their policy to the local level via training. Langley and Olsen (2003) found that nearly one-third of a sample of states provided no training on accommodations at all, and another third focused only on assessment accommodations.

The SEELS findings document on a nationally representative sample what has been found in a couple of studies at more local levels. These studies alerted us to the potential issue of lack of alignment between IEP/504 plans and what happened during instruction/classroom assessments, and what happened during state standardized assessments. For example, Shriner and DeStefano (2003) documented that the accommodations selected by highly trained IEP teams, who made decisions about needed accommodations for individual students based on their training, often did not end up being implemented on the day of testing. Shriner and DeStefano also found that some accommodations that were not identified for a particular student were given to that student. Research by the Paul Sherlock Center (2002) revealed also that the students testing location seemed to be more important than what the IEP said in determining what the student received on the day of the state test. Thus, if one student at a location needed the test read to him or her, then all students in that location received the test read aloud.

Having a nationally representative sample provides much larger numbers, which allow us to look in more detail at the extent of the problem and at some of the interactions. The SEELs data have suggested that there is some variability in what happens for students with different categorical labels. It will be important to monitor this. The current SEELS data, from Wave 2, are from a relatively early administration of the study. Obtaining similar information from a later administration of the assessment will be informative, especially as local districts, and IEP teams or 504 accommodation plan developers become accustomed to making decisions and providing accommodations during, not only classroom assessments, but also standardized testing, and determine if there is change in state practices.


American Educational Research Association, American Psychological Association, National Council for Measurement in Education. (1999). Standards for educational and psychological testing. Washington, D.C.: American Educational Research Association.

Clapper, A., Morse, Lazarus, S., Thompson, S., & Thurlow, M. L. (2005). 2003 state policies on assessment participation and accommodations for students with disabilities (Synthesis Report 56). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Cortiella, C. (2005). No Child Left Behind: Determining appropriate assessment accommodations for students with disabilities. New York, NY: National Center for Learning Disabilities, Inc.

Langley, J., & Olsen, K. (2003). Training district and state personnel on accommodations: A study of state practices, challenges and resources. Washington, DC: Council of Chief State School Officers.

Lazarus, S. S., Thurlow, M. L., Lail, K. E., Eisenbraun, K. D., Kato, K. (2006). 2005 state policies on assessment participation and accommodations for students with disabilities (Synthesis Report 64). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Lehr, C., & Thurlow, M. (2003). Putting it all together: Including students with disabilities in assessment and accountability systems (Policy Directions 16). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

McDonnell, L. M., McLaughlin, M. J., & Morison, P. (1997). Educating one and all: Students with disabilities and standards-based reform. National Academy Press, Washington D. C. (p. 7). Retrieved February 21, 2004, from http://www.netlibrary.com.

Paul Sherlock Center on Disabilities. (2002). Results from Rhode Island’s assessment accommodation survey. Providence, RI: Rhode Island Department of Education Office of Special Needs.

Phillips, S. E. (1994). High-stakes testing accommodations: Validity versus disabled rights. Applied Measurement in Education, 7(2), 93-120.

Rose, D. (2000) Universal design for learning. Journal of Special Education Technology, 15(4). Retrieved October 15, 2003, from http://jset.unlu.edu/15.4/issuemenu.html.

Russell, M. (1999). Testing on computers: A follow-up study comparing performance on computer and on paper. Education Policy Analysis Archives, 7(20). Retrieved December 1, 2003, from http://epaa.asu.edu/epaa/v7n20/.

Salvia, J., & Ysseldyke, J. E. (2001). Assessment (8th Ed.). Boston: Houghton Mifflin.

Shaftel, J., Yang, X., Glasnapp, D., & Poggio, J. (2005). Improving assessment validity for students with disabilities in large-scale assessment programs. Educational Assessment, 10(4), 357-375.

Shriner, J. G., & DeStefano, L., (2003). Participation and Accommodation in State Assessment: The Role of Individualized Education Programs. Exceptional Children, 69(2), 147-161.

Sireci, S. G., Li, S., & Scarpati, S. (2003). The effects of test accommodation on test performance: A review of the literature (Report # 485). Amherst, MA: University of Massachusetts Amherst, Center for Educational Assessment Research.

Sireci, S. G., Scarpati, S. E., & Li, S. (2005). Test accommodations for students with disabilities: An analysis of the interaction hypothesis. Review of Educational Research, 75(4), 457-490.

Special Education Elementary Longitudinal Study (2003). What’s new from SEELS (Newsletter). Retrieved December 2, 2004 from http://www.seels.net/designdocs/SEELS_Newsletter_Fall_2003.rev-1.pdf.

Thurlow, M. L., Elliott, J. L., & Ysseldyke, J. E., (2003). Testing students with disabilities: Practical strategies for complying with district and state requirements (2nd ed.). Thousand Oaks, CA: Corwin Press.

Thurlow, M. L., Lazarus, S., Thompson, S., & Robey, J. (2002). 2001 state policies on assessment participation and accommodations (Synthesis Report 46). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.

Tindal G., & Fuchs, L. (1999). A summary of research on test changes: An empirical basis for defining accommodations. Lexington, KY: Vanderbilt University, Mid-South Regional Resource Center.

U.S. Department of Education. Raising achievement-alternate assessments for students with disabilities. Retrieved April 20, 2005 from http://www.ed.gove/policy/elsec/guid/raising/alt-assess.html

U.S. Department of Education. (2002). Twenty-fourth annual report to Congress on the implementation of the Individuals with Disabilities Education Act. Retrieved January 19, 2005 from http://www.ed.gov/about/reports/annual/osep/2002/toc-execsum.pdf

© 2006 by the Regents of the University of Minnesota.
The University of Minnesota is an equal opportunity educator and employer.

Online Privacy Statement
This page was last updated on May 20, 2013

NCEO is supported primarily through a Cooperative Agreement (#H326G050007) with the Research to Practice Division, Office of Special Education Programs, U.S. Department of Education. Additional support for targeted projects, including those on LEP students, is provided by other federal and state agencies. Opinions expressed in this Web site do not necessarily reflect those of the U.S. Department of Education or Offices within it.