ACT Growth Modeling Resources
Growth modeling based on assessments of college and career readiness can be used to measure progress—both for individual students and school systems.
 Growth modeling tells students how far they need to progress to reach their readiness goals.
 Growth modeling is also used for educational research and evaluation. Measures of student growth can be used as one component of teacher, school, or district evaluation and can help diagnose areas of strength and weakness.
To help individuals and school systems implement growth models based on ACT assessments, ACT provides normative growth data for various assessment combinations and grade levels.
In a Growth to Standards model, student progress is monitored with respect to an external criterion—in our case, the ACT College Readiness Benchmarks. The Growth to Standards model directly addresses the question “Has the student gained enough?”
In the graph below, the Growth to Standards model is applied to a fictitious 4thgrade student (John) who took ACT Aspire in grades 3 and 4. His score is below the 4thgrade ACT Readiness Benchmark in mathematics, and his goal is to become on target for college readiness by 7th grade. At each future grade level, one can determine if he is growing enough to reach his goal. The ACT Readiness Benchmarks for each grade level are represented by the points on the red line, John’s scores are represented by the solid black squares, and John’s goals are represented by the dashed black line. The ACT College Readiness Benchmark for Mathematics is 22 and is represented by the black X.
Note: For illustration, this graph shows ACT Aspire scores on one vertical axis and ACT scores on the other vertical axis. However, the two scales are not interchangeable.
ACT Aspire supports Growth to Standard Models by reporting student progress over time with respect to the ACT Readiness Benchmarks. Sample reports can be viewed here.
The ACT Readiness Benchmarks for ACT Aspire grades 3 through 10 and the ACT College Readiness Benchmarks for ACT Explore^{®} (grades 8 and 9), ACT Plan^{®} (grade 10), and the ACT^{®} test (grades 11 and 12) are given in the table below. Shaded rows represent grade levels where the two scales overlap.
Asessment 
Grade Level 
Subject Area 






English 
Reading 
Mathematics 
Science 
Writing 
ACT Readiness Benchmark 

ACT Aspire 
3 
413 
415 
413 
418 
428 
4 
417 
417 
416 
420 
428 

5 
419 
420 
418 
422 
428 

6 
420 
421 
420 
423 
428 

7 
421 
423 
422 
425 
428 

8 
422 
424 
425 
427 
428 

9 
426 
425 
428 
430 
428 

10 
428 
428 
432 
432 
428 


ACT College Readiness Benchmark 

ACT Explore 
8 
13 
16 
17 
18 

9 
14 
17 
18 
19 


ACT Plan 
10 
15 
18 
19 
20 

ACT 
11/12 
18 
22 
22 
23 

Growth can be described in a normative fashion by considering the average of test score 2 for each value of test score 1. These averages are known as conditional score averages. If students exceed the conditional score average, they performed better on test 2 than their peers with the same score on test 1. The conditional score averages are useful for predicting scores in a future grade and for measuring student growth relative to peers with the same prior test scores.
Below, we provide conditional score averages for numerous pairs of assessments. Find the assessment pair and select the growth period you are interested in to download the conditional score averages and 50% score prediction intervals.
Test 1  Test 2  Growth Period 

ACT Explore  ACT Explore  Grade 8 to grade 9 (10–14 months) 
ACT Explore  ACT Plan  Grade 8 to grade 10 
ACT Explore  ACT  Grade 8 to grade 11 
ACT Explore  ACT Plan  Grade 9 to grade 10 (10–14 months) 
ACT Explore English  ACT QualityCore English 9  6–12 months 
ACT Explore Mathematics  ACT QualityCore Algebra I  6–12 months 
ACT Explore Mathematics  ACT QualityCore Geometry  6–12 months 
ACT Explore Science  ACT QualityCore Biology  6–12 months 
ACT Explore English  ACT QualityCore English 10  18–24 months 
ACT Explore Mathematics  ACT QualityCore Geometry  18–24 months 
ACT Explore Science  ACT QualityCore Biology  18–24 months 
ACT Plan  ACT Plan  Grade 9 to grade 10 (10–14 months) 
ACT Plan  ACT  Grade 10 to grade 11 (10–14 months) 
ACT Plan  ACT  Fall grade 10 to spring grade 11 
ACT Plan English  ACT QualityCore English 10  6–12 months 
ACT Plan Mathematics  ACT QualityCore Geometry  6–12 months 
ACT Plan Mathematics  ACT QualityCore Algebra II  6–12 months 
ACT Plan Science  ACT QualityCore Biology  6–12 months 
ACT Plan Reading  ACT QualityCore US History  6–12 months 
ACT Plan English  ACT QualityCore English 11  18–24 months 
ACT Plan Mathematics  ACT QualityCore Geometry  18–24 months 
ACT Plan Mathematics  ACT QualityCore Algebra II  18–24 months 
ACT Plan Mathematics  ACT QualityCore PreCalculus  18–24 months 
ACT Plan Reading  ACT QualityCore U.S. History  18–24 months 
ACT Plan Science  ACT QualityCore Chemistry  18–24 months 
ACT QualityCore English 9  ACT QualityCore English 10  10–14 months 
ACT QualityCore English 10  ACT QualityCore English 11  10–14 months 
ACT QualityCore English 11  ACT QualityCore English 12  10–14 months 
ACT QualityCore Algebra I  ACT QualityCore Algebra II  22–26 months 
ACT English  ACT Compass Writing Skills  10–14 months 
ACT Mathematics  ACT Compass PreAlgebra  10–14 months 
ACT Mathematics  ACT Compass Algebra  10–14 months 
ACT Mathematics  ACT Compass College Algebra  10–14 months 
ACT Reading  ACT Compass Reading Skills  10–14 months 
How to Apply the Conditional Score Averages
In this example, 124 students took the ACT Explore assessment in 9th grade and then took the ACT Plan assessment in 10th grade, 12 months later. Students’ predicted ACT Plan scores are determined by their ACT Explore score in the same subject area using the Conditional Score Average model.
Predicted ACT Plan scores can be compared to actual ACT Plan scores to obtain residual scores that measure growth relative to peers. The predicted values and residual scores are highlighted in the spreadsheet.
The worksheet titled “Conditional Score Averages” is obtained from the ACT Explore/ACT Plan/Grade 9 to grade 10 (10–14 months) row in the table above.
Projection models are used to predict scores in a future grade and are meant to answer the question “Given this student’s observed past scores and based on patterns of scores in the past, where is she likely to score in the future?”
The projection model is a variant of linear regression where an equation is established that relates students’ past scores to their future scores. The projection model is flexible in that multiple past scores, as well as other measures, can be used to predict a single future score.
For example, a grade 10 ACT Plan Mathematics score can be predicted based on grade 9 ACT Explore scores in all four subject areas (English, mathematics, reading, and science) and on the number of months between the two assessments:
 ACT Plan Mathematics Score= β0 + β1 × ACT Explore English Score + β2 × ACT Explore Mathematics Score + β3× ACT Explore Reading Score + β4 × ACT Explore Science Score + β5 × Months Elapsed
In this model, the β values are weights relating each input variable to the future test score. We refer to these weights as projection parameters.
The projection model we employ is different from the conditional score average model because, instead of conditioning on a single past test score, we are conditioning on multiple past test scores. Generally, score predictions from the projection model are more accurate than score predictions from the conditional score average model because it uses more information for predicting future test scores.
Below, we provide projection parameters for several pairs of assessments. Find the assessment pair and select the growth period you are interested in to download the projection parameters.
Test 1  Test 2  Growth Period 

ACT Explore  ACT Explore  Grade 8 to grade 9 (10–14 months) 
ACT Explore  ACT Plan  Grade 8 to grade 10 
ACT Explore  ACT  Grade 8 to grade 11 
ACT Explore  ACT Plan  Grade 9 to grade 10 (10–14 months) 
ACT Explore  ACT QualityCore English 9  6–12 months 
ACT Explore  ACT QualityCore Algebra I  6–12 months 
ACT Explore  ACT QualityCore Geometry  6–12 months 
ACT Explore  ACT QualityCore Biology  6–12 months 
ACT Explore  ACT QualityCore English 10  18–24 months 
ACT Explore  ACT QualityCore Geometry  18–24 months 
ACT Explore  ACT QualityCore Biology  18–24 months 
ACT Plan  ACT Plan  Grade 9 to grade 10 (10–14 months) 
ACT Plan  ACT  Grade 10 to grade 11 (10–14 months) 
ACT Plan  ACT  Fall grade 10 to spring grade 11 
ACT Plan  ACT QualityCore English 10  6–12 months 
ACT Plan  ACT QualityCore Geometry  6–12 months 
ACT Plan  ACT QualityCore Algebra II  6–12 months 
ACT Plan  ACT QualityCore Biology  6–12 months 
ACT Plan  ACT QualityCore U.S. History  6–12 months 
ACT Plan  ACT QualityCore English 11  18–24 months 
ACT Plan  ACT QualityCore Geometry  18–24 months 
ACT Plan  ACT QualityCore Algebra II  18–24 months 
ACT Plan  ACT QualityCore US History  18–24 months 
ACT Plan  ACT QualityCore Chemistry  18–24 months 
ACT Plan  ACT QualityCore PreCalculus  18–24 months 
ACT  ACT Compass Writing Skills  10–14 months 
ACT  ACT Compass PreAlgebra  10–14 months 
ACT  ACT Compass Algebra  10–14 months 
ACT  ACT Compass College Algebra  10–14 months 
ACT  ACT Compass Reading Skills  10–14 months 
How to Apply the Projection Model
In this example, 131 students took the ACT Plan assessment in 10th grade and then took the ACT test in 11th grade (11 or 12 months later). Students’ predicted ACT scores are determined by their ACT Plan scores in all four subject areas, as well as the number of months elapsed between the ACT Plan and ACT tests.
Predicted ACT scores can be compared to actual ACT scores to obtain residual scores that measure growth relative to peers. The predicted values and residual scores are highlighted in the spreadsheet.
The worksheet named “Projection Model Parameters” is obtained from the ACT Plan/ACT/Grade 10 to grade 11 (10–14 months) row in the table above.
The Student Growth Percentile (SGP) model answers the question “What is the percentile rank of a student’s current score compared to students with similar score histories?” The model describes how well a student performed relative to peers with similar score histories.
Similar to the Projection model and the Conditional Score Average model, the SGP model can be used to predict how much students will grow and describes the level of growth that occurred. The model lets users examine how much future performance varies for different percentile levels. Many states and school systems use the SGP model for describing student growth, predicting future test scores, and for examining differences in growth across student groups.
ACT Aspire reports SGPs for students tested in consecutive years for grade levels 3 through 10. This page includes SGP “lookup” tables for ACT Aspire. Also included are SGP lookup tables for grade 10 ACT Aspire to the grade 11 ACT test. The tables can be used to find the SGP value (ranging from 1 to 100) associated with each combination of currentyear test score and prioryear test score. For example, suppose a student scored 411 on the grade 4 ACT Aspire Mathematics test and 415 on the grade 5 ACT Aspire Mathematics test one year later. His SGP value (given by the “growth_percentile” column) would be 59 (as shown in the table below, which is an excerpt from the grade 4–5 SGP lookup table).
version  current_grade  subject  prior_score  current_score  growth_percentile 

2016F  5  Mathematics  411  411  16 
2016F  5  Mathematics  411  412  24 
2016F  5  Mathematics  411  413  35 
2016F  5  Mathematics  411  414  47 
2016F  5  Mathematics  411  415  59 
2016F  5  Mathematics  411  416  73 
2016F  5  Mathematics  411  417  81 
The lookup tables provide an estimate of the SGP for each possible combination of samesubject test scores from consecutive years. SGPs are provided for five subject areas: English, mathematics, reading, science, and writing. The SGPs were estimated using quantile regression methods (Koenker, 2005) by the SGP R package (Betebenner, VanIwaarden, Domingue, Shang, 2014). The SGP model is flexible because multiple prior test scores can be used as inputs. The lookup tables provided below are based on a single prioryear score in the same subject area.
When interpreting SGPs, the reference group used to estimate the model should always be considered. The SGPs for ACT Aspire will be updated over time as larger and more diverse reference groups become available. Earlier versions of the SGP lookup tables are available below.
Test 1  Test 2  Growth Period  Reference Group 

ACT Aspire  ACT Aspire  Grade 3 to grade 4 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  ACT Aspire  Grade 4 to grade 5 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  ACT Aspire  Grade 5 to grade 6 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  ACT Aspire  Grade 6 to grade 7 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  ACT Aspire  Grade 7 to grade 8 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  ACT Aspire  Grade 8 to grade 9 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  ACT Aspire  Grade 9 to grade 10 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  The ACT  Grade 10 to grade 11 (1 year)  Examinees who tested in consecutive years from spring 2013 through spring 2016 
ACT Aspire  PreACT  Grade 9 to grade 10 (1 year)  Examinees who tested in fall 2015 (grade 9) and fall 2016 (grade 10) 
ACT Explore  ACT Plan  Grade 9 to grade 10 (1 year)  Examinees who tested in consecutive years from 2006 through 2016 
ACT Plan  The ACT  Grade 10 to grade 11 (1.5 years)  Examinees who tested in consecutive years from 2006 through 2016 
The ACT  The ACT  Grade 11 to grade 12 (6 months)  Examinees who tested in consecutive years from 2013 through 2016 
How to Apply the Student Growth Percentile Model
 Example: Demonstrating the Student Growth Percentile model for grade 10 ACT Aspire to grade 11 ACT (Excel)
In this example, 50 students took ACT Aspire in spring grade 10 and then took the ACT in spring grade 11. Each student was tested in all five subject areas (English, mathematics, reading, science, and writing). The spreadsheet displays each student’s ACT Aspire and ACT scores, as well as their SGP in each subject area. SGP values are highlighted. Note that the SGP lookup table is needed to obtain each SGP value based on the subject area tested, ACT Aspire score, and ACT score. Formulas are used in the Excel spreadsheet to look up the SGP value based on the ACT Aspire score and ACT test score.
References
Betebenner, D.W., VanIwaarden, A., Domingue, B., and Shang, Y (2014). SGP: An R Package for the Calculation and Visualization of Student Growth Percentiles & Percentile Growth Trajectories. R package version 1.20.0. URL.
Koenker, R. (2005). Quantile Regression. New York, NY: Cambridge University Press.
Earlier Versions of ACT Aspire SGP Lookup Tables
The lookup tables below were developed in 2015 and are not the most current lookup tables available.
Test 1  Test 2  Growth Period  Reference Group 

ACT Aspire  ACT Aspire  Grade 3 to grade 4 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
ACT Aspire  ACT Aspire  Grade 4 to grade 5 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
ACT Aspire  ACT Aspire  Grade 5 to grade 6 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
ACT Aspire  ACT Aspire  Grade 6 to grade 7 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
ACT Aspire  ACT Aspire  Grade 7 to grade 8 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
ACT Aspire  ACT Aspire  Grade 8 to grade 9 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
ACT Aspire  ACT Aspire  Grade 9 to grade 10 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
ACT Aspire  The ACT  Grade 10 to grade 11 (1 year)  Examinees who tested in spring 2013 and spring 2014, or spring 2014 and spring 2015 
The lookup tables below were developed in 2014 and are not the most current lookup tables available.
Test 1  Test 2  Growth Period  Reference Group 

ACT Aspire  ACT Aspire  Grade 3 to grade 4 (1 year)  Examinees who tested in spring 2013 and spring 2014 
ACT Aspire  ACT Aspire  Grade 4 to grade 5 (1 year)  Examinees who tested in spring 2013 and spring 2014 
ACT Aspire  ACT Aspire  Grade 5 to grade 6 (1 year)  Examinees who tested in spring 2013 and spring 2014 
ACT Aspire  ACT Aspire  Grade 6 to grade 7 (1 year)  Examinees who tested in spring 2013 and spring 2014 
ACT Aspire  ACT Aspire  Grade 7 to grade 8 (1 year)  Examinees who tested in spring 2013 and spring 2014 
ACT Aspire  ACT Aspire  Grade 8 to grade 9 (1 year)  ACT Explore examinees tested in academic years 2008–2009 (grade 8) and 2009–2010 (grade 9) or 2009–2010 (grade 8) and 2010–2011 (grade 9). The ACT Explore/PlantoACT Aspire concordance was applied to estimate SGPs on the ACT Aspire scale. 
ACT Aspire  ACT Aspire  Grade 9 to grade 10 (1 year)  ACT Explore and ACT Plan examinees tested in academic years 2009–2010 (grade 9) and 2010–2011 (grade 10) or 2010–2011 (grade 9) and 2011–2012 (grade 10). The ACT Explore/PlantoACT Aspire concordance was applied to estimate SGPs on the ACT Aspire scale. 
ACT Aspire  The ACT  Grade 10 to grade 11 (1 year)  Examinees who tested in spring 2013 and spring 2014 
Student growth measures can be used within systems for evaluating teachers, schools, and districts. Several methods attempt to estimate the distinct effects of teachers and schools on student growth. These methods are intended to support valueadded interpretations.
The Multivariate Model
The Multivariate model is designed for the primary purpose of supporting valueadded inferences for teachers and schools. By simultaneously considering multiple years of student scores across subject areas, the Multivariate model attempts to attribute student performance to individual teachers and schools. A wellknown example of the Multivariate model is the Education ValueAdded Assessment System (EVAAS). Because of its technical complexity and extensive data requirements, the Multivariate model is not supported here.
ValueAdded Interpretations from Other Models
With proper attribution of student growth measures, valueadded interpretations can be made from simpler models, including the Conditional Score Averages, Projection, and Student Growth Percentile models.
Note that the Conditional Score Averages and Projection models both produce predicted scores. For a group of students, a normative measure of aggregate growth can be constructed by averaging the residual scores—the difference between actual and predicted scores. If the mean residual score is significantly greater than 0, then the group of students performed better than predicted, on average. If the mean residual score is significantly less than 0, then the group of students performed worse than predicted, on average. Simple statistical tests can be used to determine if the mean residual score is greater or less than 0.
For the Student Growth Percentile (SGP) model, the mean (or median) SGP is a normative measure of aggregate growth. If the mean SGP is significantly greater than 50, then the group of students performed better than predicted, on average. If the mean SGP is significantly less than 50, then the group of students performed worse than predicted, on average. Simple statistical tests can be used to determine if the mean SGP is greater or less than 50.
Valueadded interpretations of teachers and schools are better supported when individual student performance can be attributed directly to the teacher and school. Factors that negatively affect confidence in attribution include:
 Misalignment of timing of assessments and timing of instruction
 Misalignment of assessment content coverage to instructional content coverage
 Team teaching, multidisciplinary instruction, or other situations where multiple teachers affect student learning
 Student migration, absenteeism, and other personal events that affect academic performance
Generally, attribution is difficult because many forces act upon student performance, and it is difficult to disentangle them. Because of difficulties with attribution and yeartoyear inconsistency in estimates of teacher and school effects, many experts suggest that teacher and school effect estimates should supplement, not replace, other sources of information for evaluation.
In some cases, it may be unreasonable to attribute student performance to a single teacher. In other cases, it may be more reasonable to assign weights to student growth measures representing the level of attribution to individual teachers.
Examples
In the first two examples below, students’ predicted scores are compared to their actual scores to obtain a residual score. If the mean residual score is significantly greater than 0, then the group of students performed better than predicted, on average. If the mean residual score is significantly less than 0, then the group of students performed worse than predicted, on average. A simple statistical test known as the onesample ttest is used to determine if the mean residual score is significantly greater or less than 0. If it is significantly greater than 0, the aggregate growth is classified as “Above Expected”; if it is significantly less than 0, the aggregate growth is classified as “Below Expected.” Otherwise, the aggregate growth is classified as “Expected.” In the third example, a statistical test is used to determine if the mean SGP is significantly greater or less than 50.
Example 1
In this example, 124 students took the ACT Explore assessment in 9th grade and then took the ACT Plan assessment in 10th grade, 12 months later. Students’ predicted ACT Plan scores are determined by their ACT Explore score in the same subject area using the Conditional Score Averages model. Predicted ACT Plan scores can be compared to actual ACT Plan scores to obtain residual scores that measure growth relative to peers. The predicted values and residual scores are highlighted in the spreadsheet.The worksheet named “Conditional Score Averages” is obtained from theConditional ACT Explore grade 9 to ACT Plan grade 10 (10 to 14 months) file.
The mean residual scores are calculated for each subject area. Each of the mean residual scores is greater than 0, and the ttest pvalues (<0.05) indicate that the means are significantly greater than 0. Therefore, the growth in each subject area is classified as “Above Expected.”
Example 2
In this example, 131 students took the ACT Plan assessment in 10th grade and then took the ACT in 11th grade (11 or 12 months later). Students’ predicted ACT scores are determined by their ACT Plan scores in all four subject areas, as well as the number of months elapsed between the ACT Plan and ACT tests. Predicted ACT scores can be compared to actual ACT scores to obtain residual scores that measure growth relative to peers. The predicted values and residual scores are highlighted in the spreadsheet. The worksheet named “Projection Model Parameters” is obtained from theProjection ACT Plan grade 10 to ACT grade 11 (10–14 months) file.
The mean residual scores are calculated for each subject area. For English, reading, and science, the mean of the residual scores is less than 0, and the ttest pvalues (<0.05) indicate that the means are significantly less than 0. Therefore, the growth in English, reading, and science is classified as “Below Expected.” For mathematics, the mean of the residual scores is less than 0, but the ttest pvalue (0.269) indicates that the mean is not significantly less than 0. Therefore, the growth in mathematics is classified as “Expected.”
Example 3
In this example, 50 students took ACT Aspire in spring grade 10 and then took the ACT in spring grade 11. Each student was tested in all five subject areas (English, mathematics, reading, science, and writing). The spreadsheet displays each student’s ACT Aspire and ACT scores, as well as their SGP in each subject area. SGP values are highlighted.
The mean SGP values are calculated for each subject area. The mean SGP values range from 50.60 for science to 56.70 for Reading. For each subject area, the ttest pvalue is greater than 0.05, so there is not enough evidence to conclude that the mean SGP is significantly different from 50 in any subject area. Therefore, the growth is classified as “Expected” for each subject area.
 Academic Growth Patterns of FirstGeneration College Students in Grades 8 to 12 (2016)
 Academic Growth Patterns of FirstGeneration College Students in Grades 8 to 12 by Parental Education (2016)
 Academic Growth Patterns of FirstGeneration College Students in Grades 8 to 12 by Parental Education and Gender (2016)
 Are ValueAdded Measures of High School Effectiveness Related to Students' Enrollment and Success in College? (2016)
 ACT Aspire Summative Technical Manual (2016, see Chapter 14 for growth models)
 Statistical Properties of School ValueAdded Scores Based on Assessments of College Readiness (2015)
 The Legality of Using SchoolWide Growth Measures in Teacher Evaluation Systems (2015)
 Answers to Administrators’ Questions about Student Growth (2015)
 Answers to Teachers’ Questions about Student Growth (2015)
 Answers to Parents’ Questions about Student Growth (2015)
 The Effects of Model Choice on Estimates of Teacher Effectiveness (2015)
 Building Momentum: The Condition of Progress Toward College Readiness (2015)
 A Bayesian Hierarchical Selection Model for Academic Growth with Missing Data (2015)
 ACT Aspire Summative Assessment Technical Bulletin #2 (2014, see Chapter 6 for Growth Models)
 Development of Predicted Paths for ACT Aspire Score Reports (2014)
 Will Courts Shape ValueAdded Methods for Teacher Evaluation? (2014)
 Recent Validity Evidence for ValueAdded Measures of Teacher Performance (2014)
 Catching Up to College and Career Readiness in Kentucky (2014)
 Catching Up to College and Career Readiness in Arkansas (2014)
 Catching Up to College and Career Readiness: The Challenge Is Greater for AtRisk Students (2014)
 The Forgotten Middle (2014)
 Catching Up to College and Career Readiness (2012)
 Raising the Bar: A Baseline for College and Career Readiness in Our Nation’s High School Courses (2012)
 Grade 8 to 12 Academic Growth Patterns for English Language Learners and Students with Disabilities (2012)
 Measuring Progress in Core High School Courses: Insights into ValueAdded Measures of Teacher Effectiveness (2012)
 Principles for Measuring Growth Towards College and Career Readiness (2012)
 Do race/ethnicitybased student achievement gaps grow over time? (2012)
 What are ACT/NCEA College and Career Readiness Targets?
 Statistical Properties of Accountability Measures Based on ACT’s Educational Planning and Assessment System (2009)
 How Much Growth toward College Readiness Is Reasonable to Expect in High School? (2009)
 Using ACT Data as Part of a State Accountability System (2009)
 The Forgotten Middle: Ensuring that All Students Are on Target for College and Career Readiness before High School (2008)