Using Student Achievement Data to Support Instructional Decision Making - IES PRACTICE GUIDE WHAT WORKS CLEARINGHOUSE

Page created by Geraldine Perry
 
CONTINUE READING
Using Student Achievement Data to Support Instructional Decision Making - IES PRACTICE GUIDE WHAT WORKS CLEARINGHOUSE
IES PRACTICE GUIDE             WHAT WORKS CLEARINGHOUSE

Using Student Achievement Data to
Support Instructional Decision Making

NCEE 2009-4067
U.S. DEPARTMENT OF EDUCATION
Using Student Achievement Data to Support Instructional Decision Making - IES PRACTICE GUIDE WHAT WORKS CLEARINGHOUSE
The Institute of Education Sciences (IES) publishes practice guides in education
to bring the best available evidence and expertise to bear on the types of challenges
that cannot currently be addressed by a single intervention or program. Authors of
practice guides seldom conduct the types of systematic literature searches that are
the backbone of a meta-analysis, although they take advantage of such work when
it is already published. Instead, authors use their expertise to identify the most im-
portant research with respect to their recommendations and conduct a search of
recent publications to ensure that the research supporting the recommendations
is up-to-date.

Unique to IES-sponsored practice guides is that they are subjected to rigorous exter-
nal peer review through the same office that is responsible for independent reviews
of other IES publications. A critical task for peer reviewers of a practice guide is to
determine whether the evidence cited in support of particular recommendations is
up-to-date and that studies of similar or better quality that point in a different di-
rection have not been ignored. Because practice guides depend on the expertise of
their authors and their group decision making, the content of a practice guide is not
and should not be viewed as a set of recommendations that in every case depends
on and flows inevitably from scientific research.

The goal of this practice guide is to formulate specific and coherent evidence-based
recommendations for use by educators and education administrators to create the
organizational conditions necessary to make decisions using student achievement
data in classrooms, schools, and districts. The guide provides practical, clear in-
formation on critical topics related to data-based decision making and is based on
the best available evidence as judged by the panel. Recommendations presented in
this guide should not be construed to imply that no further research is warranted
on the effectiveness of particular strategies for data-based decision making.
Using Student Achievement Data to Support Instructional Decision Making - IES PRACTICE GUIDE WHAT WORKS CLEARINGHOUSE
IES PRACTICE GUIDE

            Using Student Achievement
            Data to Support Instructional
            Decision Making
            September 2009
            Panel
            Laura Hamilton (Chair)
            RAND Corporation

            Richard Halverson
            University of Wisconsin–Madison

            Sharnell S. Jackson
            Chicago Public Schools

            Ellen Mandinach
            CNA Education

            Jonathan A. Supovitz
            University of Pennsylvania

            Jeffrey C. Wayman
            The University of Texas   at   Austin

            Staff
            Cassandra Pickens
            Emily Sama Martin
            Mathematica Policy Research

            Jennifer L. Steele
            RAND Corporation

NCEE 2009-4067
U.S. DEPARTMENT OF EDUCATION
Using Student Achievement Data to Support Instructional Decision Making - IES PRACTICE GUIDE WHAT WORKS CLEARINGHOUSE
This report was prepared for the National Center for Education Evaluation and Re-
gional Assistance, Institute of Education Sciences, under Contract ED-07-CO-0062
by the What Works Clearinghouse, operated by Mathematica Policy Research.

Disclaimer
The opinions and positions expressed in this practice guide are the authors’ and do
not necessarily represent the opinions and positions of the Institute of Education Sci-
ences or the U.S. Department of Education. This practice guide should be reviewed
and applied according to the specific needs of the educators and education agency
using it, and with the full realization that it represents the judgments of the review
panel regarding what constitutes sensible practice, based on the research available
at the time of publication. This practice guide should be used as a tool to assist in
decision making rather than as a “cookbook.” Any references within the document to
specific education products are illustrative and do not imply endorsement of these
products to the exclusion of other products that are not referenced.

U.S. Department of Education
Arne Duncan
Secretary

Institute of Education Sciences
John Q. Easton
Director

National Center for Education Evaluation and Regional Assistance
John Q. Easton
Acting Commissioner

September 2009

This report is in the public domain. While permission to reprint this publication is
not necessary, the citation should be:

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J.
(2009). Using student achievement data to support instructional decision making
(NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and
Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides/.

What Works Clearinghouse Practice Guide citations begin with the panel chair,
followed by the names of the panelists listed in alphabetical order.

This report is available on the IES website at http://ies.ed.gov/ncee and http://ies.
ed.gov/ncee/wwc/publications/practiceguides/.

Alternative formats
On request, this publication can be made available in alternative formats, such
as Braille, large print, audiotape, or computer diskette. For more information,
call the Alternative Format Center at 202–205–8113.
Using Student Achievement Data to
Support Instructional Decision Making
Contents
Introduction                                                                1

 The What Works Clearinghouse standards and their relevance to this guide   4

Overview                                                                    5

Scope of the practice guide                                                 6

 Status of the research                                                     6

 Summary of the recommendations                                             7

Checklist for carrying out the recommendations                              9

Recommendation 1. Make data part of an ongoing cycle of
instructional improvement                                                   10

Recommendation 2. Teach students to examine their own data and
set learning goals                                                          19

Recommendation 3. Establish a clear vision for schoolwide data use          27

Recommendation 4. Provide supports that foster a data-driven culture
within the school                                                           33

Recommendation 5. Develop and maintain a districtwide data system           39

Glossary of terms as used in this report                                    46

Appendix A. Postscript from the Institute of Education Sciences             49

Appendix B. About the authors                                               52

Appendix C. Disclosure of potential conflicts of interest                   54

Appendix D. Technical information on the studies                            55

References                                                                  66

                                            ( iii )
USING STUDENT ACHIEVEMENT DATA TO SUPPORT INSTRUCTIONAL DECISION MAKING

List of tables
Table 1.   Institute of Education Sciences levels of evidence for practice guides   3

Table 2. Recommendations and corresponding levels of evidence                       8

Table 3. Suggested professional development and training opportunities              37

Table 4. Sample stakeholder perspectives on data system use                         40

Table 5. Considerations for built and purchased data systems                        44

Table D1. Studies cited in recommendation 2 that meet WWC standards
with or without reservations                                                        57

Table D2. Scheduling approaches for teacher collaboration                           61

List of figures
Figure 1. Data use cycle                                                            10

Figure 2. Example of classroom running records performance
at King Elementary School                                                           13

List of examples
Example 1. Examining student data to understand learning                            17

Example 2. Example of a rubric for evaluating five-paragraph essays                 21

Example 3. Example of a student’s worksheet for reflecting on
strengths and weaknesses                                                            23

Example 4. Example of a student’s worksheet for learning from
math mistakes                                                                       24

Example 5. Teaching students to examine data and goals                              25

Example 6. Examples of a written plan for achieving school-level goals              30

                                          ( iv )
Introduction                                     and single subject designs to examine
                                                 whether data use leads to increases in
As educators face increasing pressure            student achievement. Among the studies
from federal, state, and local accountabil-      ultimately relevant to the panel’s recom-
ity policies to improve student achieve-         mendations, only six meet the causal va-
ment, the use of data has become more            lidity standards of the What Works Clear-
central to how many educators evaluate           inghouse (WWC) and were related to the
their practices and monitor students’ aca-       panel’s recommendations.2
demic progress.1 Despite this trend, ques-
tions about how educators should use data        To indicate the strength of evidence sup-
to make instructional decisions remain           porting each recommendation, the panel
mostly unanswered. In response, this             relied on the WWC standards for determin-
guide provides a framework for using stu-        ing levels of evidence, described below and
dent achievement data to support instruc-        in Table 1. It is important for the reader to
tional decision making. These decisions          remember that the level of evidence rating
include, but are not limited to, how to          is not a judgment by the panel on how ef-
adapt lessons or assignments in response         fective each of these recommended prac-
to students’ needs, alter classroom goals        tices will be when implemented, nor is it
or objectives, or modify student-grouping        a judgment of what prior research has to
arrangements. The guide also provides            say about the effectiveness of these prac-
recommendations for creating the orga-           tices. The level of evidence ratings reflect
nizational and technological conditions          the panel’s judgment of the validity of
that foster effective data use. Each rec-        the existing literature to support a causal
ommendation describes action steps for           claim that when these practices have been
implementation, as well as suggestions           implemented in the past, positive effects
for addressing obstacles that may impede         on student academic outcomes were ob-
progress. In adopting this framework, edu-       served. They do not reflect judgments of
cators will be best served by implement-         the relative strength of these positive ef-
ing the recommendations in this guide            fects or the relative importance of the in-
together rather than individually.               dividual recommendations.

The recommendations reflect both the ex-         A strong rating refers to consistent and
pertise of the panelists and the findings        generalizable evidence that an inter-
from several types of studies, including         vention strategy or program improves
studies that use causal designs to examine       outcomes.3
the effectiveness of data use interventions,
case studies of schools and districts that       A moderate rating refers either to evidence
have made data-use a priority, and obser-        from studies that allow strong causal con-
vations from other experts in the field. The     clusions but cannot be generalized with
research base for this guide was identi-         assurance to the population on which a
fied through a comprehensive search for          recommendation is focused (perhaps be-
studies evaluating academically oriented         cause the findings have not been widely
data-based decision-making interventions
and practices. An initial search for litera-     2. Reviews of studies for this practice guide ap-
ture related to data use to support instruc-     plied WWC Version 1.0 standards. See Version 1.0
                                                 standards at http://ies.ed.gov/ncee/wwc/pdf/
tional decision making in the past 20 years
                                                 wwc_version1_standards.pdf.
yielded more than 490 citations. Of these,
                                                 3. Following WWC guidelines, improved out-
64 used experimental, quasi-experimental,        comes are indicated by either a positive, statisti-
                                                 cally significant effect or a positive, substantively
1. Knapp et al. (2006).                          important effect size (i.e., greater than 0.25).

                                               (1)
Introduction

replicated) or to evidence from studies that        that researchers have not yet studied a
are generalizable but have more causal              practice or that there is weak or conflicting
ambiguity than that offered by experi-              evidence of effectiveness. Policy interest in
mental designs (e.g., statistical models of         topics of current study thus can arise be-
correlational data or group comparison de-          fore a research base has accumulated on
signs for which equivalence of the groups           which recommendations can be based.
at pretest is uncertain).
                                                    Under these circumstances, the panel ex-
A low rating refers to evidence either from         amined the research it identified on the
studies such as case studies and descrip-           topic and combined findings from that
tive studies that do not meet the stan-             research with its professional expertise
dards for moderate or strong evidence or            and judgments to arrive at recommenda-
from expert opinion based on reasonable             tions. However, that a recommendation
extrapolations from research and theory.            has a low level of evidence should not be
A low level of evidence rating indicates            interpreted as indicating that the panel
that the panel did not identify a body of           believes the recommendation is unimport-
research demonstrating effects of imple-            ant. The panel has decided that all five rec-
menting the recommended practice on                 ommendations are important and, in fact,
student achievement. The lack of a body of          encourages educators to implement all of
valid evidence may simply mean that the             them to the extent that state and district
recommended practices are not feasible or           resources and capacity allow.
are difficult to study in a rigorous, experi-
mental fashion.4 In other cases, it means

4. For more information, see the WWC Frequently
Asked Questions page for practice guides, http://
ies.ed.gov/ncee/wwc/references/idocviewer/
doc.aspx?docid=15&tocid=3.

                                                (2)
Introduction

Table 1. Institute of Education Sciences levels of evidence for practice guides
                  In general, characterization of the evidence for a recommendation as strong requires both
                  studies with high internal validity (i.e., studies whose designs can support causal conclu-
                  sions) and studies with high external validity (i.e., studies that in total include enough of
                  the range of participants and settings on which the recommendation is focused to sup-
                  port the conclusion that the results can be generalized to those participants and settings).
                  Strong evidence for this practice guide is operationalized as
                  • A systematic review of research that generally meets WWC standards (see http://ies.
                      ed.gov/ncee/wwc/) and supports the effectiveness of a program, practice, or approach
    Strong            with no contradictory evidence of similar quality; OR
                  • Several well-designed, randomized controlled trials or well-designed quasi-experi-
                      ments that generally meet WWC standards and support the effectiveness of a program,
                      practice, or approach, with no contradictory evidence of similar quality; OR
                  • One large, well-designed, randomized controlled, multisite trial that meets WWC stan-
                      dards and supports the effectiveness of a program, practice, or approach, with no
                      contradictory evidence of similar quality; OR
                  • For assessments, evidence of reliability and validity that meets the Standards for
                      Educational and Psychological Testing.a

                  In general, characterization of the evidence for a recommendation as moderate requires
                  studies with high internal validity but moderate external validity or studies with high
                  external validity but moderate internal validity. In other words, moderate evidence is
                  derived from studies that support strong causal conclusions but generalization is uncer-
                  tain or studies that support the generality of a relationship but the causality is uncertain.
                  Moderate evidence for this practice guide is operationalized as
                  • Experiments or quasi-experiments generally meeting WWC standards and supporting
                      the effectiveness of a program, practice, or approach with small sample sizes and/
                      or other conditions of implementation or analysis that limit generalizability and no
                      contrary evidence; OR
   Moderate       • Comparison group studies that do not demonstrate equivalence of groups at pretest
                      and, therefore, do not meet WWC standards but that (1) consistently show enhanced
                      outcomes for participants experiencing a particular program, practice, or approach
                      and (2) have no major flaws related to internal validity other than lack of demonstrated
                      equivalence at pretest (e.g., only one teacher or one class per condition, unequal
                      amounts of instructional time, highly biased outcome measures); OR
                  • Correlational research with strong statistical controls for selection bias and for dis-
                      cerning influence of endogenous factors and no contrary evidence; OR
                  • For assessments, evidence of reliability that meets the Standards for Educational and
                      Psychological Testingb but with evidence of validity from samples not adequately rep-
                      resentative of the population on which the recommendation is focused.

                  In general, characterization of the evidence for a recommendation as low means that the
                  recommendation is based on expert opinion derived from strong findings or theories in
      Low         related areas and/or expert opinion buttressed by direct evidence that does not rise to
                  the moderate or strong level. Low evidence is operationalized as evidence not meeting
                  the standards for the moderate or strong level.

a. American Educational Research Association, American Psychological Association, and National Council on
   Measurement in Education (1999).­­­
b. Ibid.

                                                (3)
Introduction

The What Works Clearinghouse                   Following the recommendations and sug-
standards and their relevance to               gestions for carrying out the recommen-
this guide                                     dations, Appendix D presents more in-
                                               formation on the research evidence that
In terms of the levels of evidence indi-       supports each recommendation.
cated in Table 1, the panel relied on WWC
evidence standards to assess the quality       The panel would like to thank Cassandra
of evidence supporting educational pro-        Pickens, Emily Sama Martin, Dr. Jennifer
grams and practices. The WWC evaluates         L. Steele, and Mathematica and RAND staff
evidence for the causal validity of instruc-   members who participated in the panel
tional programs and practices according        meetings, characterized the research find-
to WWC standards. Information about            ings, and drafted the guide. We also appre-
these standards is available at http://ies.    ciate the help of the many WWC reviewers
ed.gov/ncee/wwc/pdf/wwc_version1_              who contributed their time and expertise
standards.pdf. The technical quality of        to the review process, and Sarah Wissel for
each study is rated and placed into one of     her support of the intricate logistics of the
three categories:                              project. In addition, we would like to thank
                                               Scott Cody, Kristin Hallgren, Dr. Shannon
•   Meets Evidence Standards for random-       Monahan, and Dr. Mark Dynarski for their
    ized controlled trials and regression      oversight and guidance during the devel-
    discontinuity studies that provide the     opment of the practice guide.
    strongest evidence of causal validity.
                                                                      Dr. Laura Hamilton
•   Meets Evidence Standards with Res-                            Dr. Richard Halverson
    ervations for all quasi-experimental                   Ms. Sharnell S. Jackson, Ed.M.
    studies with no design flaws and ran-                            Dr. Ellen Mandinach
    domized controlled trials that have                         Dr. Jonathan A. Supovitz
    problems with randomization, attri-                            Dr. Jeffrey C. Wayman
    tion, or disruption.

•   Does Not Meet Evidence Screens for
    studies that do not provide strong evi-
    dence of causal validity.

                                           (4)
Using Student                                     progress is a logical way to monitor con-
Achievement Data to                               tinuous improvement and tailor instruc-
                                                  tion to the needs of each student. Armed
Support Instructional                             with data and the means to harness the
Decision Making                                   information data can provide, educators
                                                  can make instructional changes aimed at
                                                  improving student achievement, such as:
Overview
                                                  •   prioritizing instructional time;8
Recent changes in accountability and test-
ing policies have provided educators with         •   targeting additional individual instruc-
access to an abundance of student-level               tion for students who are struggling
data, and the availability of such data has           with particular topics;9
led many to want to strengthen the role of
data for guiding instruction and improving        •   more easily identifying individual stu-
student learning. The U.S. Department of              dents’ strengths and instructional in-
Education recently echoed this desire, call-          terventions that can help students
ing upon schools to use assessment data to            continue to progress;10
respond to students’ academic strengths
and needs.5 In addition, spurred in part          •   gauging the instructional effectiveness
by federal legislation and funding, states            of classroom lessons;11
and districts are increasingly focused on
building longitudinal data systems.6              •   refining instructional methods;12 and

Although accountability trends explain            •   examining schoolwide data to consider
why more data are available in schools,               whether and how to adapt the curricu-
the question of what to do with the data re-          lum based on information about stu-
mains primarily unanswered. Data provide              dents’ strengths and weaknesses.13
a way to assess what students are learn-
ing and the extent to which students are
making progress toward goals. However,
making sense of data requires concepts,           8. Brunner et al. (2005).
theories, and interpretative frames of ref-       9. Brunner et al. (2005); Supovitz and Klein
                                                  (2003); Wayman and Stringfield (2006).
erence.7 Using data systematically to ask
                                                  10. Brunner et al. (2005); Forman (2007); Wayman
questions and obtain insight about student
                                                  and Stringfield (2006).
                                                  11. Halverson, Prichett, and Watson (2007);
5. American Recovery and Reinvestment Act         Supovitz and Klein (2003).
of 2009; U.S. Department of Education (2009);     12. Halverson, Prichett, and Watson (2007);
Obama (2009).                                     Fiarman (2007).
6. Aarons (2009).                                 13. Marsh, Pane, and Hamilton (2006); Kerr
7. Knapp et al. (2006).                           et al. (2006).

                                                (5)
Scope of the                                               these are administered consistently
practice guide                                             and routinely to provide information
                                                           that can be compared across class-
                                                           rooms or schools.
The purpose of this practice guide is to
help K–12 teachers and administrators use              Annual and interim assessments vary con-
student achievement data to make instruc-              siderably in their reliability and level of
tional decisions intended to raise student             detail, and no single assessment can tell
achievement. The panel believes that the               educators all they need to know to make
responsibility for effective data use lies             well-informed instructional decisions. For
with district leaders, school administrators,          this reason, the guide emphasizes the use of
and classroom teachers and has crafted the             multiple data sources and suggests ways to
recommendations accordingly.                           use different types of common assessment
                                                       data to support and inform decision mak-
This guide focuses on how schools can make             ing. The panel recognizes the value of class-
use of common assessment data to improve               room-specific data sources, such as tests or
teaching and learning. For the purpose of              other student work, and the guide provides
this guide, the panel defined common as-               suggestions for how these data can be used
sessments as those that are administered               to inform instructional decisions.
in a routine, consistent manner by a state,
district, or school to measure students’ aca-          The use of data for school management
demic achievement.14 These include                     purposes, rewarding teacher performance,
                                                       and determining appropriate ways to
•   annual statewide accountability tests              schedule the school day is beyond the
    such as those required by No Child                 scope of this guide. Schools typically col-
    Left Behind;                                       lect data on students’ attendance, behav-
                                                       ior, activities, coursework, and grades, as
•   commercially produced tests—includ-                well as a range of administrative data con-
    ing interim assessments, benchmark                 cerning staffing, scheduling, and financ-
    assessments, or early-grade reading                ing. Some schools even collect perceptual
    assessments—administered at mul-                   data, such as information from surveys or
    tiple points throughout the school                 focus groups with students, teachers, par-
    year to provide feedback on student                ents, or community members. Although
    learning;                                          many of these data have been used to
                                                       help inform instructional decision making,
•   end-of-course tests administered                   there is a growing interest among educa-
    across schools or districts; and                   tors and policy advocates in drawing on
                                                       these data sources to increase operational
•   interim tests developed by districts               efficiency inside and outside of the class-
    or schools, such as quarterly writing              room. This guide does not suggest how
    or mathematics prompts, as long as                 districts should use these data sources to
                                                       implement data-informed management
                                                       practices, but this omission should not be
14. The panel recognizes that some schools do
not fall under a district umbrella or are not part     construed as a suggestion that such data
of a district. For the purposes of this guide, dis-    are not valuable for decision making.
trict is used to describe schools in partnership,
which could be either a school district or a collab-   Status of the research
orative organization of schools. Technical terms
related to assessments, data, and data-based de-
cision making are defined in a glossary at the end     Overall, the panel believes that the ex-
of the recommendations.                                isting research on using data to make
                                                   (6)
Scope of the practice guide

instructional decisions does not yet pro-          research that proves the practices do im-
vide conclusive evidence of what works to          prove student achievement.
improve student achievement. There are a
number of reasons for the lack of compel-          Summary of the recommendations
ling evidence. First, rigorous experimental
studies of some data-use practices are dif-        The recommendations in this guide create
ficult or infeasible to carry out. For exam-       a framework for effectively using data to
ple, it would be impractical to structure a        make instructional decisions. This frame-
rigorous study investigating the effects of        work should include a data system that
implementing a districtwide data system            incorporates data from various sources,
(recommendation 5) because it is difficult         a data team in schools to encourage the
to establish an appropriate comparison             use and interpretation of data, collabora-
that reflects what would have happened in          tive discussion sessions among teachers
the absence of that system. Second, data-          about data use and student achievement,
based decision making is closely tied to           and instruction for students about how to
educational technology. As new technolo-           use their own achievement data to set and
gies are developed, there is often a lag           monitor educational goals. A central mes-
before rigorous research can identify the          sage of this practice guide is that effective
impacts of those technologies. As a result,        data practices are interdependent among
there is limited evidence on the effective-        the classroom, school, and district levels.
ness of the state-of-the-art in data-based         Educators should become familiar with all
decision making. Finally, studies of data-         five recommendations and collaborate with
use practices generally look at a bundle of        other school and district staff to implement
elements, including training teachers on           the recommendations concurrently, to the
data use, data interpretation, and utiliz-         extent that state and district resources and
ing the software programs associated with          capacity allow. However, readers who are
data analysis and storage. Studies typi-           interested in implementing data-driven
cally do not look at individual elements,          recommendations in the classroom should
making it difficult to isolate a specific ele-     focus on recommendations 1 and 2. Read-
ment’s contribution to effective use of data       ers who wish to implement data-driven
to make instructional decisions designed           decision making at the school level should
to improve student achievement.                    focus on recommendations 3 and 4. Read-
                                                   ers who wish to bolster district data sys-
This guide includes five recommendations           tems to support data-driven decision mak-
that the panel believes are a priority to im-      ing should focus on recommendation 5.
plement. However, given the status of the          Finally, readers interested in technical in-
research, the panel does not have compel-          formation about studies that the panel used
ling evidence that these recommendations           to support its recommendations will find
lead to improved student outcomes. As a            such information in Appendix D.
result, all of the recommendations are sup-
ported by low levels of evidence. While the        To account for the context of each school
evidence is low, the recommendations re-           and district, this guide offers recommen-
flect the panel’s best advice—informed by          dations that can be adjusted to fit their
experience and research—on how teachers            unique circumstances. Examples in this
and administrators can use data to make            guide are intended to offer suggestions
instructional decisions that raise student         based on the experiences of schools and
achievement. In other words, while this            the expert opinion of the panel, but they
panel of experts believes these practices          should not be construed as the best or only
will lead to improved student achieve-             ways to implement the guide’s recommen-
ment, the panel cannot point to rigorous           dations. The recommendations, described
                                                 (7)
Scope of the practice guide

Table 2. Recommendations and corresponding levels of evidence

 Recommendation                                                            Level of evidence

 1. Make data part of an ongoing cycle of instructional improvement               Low

 2. Teach students to examine their own data and set learning goals               Low

 3. Establish a clear vision for schoolwide data use                              Low

 4. Provide supports that foster a data-driven culture within the school          Low

 5. Develop and maintain a districtwide data system                               Low

Source: Authors’ compilation based on analysis described in text.

here briefly, also are listed with their levels     on the organizational and technological
of evidence in Table 2.                             conditions that support data use. Recom-
                                                    mendation 3 suggests that school leaders
Recommendations 1 and 2 emphasize the               establish a comprehensive plan for data
use of data to inform classroom-level in-           use that takes into account multiple per-
structional decisions. Recommendation 1             spectives. It also emphasizes the need to
suggests that teachers use data from multi-         establish organizational structures and
ple sources to set goals, make curricular and       practices that support the implementation
instructional choices, and allocate instruc-        of that plan.
tional time. It describes the data sources
best suited for different types of instruc-         The panel believes that effective data use
tional decisions and suggests that the use          depends on supporting educators who are
of data be part of a cycle of instructional         using and interpreting data. Recommenda-
inquiry aimed at ongoing instructional im-          tion 4 offers suggestions about how schools
provement. Building on the use of data to           and districts can prepare educators to use
drive classroom-based instructional deci-           data effectively by emphasizing the impor-
sions, recommendation 2 provides guidance           tance of collaborative data use. These col-
about how teachers can instruct students in         laboration efforts can create or strengthen
using their own assessment data to develop          shared expectations and common practices
personal achievement goals and guide learn-         regarding data use throughout a school.
ing. Teachers then can use these goals to
better understand factors that may motivate         Recommendation 5 points out that effec-
student performance and can adjust their            tive, sustainable data use requires a se-
instruction accordingly.                            cure and reliable data-management system
                                                    at the district level. It provides detailed
The panel believes that effective data use          suggestions about how districts or other
at the classroom level is more likely to            educational entities, such as multidistrict
emerge when it is supported by a data-              collaboratives or charter management or-
informed school and district culture. Rec-          ganizations, should develop and maintain
ommendations 3, 4, and 5, therefore, focus          a high-quality data system.
                                                  (8)
Checklist for carrying out the                   Recommendation 4. Provide supports
recommendations                                  that foster a data-driven culture within
                                                 the school
Recommendation 1. Make data part
of an ongoing cycle of instructional                 Designate a school-based facilitator
improvement                                      who meets with teacher teams to discuss
                                                 data.
   Collect and prepare a variety of data
about student learning.                              Dedicate structured time for staff
                                                 collaboration.
   Interpret data and develop hypotheses
about how to improve student learning.             Provide targeted professional devel-
                                                 opment regularly.
    Modify instruction to test hypotheses
and increase student learning.                   Recommendation 5. Develop and
                                                 maintain a districtwide data system
Recommendation 2. Teach students
to examine their own data and set                    Involve a variety of stakeholders in
learning goals                                   selecting a data system.

     Explain expectations and assessment           Clearly articulate system require-
criteria.                                        ments relative to user needs.

    Provide feedback to students that               Determine whether to build or buy
is timely, specific, well formatted, and         the data system.
constructive.
                                                     Plan and stage the implementation of
   Provide tools that help students learn       the data system.
from feedback.

    Use students’ data analyses to guide
instructional changes.

Recommendation 3. Establish a clear
vision for schoolwide data use

    Establish a schoolwide data team that
sets the tone for ongoing data use.

   Define critical teaching and learning
concepts.

    Develop a written plan that articulates
activities, roles, and responsibilities.

    Provide ongoing data leadership.

                                               (9)
Recommendation 1.                                    Figure 1. Data use cycle
Make data part of
an ongoing cycle
                                                            Collect and
of instructional                                         prepare a variety                    Interpret data
                                                          of data about                        and develop
improvement                                              student learning                   hypotheses about
                                                                                             how to improve
                                                                                             student learning

Teachers should adopt a systematic
process for using data in order to bring
evidence to bear on their instructional                                      Modify
decisions and improve their ability to                                instruction to test
meet students’ learning needs. The                                      hypotheses and
                                                                       increase student
process of using data to improve                                            learning
instruction, the panel believes, can be
understood as cyclical (see Figure 1).
It includes a step for collecting and                Because the data-use process is
preparing data about student learning                cyclical, teachers actually can begin at
from a variety of relevant sources,                  any point shown in Figure 1—that is,
including annual, interim, and classroom             with a hypothesis they want to test,
assessment data.15 After preparing                   an instructional modification they
data for examination, teachers                       want to evaluate, or a set of student
should interpret the data and develop                performance data they want to use
hypotheses about factors contributing                to inform their decisions. However,
to students’ performance and the                     the panel has observed that teachers
specific actions they can take to meet               are sometimes asked to use existing
students’ needs. Teachers then should                student assessment data without
test these hypotheses by implementing                receiving clear guidance on how to
changes to their instructional practice.             do so. Consequently, some teachers
Finally, they should restart the cycle by            may find it useful to begin with the
collecting and interpreting new student              collection and preparation of data
performance data to evaluate their own               from a variety of sources, and this
instructional changes.16                             guide presents that as the first step
                                                     in the process. Also, although the
                                                     steps represent the ongoing nature
                                                     of the cycle, teachers may find that
                                                     they need a considerable amount of
15. Halverson, Prichett, and Watson (2007), Her-     data collection and interpretation to
man and Gribbons (2001), Huffman and Kalnin          form strong hypotheses about how
(2003), and Fiarman (2007) outline these com-        to change their instruction.
ponents (in varied order) in their case studies
of how the inquiry process was implemented in
some school and district settings. Similarly, Ab-    Level of evidence: Low
bott (2008) discusses using data to assess, plan,
implement, and evaluate instructional changes as     The panel drew on a group of qualitative
part of a larger framework schools should use to     and descriptive studies to formulate this rec-
achieve accountability. Further detail under each
component is based on panelist expertise.            ommendation, using the studies as sources
16. Abbott (2008); Brunner et al. (2005); Halv-
                                                     of examples for how an inquiry cycle for
erson, Prichett, and Watson (2007); Kerr et al.      data use can be implemented in an educa-
(2006); Liddle (2000); Mandinach et al. (2005).      tional setting. No literature was located that
                                                ( 10 )
Recommendation 1. Make data part of an ongoing cycle of instructional improvement

assesses the impact on student achievement        Each assessment type has advantages and
of using an inquiry cycle, or individual steps    limitations (e.g., high-stakes accountability
within that cycle, as a framework for data        tests may be subject to score inflation and
analysis, however, and the panel determined       may lead to perverse incentives).18 There-
that the level of evidence to support this        fore, the panel believes that multiple data
recommendation is low.                            sources are important because no single
                                                  assessment provides all the information
Brief summary of evidence to                      teachers need to make informed instruc-
support the recommendation                        tional decisions. For instance, as teachers
                                                  begin the data-use process for the first time
The panel considers the inquiry cycle of          or begin a new school year, the accessibil-
gathering data, developing and testing hy-        ity and high-stakes importance of students’
potheses, and modifying instruction to be         statewide, annual assessment results pro-
fundamental when using assessment data            vide a rationale for looking closely at these
to guide instruction. Although no causal          data. Moreover, these annual assessment
evidence is available to support the effective-   data can be useful for understanding broad
ness of this cycle, the panel draws on studies    areas of relative strengths and weaknesses
that did not use rigorous designs for exam-       among students, for identifying students or
ples of the three-point cycle of inquiry—the      groups of students who may need particu-
underlying principle of this recommenda-          lar support,19 and for setting schoolwide,20
tion—and provides some detail on the con-         classroom, grade-level, or department-level
text for those examples in Appendix D.            goals for students’ annual performance.

How to carry out this                             However, teachers also should recognize
recommendation                                    that significant time may have passed
                                                  between the administration of these an-
1. Collect and prepare a variety of data about    nual assessments and the beginning of
student learning.                                 the school year, and students’ knowledge
                                                  and skills may have changed during that
To gain a robust understanding of stu-            time. It is important to gather additional
dents’ learning needs, teachers need to           information at the beginning of the year to
collect data from a variety of sources.           supplement statewide test results. In addi-
Such sources include but are not limited          tion, the panel cautions that overreliance
to annual state assessments, district and         on a single data source, such as a high-
school assessments, curriculum-based as-          stakes accountability test, can lead to the
sessments, chapter tests, and classroom           overalignment of instructional practices
projects. In most cases, teachers and their       with that test (sometimes called “teaching
schools already are gathering these kinds         to the test”), resulting in false gains that
of data, so carrying out data collection de-      are not reflected on other assessments of
pends on considering the strengths, limita-       the same content.21
tions, and timing of each data type and on
preparing data in a format that can reveal
                                                  Kalnin (2003); Lachat and Smith (2005); Supo-
patterns in student achievement. More-
                                                  vitz (2006).
over, by focusing on specific questions
                                                  18. Koretz (2003); Koretz and Barron (1998).
about student achievement, educators can
                                                  19. Halverson, Prichett, and Watson (2007); Her-
prioritize which types of data to gather to       man and Gribbons (2001); Lachat and Smith
inform their instructional decisions.17           (2005); Supovitz and Klein (2003); Wayman and
                                                  Stringfield (2006).
17. Bigger (2006); Cromey and Hanson (2000);      20. Halverson, Prichett, and Watson (2007).
Herman and Gribbons (2001); Huffman and           21. Hamilton (2003); Koretz and Barron (1998).

                                              ( 11 )
Recommendation 1. Make data part of an ongoing cycle of instructional improvement

To gain deeper insight into students’ needs       improved after a unit spent reading and
and to measure changes in students’ skills        analyzing expository writing.
during the academic year, teachers also
can collect and prepare data from interim         Finally, it is important to collect and prepare
assessments that are administered consis-         classroom performance data for examina-
tently across a district or school at regular     tion, including examples and grades from
intervals throughout the year (see the box        students’ unit tests, projects, classwork, and
below).22 As with annual assessments, in-         homework. The panel recommends using
terim assessment results generally have           these classroom-level data sources, in con-
the advantage of being comparable across          junction with widely accessible nonachieve-
classrooms, but the frequency of their ad-        ment data such as attendance records and
ministration means that teachers can use          cumulative files,23 to interpret annual and
the data to evaluate their own instructional      interim assessment results (see the box on
strategies and to track the progress of their     page 13). An important advantage of these
current students in a single school year. For     data sources is that in most cases, they can
instance, data from a districtwide interim        be gathered quickly to provide teachers with
assessment could help illuminate whether          immediate feedback about student learning.
the students who were struggling to con-          Depending on the assignment in question,
vert fractions to decimals improved after         they also can provide rich, detailed exam-
receiving targeted small group instruction,       ples of students’ academic performance,
or whether students’ expository essays            thereby complementing the results of an-
                                                  nual or interim tests. For example, if state
                                                  and interim assessments show that students
  Characteristics of interim
                                                  have difficulty writing about literature, then
  assessments                                     examination of students’ analytic essays,
  • Administered routinely (e.g., each            book reports, or reading-response journals
    semester, quarter, or month)                  can illuminate how students are accustomed
    throughout a school year                      to writing about what they read and can sug-
                                                  gest areas in which students need additional
  • Administered in a consistent                  guidance.24 An important disadvantage of
    manner across a particular grade              classroom-level data is that the assignments,
    level and/or content area within              conditions, and scores are not generally
    a school or district                          comparable across classrooms. However,
                                                  when teachers come together to examine
  • May be commercial or developed                students’ work, this variability also can be
    in-house                                      an advantage, since it can reveal discrepan-
                                                  cies in expectations and content coverage
  • May be administered on paper                  that teachers can take steps to remedy.
    or on a computer
                                                  As teachers prepare annual, interim,
  • May be scored by a computer                   and classroom-level data for analysis,
                                                  they should represent the information in
    or a person

                                                  23. The following studies provide examples of
                                                  available data sources: Owings and Follo (1992);
22. Standards for testing in educational envi-    Halverson, Prichett, and Watson (2007); Jones
ronments are discussed in more detail in Amer-    and Krouse (1988); Supovitz and Klein (2003);
ican Educational Research Association (AERA),     Supovitz and Weathers (2004); Wayman and
American Psychological Association (APA), and     Stringfield (2006).
National Council on Measurement in Education      24. This example is drawn and adapted from a
(NCME) (1999).                                    case study by Fiarman (2007).

                                             ( 12 )
Recommendation 1. Make data part of an ongoing cycle of instructional improvement

  Examples of classroom and                   progress on the interim math assessments
  other data                                  throughout the year. On the graph, she
                                              might create separate lines for students
  • Curriculum-based unit tests               from each performance quartile on the
                                              previous year’s state mathematics assess-
  • Class projects                            ment (see Figure 2). Such a graph would
                                              allow her to compare the growth trajec-
  • Classwork and homework                    tories for each group, although she would
                                              need to be certain that each quartile group
  • Attendance records                        contained numerous students, thereby en-
                                              suring that results were not driven by one
  • Records from parent meetings              or two outliers. (Some data systems will
    and phone calls                           include features that make graphing easier
                                              and more automatic. See recommendation
  • Classroom behavior charts                 5 for more information on data systems.)

  • Individualized educational plans          In general, preparing state and district data
    (IEPs)                                    for analysis will be easier for teachers who
                                              have access to the kind of districtwide data
  • Prior data from students’ cumula-         systems described in recommendation 5,
    tive folders                              although these teachers still will need to
                                              maintain useful records of classroom-level
                                              data. Online gradebooks that allow teach-
aggregate forms that address their own        ers to prepare aggregate statistics by class-
questions and highlight patterns of in-       room, content area, or assignment type can
terest. For instance, if a teacher wanted     be useful for identifying patterns in stu-
to use four waves of interim test data to     dents’ classroom-level performance and for
learn whether students who started the        identifying students whose classwork per-
year with weaker mathematics skills were      formance is inconsistent with their perfor-
narrowing the gap with their peers, she       mance on annual or interim assessments.
could make a line graph tracking students’

Figure 2. Example of classroom running records performance at King Elementary School

Source: Supovitz and Klein (2003).

                                          ( 13 )
Recommendation 1. Make data part of an ongoing cycle of instructional improvement

2. Interpret data and develop hypotheses            source of the discrepancy. In all cases, they
about how to improve student learning.              should use classroom and other data to
                                                    shed light on the particular aspects of the
Working independently or in teams, teach-           skill with which students need extra help.
ers should interpret the data they have
collected and prepared. In interpreting             As they triangulate data from multiple
the data, one generally useful objective            sources, teachers should develop hypoth-
is to identify each class’s overall areas           eses about ways to improve the achieve-
of relative strengths and weaknesses so             ment patterns they see in the data. As the
that teachers can allocate instructional            box on page 15 explains, good hypoth-
time and resources to the content that is           eses emerge from existing data, identify
most pressing. Another useful objective is          instructional or curricular changes likely
to identify students’ individual strengths          to improve student learning, and can be
and weaknesses so that teachers can adapt           tested using future assessment data. For
their assignments, instructional methods,           example, existing data can reveal places in
and feedback in ways that address those             which the school’s curriculum is not well
individual needs. For instance, teachers            aligned with state standards. In those situ-
may wish to adapt students’ class project           ations, teachers might reasonably hypoth-
assignments in ways that draw on stu-               esize that reorganizing the curriculum to
dents’ individual strengths while encour-           address previously neglected material will
aging them to work on areas for growth.             improve students’ mastery of the standards.
                                                    In other cases, teachers may hypothesize
To gain deeper insight into students’ learn-        that they need to teach the same content in
ing needs, teachers should examine evi-             different ways. Taking into account how they
dence from the multiple data sources they           and their colleagues have previously taught
prepared in action step 1.25 “Triangulation”        particular skills can help teachers choose
is the process of using multiple data sources       among plausible hypotheses. For instance,
to address a particular question or problem         teachers may find that students have diffi-
and using evidence from each source to              culty identifying the main idea of texts they
illuminate or temper evidence from the              read. This weak student performance may
other sources. It also can be thought of as         lead teachers to hypothesize that the skill
using each data source to test and confirm          should be taught differently. In talking to
evidence from the other sources in order            other teachers, they might choose a differ-
to arrive at well-justified conclusions about       ent teaching strategy, such as a discussion
students’ learning needs. When multiple             format in which students not only identify
data sources (e.g., results from the annual         the main idea of a text but also debate its
state assessment and district interim as-           evidence and merits.
sessment) show similar areas of student
strength and weakness (as in Example 1),            To foster such sharing of effective practices
teachers can be more confident in their             among teachers, the panel recommends
decisions about which skills to focus on.           that teachers interpret data collaboratively
In contrast, when one test shows students           in grade-level or department-specific teams.
struggling in a particular skill and another        In this way, teachers can begin to adopt
test shows them performing well in that             some common instructional and assess-
skill, teachers need to look closely at the         ment practices as well as common expec-
items on both tests to try to identify the          tations for student performance.26 Col-
                                                    laboration also allows teachers to develop
25. Halverson, Prichett, and Watson (2007); Her-
man and Gribbons (2001); Lachat and Smith           26. Fiarman (2007); Halverson, Prichett, and Wat-
(2005); Wayman and Stringfield (2006).              son (2007); Halverson et al. (2007).

                                               ( 14 )
Recommendation 1. Make data part of an ongoing cycle of instructional improvement

a collective understanding of the needs of        3. Modify instruction to test hypotheses and
individual students in their school, so that      increase student learning.
they can work as an organization to provide
support for all students.                         After forming hypotheses about students’
                                                  learning needs, teachers must test their
                                                  hypotheses by carrying out the instruc-
                                                  tional changes that they believe are likely
  Forming testable hypotheses                     to raise student achievement. The kinds
                                                  of changes they choose to implement may
  Situation: Based on data from your 3rd-
                                                  include—but are not limited to—one or
  grade class’s assignments and assess-
                                                  more of the following:
  ments, it appears that more than half
  of the students struggle with subtrac-
                                                  •    allocating more time for topics with
  tion. As their teacher, you ask yourself
                                                       which students are struggling;
  how they can better master subtraction
  skills. To answer this question, you hy-        •    reordering the curriculum to shore up
  pothesize that the students’ subtraction             essential skills with which students are
  skills might improve if they were taught             struggling;
  to use the “trade first” method for sub-
  traction, in which students do their re-        •    designating particular students to re-
  grouping from the tens to ones column                ceive additional help with particu-
  at the beginning, rather than at the end,            lar skills (i.e., grouping or regrouping
  of the problem. You determine that this              students);
  hypothesis can be tested by (1) working
  with these students in a group to teach         •    attempting new ways of teaching dif-
  them the trade first method and (2) ex-              ficult or complex concepts, especially
  amining changes in their subtraction                 based on best practices identified by
  scores on the interim assessment.                    teaching colleagues;

  Characteristics of testable                     •    better aligning performance expecta-
  hypotheses                                           tions among classrooms or between
                                                       grade levels; and/or
  • Identify a promising interven-
    tion or instructional modification            •    better aligning curricular emphasis
    (teaching the trade first method for               among grade levels.
    subtraction) and an effect that you
    expect to see (improvement in                 If the instructional modification was not
    the subtraction skills of struggling          developed collaboratively, teachers may
    students)                                     nonetheless find it useful to seek feedback
                                                  from peers before implementing it. This
  • Ensure that the effect can be mea-            is particularly true if teachers have cho-
    sured (students’ subtraction scores           sen to enact a large instructional change,
    on the interim assessment after               such as a comprehensive new approach
    they learn the trade first strategy)          to algebra instruction or a reorganization
                                                  of the mathematics curriculum sequence.
  • Identify the comparison data (stu-            Because curricular decisions are some-
    dents’ subtraction scores on the in-          times made at the school or district level,
    terim assessment before they were             teachers may even want to make a case for
    taught the strategy)                          curriculum reorganization with school or
                                                  district leaders ahead of time.
                                              ( 15 )
Recommendation 1. Make data part of an ongoing cycle of instructional improvement

The time it takes teachers to carry out their    should give themselves and their students
instructional changes will depend in part        time to adapt to it.28
on the complexity of the changes. If teach-
ers are delivering a discrete lesson plan or     Potential roadblocks and solutions
a series of lessons, then the change usually
can be carried out quickly. Larger interven-     Roadblock 1.1. Teachers have so much
tions take longer to roll out than smaller       data that they are not sure where they
ones. For instance, a teacher whose inter-       should focus their attention in order to raise
vention involves introducing more collab-        student achievement.
orative learning into the classroom may
need time to teach her students to work          Suggested Approach. Teachers can nar-
efficiently in small group settings.             row the range of data needed to solve a
                                                 particular problem by asking specific ques-
During or shortly after carrying out an in-      tions and concretely identifying the data
structional intervention, teachers should        that will answer those questions. In ad-
take notes on how students responded and         dition, administrators can guide this pro-
how they as teachers might modify deliv-         cess by setting schoolwide goals that help
ery of the intervention in future classes.       clarify the kinds of data teachers should be
These notes may not only help teachers           examining and by asking questions about
reflect on their own practice but also pre-      how classroom practices are advancing
pare them to share their experiences and         those goals. For instance, if administrators
insights with other teachers.                    have asked teachers to devote particular
                                                 effort to raising students’ reading achieve-
To evaluate the effectiveness of the in-         ment, teachers may decide to focus atten-
structional intervention, teachers should        tion on evidence from state, interim, and
return to action step 1 by collecting and        classroom assessments about students’
preparing a variety of data about student        reading needs. Teachers should then tri-
learning. For instance, they can gather          angulate data from multiple sources (as
classroom-level data, such as students’          described earlier) to develop hypotheses
classwork and homework, to quickly eval-         about instructional changes likely to raise
uate student performance after the inter-        student achievement. Note that recommen-
vention.27 Teachers can use data from later      dation 3 describes how administrators, data
interim assessments, such as a quarterly         facilitators, and other staff can help teach-
district test, to confirm or challenge their     ers use data in ways that are clearly aligned
immediate, classroom-level evidence.             with the school’s medium- and long-term
                                                 student achievement goals. Also, recom-
Finally, after triangulating data and con-       mendation 4 describes how professional
sidering the extent to which student learn-      development and peer collaboration can
ing did or did not improve in response           help teachers become more adept at data
to the intervention, teachers can decide         preparation and triangulation.
whether to keep pursuing the approach
in its current form, modify or extend the        Roadblock 1.2. Some teachers work in a
approach, or try a different approach alto-      grade level or subject area (such as early
gether. It is important to bear in mind that     elementary and advanced high school
not all instructional changes bear fruit im-     grades) or teach certain subjects (such as
mediately, so before discarding an instruc-      social studies, music, science, or physical
tional intervention as ineffective, teachers     education) for which student achievement
                                                 data are not readily available.

27. Forman (2007).                               28. Elmore (2003).

                                            ( 16 )
Recommendation 1. Make data part of an ongoing cycle of instructional improvement

Example 1. Examining student data to understand learning
 Consider this hypothetical example . . . When the 4th- and 5th-grade
 teachers at Riverview Elementary School met after school in Septem-
 ber for their first data meeting of the year, the data facilitator, Mr.
 Bradley, shared selected data about how students had performed on         Action Step 1
 the previous year’s standards-based state accountability test. Teach-
 ers quickly saw that in both grades, students’ proficiency rates were
 higher in language arts than in mathematics, so they decided to look
 more closely at particular mathematics skills. Examining the results
 on each math content strand, the teachers found that although stu-
 dents were performing adequately in arithmetic, they struggled with
 geometry skills concerning shapes and measurement. This news was          Action Step 2
 surprising because, consistent with state standards, teachers taught
 shapes and measurement in both the 4th and 5th grades.
 Because students had already taken their first district-based interim
 assessment of the school year, the teachers also were able to use
 the district’s data system to look at how students had performed in       Action Step 1
 geometry on that assessment. Studying one graph, Ms. Irving, a 4th-
 grade teacher, observed that the content strand with which students
 struggled most was measuring perimeters of polygons. Since calculat-
 ing perimeters was a matter of adding, and students had performed
 well on the addition strands of both the annual and interim tests, the
 teachers were perplexed. They decided to collect new data on students’    Action Step 2
 geometry skills using questions from the supplemental workbooks of
 their standards-based math curriculum.
 When teachers brought their students’ workbook responses to the next
 data meeting, they gathered in small groups to examine the students’
 work and generate hypotheses. As they shared the classwork exam-
 ples, they noticed a pattern. Students performed well on simple pe-
 rimeter problems when the shapes were drawn for them, but on word
 problems that required them to combine shapes before adding, they         Action Step 2
 largely faltered. The teachers hypothesized that students’ difficulties
 were not with calculating perimeters, but with considering when and
 how to combine polygons in response to real-world problems. They
 further hypothesized that students would benefit from opportunities
 to apply basic geometry skills to novel situations.
 Working together in grade-level teams, the teachers devised tasks for
 their students that would require them to use manipulatives and on-
 line interactive simulations to solve perimeter problems about floor      Action Step 3
 plans and land use. The teachers agreed to deliver these lessons in
 their classrooms and report back on how the students responded.
 At the next data meeting, teachers brought implementation notes and
 samples of student work from the hands-on perimeter lessons. Most
                                                                           Action Step 1
 reported that students were engaged in the lessons but needed addi-
 tional practice. After readministering similar lessons two weeks later,
 most teachers found that their students were getting the hang of the
 task. On the next interim assessment, teachers were pleased to learn
 that the percentage of perimeter and area questions answered correctly    Action Step 2
 had increased from 40 percent to 70 percent across the two grades.

                                           ( 17 )
You can also read