Scenarios & Take Away Responses

Page created by Chester Fischer
 
CONTINUE READING
Scenarios & Take Away Responses
Scenarios & Take Away Responses
With Resources and Evidence

1. “Two of my sections are all the students who are already advanced – that’s why
   they didn’t grow. They are already very high!”

Take Away Response
Misunderstandings/Inaccurate Assumptions:
   High achieving groups of students cannot make growth on PA’s state assessments
   Viewing student performance through the lens of achievement instead of growth
   Thinking of high achievement only in terms of % Advanced/Proficient
Key Concepts/Topics to Discuss:
   PVAAS is a measure of growth for a group of students, not individual students
   High achievement can be defined in many ways - % P/A; average scaled score; etc.
   Advanced is a range of performance; Proficiency is a range of performance
   PVAAS uses scaled scores from the PA state assessments to measure growth, not
     performance levels (Advanced, Proficient, Basic, Below Basic); PVAAS is a much more
     sensitive measure of student growth/change in the achievement level of a group students
   PA’s state assessments have enough stretch to measure growth of high and low achieving
     students; SAS EVAAS checks this each year when new PA assessment data files are received
     from the PA testing vendor
   High achieving students can—and do—show growth through PVAAS, and educators are
     neither advantaged nor disadvantaged by the achievement level of their students
Evidence:
   When a school/teacher has a high percentage of students who are reaching proficiency, or
     even scoring at the Advanced level, we cannot necessarily make the assumption that all
     students are scoring at the highest point within the Advanced range. In fact, of the 800,000
     students assessed on the PSSA in 2013:
      o less than 0.5% (less than 3500 students) score at the highest point of the Advanced
          range;
      o only 0.02% (less than 85 students) scored at the highest point of the Advanced range in
          Math two years in a row; and
      o only 0.005% (less than 30 students) scored at the highest point of the Advanced range
          in Reading two years in a row.
   For the Keystone tests which were first administered in the 2012-2013 school year,
     approximately 140,000 test scores were included in the PVAAS model for each subject. Of
     these test scores:
      o less than 0.0001% (less than 5 students) scored at the highest point of the Advanced
          range in Algebra I;
      o less than 0.0001% (less than 5 students) scored at the highest point of the Advanced
          range in Biology; and
      o less than 0.001% (less than 20 students) score at the highest point of the Advanced
          range in Literature.
PVAAS Statewide Team for PDE • pdepvaas@iu13.org
Fall 2014
PVAAS: Ready, Set, Grow! (Full Day Training for Admin Teams )
                                                                              Scenarios & Take Away Responses

     Teacher Specific Reporting Scatterplots in Misconceptions of PVAAS Teacher Specific
      Reporting document (p. 5) (https://pvaas.sas.com)
Resource(s):
   Misconceptions of PVAAS Teacher Specific Reporting (p. 5-6) (https://pvaas.sas.com)
   PVAAS Methodologies Document: Measuring Growth and Projecting Performance (p. 8-10)
     (https://pvaas.sas.com)
   Virtual Learning Modules (VLM): Introduction to High Achievement and Growth; High
     Achievement & Growth, An In-Depth Approach (https://pvaas.sas.com, under e-Learning
     Link)

2. “I’ve always taken all the students with IEPs. I appreciate your confidence in me,
   but how is that fair to me?”

Take Away Response:
Misunderstandings/Inaccurate Assumptions:
   Low achieving groups of students cannot make growth as measured by PA’s state
     assessments
   Viewing student performance through the lens of achievement instead of growth
Key Concepts/Topics to Discuss:
   PVAAS uses all available testing history for each individual student
   PVAAS is a measure of growth for a group of students, not individual students
   Each student serves as his or her own control, and to the extent that student demographics
     or other influences persist over time, these influences are already represented in the
     student’s data
   In other words, a student’s background tends to remain stable over time and if certain
     factors related to that background impact student scores, they are captured in the student’s
     test scores over time
   PVAAS does not use the percentages of students at various academic performance levels to
     measure growth. Each performance level contains a range of scaled scores, and students
     move around within these ranges, as well as between ranges
   Growth as measured by PVAAS is a more sensitive measure of student achievement growth
     than changes in performance levels(Advanced, Proficient, Basic, Below Basic)
Evidence:
   Teacher Specific Reporting Scatterplots in Misconceptions of PVAAS Teacher Specific
     Reporting document (p. 4) (https://pvaas.sas.com)
Resource(s):
   Misconceptions of PVAAS Teacher Specific Reporting (p.2-4) (https://pvaas.sas.com)
   Virtual Learning Module (VLM): PVAAS Misconceptions (https://pvaas.sas.com, under e-
     Learning Link)

PVAAS Statewide Team for PDE • pdepvaas@iu13.org                                                             2
Fall 2014
PVAAS: Ready, Set, Grow! (Full Day Training for Admin Teams )
                                                                              Scenarios & Take Away Responses

3. “If my students lost ground last year, would my students have to make that up to
   be able to show growth? How can I do that? That hardly seems fair!”

Take Away Response:
Misunderstandings/Inaccurate Assumptions:
   Not understanding HOW growth is measured in PVAAS
   Thinking that PVAAS is measuring the growth of individual students
   Assuming they will get a low value-added score if the students did not make growth the year
     before
Key Concepts/Topics to Discuss:
   Group is assessed against students’ entering achievement level; Each student serves as his
     or her own control; The student group is being compared to themselves and NOT a prior
     group of students
   Growth is assessed against the standard for PA Academic Growth, which is based on the
     philosophy that, regardless of the entering achievement level of a group of students, they
     should not lose ground academically
   Growth is measured by looking at differences in achievement for same group of students
   To get a green/light blue or dark blue in PVAAS a teacher does not need to make up growth
     not made last year plus another year
Evidence:
   Refer to School Value-Added Reporting year to year to show how this is possible at the
     school level as the concept applies at the teacher level
Resource(s):
   PVAAS Methodologies Document: Measuring Growth and Projecting Performance (p. 8-10)
     (https://pvaas.sas.com)
   Virtual Learning Modules (VLM): Introduction to Measuring Academic Growth, Concept of
     Growth, Connections to Value-Added Reporting (https://pvaas.sas.com, under e-Learning
     Link
   Virtual Learning Module (VLM): PVAAS Misconceptions (https://pvaas.sas.com, under e-
     Learning Link)

PVAAS Statewide Team for PDE • pdepvaas@iu13.org                                                             3
Fall 2014
PVAAS: Ready, Set, Grow! (Full Day Training for Admin Teams )
                                                                              Scenarios & Take Away Responses

4. “The curriculum you want me to teach isn’t aligned with the state assessment.”

Take Away Response
Misunderstandings/Inaccurate Assumptions:
   Professional staff do not have access to the PA Core Standards
   Professional staff do not have access to the eligible content
   No teacher in the respective LEA/district in the specified subject/grade/course has a group
     of students who met the standard for PA Academic Growth
Key Concepts/Topics to Discuss:
   PA Core Standards are posted on the PDE SAS Portal for all PA professional staff to access,
     including eligible content in the state assessed subjects/grades/courses
   Discuss the LEA’s work with alignment of local curriculum to PA Core Standards current
     status of curriculum; LEA curriculum cycle/ongoing process
   Discuss the LEA’s work with alignment of assessments to the PA Core Standards
   Discuss the LEA’s work with alignment of core and interventions programs to PA Core
     Standards
Evidence:
   Is the issue of concern across other subjects/grades/courses or only in one
     subject/grade/course?
   District/School Teacher Reporting Summaries (only share non-identifiable data)
      o Did any other teachers meet/exceed the standard of PA Academic Growth
   Has the LEA/district conducted a curriculum audit as evidence of alignment?
Resource(s):
    PDE Standards Aligned System Portal (http://www.pdesas.org/) including the PA Core
     Standards, state assessed Eligible content and PA curriculum framework

5. “My students did well on our common assessments. Why am I not getting blue and
   green?”

Take Away Response
Misunderstandings/Inaccurate Assumptions:
   Achievement results = growth results
   Not understanding how growth is a measure across time versus at a point in time
   Common assessments are measuring all the same eligible content/standards as the
     statewide assessment
Key Concepts/Topics to Discuss:
   Common assessments are often used as formative or benchmark assessments throughout
     the year to guide instruction. They do not always assess all eligible content on the statewide
     assessment and/or the weighting of the eligible content
   Proficient or higher on an assessment, does not mean the group of students made growth
     from year to year
   Achievement indicates performance on that day for that assessment
PVAAS Statewide Team for PDE • pdepvaas@iu13.org                                                             4
Fall 2014
PVAAS: Ready, Set, Grow! (Full Day Training for Admin Teams )
                                                                              Scenarios & Take Away Responses

      Comparing the percentage of students who score Proficient (or above) does not account for
      changes in achievement/growth within performance level categories
     PVAAS value-added reporting follows the progress of individual students over time,
      regardless of their achievement level, to ensure that all students count
Evidence:
   Misconceptions of PVAAS Teacher Specific Reporting (p.7-8) (https://pvaas.sas.com)
Resource(s): (https://pvaas.sas.com)
   Misconceptions of PVAAS Teacher Specific Reporting (p.7-8) (https://pvaas.sas.com)

6. “Why is the School Value-Added Growth Measure light blue, but my Teacher Value-
   Added Growth Measure is yellow?” (or vice versa)

Take Away Response
Misunderstandings/Inaccurate Assumptions:
   The analyses and business rules are the same which is not true
   The same students are included, which may not be true
   All students are weighted equally which may not be true
Key Concepts/Topics to Discuss:
   PVAAS School Reporting uses full academic year as a requirement for including students
   PVAAS Teacher Value-added reporting uses % Student +Teacher Enrollment and % Shared
     Instruction to determine the weighting of each student on each teacher’s PVAAS teacher
     Specific Reporting
   The standard error around the growth measure will be larger for a teacher versus for the
     teacher’s respective school given the number of students and prior test scores available to
     use in the analyses (i.e., the smaller number of students, the larger the standard error)
   Having a smaller number of students in individual classrooms versus the overall school can
     make it such that PVAAS Teacher Specific Reporting may yield different colors than PVAAS
     School Reporting, especially if a school is on the cusp of red, you might have teachers only
     falling into yellow or green
Evidence:
   Examine local reporting for the existence of this scenario; comparing PVAAS School and
     Teacher Specific Reporting
Resource(s):
   PVAAS Methodologies Document: Measuring Growth and Projecting Performance (p.27)
     (https://pvaas.sas.com)

PVAAS Statewide Team for PDE • pdepvaas@iu13.org                                                             5
Fall 2014
PVAAS: Ready, Set, Grow! (Full Day Training for Admin Teams )
                                                                              Scenarios & Take Away Responses

7. “Why doesn’t this take attendance into account?”

Take Away Response
Misunderstandings/Inaccurate Assumptions:
 Groups of students with chronic absenteeism cannot meet the standard for PA Academic
   Growth
 Teachers with students with chronic absenteeism will get low PVAAS scores
Key Concepts/Topics to Discuss:
   Discuss PA’s rationale for using enrollment v. attendance
      o The district, school, and individual teacher(s) each have a role in preventing and
          intervening with student attendance issues. The School Performance Profile reflects the
          effectiveness of the school’s efforts to address student attendance. The use of
          enrollment in PVAAS teacher-specific reporting reflects the responsibility of individual
          teachers in preventing/intervening with student attendance issues. Teachers are
          responsible for the education of each student in a subject/grade/course which results
          in a subject/course grade, as well as performance on state assessments
      o Teacher-specific strategies include areas such as high expectations,
          relevant/meaningful/engaging instruction, relationship building with students,
          mentoring, parent communication, group and individual incentive programs, and
          continuity of instruction (teacher attendance)
      o Students can be dropped/un-enrolled from a subject/grade/course based on LEA policy
   Discuss issue of chronic absenteeism:
      o This is related to concerns about high-poverty students or other
          socioeconomic/demographic factors that are related to low achievement
      o To the extent that these factors are likely to be the similar from year to year and affect
          student test scores, using all available prior test scores on each student enables each
          student to serve as his/her own control and the growth expectation will indirectly take
          these factors into account, since it is taking the students' previous performance and
          entering achievement into account
      o In others words, if a student has had a history of attendance issues and it has impacted
          their prior performance/they achieved lower on previous state assessments- this lower
          achievement is taken into account when determining growth for a teacher’s group of
          students
   LEAs may want to ensure that their LEA enrollment policies are being implemented as
     written, making sure documentation of student enrollment/disenrollment is occurring
     consistently in the local LEA Student Information System (SIS). LEAs do have state policies
     they need to follow regarding student truancy, etc. Some LEAs have discussed an enrollment
     policy where a student can be unenrolled from a tested subject/course after a specified
     number of days while continuing to be enrolled in the school/LEA.
   Enrollment policies are determined locally within the guidelines of state level requirements
Evidence:
   Misconceptions of PVAAS Teacher Specific Reporting (p.3) (https://pvaas.sas.com)
   PDE has requested that SAS EVAAS conduct additional analyses using PA attendance data
     collected from LEAs by PDE
Resource(s): PVAAS FAQ: Roster Verification document (p. 19-20) (https://pvaas.sas.com)
PVAAS Statewide Team for PDE • pdepvaas@iu13.org                                                             6
Fall 2014
PVAAS: Ready, Set, Grow! (Full Day Training for Admin Teams )
                                                                              Scenarios & Take Away Responses

8. “I no longer want to co-teach with that person.”

Take Away Response
Misunderstandings/Inaccurate Assumptions:
   A low PVAAS score is the fault of one person
   Students with specific demographic groups negatively impacted the PVAAS score
   The source of the value-added results is only related to the approach of co-teaching or the
     specific co-teacher
Key Concepts/Topics to Discuss:
   How was the progress of students handled prior to Act 82 in these kinds of situations? Is
     this a new concern?
   Has the co-teaching model been implemented with fidelity by both co-teachers?
   Does co-teaching occur in all sections?
   What does the growth of other co-teacher’s group of students in the school/district look
     like?
   Have these concerns been discussed previously with the school admin or is this new
     information?
   Discuss the implementation of models of co-teaching and what co-teaching approach/model
     is in place.
Evidence:
   Develop a PVAAS custom diagnostic report on sections of students with and without co-
     teaching approach
   Review the School Admin Teacher Value-Added Summary for the subject/grade/course to
     see if there are others with different or similar results; hypothesize what may have created
     these results?
Resource(s):
   Teacher’s Desk Reference: Co-teaching, www.pattan.net

PVAAS Statewide Team for PDE • pdepvaas@iu13.org                                                             7
Fall 2014
PVAAS: Ready, Set, Grow! (Full Day Training for Admin Teams )
                                                                              Scenarios & Take Away Responses

9.   “I no longer want to have a student teacher.”

Take Away Response
Misunderstandings/Inaccurate Assumptions:
   The assumption with this statement/concern is that the cause of lower value-added was the
     student teacher
Key Concepts/Topics to Discuss:
   How was the progress of students handled prior to Act 82 in these kinds of situations? Is
     this a new concern?
   Were there concerns about the student teacher during his/her time in the teacher’s
     classroom? Were these concerns communicated by the teacher to the principal?
   What is the role of the student teacher? How is the student teacher used in the classroom?
     How is/was the student teacher supported?
   Where is the teacher when the student teacher is teaching the students? Discuss the
     role/responsibility of the teacher in terms of student academic growth when the student
     teacher is providing instruction to students
   Preliminary research findings by SAS EVAAS:
      o For most grades and subjects, supervising student teachers had no significant
          difference in terms of teacher effectiveness, particularly for teachers who are
          considered average or high performing
      o However, the initial findings do suggest that that low performing teachers might have a
          small negative impact in their effectiveness in Mathematics and Science when
          supervising student teachers as compared to not supervising; This finding has potential
          implications for the assignment of student-teachers to licensed teachers
      o http://www.tn.gov/thec/Divisions/AcademicAffairs/rttt/tvaas/Policy_%20Brief_Stude
          ntTeacher%20Assignment.pdf
Evidence:
   Preliminary Report: The Impact of Student Teachers on Teacher Value-Added Reporting
     (http://www.tn.gov/thec/Divisions/AcademicAffairs/rttt/tvaas/Policy_%20Brief_StudentT
     eacher%20Assignment.pdf)
Resource(s):
   Preliminary Report: The Impact of Student Teachers on Teacher Value-Added Reporting
     (http://www.tn.gov/thec/Divisions/AcademicAffairs/rttt/tvaas/Policy_%20Brief_StudentT
     eacher%20Assignment.pdf)

PVAAS Statewide Team for PDE • pdepvaas@iu13.org                                                             8
Fall 2014
You can also read