Thursday, May 26, 2016

ESSA Workgroups Meet: School Improvement

This is the third of four posts regarding the ESSA workgroups meeting in Oregon to create Oregon's new system under ESSA. These are summaries released from each workgroup. Today we post the School Improvement Workgroup's recap/next steps. Find more information about this workgroup here.   More on the fourth workgroup to come.

School Improvement Workgroup:
Where We’ve Been and Where We’re Going
The School Improvement Workgroup has been charged with developing a proposed framework of supports for schools identified for comprehensive and targeted improvement as well as developing a proposed framework for determining how and when schools will exit identification. To accomplish this, the group established a common understanding of the
various stages of Oregon’s current improvement cycle and the impact on schools currently undergoing improvement efforts.

Work Group Progress
The workgroup has developed strong frames around the need to remove the stigmatization of schools identified for additional supports. This requires balancing a level of flexibility and differentiated approaches that embrace the various contexts for schools and districts as well as holding parties accountable for significant and sustained improvement.

There is also consensus within the group that “school improvement” should not be limited to Federally mandated requirements and that there is great opportunity to go above and beyond the minimum.

Ongoing Discussions
At the April 26
th meeting, workgroup members engaged in discussions focusing on the four major areas of the improvement cycle and discussed guiding principles that might be incorporated into Oregon’s next iteration of its improvement process. Each major area was framed by essential questions and considerations.

Identification: How might schools be identified for improvement supports? Guiding principles discussed were:
  •   *Inclusion of data that include measures of teacher quality / effectiveness
  •   *Multiple measures of student achievement / academic performance (not just Smarter
  •   *Broader data around school climate and culture (TELL or similar collection)
  •   *Measures that compare how schools / districts serve and support underserved student
       populations, noting the current model compares academic peers, but does not compare
       similar underserved student populations in the same manner
  •   *School-level measures that lead to district identification for improvement supports

    Diagnostic Review and Planning: What role might ODE / LEAs play in the diagnostic review / needs assessment? What are the opportunities and barriers in conducting high- quality, in-depth diagnostic reviews? How might stakeholders be meaningfully and productively engaged in the review process? Guiding principles discussed were:
  •   
  •  *Diagnostic review is the key to success more authentic review yields better plans
  •   *Stronger input and engagement from teachers in planning and implementation
  •   *More engagement from community stakeholders throughout the process
  •   *More engagement from school boards and superintendents including active participation
       in the review, planning and monitoring processes
  •   *Alignment of state expectations, district plans and actions, and school plans and actions

Monitoring: What (additional) data might be used for in-year / implementation monitoring? What resources might be developed in order to support improvement efforts? How might plans be evaluated and approved on an annual basis? Guiding principles discussed were:
  •   *Emphasis on district and school interim monitoring plans
  •   *Differentiated financial resources based on monitoring routines and outcomes
  •   *Reduce paperwork / burden to submit updates and reports
  •   *Review of systems working together: teacher observation / evaluation assessment
       RTI / PBIS climate / culture
  •   *Stronger development of implementation evidence – What will this look like when it’s

    Exit Criteria and Progressive Interventions: How might we define improvement? Does exit criteria need to mirror identification criteria? Can schools exit improvement status before the end of the identification period? How might we support sustained improvement? What might progressive interventions include for schools who do not demonstrate improvement? Guiding principles discussed were:
  •   *The desire to “exit” is based on the punitive / shaming stigma; if there’s no stigma,       districts / schools might not want to exit
  •   *Schools who demonstrate improvement should be able to exit with continued financial supports
  •   *The notion of “what gets you in, gets you out” works with some added flexibility / adaptability
  •   *Schools should create portfolios of evidence to establish improvement and change
  •   *Broader indicators than identification test scores might get a school identified, but
    more should be required to establish improvement
  •   *Multiple indicators aligned to systems health / improvement
  •   *Stronger ties to educator effectiveness and instruction

    At our May 18
    th meeting, the School Improvement Workgroup will continue to engage in discussions focusing on the various elements of the improvement process including further refinement of the principles discussed in April. Additionally, the workgroup will engage in discussions on some of the federal requirements and flexibility with set-aside funds to support direct services to students.

    By the end of the day, we hope to have some strong proposals for frameworks in each of the four areas as well as clear proposed actions for direct services to students. This process will continue through our final meeting on June 28th.

You can read the update from the Accountability work group here. Read the Standards and Assessment update herepage2image21000

No comments:

Post a Comment