|
|
|
|
author:
|
Horizon Research, Inc.
|
published:
|
02/14/2001
|
posted to site:
|
02/14/2001
|
What Have We Learned? Local Systemic Change Initiatives Share Lessons From the Field. Horizon Research, Inc.
In June, 2000 NSF sponsored a meeting for LSCs engaged in the reform of mathematics and science education. This meeting was held in Washington D.C. and coordinated by HRI. The PIs and evaluators assembled were asked what they would tell mathematics and science education communities about designing professional development, preparing professional development providers, engaging administrators, and supporting implementation of instructional materials. This important report captures some of the critical lessons shared. It will be of interest to current and future LSCs as well as the educational community at large.
The report is available as a pdf file. Use this link to download a copy. Or read it below.
I. Introduction
The Local Systemic Change (LSC) program began in 1995, with support from the National
Science Foundation (NSF). The goal of the LSC is to improve the teaching of mathematics,
science, and technology by focusing on the professional development of teachers within whole
schools or school districts, with an emphasis on preparing teachers to implement designated
exemplary mathematics and science instructional materials in their classrooms. Starting with 8
projects in 1995, the program expanded to a total of 72 projects by 1999, including projects
targeting the elementary or secondary grades, or both; and addressing mathematics, science, or
both. Many of the projects chose "kit- based" science programs, with an emphasis on hands- on
science inquiry, or module- based mathematics programs that focus on problem- solving in realworld contexts.
In June 2000, NSF sponsored a meeting for Local Systemic Change initiatives engaged in the
reform of mathematics and science education. Held in Washington, DC and coordinated by
Horizon Research, Inc. (HRI), the "Lessons Learned" conference included representatives from
30 LSC projects. HRI deliberately invited teams of LSC Principal Investigators (PIs) and
evaluators to ensure complementary perspectives. PIs responsible for designing and
implementing projects brought first- hand knowledge from "the trenches," while evaluators
brought an external, and sometimes multi- site perspective, based on observing one or more LSC
projects over time.
Collectively, LSC PIs and evaluators share a wealth of knowledge about how to "do reform."
The June conference, however, asked participants to shift their focus away from specific reform
strategies, and dwell on what they had learned from implementing these strategies. Given the
opportunity, what would LSCs tell the mathematics and science education communities about
designing professional development? about preparing professional development providers?
about supporting implementation of instruction materials? about engaging administrators?
about sustaining reform? What barriers might new LSCs anticipate? What could more
experienced projects tell others about overcoming or circumventing these barriers? In answering
these questions, LSC conference participants had the chance to consider some of their own
decisions, and reflect on how they might have altered their design, based on what they have
learned. Our goal was to capture these discussions to benefit both current and future projects.
To help frame the Lessons Learned conference, HRI conducted interviews during the spring of
2000 with the PIs of 12 LSC initiatives whose NSF funding was just ending; nine of these were
K- 8 science projects. We expected that, over time, these projects had learned some key lessons
about systemic reform in mathematics and science education. Interview questions about
professional development providers, professional development strategies, and sustaining reform
were designed to identify issues that an expanded LSC audience could explore further in
conference sessions.
Not surprisingly, the predominant lesson learned from both the conference and the interviews
was that there are no easy answers in designing and implementing systemic reform. Given
limited resources, LSCs must engage in a balancing act, deciding how, where, and when to build
on or forego particular reform strategies. These decisions, based on needs versus assets, result in
a cycle of tensions and trade- offs. This report highlights some of these dilemmas, and provides
LSCs and NSF with an opportunity to consider some design strategies that appear to be critical to
the success of systemic reform in mathematics and science education.
The substance of this report reflects the contributions of both PIs and evaluators who attended
the LSC conference.1 Their experiences enabled them to speak knowledgeably about systemic reform at the local level, and we present their advice as a set of preliminary, empirically- driven
"lessons learned." We would add, however, that while LSCs face similar challenges, they also
vary in design, context, targeted grade levels and subject, and experience with reform. All of
these factors are likely to influence the salient lessons learned from a particular project. In short,
we recommend that the reader view the content of this report as practical advice from reformexperienced
colleagues, and consider the pieces that make sense within the context of their own
LSC. It is also important to recognize that many of the conference participants were representing
elementary science projects reflecting the composition of the initial cohorts of LSCs; some of the
"lessons" many not be as applicable to mathematics projects or to those serving secondary
science teachers.
One final caveat: this report is not intended to be a comprehensive discussion of all of the
elements critical to local systemic reform. HRI intentionally focused on preparing and deploying
professional development providers; trade- offs in professional development design; involving
administrators and ensuring sustainability; since these were issues that most often emerged in
LSC evaluators' reports.
1 We did not distinguish between PIs and evaluators in recording their comments at the conference.
|
|