The Minneapolis initiative needs time to get established. Schools don’t improve when veering from one idea to another.
The June 7 article “ ‘Focus’ is failing to deliver” presented early impressions of a two-year-old effort by the Minneapolis public schools to improve educational outcomes by more tightly determining and sequencing the content of instruction. The article suggested that “focused instruction” is associated with lower academic achievement. Unfortunately, the claims were supported by rather sketchy “analysis,” with no real clear sense of how to judge this initiative in a clear, nonambiguous way as the first step in a longer, careful and successful reform effort.
Educational reform is often marked by short-term efforts that are thrown out when administrations change, or when efforts fail to demonstrate whopping changes almost immediately. This is a shortsighted way to change our schools — and one that I hope Minneapolis schools will reject. A systematic approach to program improvement wouldn’t ask, two years after implementation, whether focused instruction “is working.” The answer is too likely to be “no,” and the risk is that we throw out a very promising, even effective, effort because we asked a wrong question.
Focused instruction addresses one component of effective education — it controls the content and sequence of what teachers teach. Many educators would suggest that other factors, particularly improved instructional quality and increased instructional dosage, are also important. Even still, a strong argument can be made that instructional sequencing must come before other factors that will drive improvement — and that the district needs to start somewhere.
Perhaps we, the citizens of Minneapolis, can ask the school district to show us the importance of focused instruction using an approach like that now used by other initiatives taking on tough, seemingly intractable problems with multipronged interventions. This approach focuses on both the challenges of implementation and systematic, data-based efforts to improve outcomes systematically over time.
We have useful models for this more thoughtful approach right here in Minneapolis — in the work of Results Minneapolis, the Northside Achievement Zone and Generation Next. These initiatives (and others) are moving toward “results-based accountability” to select, implement and then continuously improve their work — and to realize that complex problems often require complex solutions developed systematically, and sometimes incrementally, over time.
As different as these efforts may seem, they are similar in their careful blending of creative, deep analysis and thoughtful program design. In particular, each asks three key questions on the way to a comprehensive analysis.
The first question — “how much?” — helps determine that we are doing things differently. In focused instruction, we would want to know the extent to which selected teachers are using the program’s elements; if they are not, efforts would be directed to helping them do so.
As intervention moves into all intended sites, attention turns to “how well?” — are teachers following all specifications of the intervention, or do they need additional support and coaching to produce high-fidelity implementation?
Finally, once a thoughtful but complex intervention is in place and is working as intended, then we can ask “how is it working?” — is academic achievement improving?
Early indicators of success (or need for improvement) are critical in large-scale efforts like focused instruction. School district leaders should find instances where implementation is going well, and should examine early indicators of improved achievement in these settings; they have this capacity with existing measures of reading and mathematics. But school leaders and citizens need to be willing to stay the course — to get a thoughtful effort fully implemented, to size up its successes and needs for improvement, and to commit to systematically identifying and implementing these improvements as rapidly as possible.
Flipping from one intervention idea to another has failed to produce anything close to acceptable academic achievement in American schools. This legacy should teach us two critical lessons: Silver bullets simply don’t exist, and true improvement rarely comes quickly. Maybe focused instruction is not yet as good as it can be, and maybe attention needs to be paid to other factors like instructional quality and dosage. But let’s ask Minneapolis schools to show us their efforts to complete a careful, thoughtful and ever-improving effort of program development that helps us find the “right” combination needed for our students.
Scott McConnell is a professor of educational psychology at the University of Minnesota.
The Opinion section is produced by the Editorial Department to foster discussion about key issues. The Editorial Board represents the institutional voice of the Star Tribune and operates independently of the newsroom.