The ongoing debate about the value of Minnesota's Q Comp program for teachers comes down to a key question: Has the nearly decadelong effort to change the way educators are paid improved student learning?

Authors of the most recent study of the program say it has. They found a small but significant increase in reading test scores in Q Comp districts and schools compared with others that did not adopt the program. Yet others who reviewed the research believe the Q Comp impact was minimal and that other educational changes caused any improved student performance.

Trouble is, the conflicting analyses are not likely to be resolved. Though there has been little change in math scores, the reading improvements are real. But because each school or district used Q Comp funding differently, it's hard to tell exactly which elements made the most difference for students.

Q Comp, or Quality Teacher Compensation, was a signature program of former Gov. Tim Pawlenty as an alternative way to pay teachers. The idea was to tie teacher pay more closely to student performance and move districts away from the current "steps-and-lanes," longevity-only pay schedules. Districts and schools could voluntarily apply for additional state funding to pay teachers more for improving student achievement, taking on leadership roles, participating in evaluations and improvement plans, or working together to meet student achievement goals.

The program was approved by the Legislature in 2005, but pilot efforts began as early as 2001. So far, through a combination of state aid and property taxes, more than $450 million has been spent on the program. About one-fifth of the state's school districts and nearly half of the charter schools participate.

The most recent research was conducted by economics Profs. Aaron Sojourner and Elton Mykerezi of the University of Minnesota and Kristine West of St. Catherine University. Their study compared individual student achievement growth on two tests before districts adopted Q Comp and after it was adopted. Then they compared that growth rate with that of students in districts not using Q Comp. They found that the increase in reading scores was more than twice as much for districts that had been in the program for five years.

They also learned that most of the funding was not used for teacher pay raises. Linking pay improvements for individual teachers to student test scores was controversial among teachers, so the voluntary program had to be approved by both individual school district leaders and their teachers unions.

As a result, a 25-student classroom generates about $6,500 per teacher annually, but on average only about $2,200 of that went to teachers and just $233 for student achievement gains. The remainder has been spent on district initiatives such as teacher evaluations or paying substitutes to allow teachers to work together on lesson planning.

A 2009 Star Tribune analysis found that about 99 percent of the roughly 4,200 eligible teachers received merit raises under the program. That same year, a legislative auditor's assessment said there was no clear evidence that Q Comp improved student performance. That review concluded that it was difficult to "disentangle" the pay plan from several other schools initiatives.

Four years later, there are demonstrable signs of reading improvement, but the questions about exactly what prompted the progress remain.

It is clear, however, that Q Comp added needed funding to Minnesota school districts at a time when new money was hard to come by. And the districts and schools that received the money did see some reading improvement compared with schools that did not get that financial bump.

The recent research credited those additional funds for making a difference in the classroom. It's worth more study to determine which practices adopted under Q Comp had the most impact on student learning.