Thursday, May 5, 2011

St. Cloud Board of Education Uses Achievement Data to drive Continuous Improvement

This year, our board of education has been working to implement an important step in our Key Work of School Boards" initiative to improve our governance.  The Key  Work  model urges school boards to  focus on data that describes student achievement.    Understanding the budget is important, sure.   But our primary responsibility  is to create the conditions for maintaining and improving student achievement.   The Key Work paradigm says that school boards must study and discuss data on student achievement:

Effective school boards are data savvy: they embrace and monitor data, even when the information is negative, and use it to drive continuous improvement. The Lighthouse I study showed that board members in high-achieving districts identified specific student needs through data, and justified decisions based on that data. Board members regularly sought such data and were not shy about discussing it, even if it was negative. By comparison, board members in low-achieving districts tended to greet data with a “blaming” perspective, describing teachers, students and families as major causes for low performance. In these districts, board members frequently discussed their decisions through anecdotes and personal experiences rather than by citing data. They left it to the superintendent to interpret the data and recommend solutions.  (Quoted from “eight characteristics of effective school boards”)

In the 1950's and 1960's, school boards often focused most heavily on financial performance of their districts.  In the decades that followed, boards in many school districts followed a national trend of focusing instead on personnel decisions, especially the selection of their superintendent.  Carried to extremes, these boards bought into the idea that student performance was the job of the superintendent, and that board's job was merely to select a superintendent who would then take full responsibility for the professional task of managing achievement.  By the year 2000,  a national trend began to lead us towards the concept that effective school boards must focus most of their work on student achievement.

This trend towards examining the data mirrors a national trend in the use of data by teachers, principals and curriculum and instructional leadership to improve results.  See for example:   Improving Teaching and Learning with Data-Based Decisions: Asking the Right Questions and Acting on the Answers  Click Here

This use of data in teaching and learning, and the use of data by school boards, is still in its infancy.   Mastering student achievement data, is easier said than done.    For example, a statistic seemingly as simple as "graduation rate" is subject to wide differences of approach. (Click Here)  (See also  Education Week's Graduation Rates ...Misleading)   Any attempt to master student achievement data requires significant study, tremendous patience, and an inquiring mind.

In our district, we've begun to implement this focus on student achievement by adoption of a framework measurable goals that are embodied in so-called "vision cards."  The vision cards use a broad array of measurement tools, including proficiency measures, progress measures, and consumer satisfaction measures. One of the central measures that we use is derived from  the highly respected NWEA MAP test, which allows us to measure growth at the individual student level, at the classroom level, at the school level and across the district.  You can find out more about how this nationally normed testing system works by clicking HERE.    NWEA testing norms are based on a population of over one million students who take the tests.  Parents who want to understand more about the NWEA testing system can download the "parent toolkit" at the main NWEA testing website.

I've posted about one important part of our data-based accountability system in the past.   You can find those posts here: (Board gets Accountability Results) and (More About NWEA Accountability).
What kind of data are we receiving from our testing system.   Here are some examples:
  • Median test scores for each grade level in our school district in reading and math before the school year begins and at the end of the school year.  
  • The standard deviation from these median test scores (a measure of the dispersion of scores above and below the median
  • A comparison of the median scores at each of these grade levels to the national median scores for each grade level.
  • Median scores for students by demographic group so that we can compare those scores to district and national averages 
As we study the our district's data, and compare it to national norms, we are finding that understanding data is demanding and requires careful and thoughtful study, as I've said.  In future posts, I'm going to write more about the challenges we face in using student achievement data at the school board level.

No comments:

Post a Comment

comments welcome