Saturday, June 26, 2010

More about our NWEA Accountability System

In my last post, I wrote about our school district's use of NWEA testing results as a new system of accountability and instructional improvement. Because this new system plays a central role in the way in which the Board of Education intends to fulfill its responsibility to monitor educational results, I believe it is really important that the community understand how this new system works.

Teachers, principals, parents, the superintendent's executive team and the Board of Education are now all working from the same NWEA MAP test, which allows us to measure growth at the individual student level, at the classroom level, at the school level and across the district. You can find out more about how this nationally normed testing system works by clicking HERE. NWEA testing norms are based on a population of over one million students who take the tests. Parents who want to understand more about the NWEA testing system can download the "parent toolkit" at the main NWEA testing website. MAP tests are given three times a year, and that data is immediately provided to the teacher. Fall testing tells the teacher who is behind and who is ahead. Winter testing tells whether current strategies are working. (If they are not, remember, it can be the teaching strategy, the student's effort, the parent's support, or the curriculum, or all of the above that needs to change.) Spring testing then tells us where the student is at the end of the year, and each year, last year's data will be provided to the teacher in the fall.

The best way that I can help explain how the NWEA testing results serve as an accountability and instructional tool is to show you the key to the testing results that we see for schools and grade levels. There is a detailed guide to the reporting information on-line, if you want more information. Click Here

(Click on the image to get a better view)

Let's say that we want to look at this year's results for mathematics at Madison Elementary School, for example. Students are going to be assigned to four categories. In Brown, we will see students who are in the G-P- group. These are students who tested below projected proficiency (they are performing below target proficiency), and whose educational growth was also below typical (median growth). Some of these students may have major obstacles to learning. They may be recent immigrants who speak no English. They may have disabilities that significantly inhibit their learning. They may be students who could learn much more, much faster, but the teacher and school haven't figured out how to reach them. They may lack needed parental support. Or a combination of all of these factors. School may not be working for them. Or possibly, a different teacher would do a better job of teaching them.

The color code doesn't answer the question why. Its a beginning point in asking what is happening with this group of students in this particular classroom and school. In yellow, we see the students who are G+P-. These are students who started the year behind the target proficiency for their grade level. They need to catch up--to grow more than a year in a year's time, or they will never reach proficiency. When most of us were in school, kids who were behind usually stayed behind, or got even further behind, because being behind is always a handicap that makes it harder and harder to progress. So these G+P- kids are success stories; they are beating the odds that would otherwise cause them to get further and further behind.

Coded in Green are the G+P+ students. These are students who ended the year above expected proficiency, but who grew faster than an average year's growth. Good schools don't ignore these kids, just because they are ahead of average proficiency. Their goal is to challenge everyone, and so G+P+ percentage is a very important success measure. Coded in orange are the G-P+ kids. These are kids who are ahead of necessary proficiency targets, but they have not continued to get further ahead. They haven't grown as much as typical growth, but they are doing fine in terms of proficiency. Keep in mind, always, that "proficiency" is an artificial cut score level. No Child Left Behind likes to pretend that the same score should be the goal for everyone. But that is, well, just plain baloney. Different students have different intellectual strengths, different emotional obstacles, different personal goals, different interests. The idea that everyone should wind up at the same point is super silly

So now let's look at how the color coding helps us illustrate performance in the School district and in a particular school. Here are the reading results for District 742 (Grades K-9) and for Madison Elementary. We have a similar chart for every school.

(Click on the image to get a better view)
The chart tells us that just about 73 percent of Madison students exceeded the median growth for elementary students, based on national norms. In an average school, we would expect that 50% of students would exceed the expected growth (because expected growth, by definition, is the median growth). For the entire school district, 60 percent of students grew in reading faster than the expected growth rate. Keep in mind that not every student is in the system. We aren't doing the testing yet for grades 10-12 (this is a cost issue). Also, in order to have a growth measure, a student has to be with us at the beginning of the year and the end of the year, and so a bunch of students aren't measured. The information also tells us that at Madison, 69 percent of students reached projected proficiency performance, and that 68.1 percent in the district reached that level.

Now the Board of Education has adopted long term targets that seek to put more students in the green category--growing more and proficient. Teachers and principals across the district are working in teams to develop techniques to make that happen. Throughout the year, teachers and principals are monitoring their progress towards individual school and classroom goals and assessing whether the techniques that they are using are successful. Parents are receiving individual student reports that contain the same information--is the student at the target proficiency level for his or her grade in math and reading? Is the student progressing at the median rate, or higher or lower. Teachers, parents, principals, superintendent and public will all be receiving similar information.

As we do this, some classes, or some grades, or some schools are not going to do as well as others. The purpose of this data is not to humiliate anyone: it is part of a continuous progress model, which says, we cannot improve, if we don't look at what we are doing. My teaching experience may be ancient, but it tells me that there are all sorts of reasons why a particular class doesn't do as well as we would like. Sometimes you get a great class of super-stars, kids who really want to learn, and who work really hard for you. Sometimes you get the class from you-no-where, and it seems as though there is almost nothing you do that works. The goal here is to catch students and classes right away, and intervene immediately to use the best possible techniques to promote success.

We are going to see some schools that seem to need improvement. We just have to remember, that the first step to improvement is to see where improvement is needed. Its going to take several years to see if this new accountability system works--works for students, for teachers, for schools and for the district. But one thing is for sure, we now have hard data that allows us to scrutinize where we need to do more work.

Friday, June 25, 2010

Board gets scorecard accountability results

At last night's Board meeting the Board of Education approved the contract for Bruce Watkins and he joined the Board for his first meeting with us. At the meeting, we also received results of our new educational accountability system, which is based on learning scorecards (also called vision cards). The information that we received is much a more powerful and meaningful report on student achievement across the district by grade level in math and reading than a Board of Education has ever received in our school district. We looked at results for grades K-9 for the entire district and we looked at the results by school.

The test results say that in the majority of grades, in both math and reading, students are making above average progress as compared to students in other schools. The reports also showed us a few grades where students are not making acceptable progress as compared to other schools across the nation. District and school leadership, and teachers as well, will be using this information to fix what needs to be fixed and to reinforce what is working. The testing results is shining a light on student progress in a way that we've never seen before, and it is going to give us the information that we need to identify exactly where we are doing well, and where we need to make significant improvements.

Our accountability system is based on the highly respected NWEA MAP test, which allows us to measure growth at the individual student level, at the classroom level, at the school level and across the district. You can find out more about how this nationally normed testing system works by clicking HERE. NWEA testing norms are based on a population of over one million students who take the tests. Parents who want to understand more about the NWEA testing system can download the "parent toolkit" at the main NWEA testing website.

A student's reading and math scores in a MAP test are reported as RIT scores. The RIT score is grade independent; it is a scale that you might compare to a yardstick. Imagine that you have posted the yardstick on your wall and as your child grows, you measure growth in inches. Every inch of growth is worth the same amount of growth. In the same way, growth can be measured against the RIT scale. When a parent receives the math and reading RIT scores, it comes with information on (a) the child's score at the beginning of the year, and then at the end of the school year, (b) RIT scores from last year (when available), (c) the typical RIT scores for children in the same grade, and all other grades, so that the parent can see where the student is located in both math and reading. The parent can also compare their students growth during the school year to the average growth of other students.

This system provides accountability tools to the district leadership, but it also provides superb reporting to parents. Students now take the MAP test in math and reading three times a year. The teacher and the parent gets information on where each student is in the fall, winter and Spring at the end of the year. No more does the parent hear vague generalities about student progress.

The reports that we received last night provided us with school by school information on how much progress students are making. The statistics show us the percentage of students in each of four categories: (1) the percentage whose annual growth was below the typical rate of growth for similar students and whose scores were below projected proficiency. These are students who started the school year behind, and who didn't make enough progress during the school year as compared to other students. (2) The percentage of students who's annual growth was greater than average, but who are still behind average proficiency. These are students who came to school behind, but made more than a year's progress. They are still behind, but they are on the road to catching up. (3) The percentage of students who perform above projected proficiency, and who also had more than a year's growth. These are kids who are doing very well, and are actually getting further ahead. NWEA monitors this, because of course we want to make sure that all students are growing and all students are challenged, even the students who are already doing well. (4) The percentage of students who perform at above average proficiency, but who didn't grow as much as they were expected to grow.

It might be helpful to understand the kind of data that we are getting to look at the results of one school, Madison. They showed that 12 percent of the students at Madison fell into category 1--students who are behind projected proficiency and who did not make average progress. Fourteen percent of Madison students fell into category 2, that is, they were behind required proficiency but made more than one year's gains during the year. That is, they were on the road to catching up. Fully 53 percent of students at Madison were in category (3). These are students who scored above expected proficiency and also displayed above average educational growth. The remaining students, 15 percent, performed at above average proficiency, but failed to make the expected amount of growth. These are students who are doing "well enough" but who need to make more progress to maintain their proficiency over the years.

We received similar data for every school in the district. As I said, in most grades, we saw that students are making above average progress as compared to students across the country, but in a few grades they are not. Using this districts, principals and teachers can now work together to compare the progress that they are making, grade by grade, classroom by classroom, student by student, and that's exactly what they are doing. They get this information at the beginning of the year and in the middle. They don't have to wait until the end of the year to discover that there are kids who are falling behind. They get just in time data that allows teachers and school leadership to take corrective action.

Finally, this tool gives us the ability to compare what is happening in each school throughout the district, and in each grade from year to year. The idea is that these first results give us a baseline to work from and to attack our problems to improve next year.

Wednesday, June 23, 2010

Listening to Parents: Responding to their Concerns

Yesterday evening, I attended a "listening session" for local elected officials sponsored by Mayor Kleis as part of St. Cloud's listening week. Only a few citizens attended, but we listened to them, and the seven or so elected officials answered questions and gave their opinions. One of the citizens present asked if we were disappointed at how few citizens come out to participate in sessions like this. It is true, that in recent years, fewer and fewer citizens, and especially parents with children, seem to have time to become involved in community dialog events, or to participate in PTA or other similar activities.

And yet, it's really important for those of us who are serving that we have opportunities to engage the people that we serve. When we hear from citizens, especially parents with children in our schools, we gain a valuable perspective. So we need to work really hard on providing listening and engagement opportunities.

In March this year, the District sent out a parent/guardian "climate survey," that asked all parents to provide input on a number of questions. We sent out nearly 7,000 surveys. Of course, we have less parents than students, because some of our families have multiple children in the same school. About twenty three percent of the surveys were returned--that is, 1569 surveys. You can never be sure, whether the people who return surveys are parents who are more satisfied, or less satisfied, or fairly representative of the total population of parents. Parents who return surveys tend to be more engaged in their childrens' education. A smaller percentage of minority families respond than white families. Over 80 percent of the surveys are returned by Mom's. (What a surprise).

Responders overwhelmingly (at or above the 90% level) said that they feel that their child is safe at school, that when they contact their school, they receive a receptive and helpful response, that their child is safe going to and from school, that they know how their child is progressing at school, and that teachers show respect for their child. In the eighty to ninety percent range, responders told us that they feel that their schools are doing a good job of preparing children for their future, that their school is performing well academically, and that they would recommend their school to other parents. To tell you the truth, as a parent who was always pretty demanding of my childrens' schools, I think that this is a pretty good result. I would expect some parents in any school system to feel that their schools should do more to challenge their children. And, in any school, there are going to be children who aren't doing well, and some of their parents are going to be disappointed. It's possible also, that the parents of kids who aren't doing well, are more likely not to complete a survey in the first place.

These surveys are only one window on reality. But they provide us valuable information. One thing in the survey really stands out for me. In almost every area of the survey, the vast majority of responders, usually in the 80 plus or 90 plus percent range, responded positively to the quality of their own school and their children's experience. Parents are mostly satisfied with their child's bus service (which is pretty remarkable considering the challenges inherent in providing quality services to all students). Most parents feel that their schools provide adequate technology.

But 54 percent of respondents answered yes to the question: I would like more support in Math to help my child's learning at home and 45 percent of respondents answered yes to the question I would like more support in Reading to help my child's learning at home. Now when you get a strong response like that, it tells you that you had better do something to address those requests.

We have people telling us lots of positive things about their children's experience. The fact that they are willing also to raise concerns in these specific areas provides reassurance and validation that they are willing to tell us when they aren't satisfied, or when they expect more from us than we are providing. And, it tells us too, that these are not complainers. They are positive when they feel positive, but they tell us when they need more from their schools. We have about half of parents telling us clearly that they would like more support in Math and Reading.

These parents believe, as do I, that education is a partnership that involves parental support in the home for math and reading. The are telling us, half of our respondents, that they would really like to do their job at home better, but they need help. When you do a survey like this, by golly, its really important that you do something to address concerns. As a board member, its not my place to figure out what is the best way to address these concerns. I don't know if these are high school parents, elementary parents, or all of the above. I'm not sure what specifically these parents are asking for in the way of support, and it may be that it differs from parent to parent. But it is my job as a Board member to make sure that our leadership is following up on these concerns. I want to know that next year, when we do these surveys, that parents don't toss them in the wastebasket, because we failed to respond to their concerns. The best way that you can encourage more parents to become involved and more parents to send back their surveys is to make those surveys count, by addressing the concerns that are raised.

Friday, June 11, 2010

Self Insurance Realizes Dental Plan Savings

This year's budget includes a new account for the District's newly established self-insured dental plan. The first year's results suggest that establishment of the self-insured plan, which replaces a plan underwritten by a major dental insurance carrier, will save somewhere between $100,000 and $200,000 per year. This year's savings will be placed in the self-insurance reserve. Actuaries recommend that when you self-insure, you should maintain sufficient reserves to protect yourself against fluctuations in usage. Once we have adequate reserves, the savings would most likely be used to reduce the premium cost resulting in savings to the district and to employees.

Replacement of our old plan with a self-insured plan resulted from an outside study that looked at our insurance operations. The savings realized from this study, and implementation of its recommendations, paid for the cost of the study many times over. This is an example of the benefits of the District's adoption in 2005 of a practice of regularly studying how we do business and where we can improve. Other studies have also led to major improvements that have led to cost controls or other savings. The cost of outside studies are booked to the school board's budget, because they are part of the public accountability function. We select study topics in consultation with the superintendent and the executive staff. Sometimes, the study topic is selected because there is a concern that a department, or part of our operations, seem to need improvement. Sometimes the study topic is selected because we receive expressions of concern that seem to warrant review with an independent eye. Sometimes the consultant comes back and tells us, this is working really well. Keep up the good work. Usually, we get good advice on how we can do things better.

Another reason for using independent expertise in this fashion is that sometimes we feel that we have to defend doing things just as they have been done in the past. Using outside expertise from time to time is a great way to develop support for making needed change. Also, it avoids having board members try to micromanage operations and operational change. Instead of saying that a constituent is complaining about bus service, or insurance costs, for example, we can work with the Superintendent to identify areas that would benefit from review. Instead of implementing a board member's individual idea of how things ought to be changed, based on a brainstorm that he got some evening over a beer, we foster a cycle of continuous improvement, using outside expertise and the good ideas of our employees.

The independent review concept was a pledge that was made to the voters back in 2003 when the previous operating referendum was passed. The Board promised regular outside reviews of our operations to assure that we were implementing efficiencies. The cost of these reviews has been relatively small, and the savings and efficiencies implemented as a result, have paid for the cost many times over.

Wednesday, June 9, 2010

School District 2010-2011 Budget

The Board of Education looks at a proposed 2010-2011 budget tonight. The Board Finance Committee spent two hours reviewing a draft earlier this week. Many of the budget decisions, of course, have already been made. We conducted a lengthy process during the fall of last year designed to bring forward cuts. Staffing decisions were made in April. But, its an unfortunate reality that Board's of Education don't have an accurate picture of their revenues until too late in the year, and this year was a particularly challenging year, because of the State's financial crisis.

As we look at the budget, I thought it might be useful to begin a discussion of how school board members use the budget as one of a number of accountability instruments to monitor how the school district is being operated. Its going to take a number of posts to wade through these topics. Our review of the budget is only one of the important accountability reviews that the board of education has at its command in monitoring the district. Over the next couple of weeks, if I have time, I'll try to post on the other monitoring devices as well.

Change in Budget Format to Enhance its Use as an Accountability Tool

One of the things that the board of education has done in the last five years is to change the structure of the budget document to make it a better accountability tool. When I joined the board of education, it was almost impossible to use the budget to understand trends in expenditures or to use the budget as an accountability tool. The budget was all trees and no forest. You could find a line item for paper in a particular school, but you couldn't find out how major budgetary items had changed over the years. National governmental finance experts urge elected governmental officials to convert their budgetary format so that they can see the big picture--to monitor the things that are really important, and to use the budgetary document as a planning tool. Among the features installed were:
  • Seven years of budget history comparison (5 past years, current year, and projected following year). This allows the board, district leaders, and the public to look at budgetary trends over the years. This makes it far easier to understand what is happening to expenses and revenues and to put this year's budget in context. Another reason why the long term budgetary trend analysis is important is that some years have unique accounting aberrations that make it difficult to analyze, without looking at previous years and future years. That's because we are on a "modified accrual" accounting basis mandated by the State of Minnesota Department of Education which doesn't always allow us to book revenues and expenses to the appropriate year. Looking at 7 years of budget numbers puts these aberrations in context, and helps us understand true budgetary trends. (For example, last year was an unusual year, because of stimulus funding, and in many budget categories, we make better comparisons between this coming year and the year before last.)
  • Special Education Education financial history Chart. Our budgetary document now includes a seven year tracking history of special education expenses and revenues. We began including this tracking device after completing an outside financial study of special education. We wanted to be able to monitor the the state's reimbursement practices as well as the important special education cross subsidy that has dogged our district and many other districts across the state. We use this chart also to provide information to our legislators so that they can understand the scope of the special education financial crisis that we and other districts face. This year's special education budget trends chart raises alarms. Total revenues supplied by state and federal government for special education are being reduced by 2.3 percent! (The state doesn't allow us to reduce expenditures, so when it reduces revenues, we are automatically in the hole.) While the federal government has increased our special education revenue support by about $200,000, the state government has reduced its special education support by $750,000. In other words, the state government is sucking up the federal increases and using that money for other purposes. This "supplanting" should, at least in spirit, be unlawful, because the state government is not supposed to be taking away federal increases. But the reductions combined with pressure on services caused by an increasing number of students with disabilities creates one of our most pressing budgetary concerns this year.
  • Instructional Staff Pupil Ratio (ISPR). This year, the board is adding a chart that tracks our instructional staff pupil ratio. That is a measure of the number of students in the district divided by the number of classroom teachers. We decided to add this tracking device, because when the district passed the operating referendums in 2003 and 2008, it undertook to maintain the ratio of classroom teachers as compared to the number of students. This ratio focuses on classroom teachers only. That is, it does not count special education teachers, counselors, nurses and other licensed staff that don't lead a classroom. A six year history of this ratio shows that in 2003, when the public passed the operating referendum,, the instructional staff pupil ratio was 27 students per classroom teacher. With passage of the operating referendum, the ratio dropped to 25 students per instructional staff, and has fallen to 23 in 2008-2009. Another reason we monitor this ratio is that most parents tell us maintaining the classroom teaching staff is really important to them.
  • Total Licensed Staff Pupil Ratio TLSPR). This is a ratio that is more commonly reported by most school districts and so it is easier to compare ourselves to other districts with this ratio. This year, the Board of Education added a chart which tracks the ratio of all licensed staff to the number of pupils. This includes all licensed staff, as I have said, so it is not a true measure of classroom teachers versus students. We also directed that the budget compare this ratio to a number of other similarly sized districts (such as Bloomington, Duluth, Moundsview, and North St. Paul). The comparison shows that our ratio has risen from 13:1 to 15:1, but that it remains the lowest among the comparison districts, which have TLSPR's ranging from 15:1 (like ours) up to 20:1, with the median ratio being about 17:1.
  • Administration Comparisons. This year, we also added a three year tracking of administrative cost per student, and student's per administrator with a comparison over this time period to other school districts. This comparison requires a bit of care, because we cannot be absolutely sure that every district codes exactly the same information to the same categories. The comparison shows that reported administrator cost per student ranges from $302 per student to $542 per student, with our district near the bottom of the range, at $365 per student. The number of students per administrator reported ranges from a low of 215 in Duluth to a high of 410 in Eden Prairie, with the District at 309. Since this is a new comparison statistic, we'll be spending time this year, looking at these statistics to see whether it leads us to make adjustments in what we do. (For example, when we first looked at staffing ratios in special education, back in 2005-2006, it led us to institute a variety of reforms that we felt would assist in bringing special education caseloads and staffing into line with other similar districts.) In short, we have found that adding comparison information to our budget has assisted the board in its accounting and monitoring functions. Sometimes the information that we get confirms that we are on the right track. Other times it leads us to ask hard questions and results in needed reforms.
We look at other comparisons as well, when we try to understand our budgetary trends. Later in the summer, we will get comparisons between our district and so-called regional center districts that have many characteristics similar to ours: Rochester, Mankato, Wilmar and Duluth would be examples of such districts. In addition, we track comparisons between our district and districts close to us. Putting all of these statistics together, give us several different windows on our financial indicators.

That's enough to digest for today's post. In my next post, I'll talk some more about different ways in which our board and superintendent monitor district accountability indicators. I want to end with this point. We're trying to be a continuous progress district. The idea behind continuous progress is that we make transparent information about how we are doing, whether it is flattering or unflattering. The beginning of solving problems is by making problems transparent. Financial indicators are only one window on accountability, and this post has only discussed some of the financial indicators that we monitor. So there's more to talk about in future posts.

Saturday, June 5, 2010

Does performance pay work in public education?

This week's online edition of Education Week, contains an article reporting on a study of the effectiveness of the performance pay program implemented for teachers in Chicago by then Chicago Superintendent of Schools, now Secretary of Education. The article reports on results of comparison of schools that adopted Duncan's performance pay plans as compared to other schools that did not:

Preliminary results from a Chicago program containing performance-based compensation for teachers show no evidence that it has boosted student achievement on math and reading tests, compared with a group of similar, nonparticipating schools, an analysis released today concludes.

The study is important because the performance pay plan adopted in these Chicago schools is the foundation for national policy and for the Race to the Top. In fact, there are a series of studies, a few finding a weak improvement resulting in some grades from performance pay, and others finding no improvement.

This research, and other research like it challenges us to ask whether leaders in St. Paul and Washington D.C. are pushing performance pay because it is proven to work, or whether they are pushing it because it answers a call for doing something that appears to be radically different in education. It seems to me that if we are going to be faithful to the central goal of graduating children to excellence, we must always be guided by doing what is proven to work. As we think about this, we need to be clear, that performance pay is one of the available reforms in the organization of teaching. One needs to keep in mind that performance pay is the most controversial, and probably the least effective of fundamental reforms in the the organization of schools and of the teaching profession. So this post is not designed to argue that we don't need to reform public schools: I'm suggesting rather that reform for the sake of reform is a mistake: we need to do what works.

We know that employee recognition works, of course. One of the leading employee recognition experts, Dr. Bob Nelson, writes in the business context:

Recognition represents the single most validated principle for driving desired behavior and performance in today’s work environments. Compared to the average company, employees in a recognition-focused company are 5 times more likely to feel valued, 11 times more likely to feel completely satisfied, 7 times more likely to stay, and 6 times more likely to invest in the company

Recognition works--indeed it is essential-- but the central problem is whether recognition by small performance pay bonuses is the kind of recognition that works in public education, and the evidence to date doesn't seem to support the idea that you can improve educational performance by implementing pay for testing results. What is often forgotten, in the performance pay debate is that, actually, performance pay isn't all that common, or all that productive even in the world of business. Nelson writes:

Most managers [wrongly] think money is the top motivator What employees really want is to be valued for a job well done by those they hold in high esteem. As Mary Kay Ash, founder of Mary Kay Cosmetics, says, "Imagine that every person is wearing a sign around their neck that says, 'Make me feel important.'" Sure, compensation is important, but most employees consider it a right, an exchange for the work they do. As management consultant Rosabeth Moss Kanter puts it, "Compensation is a right; recognition is a gift."


Results of a recent survey by the Council of Communication Management also confirm that recognition of good performance is the top motivator of employee performance. But how many managers consider "appreciating others" to be a major function of their job today? Not many.This is true even though one-third of managers report that they themselves would rather work in an organization where they could receive better recognition.People want to feel they are making a contribution at work, and for most individuals, this is a function of having the respect of peers and colleagues, having managers who tell them when they do a good job, and being involved and informed about what's going on in their department or organization.


Nelson points out that the key to motivation is to focus on what motivates the employees who are actually doing the work. "What motivates others is often different from what motivates you," Nelson tells managers:

In the late 1940s, Lawrence Lindahl performed classic studies about what workers want from their jobs, and those studies were repeated with similar results in the early 1980s and 1990s.Managers identified good wages, job security, and promotion or growth opportunities as the primary reasons they believed their employees worked. Employees, on the other hand,reported intangibles such as appreciation for work done, feeling "in" on things, and empathetic managers as their most desired job attributes. When employees and supervisors ranked a list of motivators from one to 10 in order of their importance to workers, workers rated "appreciation for a job well done" as their top motivator; supervisors ranked it eighth.Employees ranked "feeling in on things" as being number two in importance; their managers ranked it last at number 10.

To have a motivating work environment, this perception gap between managers and employees must be closed. Managers must be sure to reward the behavior they desire with recognition that is valued and meaningful to their employees — not just themselves.

The problem with the performance pay movement is that is that politicians and pundits are looking for a magic bullet to close the achievement gap, and there isn't any. Not performance pay. Not Special licensing exemptions. Not all day kindergarten. Not any one thing. Doing well in school requires lots of work, lots of persistence. It isn't an overnight deal. Transforming the teaching profession or turning schools upside down, or creating charter schools, or vouchers, or getting rid of tenure, or any of your favorite solutions, none of these are going to do the job unless it triggers the countless hours of language development, reading, homework, and individual student determination that is the key to success.

Let me be clear again. This is not an apology for keeping things the same in public education. The delivery of public education requires significant restructuring if we are going to meet the goal of graduating all of our students to excellence. We need reform in the way that universities prepare teachers. We need reforms in the structure of the teaching profession, so that experienced teachers and teachers who regularly demonstrate success, can become leaders of teaching within their schools and school district, and those changes require compensation reform. But if we are going to make significant progress we need to be more sophisticated in selecting strategies that have yielded proven results. There are, in fact, a number of strategies that involve significant restructuring in the way that schools deliver instruction. Some of those call on us to reorganize the way in which schools are led, the way in which teachers are supervised and the way in which they deliver instruction. Compensation reform may well be a significant part of those reforms, but if compensation reform is going to be effective, it needs to be mindful of the research: if motivation is the goal, then we must motivate in ways that is valued and meaningful to the employees who are being motivated.

Supreme Court's Second Cruz-Guzman Decision Requires Fundamental Re-Evaluation of Education Clause Claims

The Minnesota Supreme Court's recent Cruz-Guzman decision has radically, (but appropriately), refocused Minnesota's jurisprudence on...