The Education Reform Act of 1993 was a complex piece of legislation but its principal components are four:
- High academic standards for K-12 schools;
- Accountability through the MCAS test and a state office that performs audits on schools and districts;
- Improved teacher quality through rigorous testing of teacher’s mastery of the content in the state’s academic standards; and
- Expanded public school choices for parents through charter schools.
The subsequent history of education reform in Massachusetts has been an ebb and flow of implementation of these elements. It took until 1996 for the state to truly embark on any of the first three reforms listed above (and it took a long time and lots of public debate to move them ahead–one example). After 2001 charter expansions slowed to a trickle until the 2010 education law doubled the number of charters. After 2007, our academic standards were first injected with greater emphasis on “soft skills” rather than academic content and then switched for lower quality national standards; and the state’s school auditing office has been all but shuttered.
Two things are a constant throughout the history of two-decades-old reform effort. First, charter schools have proven over and over again in Massachusetts a high level of consistent performance and markedly higher performance than their district peers. Nothing has changed since a 2006 Massachusetts Department of Education report concluded that
In both English Language Arts and Mathematics, at least 30 percent of the charter schools performed statistically significantly higher than their CSD [ed. note: charter sending districts–i.e., district systems sending kids to charters] in each year with the exception of 2001.
That report goes on to observe that another 60 percent of charters were either as good or better than their district peers. As I’ve noted elsewhere:
A January 2009 Boston Foundation report shows Boston charters blowing the doors off of Boston district schools. To show you just how good their performance is, you might think about the impact of charters over middle school years as akin to bridging the gap between Boston public school performance and Brookline public school performance.Not a bad outcome for people who cannot afford, or don’t want, to move to Brookline.
That’s important as we see the effects of the 2010 education law doubling the number of charters. Boston will benefit greatly with up to 18 percent of its students in charter schools by 2016. We have thus far seen far less expansion outside of Boston, notwithstanding the fact that the 2010 law also increases the potential for charter growth to 18 percent in most major Massachusetts cities. (More on that another day.)
There is an additional constant in the state’s history of education reforms, and that is choice options have been expanded beyond Commonwealth charter schools (CCS), which are highly flexible schools that operate without the requirement of teachers unions and oversight of local school committees. In additional to the CCS, we have seen a series of in-district efforts to gain the benefits of charters without sacrificing the interests of the adults in the system. Were it so simple…
We have seen pilot schools championed by teachers unions, Horace Mann (unionized, in-district) charter schools, Commonwealth pilot schools, and more recently innovation schools and extended learning time (ELT). The first three charter-lite options have not borne significant fruit. We are at the experimental stage with innovation schools and will know more within a year or two.
ELT is right about at the stage of development where we have to look ourselves in the face and make some hard choices. I understand the adults’ push for the charter-lite solutions. It keeps all the usual political alliances and interests intact; no difficult political decisions are necessary; and we can continue going to the same cocktail parties and cookouts. That’s important with the weather getting nicer just about this time.
I understand the political impulse to push ELT — just more money and more time will solve the problem. Nothing against more time. If kids in Japan go to school more like 240 days a year, and we go 180, sure, there is no way we can keep up.
Intuitively, it makes sense, right? Reporters and radio journalists like Anthony Brooks of RadioBoston suggest that, in fact, charter schools have longer days, so longer days must be what makes them work.
But, before we jump to conclusions, let’s ask the question: Do Massachusetts’ ELT work? And are the results we are getting from these programs worth the $14 million a year we are currently spending on it. Roll the data. In 2010 the data, the data on ELT provided in Abt Associates’ “Year 3” report suggested that
- ELT had a significant, positive effect on 5th grade science MCAS scores in year two, but no statistically significant effects on other MCAS outcomes in year one or two.
- ELT had a statistically significant, negative effect on school attendance in both year one and two.
- While very few students received suspensions or were truant, ELT schools had slightly higher rates of out-of-school suspensions in both years.
- 8th-grade students in ELT schools were more likely to use a school computer for school work at least once a month in year one, but not year two. ELT students were no more likely to spend > 3 hours a week on homework in year one, and less likely in year two. 8th grade students in ELT were no more likely to use a home computer for school work at least once a month, or two or more hours per week.
- 5th grade ELT students were less likely to participate in non-academic clubs at school (no other significant effects).
- ELT had no effect on 5th grade students’ perceptions about their relationships with teachers. ELT had no effect on 5th grades students’ perceptions of the learning environment offered at their school or level of school engagement.
That was an interim assessment admittedly covering only the first three years of implementation of ELT programs. Has anything changed in the two years since? Happily, Abt Associates has continued to update its reports. Unhappily, many of the key findings remain negligible. Consider, for example, page XVII of the “Year 5” assessment from Abt:
On average, there were no statistically significant effects of ELT after one, two, three, or four years of implementation on MCAS student achievement test outcomes for 3rd, 4th, or 7th grade ELA, 4th, 6th, or 8th grade math, or 8th grade science.
There was a statistically significant positive effect of ELT after four years of implementation on the MCAS 5th grade science test.
Those are quotes, folks. For all the activity and all the spending, there are few positive effects, save for the 5th grade science test. And, ahem, the positive effects on the 5th grade science test seem to disappear by the 8th grade.
Then there are negative effects. Both staff and students report higher levels of fatigue; students were less enthusiastic about school.
The Abt study design in the “Year 5” assessment is impacted by the involvement of both Mass 20/20 and Focus on Results, two organizations that are the state’s biggest advocates of ELT. Some of the impact is helpful. The qualitative survey work with teachers notes high teacher satisfaction with ELT because it “allows them to accomplish their teaching goals and cover the amount of instructional material their students need to learn than would be expected in the absence of ELT.” While that is almost tautological, it is also a fact that allowing the time to go more in-depth or cover more material, as determined by the teacher, is a good thing.
The influence of these advocates, however, may have led to some self-interested conflation between ELT as implemented by the state and other programs that are categorized as ‘extended.’ For example, the study often folds charter schools into the discussion as examples of places that have longer days on average. But the fact is that charters (especially Commonwealth charters) are so much more than that: they have a different approach to achieving teacher quality, different approaches to culture-setting and expectations, a more entrepreneurial bent, and a level of urgency that is unique given the “high-stakes” accountability they have for results. If they don’t work, they get shut down.
ELT has some encouraging results in a few schools, but on an overall basis the results are not terribly encouraging. Are the results described above worth $14 million? Should we continue to fund it? Those are tough questions, I know. Let me pose the question in an even harder way: Given the success of charter schools and the less-than-inspiring results of ELT, would it be better to spend the $14 million funding 800-1,000 additional charter school students rather than spend the money on ELT? (Note: charter school students are funded at about $10,000 per student, with another slug of money going to pay districts for the loss of students.)
In the public sphere, choices to do one thing are often decisions not to do something else. It’s decision time on ELT.
Also seen in Education News and Boston Globe Blogs.