Do exam schools add value?

Share on Facebook
Share on Twitter
Share on
LinkedIn
+

bostonlatinschool.jpg

Historically, many of Massachusetts’ political and economic leaders have built their success on the education received at the city’s historic exam schools—Boston Latin School, Boston Latin Academy and the John D. O’Bryant High School of Mathematics and Science, which in total enroll about 5,300 grade 7-12 students. They have received accolades from the usual sources of school rankings, and led other states to follow our example, with New York City building on its own historic grade 9-12 exam schools (Stuyvesant High School, Bronx High School of Science, and Brooklyn Technical High School) by establishing in 2002, the High School for Math, Science and Engineering at City College, the High School of American Studies at Lehman College, and Queens High School for the Sciences at York College. Mayor Bloomberg followed in 2005 and 2006 by changing the Staten Island Technical High School into an exam school, and opening the Brooklyn Latin School in clear imitation of the Boston Latin School, respectively.

Many of these schools, especially the historic exam schools regularly appear in the popular school rankings, such as those created by U.S. News & World Report, as top performers. But are they? It may strike you as a counterintuitive question, especially as so much ink has been spilled regarding whether Judge Garrity’s call for 35 percent of students admitted to Boston’s exam schools to be minority and the ensuing debates in the late 1990s. But the question is really interesting, because it gets at “peer effects” (whether kids do better studying with high-performing peers), the effects of class size, and much more.

Well, this is just what three researchers—Atila Abdulkadiroglu, Joshua D. Angrist, Parag A. Pathak—have done in their new National Bureau of Economic Research (Working Paper 17264) paper out entitled The Elite Illusion: Achievement Effects at Boston and New York Exam Schools that is sure to stoke lots of discussion.

These are surely highly competitive schools to get into and not simply because they “screen applicants on the basis of a competitive admissions test.” Their histories give them mystique—translated into the vernacular, the ivy on their walls only strengthens their appearance as Ivy League prep schools, with interwoven ties to these colleges for parents and students seeking to climb the socio-economic ladder.

Fewer than half of Boston applicants win a seat to one of three exam schools, and less than a sixth of exam school applicants are offered a seat at the three original exam schools in New York.

Students who enter have “pre-application Math and English scores… on the order of 0.5-0.7 standard deviations… higher than the scores of those who apply but not offered.”

Differences in baseline performance between applicants at the most competitive exam school and those in regular public schools are even more impressive, at over 1.5 [standard deviations] for Boston 7th graders…

The difference between the average pre-application achievement of students enrolled at the Boston Latin School and those enrolled at a traditional Boston school… is over two standard deviations for Math and about 1:75 for English.

Before getting to the authors’ central findings on whether exam schools add value to already high-performing students, a couple of notes on items of interest:

  • Boston Latin School has far fewer advanced placement (AP) courses than New York’s Stuvyesant (37 to 23). (What’s up with that?)
  • The average student-to-teacher ratio at the Boston Latin School is 22, compared to a district-wide average of 12 for middle schools and 15 for high schools. (What does that say about the old saw about class size mattering?)

But working from “registration and demographic information for Boston Public School (BPS) students from 1997-2009” (including MCAS Math English, Writing and Science testing data), the authors center the analysis of The Elite Illusion on the impact of attendance at an exam school on academic achievement. The authors further look at PSAT and SAT data as well as AP scores, to ensure that the state standardized tests do not skew their findings.

Exam school students do very well in school, but do they do better than they would elsewhere? Does an exam school education add value? The surprising answer is that in most grades exam schools offer little additional benefit in terms of student achievement. On the positive side of the ledger, minority students attending exam schools do see modest improvement in English testing scores.

Our results offer little evidence of an achievement gain for those admitted to an exam school; most of the estimates can be interpreted as reasonably precise zeros, with a smattering of significant effects, both positive and negative. In other words, in spite of their exposure to much higher-achieving peers and a more challenging curriculum, marginal students admitted to exam schools generally do no better on a variety of standardized tests.

The findings are very much in line whether you use the MCAS or the SATs. As the authors note in their conclusion:

It’s interesting to contrast the results reported here with those from recent studies of Boston and New York charter schools using quasi-experimental research designs. Abdulkadirofiglu, Angrist, Dynarski, Kane, and Pathak (2011) and Dobbie and Fryer (2011) show substantial gains from attendance at charter schools that embrace the No Excuses pedagogical model. Many of these schools serve exceptionally low achievers. Moreover, the relationship between baseline ability and treatment effects within the urban charter population appears to be negative (Angrist, Dynarski, Kane, Pathak, and Walters, 2010; Angrist, Pathak, and Walters, 2011). The results reported here, showing evidence of achievement gains for minorities, are therefore broadly consistent with the charter findings. The comparison between No Excuses charters and exam schools also suggests that the scope for improvement in learning may be wider at the low end of the ability distribution than at the top. Together, these findings weigh against the view expressed recently by Cunha and Heckman (2007), among others, that “… returns to adolescent education for the most disadvantaged and less able are lower than the returns for the more advantaged” (page 33).

This is one of those studies that makes you question conventional wisdom. Wow.

Crossposted at Boston.com’s Rock the Schoolhouse. Follow me on twitter at @jimstergios, or visit Pioneer’s website.