We are big fans of the Harlem Children’s Zone around these parts. Like many others, I’m sure part of it is romantic – we aspire to see a social program work as well as HCZ appears because we love Harlem and we love magic bullets. If it works there, it can work here, too, right? Right??? If you’ve been following the developments lately, you’re as sad as me about the exposure of evaluation weaknesses and the implications of such nudity.

Not so long ago, HCZ was criticized for not truly achieving at the rate of some of its claims. The criticism came from the Brookings Institution (see the report) and was authored by Russ Whitehurst (who is well-known in evaluation as the head of the Institute of Education Sciences under Bush, where many believe the lunge toward randomized trials was re-institutionalized, but let’s not reopen that debate here). Suffice it to say, right or wrong, within a week of the Brookings Institution’s public criticism, we see that Congress is moving to cut the proposed Promise Neighborhoods, a proposal that had been based entirely on the success claimed by HCZ. We don’t know if the congressional move is a result of the influential Brookings report, but it seems highly coincidental and noteworthy, no?

Also highly coincidental and noteworthy is the recent job posting on the American Evaluation Association’s Career Center webpage for, you guessed it, evaluation help at HCZ. The job was posted exactly one week before the initial Brookings criticism. So what? So it is possible, if speculative, the evaluation weakness is at play here, folks. My impression, wherever I got it, was that HCZ had evaluation on its radar. But whether it is evaluation instability or a growing need for more evaluation staff, it is clear there is a gap around evaluation. At the very least, more public, independent publishing of evaluation reports from HCZ could have helped stave off the criticism. The website has a good deal about evaluation – but it is limited to a discussion of its commitment to evaluation and the evaluation of nonHCZ programs. Show me the iron-clad numbers! Or at least the well-documented success stories! If its there, it isn’t obvious.

Without published reports from credible sources, HCZ has little to stand on when defending itself. (And people ask me why external evaluators are needed…). It is put in the defensive position, instead of an offensive, reputable analysis of its own situation – good, bad, and ugly. Oh, HCZ. You have my admiration, but you need my field’s assistance.