Happy Almost Halloween. I hope that in spite of everything you have found a way to celebrate with your kids, even if it is just buying a family size Toblerone Bar and letting them eat it all.
The newsletter will be on a brief hiatus for a week or so as, honestly, I’m struggling to see past Tuesday. I’ll be back the end of next week, though, and I have a post teed up on the question of toddler teeth brushing.
But! I thought before a break, I’d give a bit of an update on the school dashboard. I realize although I spend almost all of my Twitter energy on this topic, I haven’t talked about it much here of late.
Basics: The dashboard is here. You can also find some more FAQ about it here.
If you’re a regular reader here, you’ll recall the basic structure of this effort. School and districts opt-in to provide information on their enrollments and opening plans, and then their COVID-19 cases. We collate these, and post information in the dashboard. We’re showing infection rates in students and staff, and information on mitigation strategies. If you poke around the dashboard you can do things like filter by age group, state, or state community rates. We also have some filters by mitigation strategy.
I’ve written about early information from this in a number of places, including most notably The Atlantic. We’re also writing a series of articles in Insider, a bit more in the weeds, including here and here. Other outlets have covered us, also, some positive and some…less positive (more on this below).
Yesterday we posted a big update to the dashboard; if you look, you’ll see the most recent wave increases the sample to 2.7 million kids, about 1.5 million in person. This reflects the addition of new data from New York, where the state is officially collecting both cases and counts of kids in school. By pulling down these data from their website, we’re able to increase our sample a lot, and make it more comprehensive and representative (at least of this particular area). Since these data are already public, we’ve also provided them in an excel document if you want to play around with it yourself.
The rates in the most recent period largely reflect New York given the balance of the data; a child rate of 5 per 100,000 per day (or about 0.7 infections in a school of 1000 over two weeks) and a staff rate of 12 per 100,000 per day (or about 1.7 infections per 1000 over two weeks).
Big Take-Aways?
Are there big take-aways from the data? When people ask this, I focus on two.
First, schools move with communities. I’ll write more about this in coming days, but you can see it yourself in the filters and in this piece. When community rates are higher, school rates are higher. This means, among other things, that controlling community rates is key to protecting schools.
Second, age matters. Elementary schools have lower infection rates than high schools. Staff rates in elementary schools are also slightly lower although not much. This seem more likely to reflect variation in the infection rates by age.
The obvious question that comes up here is “Are these infection rates high?” Without a concrete view on what “high” means, this is difficult to answer. More than that, individual families may differ in the risk of exposure they are willing to tolerate — this is why it’s valuable to have some numbers. The data doesn’t show schools as higher risk than their surrounding communities, which is perhaps the simplest thing to say.
Criticism?
Not surprisingly, the dashboard has come under some criticism. To the credit of the team, most of this has surrounded not the effort itself but my presentation of it. This is important to me: the team has worked extremely hard on this on a largely volunteer basis. They deserve credit, and not scorn.
The most significant concerns raised with the data itself is the self-selection of schools into the sample and the fact that our data doesn’t cover a representative sample of schools. This is, for example, the main concrete issue that this piece in The American Prospect raises about the data. We have maintained — and continue to maintain — that the data isn’t perfect. It’s just that it is the best that we have in the absence of federal leadership. The addition of the New York data — which came out the day after that piece posted — improves this. But more can be done; a bit on this in next steps below.
A second issue is the lack of analysis by race in the dashboard. This, too, we are working to do more of.
Because I share a lot here, it probably will not surprise you that I’ve found some of the personal criticism hard to take. The American Prospect piece was especially harsh on my motivations and qualifications. But I’m working to adopt a little more of the attitude expressed by Michael Phelps during one past Olympics, when criticized by the French: “We welcome comments. We use them as fuel.”
Next Steps
More enrollment. Larger urban districts. STATES. Parochial school groups.
Where the criticism of what we have thus far is spot on is in the need for more comprehensive data. We hope we can get this by working with larger school groups — charter organizations, archdiocese, states. Many of these collaborations are underway.
District enrollment is still key, and we hope we can pull more districts in, as well.
If you think you can help, please do reach out. It’s a team effort!
Thanks for reading, and all your support. See you next week.
Community Guidelines