From this blog post:
The problem is precisely that so many of the studies showed positive results. The authors present a model that says there should be more papers showing negative effects just by chance. They conclude that the reason they can't find any is due to the well-known (and very real) problem of publication bias: namely that papers with null results are boring and often never get published (or maybe never even get written up and submitted in the first place). After the authors use their model to estimate how many unpublished papers probably exist, they conclude that the average effect of lead on crime is likely zero.
So the peer-reviewed studies show a big impact of lead on crime, but these "researchers" just made up non-existent studies, threw them in their model, and said there was no impact of lead on crime?
What the living hell??
I'm not saying publication bias is not a problem. But by this logic, every meta-analysis of anything could add in studies to show no effect.
Lead was and is an absolutely huge pubic health disaster. There are loads of things we could be doing to reduce kids' lead exposure right now here in the U.S. (Hillary Clinton had a detailed plan for it*), and loads more could be done around the world. This supposed "meta-analysis" strikes me as borderline criminal. (Maybe the researchers were born into peak-lead time, like Anne and I were.)
*But of course she did.