Results of meta‐analyses are potentially valuable for informing environmental policy and practice decisions. However, selective sampling of primary studies through searches exclusively using widely used bibliographic platform(s) could bias estimates of effect sizes. Such search strategies are common in environmental evidence reviews, and if risk of bias can be detected, this would provide the first empirical evidence that comprehensiveness of searches needs to be improved. We compare the impact of using single and multiple bibliographic platform(s) searches vs more comprehensive searches on estimates of mean effect sizes. We used 137 published meta‐analyses, based on multiple source searches, analyzing 9388 studies: 8095 sourced from commercially published articles; and 1293 from grey literature and unpublished data. Single‐platform and multiple‐platform searches missed studies in 100 and 80 of the meta‐analyses, respectively: 52 and 46 meta‐analyses provided larger‐effect estimates; 32 and 28 meta‐analyses provided smaller‐effect estimates; eight and four meta‐analyses provided opposite direction of estimates; and two each were unable to estimate effects due to missing all studies. Further, we found significant positive log‐linear relationships between proportions of studies missed and the deviations of mean effect sizes, suggesting that as the number of studies missed increases, deviation of mean effect size is likely to expand. We also found significant differences in mean effect sizes between indexed and non‐indexed studies for 35% of meta‐analyses, indicating high risk of bias when the searches were restricted. We conclude that the restricted searches are likely to lead to unrepresentative samples of studies and biased estimates of true effects.