Shortlist

Does Expensive Childcare Mean Better Childcare?

Correction (March 7, 2026): This report has been updated to reflect Shortlist's revised scoring methodology. The original version (March 5) described a 5-dimension editorial assessment; scores are now calculated using a rules-based point system. The headline finding is unchanged — price does not predict quality. Several specific claims have been corrected, most notably: corporate chains now score higher than co-ops and independent centers on verifiable quality indicators, reversing the original report's finding. Full methodology: shortlist.guide/methodology.html.

Most parents assume that paying more for childcare buys better quality. It seems intuitive: higher tuition should mean better teachers, stronger curriculum, safer environments. We tested that assumption against real data — rules-based quality scores and normalized cost-per-hour pricing for 250 providers across Seattle, Chicago, Denver, Kansas City, and Charlotte.

The result: price explains almost nothing about quality.

−0.04
Spearman correlation between price and quality score (95% CI: −0.16 to +0.09, not statistically significant)
250
Providers analyzed
5
Cities
$6.47
Median $/hr
$7.42
Mean $/hr

The Finding

We divided providers into four groups by cost — cheapest 25%, second quartile, third quartile, and most expensive 25%. The most expensive group actually scored the worst on average, though the differences between groups are within sampling noise.

Cost QuartileProvidersAvg $/hrAvg ScoreA or B Rate
Cheapest 25%62$3.847.132%
Second quartile62$5.768.237%
Third quartile62$7.517.632%
Most expensive 25%64$12.436.923%

The most expensive quartile averages $12.43/hr — 3.2x more than the cheapest at $3.84/hr. Yet the expensive group has the lowest proportion of high-quality providers: 23% earning A or B (95% CI: 14%–34%), compared to 32% in the cheapest quartile (95% CI: 21%–44%). Parents paying top dollar are not systematically getting better programs. They are paying for location, brand, and schedule convenience.

The Spearman rank correlation between cost per hour and quality score is −0.04 (95% CI: −0.16 to +0.09) — statistically indistinguishable from zero (n = 250, p > 0.05). There is no meaningful relationship between what a provider charges and how it scores on our quality assessment. This holds whether we use the raw quality score (0–30 scale) or the letter grade. It holds across all five cities.

What Are Parents Actually Paying For?

If price does not predict quality, what drives the cost differences? We measured how much of the cost variation each factor explains on its own (eta-squared, one factor at a time). These are not additive — neighborhood, city, and category overlap — but they reveal which factors dominate:

Factor (tested independently)Share of Cost Variance Explained
Neighborhood59%
City11%
Quality score (A/B/C/D)3%
Program category2%

Neighborhood alone explains 59% of what you pay. In Seattle, the average cost per hour in Medina is over $15; in White Center, it is $3.25. That is a 5x difference for programs serving the same age groups in the same metro. Quality score — the one factor that measures what actually happens inside the program — explains just 3%.

By City

The price-quality disconnect is consistent across markets, though price levels differ substantially.

CityProvidersAvg $/hrAvg Score$/hr Range
Seattle102$7.81C$1.88–$26.36
Chicago65$9.07C$2.31–$18.99*
Denver35$6.51B$4.12–$10.37
Kansas City43$4.82C$2.52–$10.16
Charlotte5$6.82C$4.27–$12.35*

Kansas City has the lowest average cost ($4.82/hr) and one of the higher average scores. Chicago has the highest cost ($9.07/hr) but the same average grade as Kansas City. Paying more at the city level does not buy more quality either. *Chicago excludes one outlier at $38.49/hr (JCC Am Shalom, a part-time program). Charlotte sample is small (n=5) — treat as directional.

By Program Type

This is where the data challenges conventional wisdom.

CategoryProvidersAvg $/hrAvg ScoreA or B Rate
Independent Center101$8.07C19%
Montessori Center62$7.23C31%
Corporate Chain53$6.83B55%
Language Immersion15$6.50C33%
Co-op Preschool9*$6.63C44%
Home Daycare9*$7.37C22%

Corporate chains score highest on verifiable quality indicators. The 53 chain providers in our dataset (KinderCare, Bright Horizons, Goddard, etc.) average a B, with 55% earning an A or B — the highest of any category. This is not because chains are inherently better programs. It is because our scoring rewards verifiable indicators: accreditation, documented background checks, published ratios, and structured inspection records. Chains systematically have these. Independent centers — some of which may be excellent programs — often lack the documentation to prove it.

This distinction matters. A high score means "verifiably meets quality standards." A low score means "we cannot verify quality from available data." Those are different statements. *Co-op and home daycare samples are small (n=9 each). Treat as directional, not definitive.

The Outliers

Expensive but low-scoring

These providers charge top-quartile prices (above $8.58/hr) but scored C or D on Shortlist's quality assessment. 49 providers — about 20% of the dataset — fall in this category.

ProviderCity$/hrScoreType
JCC Chicago Early Childhood at Am ShalomChicago$38.49DIndependent
Harkness House for ChildrenChicago$28.87DIndependent
St. Thomas SchoolSeattle$26.36DIndependent
Winnetka Public School NurseryChicago$23.09CIndependent
Glencoe Junior KindergartenChicago$19.25DIndependent
Highland Park MontessoriChicago$18.48CMontessori
Guidepost Montessori at Magnificent MileChicago$18.41CMontessori
University Child Development SchoolSeattle$17.78DIndependent

Many are part-time programs where high per-hour costs reflect short schedules rather than premium quality. A D score does not necessarily mean a bad program — it means the program has limited verifiable quality indicators in our data. Some of these may be excellent programs that simply don't publish the information needed to score well.

Affordable and high-scoring

These providers charge bottom-quartile prices (below $4.82/hr) and scored A or B.

ProviderCity$/hrScoreType
Ward Parkway PreschoolKansas City$2.77BIndependent
City of Fountains SchoolKansas City$3.17BIndependent
The Goddard School BallantyneCharlotte$3.50BChain
Purple Dragon PreschoolKansas City$3.52BIndependent
Park West Cooperative NurseryChicago$3.70BCo-op
Brookside MontessoriKansas City$3.70BHome Daycare
Global Village International PreschoolDenver$3.96BLanguage Immersion
Denver Waldorf SchoolDenver$4.16BIndependent

22 providers — about 9% — are in the cheapest quartile and scored A or B. Kansas City and Denver are well-represented. Quality childcare at an affordable price does exist. It is just hard to find without the data.

The Transparency Signal

If price is not a reliable signal of quality, what should parents look for? One pattern stands out: providers that share more information tend to score better.

Shortlist tracks 10 key data fields for each provider: pricing, schedule, age range, ratios, meals, diapers, teaching philosophy, staff tenure, and pay range. Providers vary widely in how many of these they make publicly available.

Data Fields SharedProvidersA or B RateD Rate
1–3 fields534%68%
4–5 fields2388%45%
6–7 fields11372%0%
8–10 fields1100%0%

Providers sharing 6 or more data fields have a 72% A/B rate. Those sharing 3 or fewer: 4%. The jump from 5 fields to 6 is where quality becomes visible.

An important caveat: Shortlist's scoring directly rewards some of these data fields — having a published ratio or teaching philosophy earns points. So part of this correlation is mechanical: sharing data literally raises the score. We cannot fully separate "transparent because high-quality" from "high-scoring because transparent." Both dynamics are likely at work. But the practical takeaway for parents is the same: a provider's willingness to share basic information about their operations is the best publicly available signal of quality. Better than price, better than brand, better than neighborhood.

What This Means

The childcare market has a pricing problem. In most consumer markets, price carries at least some information about quality. In childcare, it carries almost none. Parents who stretch their budgets for the most expensive option are not buying measurably better outcomes. They are buying a neighborhood, a brand, or a schedule that fits their workday.

This does not mean all childcare is the same. Quality varies enormously — from A-rated programs with strong inspection records, documented credentials, and transparent operations, to D-rated programs where we simply cannot verify quality from available data. The variation is real. It is just not correlated with what you pay.

For parents making this decision, the data suggests a different approach: instead of anchoring on price as a proxy for quality, look for verifiable indicators. Ask about accreditation. Ask about inspection results. Ask about staff credentials and ratios. The providers that can answer these questions clearly — and do so without being asked — are the ones most likely to earn high marks on an independent assessment.

Compare providers by score, cost per hour, and data completeness. Sort, filter, and see the full data.

Explore the Database

Methodology

This analysis covers 250 of 405 total providers in Shortlist's database — those with verified schedule data (days per week and hours per day) needed to calculate cost per hour. The 155 providers without cost data average a lower quality score (4.7 vs. 7.5 for included providers), so this sample over-represents higher-quality, more-transparent programs. Cost per hour = monthly tuition ÷ (days per week × 4.33 × hours per day). Quality scores use Shortlist's rules-based point system (0–30 scale): providers earn points for verifiable indicators across quality credentials, operational quality, and transparency. Scores are converted to letter grades (A ≥ 14, B ≥ 9, C ≥ 4, D < 4). Spearman rank correlation with 10,000-iteration bootstrap confidence interval. All data verified as of March 2026.

Statistical notes. Spearman ρ = −0.039 (bootstrap 95% CI: −0.159 to +0.087, n = 250, p > 0.05). Result is consistent using quality_score (continuous, 0–30) or letter grade (ordinal). Eta-squared values are unadjusted and likely overestimate effect sizes for small groups; they are presented as rough guides to relative magnitude, not precise estimates. Charlotte (n = 5), co-op (n = 9), and home daycare (n = 9) subsamples are too small for reliable inference and are flagged accordingly. The transparency-quality correlation is partially mechanical — Shortlist's scoring awards points for several of the tracked data fields. Selection bias: providers with published pricing are systematically more transparent than the excluded group, inflating the sample's average quality score.

Source: Shortlist original research, March 2026 (updated March 7). 250 providers with verified cost and quality data across 5 U.S. cities. Data available for media use with attribution to Shortlist (shortlist.guide). Contact: [email protected]