Solstice FC
← All specs

Solstice FC Success Metrics

draftInformed by: all-rounds

Solstice FC Success Metrics

Success metrics synthesized from all 13 debate verdicts (Rounds 1, 4-10, Semifinals 1-2, Final). Organized by phase with explicit measurement criteria.


Year 1: Launch / Proof of Concept

Core question: Does the model work? Can we operate, attract clubs, and collect the data we need?

# What We Measure Target Source Debate How We Measure It
1.1 Proof-of-concept completion All POC milestones hit within published timeline Round 1 Milestone checklist with dates; public progress tracker updated monthly
1.2 Timeline accountability Zero unacknowledged deadline slips; any slip disclosed within 7 days with revised date Round 1 Published timeline vs. actuals log; deviation report shared with member clubs
1.3 Flat fee revenue coverage $2,000-2,800/club/year covers 100% of operating costs Round 4 Quarterly P&L showing revenue vs. expenses; break-even or surplus by month 12
1.4 Scholarship fund seeding Scholarship fund established at 10% of gross revenue + any sponsorship earmarks Round 4 Dedicated fund account balance; quarterly contribution reports
1.5 Floor compliance rate 90% of member clubs meet all floor requirements by end of season 1 Round 7 Audit checklist per club: coaching certifications held, minimum training hours logged, coach-to-player ratios met, financial transparency docs filed, SafeSport compliance current
1.6 Institutional partnership (anchor) 1 anchor institutional partner secured (university, federation, or municipal rec dept) Round 9 Signed MOU or partnership agreement on file
1.7 Club recruitment Minimum viable club count for league operation (target defined pre-launch) Round 9 Signed membership agreements; paid dues
1.8 Pro/rel criteria transparency 100% of promotion/relegation criteria and current standings publicly accessible Round 5 All criteria published on website; standings updated within 48 hours of results
1.9 Operational data collection completeness 95% of defined season-one data fields captured across all member clubs Round 10 Data completeness audit at season end; field-level fill rates per club
1.10 Charter ratification participation 70%+ of eligible members participate in charter ratification vote Final Vote turnout records from ratification process
1.11 Player data consent rate 80%+ of families opt in to anonymized player development data sharing Round 6 Consent form completion rate tracked per club
1.12 Family confusion score (baseline) Establish baseline; target <20% of families reporting confusion on key UX touchpoints Semifinal 2 Post-registration and mid-season surveys; support ticket categorization

Year 2: Growth / Platform

Core question: Is the platform ready to scale? Are clubs staying, players developing, and governance working?

# What We Measure Target Source Debate How We Measure It
2.1 Member club retention 85%+ of year-1 clubs renew for year 2 Semifinal 1 Renewal rate = (clubs renewing / clubs eligible) x 100
2.2 Platform readiness Digital platform operational for season-two needs (scheduling, standings, data dashboards) Round 10 Platform launch checklist; feature acceptance testing complete before season 2 kickoff
2.3 Player development tracking (technical) Technical development metrics tracked for 90%+ of registered players; physical metrics secondary Round 6 Per-player technical assessment records (skills evaluations, not fitness tests); coverage rate across clubs
2.4 Coach retention rate 80%+ season-over-season coach retention across member clubs Round 8 Club-reported coach rosters compared year over year
2.5 Injury rate tracking Injury rates tracked per club; target below comparable youth league benchmarks Round 8 Injury reports filed per club per season; rate = injuries / 1,000 player-hours
2.6 Player/parent feedback scores 4.0/5.0+ average satisfaction across coaching quality, communication, and experience Round 8 End-of-season surveys distributed to all registered families; response rate >50%
2.7 Governance satisfaction 70%+ of member club leaders rate governance as "satisfactory" or higher Semifinal 1 Annual governance survey administered to club leadership
2.8 Communication quality 80%+ of families rate league communications as "clear" or "very clear" Semifinal 2 Mid-season and end-of-season family surveys; support ticket volume trending down YoY
2.9 Family confusion reduction 50% reduction from year-1 baseline on confusion-related support tickets Semifinal 2 Support ticket categorization; survey comparison vs. year-1 baseline
2.10 Revenue sustainability Operating surplus or break-even maintained; no emergency assessments levied Round 4 Annual financial report; audited if >$100K revenue
2.11 Scholarship fund growth Fund balance grows; at least 1 scholarship awarded Round 4 Fund balance report; scholarship recipient records
2.12 Evidence-based deliberation quality Board/committee decisions reference data from year-1 operations in 80%+ of recorded votes Final Meeting minutes review; decision rationale documentation

Year 3+: Scale / Sustainability

Core question: Is the model durable, replicable, and delivering on its developmental mission?

# What We Measure Target Source Debate How We Measure It
3.1 Floor compliance rate (sustained) 95%+ of member clubs maintain all floor requirements Round 7 Annual compliance audit: certs, hours, ratios, financials, SafeSport
3.2 Pro/rel system functioning At least 1 promotion/relegation cycle completed with no disputes escalated beyond internal process Round 5 Pro/rel transaction log; dispute records
3.3 Player development longitudinal data 3-year technical development trajectories available for 70%+ of players with 2+ seasons Round 6 Player database query; longitudinal records with minimum 2 assessment points
3.4 Institutional partnerships (expansion) 3+ active institutional partnerships Round 9 Active MOUs/agreements on file; annual partnership review
3.5 Club recruitment (growth) Net positive club growth year over year Round 9 Membership count trend; new clubs minus departures
3.6 Member club retention (sustained) 90%+ renewal rate Semifinal 1 Annual renewal tracking
3.7 Revenue diversification No single revenue source >60% of total (fees, sponsorships, grants, events) Round 4 Revenue breakdown in annual financial report
3.8 Scholarship fund impact Scholarships awarded cover 5%+ of total registered players from underserved communities Round 4 Scholarship awards / eligible player population
3.9 Coach quality outcomes (longitudinal) Clubs with certified, retained coaches show measurably better player satisfaction and lower injury rates Round 8 Correlation analysis: coach tenure/certs vs. parent satisfaction and injury rates
3.10 Vote turnout (ongoing governance) 60%+ turnout on major governance votes (bylaw changes, budget approval) Final Vote participation records per ballot
3.11 Community engagement depth 50%+ of member clubs send representatives to annual meeting/assembly Final Attendance records at governance events
3.12 Family experience (sustained) <10% confusion rate; 4.2/5.0+ satisfaction Semifinal 2 Annual family survey; support ticket trends
3.13 Operational data maturity Data pipeline supports automated standings, public dashboards, and annual reports without manual intervention Round 10 System uptime; manual intervention log

Measurement Principles

Derived from debate verdicts:

  1. Transparency over secrecy. All criteria, standings, and financial summaries are public by default (Round 5, Final).
  2. Technical over physical. Player development tracking prioritizes skill acquisition, not physical metrics (Round 6).
  3. Consent-first data. No player data shared without opt-in; aggregate only unless explicit consent (Round 6).
  4. Floors, not ceilings. Compliance metrics enforce minimums; clubs are free to exceed them (Round 7).
  5. Evidence over intuition. Decisions at every level should reference collected data (Final).
  6. Family clarity is a first-class metric. Confusion is a system failure, not a user failure (Semifinal 2).