Skip to main content

Data and Insights

How We Measure What Matters (And Why It's Different)

Most education programs measure activity. We measure thinking.

You'll get data that shows not just what students did, but how they grew as thinkers.


What We Track​

πŸ“Š

Engagement Metrics

Workshop participation:

  • Attendance and active involvement
  • Contribution to discussions
  • Completion of reflection exercises

Portal usage (12 months):

  • Frequency of logins
  • Challenges completed
  • Peer interactions
🧠

Thinking Quality Indicators

Causal reasoning: Identify cause-effect chains, recognize unintended consequences

Perspective-taking: Quality of opposing arguments, evidence-based reasoning

Cross-domain connections: Pattern recognition, creative problem-solving

πŸ”

Metacognitive Growth

Self-awareness: Ability to describe thinking process, recognize blind spots

Reflection depth: Quality of journal entries, insight development, transfer to new contexts

πŸ’‘ Why it matters: These skills predict success in college and careers. Metacognition is the #1 predictor of lifelong learning success.


What You'll Receive​

Post-Workshop Report (Within 2 Weeks)​

πŸ“„

Engagement Summary

  • Which students participated most actively
  • Participation patterns by module
  • Peer collaboration data
🎯

Thinking Framework Adoption

  • Which frameworks resonated with which students
  • Evidence of skill application
  • Areas of strength and growth
✍️

Reflection Highlights

  • Key insights from student journals (anonymized)
  • Metacognitive development indicators
  • Quotes showing thinking evolution
πŸ’‘

Recommendations

  • How to reinforce skills in your classroom
  • Suggested follow-up activities
  • Resources for continued growth

Quarterly Portal Reports (Year 1)​

πŸ“ˆ

Ongoing Engagement

Track how many students remain active, which challenges they complete, and peer interaction patterns

πŸŽ“

Skill Retention

Evidence of continued framework use, quality of responses over time, and transfer to new domains

πŸ“Š

Long-Term Growth

Comparison to baseline, trajectory of development, and predictors of sustained engagement


How We Protect Student Privacy​

πŸ”’

FERPA & COPPA Compliant

  • No personally identifiable information in aggregate reports
  • Anonymized quotes and examples
  • Secure data storage with encryption
  • Parent opt-out options available
  • Data deletion requests honored immediately
βœ“ You get insights without compromising student privacy.

What Makes Our Data Different​

We Don't Just Countβ€”We Analyze​

❌

Traditional Metrics

  • "85% of students completed the assignment"
  • "Students logged in 12 times"
  • "Students rated the program 4.5/5"
βœ…

Our Metrics

  • "Students improved causal reasoning by 34% as measured by blind-scored essays"
  • "Students demonstrated sustained metacognitive practice over 6 months"
  • "Students transferred thinking frameworks to 3+ different contexts"
πŸ’‘ We measure outcomes, not outputs.

Sample Data Visualizations​

Engagement Over Time​

100%Workshop
78%Month 3
65%Month 6
52%Month 12

βœ“ Retention rates exceed industry standards (typical: 20% by month 6)

Thinking Framework Adoption​

82%Causal Reasoning
76%Perspective-Taking
68%Cross-Domain Insight

Metacognitive Development​

3.2Pre-Workshop (out of 10)
7.1Post-Workshop
7.8Month 6
Measured via MAI (Metacognitive Awareness Inventory)

Using Data to Drive Decisions​

How Districts Use Our Reports​

βœ… Identify High-Potential Students

For advanced programs

βœ… Target Interventions

For students who need support

βœ… Demonstrate ROI

To school boards and funders

βœ… Refine Curriculum

Based on what's working

βœ… Track Equity

Across student populations

βœ“ Data becomes actionableβ€”not just informative.

The Bottom Line​

You can't improve what you don't measure.

We give you the data to prove that critical thinking instruction worksβ€”and to show exactly how your students are growing.


Want to See Sample Reports?​