RAOP
Assessment & Evaluation
RAOP uses a structured assessment and evaluation framework aligned to program objectives: (1) strengthen educator capability in inquiry-based STEM research workflows, (2) support evidence-based reasoning using simulation-to-hardware activities, and (3) produce classroom-ready lessons and assessments that educators can implement in grades 5–12 settings. The framework prioritizes artifact-based evidence collected every cycle, embeds lightweight formative checks during labs, and rotates one validated standardized instrument per year to maintain rigor while minimizing burden. Findings are reviewed biannually with the External Advisory Board and used for continuous improvement across cohorts.
- Growth in inquiry-based experimental design and evidence-based reasoning through simulation-to-hardware workflows.
- Growth in scientific literacy skills for interpreting results, evaluating evidence, and communicating findings.
- Quality and classroom readiness of educator-produced lessons and assessments derived from RAOP labs.
- Feasibility for adoption in high-need Local Education Agencies (LEAs) through clear materials, constrained classroom workflows, and practical assessment tools.
A central deliverable across cohorts is a robotics and automation-friendly curriculum package (lesson + student materials + assessment rubric + implementation notes) that can be adopted and adapted by high-need LEAs.
Primary Outcome Measures
The instruments below are used to measure educator growth and to document evidence aligned to RAOP learning outcomes. Results are summarized at the cohort level and used to refine future cycles.
Standardized instruments are administered using a pre/post or post-only design as appropriate for the annual cohort and program feasibility, with decisions guided by educator burden and advisory board recommendations. Artifact-based evidence and formative checks are collected every cycle and serve as the primary evidence base for continuous improvement.
- GIBL reflection logs (Predict–Test–Observe–Explain–Reflect) tied to selected modules
- Educator-developed lesson plan/curriculum draft (grades 5–12) derived from RAOP labs
- Classroom adaptation plan (constraints, feasibility, materials, timing)
- Implementation notes (barriers, supports, troubleshooting outcomes) used for annual refinement
- This is the primary evidence base every year and is designed to be low-burden and classroom-relevant.
- Short exit tickets/check-ins linked to lab objectives
- Observation checklists for setup readiness and expected outputs (virtual and hardware)
- Lab artifacts (plots, screenshots, brief interpretation prompts) for coaching and reflection
- Formative checks are used for instructional improvement and educator support.
- Cohort-level pre/post summaries for the selected instrument
- Triangulation with artifact-based evidence and educator reflections
- Planned rotation: Year 1—EDAT; Year 2—TOSLS; Year 3—CLASS (or equivalent); Years 4–5 repeat based on findings.
- If a different validated attitude instrument is substituted, the same pre/post timing and reporting structure are preserved.
Evaluation Components
Evaluation combines learning outcomes, implementation fidelity, and classroom translation evidence to ensure the program delivers measurable value and produces implementable classroom outputs.
- Structured observation checklists for virtual sessions and on-site labs
- Attendance and participation tracking for required sessions
- Facilitator logs documenting constraints, deviations, and adjustments
- Cohort-level fidelity summary and lessons learned
- Actionable improvements for the next cycle (pacing, supports, logistics)
- Artifact-based evidence collected every cycle (GIBL reflections, lab outputs, interpretation prompts)
- One rotating standardized pre/post instrument per year (EDAT or TOSLS or CLASS/equivalent)
- Formative checks embedded in modules for coaching and support
- Cohort-level outcome summaries and trends
- Mapped evidence to program objectives and planned refinements
- Rubric-based review of educator deliverables (lesson plan, student materials, assessment tool, implementation notes)
- Peer review and program-team feedback cycles (draft → revise → finalize)
- Advisory board review of cohort-level deliverable quality and adoption constraints (biannual)
- Classroom-ready implementation package per educator (lesson + student materials + assessment + implementation notes)
- Identified strengths/gaps used to improve templates and supports for the next cohort
- End-of-week pulse surveys (Weeks 1–3) focused on clarity, pacing, and support needs
- End-of-program survey on feasibility and intended classroom use
- Optional follow-up check-in during the academic year (adoption status and needs)
- Cohort-level experience summary (what worked, what should change)
- Targeted supports for classroom adoption (time, resources, constraints)
- During the virtual and on-site phases, formative checks and facilitator logs identify where educators need additional scaffolding, clarity, or pacing adjustments.
- At the end of each cohort, pre/post outcome summaries and deliverable reviews are used to revise module selection, instructional supports, and PD templates for the next cycle.
- Biannual advisory reviews provide external oversight and technical guidance. Advisory recommendations are tracked and incorporated into the next iteration of the RAOP curriculum and assessment materials.
- Across five years, evidence is consolidated to produce a stable, robotics and automation-friendly curriculum that is feasible for adoption by high-need LEAs.
Technical & Advisory Support Team
RAOP evaluation is supported by continuous technical guidance and external oversight. The advisory board reviews progress biannually and collaborates with program leadership to ensure alignment with educational and research objectives.
- Provides technical support for both virtual and on-site laboratory environments.
- Assists with setup verification, troubleshooting, and workflow reliability.
- Supports consistent execution of the simulation-to-hardware workflow used in RAOP activities.
Technical support is provided in coordination with the RAOP program team to ensure smooth delivery and a consistent participant experience.
The board provides oversight, strategic feedback, and technical guidance. Reviews are conducted biannually.
- Cohort-level outcome summaries and educator experience trends.
- Quality and classroom feasibility of educator deliverables and assessment tools.
- Alignment of module selection with RAOP objectives and adoption constraints in high-need LEAs.
- Refinement priorities for the next annual cycle (templates, pacing, supports, dissemination).
Timeline
The timelines below describe how RAOP assessment and evaluation are executed for the first year cycle (2026) and how the process repeats and improves over the five-year program.
Year 1 Cycle (2026) — Assessment & Evaluation Timeline
This timeline is designed to be clear for educators, program staff, and the External Advisory Board. Evidence is summarized at the cohort level and used for continuous improvement.
| When | Activity | Owner | Evidence collected | Output / decision |
|---|---|---|---|---|
Jan–Feb 2026 (Recruitment & selection window) | Finalize cohort selection; confirm Educator eligibility; distribute onboarding instructions and required accounts. | RAOP Program Team |
| Confirmed cohort and readiness status for summer cycle. |
Jun 2026 (Technical onboarding) | Software access verification; orientation to QLabs workflow; expectations for artifacts and deliverables; evaluation overview. | RAOP Program Team + Quanser Engineering Team |
| Educators ready to begin Week 1 virtual phase with standardized setup. |
Week 1 (Virtual phase) | Pre-assessment for the rotating standardized instrument selected for the annual cycle (e.g., EDAT or TOSLS or CLASS/equivalent) + foundational guided inquiry modules; formative checks embedded in labs. | Educators + RAOP Program Team |
| Baseline dataset for the annual instrument and first-week learning evidence for coaching. |
Week 2 (Virtual phase) | Deeper analysis and design tasks; draft classroom materials (lesson outline + assessment draft); peer review cycle. | Educators + RAOP Program Team |
| Draft classroom-ready package prepared for on-site validation. |
Week 3 (On-site phase) | Hardware-aligned validation and demonstration; revise lesson/assessment for feasibility; capture performance evidence. | Educators + RAOP Program Team + Quanser Engineering Team |
| Final classroom implementation package per educator. |
End of Week 3 (Immediate post-program) | Post-assessment for the rotating standardized instrument selected for the annual cycle + final deliverables submission + end-of-program feedback survey. | Educators + RAOP Program Team |
| Cohort outcome summary and consolidated recommendations for refinement. |
Fall 2026 (Implementation follow-up) | Follow-up check-in on classroom adoption in high-need LEAs; identify barriers; provide targeted supports where feasible. | RAOP Program Team |
| Adoption evidence and needs analysis to strengthen year-2 supports. |
Dec 2026 (Biannual advisory review) | Advisory Board reviews cohort-level outcomes, deliverable quality, and adoption evidence; approves refinement priorities. | External Advisory Board + RAOP Leadership |
| Year-2 improvement plan and documented advisory recommendations. |
Five-Year Program Timeline — Continuous Improvement and Adoption
This high-level view shows the repeatable annual cycle and the feedback loop used to refine the curriculum and strengthen adoption by high-need LEAs.
| When | Activity | Owner | Evidence collected | Output / decision |
|---|---|---|---|---|
Each year (Jan–Mar) | Recruit and select cohort; confirm eligibility and logistics; finalize module subset for the annual experiments. | RAOP Program Team |
| Annual cohort plan aligned to goals and constraints. |
Each year (Jun) | Technical onboarding and evaluation orientation; confirm access to virtual labs; share expectations for artifacts/deliverables. | RAOP Program Team + Quanser Engineering Team |
| Consistent starting conditions across cohorts. |
Each year (Jul) | Deliver the 2-week virtual phase + 1-week on-site phase using a simulation-to-hardware workflow; collect pre/post, formative evidence, and deliverables. | Educators + RAOP Program Team + Quanser Engineering Team |
| Annual cohort outcomes + classroom-ready robotics/autonomy curriculum artifacts suitable for high-need LEA adoption. |
Each year (Aug–Oct) | Analyze cohort-level outcomes; synthesize strengths and gaps; revise templates, supports, and recommended module pathways. | RAOP Program Team |
| Refined, more adoptable RAOP curriculum for the next cohort. |
Each year (Fall semester) | Follow-up on classroom adoption in high-need LEAs; document barriers and feasible support strategies. | RAOP Program Team |
| Adoption evidence that informs targeted improvements and dissemination. |
Biannually (2x/year) | External Advisory Board reviews cohort summaries and refinement plan; provides strategic and technical feedback. | External Advisory Board + RAOP Leadership |
| Governance oversight and documented continuous improvement. |
End of project (Year 5) | Consolidate multi-year evidence; document final curriculum package and adoption guidance for high-need LEAs; summarize outcomes and refinements. | RAOP Leadership + Program Team |
| A robotics and automation-friendly curriculum package designed for adoption and reuse by high-need LEAs. |
Assessment Resources
These resources support consistent assessment implementation across modules and support the development of classroom-ready educator deliverables.
If you prefer educator-only access for assessment templates, host the files in a restricted location and replace the internal paths above.
Follow RAOP
Updates, announcements, and participant highlights.
Official RAOP social channels. Additional platforms will be added as they launch.