Clark University: Immersive Climate Learning with AI Expert Avatars and Multiuser 360° Arctic Video

This case study shows how Clark University delivered immersive climate learning using AI expert avatars and multiuser 360° Arctic videos for middle school learners.
Customer
- Institution: Clark University (higher education) (School of Climate, Environment, and Society)
- Lead: Dr. Karen Frey, Clark University
- Course: First-Year Seminar, Communicating Climate Change (16 first-year undergraduates)
- Audience: Middle school learners (ages 9–14)

Challenge
Dr. Frey wanted first-year undergraduates, with no prior immersive-tech experience, to collaboratively design guided shared VR climate education experiences for middle school students that:
- used 360° Arctic field footage from research expeditions
- worked cross-device (Meta Quest, desktop and mobile)
- stayed structured and assessment-based (not “cool VR” without a learning arc)
- fit within a single semester, with practical constraints (asset size, usability, and classroom logistics)
- supported live classroom facilitation in a shared multiuser environment (audio, coordination, and flow)

Solution
Using the Hyperspace platform (via LearnBrite), Clark ran the program as learning-by-building:
- Team setup: 4 undergrad student teams (≈4 students each) + 1 graduate assistant
- Access model: 3 authors (faculty/GA), unlimited editors (undergrads), unlimited visitors (middle schoolers)
- Timeline: late August to mid-December; experiences kept live via URLs afterward

What students built
Undergrad students delivered four themed virtual classrooms for ages 9–14: Sea Ice, Seafloor Ecosystems, Water Column, and Shipping & Navigation. Each classroom followed the same five-task learning path:
- Explore scientific operations via multiuser 360° Arctic videos
- Interact with AI expert avatars (Q&A expert layer for inquiry)
- Learn climate basics via curated visuals/animations from reputable public sources (e.g., NASA/NOAA-type resources)
- Interact with 3D digital objects and instruments
- Read and discuss peer-reviewed scientific literature

AI implementation
- Two AI expert avatars (AI avatars) per classroom (e.g., sea-ice geography vs biology; water-column physics vs ecology; shipping impacts vs climate impacts)
- Students wrote bot prompts after an ethics/prompting session, focusing on:
- restricting answers to peer-reviewed and trusted institutional sources
- tuning tone and language for middle-school comprehension and comfort
- Bots were voice-enabled via headset microphones for natural conversational interactions
- A scripted (non-AI us) guide avatar per room handled task sequencing and structured choices (the team prototyped guide flows in ChatMapper)
- In-world announcements supported group coordination across the Clark Polar Science Research World
“My Clark students instructed the AI: ‘Use middle school-friendly language like ‘skibidi’ or ‘slay.’ The middle schoolers at our event said, ‘It made me want to interact with it more because it was using language I hear on YouTube all the time.”
— Dr. Karen Frey, Clark University

Content & assets delivered
Across the four classrooms, students delivered:
- 20+ multiuser 360° videos (seafloor dives, ship operations, sea ice, helicopter flights)
- 8 embedded peer-reviewed articles, displayed at large scale on virtual walls
- Interactive 3D models (e.g., polar bear, seals, walrus, ship equipment, scientific tools, pressure cups)
- 4 teacher guides with objectives, pre-questions, in-world tasks, and post-worksheets (6 questions each)
Delivery (what happened)
The middle-school experience was facilitated in a shared computer-lab setting; learners used headphones with microphones to support simultaneous voice interactions without overwhelming the room. Students participated on computers and in immersive VR headsets.

Results
Platform analytics show sustained usage beyond “one-and-done” novelty:
Engagement & usage
- 44 users in the analytics export
- 1,331 world visits
- Total time in world: ~52.2 hours (3,133 minutes)
- Average time per user: ~71.2 minutes (median ~12.8 minutes, reflecting a mix of short exploratory sessions and longer facilitated sessions)

AI adoption
- 77.3% of users interacted with at least one AI avatar
- Total AI conversation time: ~17.3 hours (1,036 minutes)
- AI conversation visits: 378
- AI engagement concentrated in the most-used expert avatars (top two accounted for ~79% of total AI time and ~68% of AI visits)
“I was worried the middle schoolers would fly through the experience in 10 minutes. But what I learned was there was probably weeks of content they could have enjoyed exploring in those spaces.”
— Dr. Karen Frey, Clark University
Metrics note: Analytics are derived from the platform analytics, Trophio gamification report export; “world visits” and “AI conversation visits” reflect the report’s visit counters, and time is aggregated from recorded time spent in-world.
Video: Arctic-Shipping-and-Navigation.mp4
Next-course optimizations
- More desktop-first delivery, reserving VR for highlight moments to reduce friction and fatigue (optimization identified)
- Start platform training earlier, then shift into production
- Strengthen museum-style curation (labels, citations, interpretive text) so visuals and objects are self-explanatory
- Lock the asset pipeline earlier (especially 3D file-size constraints and sourcing rules)
- Reduce VR mode-switching and multi-click transitions between spaces and 360 scenes
“I think experiencing it on both platforms was really helpful; desktop first to get oriented, and then the VR headset for a different level of immersion.” – Dr. Karen Frey, Clark University

Why this worked and what we’ve learned (Replicable pattern for immersive climate learning for other EDU institutions)
- Use a repeatable instructional spine, then let student teams differentiate by topic (the five-task structure plus an exit check is simple, scalable, and “teachable” to new builders).
- Make AI a domain “expert layer,” not a narrator. The best use here is Q&A that expands depth on demand, while the space itself carries the directed learning path.
- Template-first authoring via a grad assistant (or instructional designer) reduces chaos when multiple student teams build in parallel. (From your program notes.)
- Design for cross-device from day one, since facilitation often starts on desktop and graduates to headset once learners are oriented.
If you’re exploring immersive, cross-device learning for your students, let’s do a quick fit check during a 25-minute call with one of our experts.https://ghl.hyperspace.mv/bookdemo




