An interactive VR museum with AI-driven guides, living animal behavior, and timeline quests.
Our research project is a multi-disciplinary system that integrates virtual reality, artificial intelligence, behavioral simulation, VR timeline and Extinction Event Simulation into a single ecosystem. Its centerpiece is an immersive PaleoVision experience designed for the Meta Quest 3 headset, supported by AI-driven interactions, dynamic animal behavior, and interactive educational features.
Traditional prehistoric education—textbooks, documentaries, and static exhibits—is often passive, text-heavy, and low in engagement, limiting memory retention and attention. Existing VR projects, including 360° videos and animated reconstructions, provide more immersion but remain mostly static and scripted. Advances in VR and AI offer interactive, immersive learning opportunities. Virtual reality enables users to explore inaccessible environments and visualize complex concepts [1], [2], while artificial intelligence supports context-aware and adaptive learning experiences [3], [4]. Studies in STEM and museum contexts show that immersive VR improves comprehension, retention, and spatial reasoning [1], [2].
Previous AI behavior modeling using rule-based systems or fuzzy logic lacks adaptability and realism. Integrating machine learning with VR allows autonomous virtual agents to respond to environmental conditions and social dynamics [5]. Projects such as Dinosaurs and Crvena Stijena VR demonstrate increasing interest in immersive prehistoric simulations but still rely on static storytelling and limited interactivity. This project addresses these gaps by developing an AI-driven, spatially aware VR environment with dynamic animal behaviors , interactive activities, and adaptive learning techniques [4], [5], transforming passive exploration into active, engaging learning.
Existing VR exhibits rarely include scalable and believable AI animals. We propose herd/predator logic with scene-aware behaviors.
We add timeline quests, spatial anchors, and spaced-recall cues to improve focus and learning outcomes.
From passive viewing to dynamic scenes: reactive narration and hands-on interactions.
Guides read the room: surfaces, exhibit proximity, and player location for natural interactions.
Low engagement/retention with static VR exhibits; limited interactivity and adaptive guidance.
An AI-guided, scene-aware VR museum featuring dynamic animal simulations, a playable timeline, and memory enhancers.
Environment blockout, interaction patterns, and baseline navigation (teleport + direct locomotion).
Herding, predator detection, feeding interactions, and narrator triggers linked to exhibits.
Usability testing, knowledge retention measures, and iteration.
A Project Proposal is presented to potential sponsors or clients to receive funding or get your project approved.
Reviews the 50% completion status of the project, revealing any gaps or inconsistencies in the design or requirements.
Describes contributions to existing knowledge, giving due recognition to all referenced work in making new knowledge.
Reviews the 90% completion status and demonstration of the project.
The website helps promote our research project and reveals all details related to the project.
The status of the project is validated through the Logbook.
Evaluates the completed project over the year and includes both individual and group reports.
Conducted individually to assess each member’s contribution to the project.