Instruction Modes with an XR Education Resource

We need to consider how we will use the developed 3-D astronomical model in real instruction. Is the 3-D model provided to a student and the student explores the model in a VR environment without any intervention by the instructor? Is it possible to live stream the 3-D model simulation from a central server to many VR headsets? The following is a list of potential instructional modes using a 3-D simulation model.

  • pre-recorded 360 video with instructor's audio guidance. Each student will “watch” this 360 instruction individually.
  • live streaming via YouTube 360. Unreal Engine 5 allows a live streaming (via Off World Live [OWL] Live Streaming toolkit). Because of the YouTube live limitation, there can be about 20 seconds delay.
  • Preloaded VR model directly into each headset. Each student interacts with the VR 3-D simulation while listening live audio instruction from the instructor.
  • each student explores the VR 3-D simulation without any instructor's intervention. A pre-lecture guidance can be given about how to use the VR headset and the 3-D model.
  • Through a virtual classroom setting by using “multiplayer game engine feature”. Students and the instructor interact in the virtual classroom and the 3-D model is delivered in the VR classroom. UGA has such a tool already developed called VRGed (by Kyle Johnson).

We will obtain classroom data and evaluate advantages and disadvantages of each instruction mode

We can create our own “Virtual Classroom” in UE5: See this post: https://forums.unrealengine.com/t/virtual-classroom/768736