Digging deeper into the design of tutorial room for the German VR game at last week’s team meeting, we came to the realization that we need to align the training objectives of this room more closely with the activities that a player would ultimately need to perform in the game. These actions include: (1) teleporting in the space; (2) interacting with the buttons on the buttons on the hand controller (e.g., accessing a German dictionary or inventory); (3) looking around and orienting one’s self in the space; (4) interacting with objects (e.g., picking them up, examining them, putting them down); (5) putting objects into the inventory system; and (6) taking objects out of the inventory system. Other actions may need to be added as the game is developed and we evaluate feedback from user prototype testing.
Inclusion of the new training objectives modified the narrative of the tutorial room:
Tutorial room narrative
|At 4 PM the player receives a text message, inviting them to a party. Haptic and visual cues (vibrating hand controller and pop-up menu) prompt user to interact with hand controller to view/hear the message.||(2) interacting with the buttons on the buttons on the hand controller.|
|After receiving the message, a list of training objectives (player tasks) for the tutorial room are added to the to-do list, accessible through the hand controller. These include:
1. Put recycling into the correct containers;
2. Put trash into the correct container; and
3. Put kitchen waste into the correct container.
Haptic and visual cues prompt the player to access the to-do list at this point.
|(2) interacting with the buttons on the buttons on the hand controller.|
|After the to-do list has been closed, a recyclable item will be highlighted in the tutorial space, which the user will be prompted to locate through audio cues. When the headset is oriented to the object, visual cues will instruct the player on how to teleport to a hotspot placed in front of the item, to grab the item, and then place the item in the correct container.||(1) teleporting in the space; (3) looking around and orienting one’s self in the space; and (4) interacting with objects (e.g., picking them up, examining them, putting them down).|
|The player will repeat the above steps, without visual cues, for the remaining items. The player may need to access a help menu at any point for just-in-time instruction.|
|After all the items have been placed in the correct containers, the player will be prompted through haptic and visual cues to access the inventory system on the hand controller. Visual cues will instruct the player on how to place the following items into inventory:
1. Trash, recycling, and kitchen waste bags; and
2. An ATM card
The bags will be put into the correct containers once the player leaves the apartment. The ATM card will be used by the player to withdraw money for public transportation.
|(5) putting objects into the inventory system.|
|After the player puts all the items in the inventory, they will receive a text message from their roommate, telling them to leave a textbook on the table before going to the party. Player will access the inventory system, locate the book, teleport to the table, and place the book on the table.||(6) taking objects out of the inventory system.|
|Once these objectives have all been realized, the player will be able to leave the apartment and go to the pedestrian zone.|
We realized that creating the tutorial room will be a lot of work for the team, perhaps enough to keep us busy for the full academic year.
To help us generate ideas on how other VR experiences are structured and address similar design issues, we will use our next team meeting (on 10/17/19) to play through these experiences and take notes on how we react to them, judge their effectiveness, and apply aspects of these experiences to our own game. After this is done, we’ll have a bodystorming session to flesh out the experience even more and discuss the finer design details that will undoubtedly emerge when we act out the experience, and then we’ll begin whiteboxing the experience.
In the past we were looking at using VRTK v3 our Unity plugin for developing VR content for the Grinnell College Immersive Experiences Lab (GCIEL). With the release of SteamVR v2, however, VRTK v3 will no longer be supported and a new release (VRTK v4) is currently in beta. Unlike v3, which was documented very well, v4 is still rough around the edges and is missing a lot of documentation. So that we don’t waste time waiting for this documentation to be developed, we are moving all GCIEL projects over to SteamVR v2. This still will allow us to develop for all the major VR devices, including both the Oculus and Vive.
SteamVR v2 is a major rewrite of the v1 release, which was poorly documented. I’ve been testing SteamVR v2 this past week, seeing if I could get something up-and-running, and I’m happy to say that even I (with my lower-level programming skills) was able to do this. I found these resources to be most useful:
https://steamcommunity.com/games/250820/announcements/detail/1696059027982397407 (gives an overview of the v2 release);
https://www.raywenderlich.com/9189-htc-vive-tutorial-for-unity (good tutorial and mostly accurate, just a few minor changes because of updates);
https://medium.com/@sarthakghosh/a-complete-guide-to-the-steamvr-2-0-input-system-in-unity-380e3b1b3311 (an overview of the SteamVR input system);
https://www.youtube.com/watch?v=qo-9CmcKWlY (video walkthrough of the SteamVR Unity plugin); and
https://valvesoftware.github.io/steamvr_unity_plugin/ (SteamVR Unity plugin documentation)
I anticipate that a lot of code from all the projects currently being developed in GCIEL can be recycled for other future projects. So that we don’t constantly reinvent the wheel, I hope to take frequently used code, insert it into a Unity demo project similar to the SteamVR v2 example scene (Assets/SteamVR/InteractionSystem/Samples/Interactions_Example), and upload the project to the GCIEL Toolkit repo.