Virtual Future Virtually Here?

Next Level Immersion

Take for instance, Blades and Sorcery. This game, on the surface, is simple. There is an endless amount of gladiators in an arena with you, and they all, I mean ALL, want to do nothing but slice and dice you. There are plenty of games that do this already, like Doom or For Honor, that are already popular. So, what makes this game different? It’s the motion, the feel of your actual arm swimming to cut down a foe, your own leg that kicks the enemy, clad in gold-plated armor, that sends him to the valley below. This immersive experience gives the player a “three dimensional world of virtual reality will simply immerse you into this make believe world as the real world” 

Breachers by Triangle Factory

In Jesper Juul’s “Half-Real”, emergence is defined as “a number of simple rules combining to form interesting variations” The VR games mentioned have simple rules, such as don’t die and eliminate the enemy, in expansive spaces that you can physically move in, such as an arena, a skyscraper full of terrorists, or a castle on a fiery planet. But the variations are endless. I could eliminate and open with a variety of weapons in a million ways, with a million different motions. The possibilities are truly endless! Minutes go by, and I myself forget that in reality, I’m in my room , standing and swinging around an imaginary battle ax. But my brain and its imagination thinks otherwise. That is immersion beyond belief.

It’s a Skill Issue

VR tech has been beyond games for years. Army training, fighter pilot mission simulations, and astronaut training have all used VR tech to prepare them for the dangerous world that awaits them. But, does this mean that the VR games that I play are simply just for fun and give me nothing of real-word substance. To that I shout, “No!” VR games and training have led to improved hand-eye coordination, reaction times to both sounds and sights, and even memory retention. Playing a VR game long enough helps you recognize sound queues, footsteps, the clangs of weapons and the fire of gunshots. Your brain begins to be conditioned to these sounds, giving them meaning in purpose beyond the game.

VR Army Training: Breach and Clear

True, it’s a game with fake bullets, enemies, and consequences. Yet, I and other players are immersed in that new reality, hear and gunshot, and what we think is not “It’s not real, don’t worry.” We think, “Gunshots?! Where?! Am I safe? I need to go somewhere safe and flank the enemy!” We gain the reaction time to pounce on a dangerous situation and handle it, gain the knowledge on what to do in peril, and act on it. These skills grow and grow, giving players knowledge repeated through muscle memory, until we know exactly what to do. 

Ready Player One Future?

Ready Player One created this world where reality is now embedded into VR where jobs, school, and meaningful interactions are done in the world of games and characters. With the introduction of the Metaverse, where players can actually own property and host hangouts in the virtual world, this land of fiction might become reality. Though we are a long way, imagine the immersive world that it could become. Those skills players learned in their games, the battle knowledge gathered from Blades and Sorcery and Breachers could be used in fights settled in VR, maybe even wars. Ok, maybe I am getting ahead of myself. But this world of VR never gets old. There are always “interesting variations’ ‘ that haven’t been explored, weapons that were just modded and imported into the game that you haven’t tried, and maps you haven’t explored yet. The world awaits you in VR, and it’s never-ending !

Sources:

Blurring Game and Reality: Horror Gaming in “Black Mirror”

I have been an avid viewer of Black Mirror for a while, the British sci-fi series that explores the darker implications of advanced technology. While I am certainly entertained by the show’s disquieting plots, I am simultaneously horrified by its proximity to our existing cyberspace concerns. In particular, the episode “Playtest” delves into the world of reality augmentation and the ethical implications of pushing boundaries in horror video gaming. 

The episode follows Cooper, our protagonist who finds himself in desperate need of some money while backpacking through Europe. To make a quick earning, he takes on a gig as a playtester for a new augmented reality game, SaitoGemu. The implantation of a brain chip facilitates a personalized VR-style horror game experience, where Cooper is thrown into a mansion simulation filled with terrifying scenarios designed to evoke fear. The line between reality and game thus begins to blur, as he is now immersed in augmented worlds that feel just as real as the physical one. 

Cooper’s conscience is tricked into believing a false construction, questioning how hyperreality impacts our sense of self. Regardless of whether he is actually in the game world or not, his mind is under the impression that what he is experiencing poses a genuine threat to his wellbeing.

Perhaps this is all that it takes for something to be considered reality – the legitimate belief that what one is experiencing in that very moment is the truth. At one point in the game, Cooper even endures what appears to be intense physical pain, grounding our conceptions of reality in the corporeal form. This is complicated when (spoilers ahead) he is unable to exit the game, and the audience discovers he has died from an implant malfunction frying his brain. 

During my rewatching of “Playtest”, I was distinctly reminded of Jesper Juul’s guidelines for what constitutes a game, where a main component of the definition involves the consequences of the activity being both optional and negotiable. It’s difficult to imagine that Cooper was fully aware of the psychological duress he would be placed under, and given how unpredictable our consciousness can be, I wonder whether the fear the game capitalizes on and triggers is truly optional. This places SaitoGemu’s classification as a real game under fire, especially when we consider the safety of its design. 

As someone who despises the horror genre in any medium, the idea of willingly participating in something like SaitoGemu seems impossible for me. While my instinctive reaction is to wonder what the appeal could be, I understand that there’s an addictive and even enticing element of horror that resonates with people. As VR and simulation games grow scarily advanced, I worry these tools may be weaponized, especially if this technology is placed in the hands of those with questionable or malicious intent. The concept of corporations turning fear into a commodity is perhaps one of the most alarming takeaways that left me with an inexplicable sense of dread at the episode’s conclusion. I wonder to what extent are we willing to go in the name of entertainment, especially given how the technology in “Playtest” taps into the most intimate corners of the player’s mind.

This new frontier of gaming technology ultimately poses questions on the potential ethical ramifications of our future, ones that may arise sooner than we might even anticipate. For instance, the monetary incentive of participating in the SaitoGemu experiment mirrors some of our present-day socioeconomic inequities regarding who is most vulnerable, as was the case for Cooper in his financially desperate circumstance. Does the implementation of pioneering technology only exacerbate existing social dilemmas? Is it possible to break from them entirely?

Rachel Lee

Augmented Reality: The Future of Medicine

Augmented reality (AR) and virtual reality (VR) are gaining significant momentum as leading innovators in medical education, health care and health care delivery. While virtual reality is based in an artificial virtual world, augmented reality, aka spatial computing, is a technology based on a merging of digital and physical spaces.

Use of augmented reality in the Operating Room for better visualization during abdominal surgery (Image courtesy of Medical Augmented Reality)

Augmented reality is an invaluable asset to health care professionals. It allows its users to stay in touch with reality, while providing fast and efficient transfer of information from multiple sensory modalities – think Pokemon Go. Such distinctive features have established augmented reality as a revolutionary necessity in the future of health care, during an era driven by technological innovation.

With the recent release of new mapping AR tasks, Pokemon Go is a leader in the conversation about VR/AR (Image via @PokemonGoApp on Twitter)

From video games to medical clinical practice, augmented reality has allowed its users to interact with simulated reality environments. One of my first introductions to the use of such technology in medicine was actually at the Wond’ry, Vanderbilt’s Center for Innovation and Design. Using an Oculus VR headset and the available equipment, I was able to explore further into the nooks and crannies of our bodies on Human Anatomy VR on the Oculus Quest. If you want to get a feel of how dumb I looked while trying to ~explore~ in the Oculus headset, watch this video!

Human Anatomy VR on the Oculus Quest (Video courtesy of Open PC Reviews on YouTube)

I was surprised to see the precision and amount of detail that could be conveyed using such technological tools, and could only imagine how differing applications of such technological advancements could allow for more immersive medical training and education for aspiring physicians. Currently, new potential applications of mixed reality in training future health care professionals include programs such as Microsoft’s HoloLens and Osso VR’s Surgical Training Platform. “Although there is no hard data, an increasing number of medical schools are using or considering these technologies”, says Warren Wiechmann, MD, Associate Dean of Clinical Science Education and Educational Technology at the University of California, Irvine, School of Medicine. Early adopters of these technologies in medical education have noted the varied benefits of AR and VR, including the opportunity for real-life experiences without real-life consequences.

Osso VR is a cost-effective alternative to the expensive and highly limited training opportunities currently offered (Image courtesy of Osso VR)

Mixed reality has already been advancing the medical field through a wide array of practical applications. Current uses of such technology allow for better visualization during invasive surgical procedures, detection of cancer through image recognition and more accurate patient diagnoses/treatments as a result of advances in medical equipment. Last summer, I worked under a reconstructive plastics and microsurgery specialist to help adapt a mixed reality imaging software to diabetic free-flap reconstructive surgeries. The mixed reality imaging allowed for better pre-operative planning, for we were able to gain better visualization of the blood vessels and blood flow to the lower extremities of our patients, while also allowing for the precise assessment and quantification of diabetic ulcer excision margins. Ultimately, through the application of such technology, we were able to further investigate methods that would allow physicians to preserve as much viable tissue while significantly reducing the risk of development of future sores and ulcerations.

Launch of xvision™, the First Augmented Reality Guidance System for Surgery (Image courtesy of Augmedics)

Although use of augmented reality, virtual reality and other mixed reality mediums have allowed for numerous medical advancements, we are only on the cusp of breakout ideas and their practical applications. Notably, as health costs continue to rise, the use of technology, such as augmented reality, will play a significant role to help prevent, manage and cure patients, especially those who are members of medically underserved communities. Outside of mixed reality, the increased use of technology can also be used to help create a means for more personalized and accessible medical care. I look forward to seeing how the utilization of such technology, by competent and compassionate physicians, will allow for increased equitable delivery of high-quality patient-centered care, while further improving patient outlooks using advancements in treatment modalities.

~ Tiffany Lee

Augmented Empathy: VR/AR’s Impact on Gamers

Game psychologists are looking to a relatively new gaming medium to explore the effects of in-game experiences on the real lives of gamers: virtual and augmented reality. According to the Virtual Reality Society, virtual reality gaming is “where a person can experience being in a three-dimensional environment and interact with that environment during a game.” In contrast, augmented reality gaming is “the integration of game visual and audio content with the user’s environment in real time. … While virtual reality games require specialized VR headsets, only some augmented reality systems use them.”

the-void-trying-to-make-vr-theme-parks-a-reality-7-730x430

What these two forms of new gaming have in common is the integration of the gamer into immersive storytelling. Rather than watching the effects of gameplay choices play out on a flat screen using a controller, the gamer becomes the controller and experiences the impact of their in-game decisions in real time.

In the case of augmented reality, gamers can even experience the impacts of their decisions on their real environment through a camera. This leads to a sensation gamers call TINAG, or “This Is Not A Game,” in which one of the main goals of the game is to deny and disguise the fact that it is even a game at all (Virtual Reality Society).

Because of the real-world, real-time feel, gamers often feel there are higher stakes to their in-game decisions. Game psychologists argue that “VR experiences can impact the empathy of their users and immediately translate to positive real world behavior.” One example of this comes from a study done on VR gamers who were instructed to cut down a virtual tree. After cutting down this tree in the game, the gamers used an average of 20% less paper in real life.

Another study suggests that the more a gamer immerses in the environment of the game, the more likely they are for in-game choices to affect their empathy outside of the game. For example, when a gamer picks and customizes an avatar, they often bring traits from their real life into their game life. This causes them to identify more strongly with their in-game persona and blur the line that separates gaming from real life.

1_iW0ln_6zBGA_KWysu46bcA

AR and VR games are the final frontier in eliminating that line completely. When your in-game character is no longer distinguishable from your true self, your choices in and outside of gameplay affect one another inherently.

The implications of this empathy-building through gaming are massive. Some game psychologists argue that it is the moral responsibility of AR/VR game developers to consider the empathic development of their gamers when creating storylines, often with a focus on empathy for other persons, animal rights, and the environment.

Whether or not you believe the onus of creating a more empathetic generation falls on game developers, the impact of these AR/VR games on the emotional development of gamers is undeniable and will likely only grow as the technology flourishes.

Kathleen Shea

https://www.sciencedirect.com/science/article/pii/S0747563214003999
https://www.vrs.org.uk/virtual-reality-games/what-is-vr-gaming.html
https://venturebeat.com/2018/09/24/augmented-reality-can-foster-empathy-and-games-can-take-advantage/
https://www.sciencedirect.com/science/article/pii/S0747563217305381

Inside Aech’s Basement: A VR Group Project

Our group project was to create various objects using Blender to incorporate into the VR room that is Aech’s Basement from Ernest Cline’s novel, Ready Player One. Each group member brought their complementary individuality which added unique insight, helpful suggestions, and overall, great team camaraderie.

Below, each member of the group highlights their creation process and touches on how important intermediary group projects are to boosting confidence, gaining experience, and pushing academic boundaries.

But first, please enjoy this Game Trailer created by the invaluable, Vincent H.:

https://youtu.be/StVIVT0FUZM

Cheers!

 

ENGL 3726-01 Final Group Project

Janelle O.

The Stages of Using Blender as a Tech Novice

 

Stage 1: Fear

giphy-7.gif

I’ll admit, I let my insecurities get the best of me.

Make a virtual reality object? Me? Someone who’s never worked on anything STEM related before? In a software that has millions of buttons to press and things to mess up?

giphy-8.gif

I watched countless YouTube tutorials and felt like the hosts were speaking in a foreign language. It was so difficult for me to believe that I could comprehend, and eventually implement, what they were telling me. The self-inflicted intimidation inevitably led to…

 

Stage 2: Frustration

giphy-3.gif

giphy-4.gif

giphy-6.gif

No commentary necessary.

But, if you were wondering how many times someone can restart a project on Blender and come thiiiiiiiiiiis close to throwing a computer across a room…

giphy-2.gif

On the brink of a complete meltdown, I took a step back and realized that I may have been giving Blender too much power. Were the tutorial gurus really speaking in a different language? Was is truly as difficult as I believed it to be? Absolutely not…and once I accepted that, it was smooth sailing toward…

 

Stage 3: Success

giphy-1.gif

Ultimately, completing my object brought me relief; relief that I would never have to use Blender again! Kidding…kinda.

I felt a wave of satisfaction and pride knowing that I proved myself wrong. I was able to use software I had never used before to make an object in a world I had never visited prior to this semester. This project allowed me to fine tune a skill I didn’t know I possessed in a field that intimidated me beyond belief.

Am I switching my major to Computer Science? Absolutely not.

Did I gain experience and insight in an important field? Yes.

Most importantly, did I learn and grow in ways I never could have imagined? Yes.

giphy.gif

 

Vincent H.

Skee-Ball Machine and Game Trailer

 

With limited experience with MATLAB and none in computer graphics, I cautiously approached this project that proved to be an interesting venture into the world of 3D modeling.

Our task: supplement an existing virtual reality model of Aech’s basement based on Ernest Cline’s Ready Player One.

Since many of the objects explicitly named in the book had already been created, I needed to immerse myself into Cline’s world and extrapolate other plausible objects. I was inspired by his line: “Most of them were gathered around the row of old arcade games against the wall.” Though not quite old enough to be an 80’s kid, waves of nostalgia still swept over me as I recalled running through a labyrinth of arcade games in Chuck E. Cheese’s as a child. Each corner was filled with vibrant lights that tempted me with the opportunity to win tickets. Stopping in front of a Skee-Ball machine, the objective seemed simple. Whichever prize I earned would be a manifestation of my skill and not chance.

This childhood memory compelled me to create a Skee-Ball machine, and so I did.

Screen Shot 2018-04-23 at 11.45.21 AM.pngPerspective view of Skee-Ball machine in Blender

I started by searching for various makes and models of Skee-Ball machines to provide a historically accurate model; in doing so, I also learned more about the rich history of the game. Skee-Ball was invented in 1908. Aggressive marketing campaigns created an exciting buzz around the game, eventually being featured in various media outlets, with one of those being a game called Superball on The Price is Right.

After finding the ideal model to recreate, I began in Blender by creating a scaffold of rectangular blocks to create a vague table-like structure. Blender has a variety of tools for detail work, so after creating the basic shape, I began adjusting the edges, gradually working towards the sleek and tapered model. After using bevel to create curved surfaces, knife to stencil point values on the backboard, and subdivide to generate the metal mesh, my Skee-Ball machine came to fruition and fits well in the room.

Screen Shot 2018-04-23 at 11.46.07 AM.pngClockwise from top left: (1) Model of vintage 1980s Skee-Ball machine for design accuracy, (2) Wireframe view of final object created in Blender, (3) Rendered view of Skee-Ball machine and ball in Blender, (4) Rendered view of Skee-Ball machine and ball in room environment created with Unity

The object at the end of this meticulous process was similar to the wireframe structure shown above, except for one important feature: it was colorless. I spent the next few days experimenting with different shaders in Blender to generate the matte texture and metallic luster of the machine’s frame. Upon completion of the frame, I was stumped by the deceptively intricate and random texture of the felt for the machine’s surface. Luckily, a trove of insightful guidance and templates were available on the internet, so I found a relevant node map and adapted it for my use.

Screen Shot 2018-04-23 at 11.46.42 AM.pngNode map of blending shaders to create colors and textures

After a few finishing touches, I exported the object and passed it to Vivian for the final step: uploading to Unity. This project gave me a profound appreciation for the computer graphics all around us, increasingly seen in movies. Although the true experts with years of experience are capable of creating models nigh indistinguishable from real-life objects, anyone with the dedication to learn can become proficient within a month. As a STEM major, I truly valued this rare opportunity to exhibit artistic creativity and learn cross-disciplinary skills in an epic quest to remediate Ready Player One.

 

Robert W.

The Music

 

My responsibilities differed a little from the other group members. My time was pretty evenly split between the softwares Blender, Finale, and Cubase. I wanted to add a little aural spice to the otherwise silent basement. Since a large portion of Aech’s and Parzival’s time is spent playing video games, I thought video game music would make the most sense as an incidental soundtrack. One of the only games the author mentions explicitly as a favorite of Aech and Parzival (which also has a decent soundtrack) is Golden Axe.

Screen Shot 2018-04-23 at 11.47.33 AM.pngMy 3D model of the Golden Axe Genesis cartridge

I turned to that soundtrack as a source material for my synth work. I arranged a piece of music from the game, which took several hours of transcription and input in the notation software Finale (the only piece of software I was already familiar with).

Screen Shot 2018-04-23 at 11.47.43 AM.pngThe Finale file of my arrangement

From there, I exported the midi file into the digital audio workstation Cubase. Cubase is where I got to transform my generic midi file into a slightly more interesting electronic piece.

Screen Shot 2018-04-23 at 11.47.50 AM.pngRaw import of my Finale file in the Cubase DAW

I wanted my percussive sounds to emulate those of a sega genesis system, so I worked with the virtual instrument VOPM, a digital synth designed specifically to emulate a sega genesis sound chip. My work with this synth proved a unique challenge. You must describe a sound in your head in terms of attack time, attack delay, reverb, detune, modular shape, etc. in order to create the instrument sound you desire.

Screen Shot 2018-04-23 at 11.48.01 AM.pngVOPM interface

In order to add a unique, slightly more palatable character to my arrangement, I used some virtual instruments created in the Spector digital synth, which is a more modern and practical plugin than VOPM.

Screen Shot 2018-04-23 at 11.48.15 AM.pngSpector synth. Fun fact, the total cost of the software I used for this project exceeds $1000. Thankfully Blair owns all the software so I didn’t have to foot that bill

Even though music is the focus of my degree, my engagement with Cubase and electronic music in general has been limited. Nearly every step of this project (outside of arranging the original track) was a new and valuable experience for me.

 

Wooseong C.

Modeling and UV Mapping  

 

I came into this project without even knowing what Blender was. When I opened the software for the first time, I was overwhelmed by the number of different tools and view modes the software offers.  I mainly learned from Youtube videos that showed the step-by-step process for making a 3D object. During the process of learning to use this software, I learned two Blender fundamentals – modeling and UV mapping.

Modeling simply refers to crafting the shape of your object to match that of the real-world version. This often requires you to have a second window with a picture of the real world object you can refer to:

Screen Shot 2018-04-23 at 11.51.37 AM.pngModeling an open magazine – I chose to create comic books to add to our VR version of Aech’s basement. I wanted to have a comic book that was open to add to the “realistic” aspect

Having a reference is really helpful, as it allows you to be more detailed and accurate. The process of making an open comic book entailed making a plane, extruding edges to make the folded part and curvature indicated by the red arrows (above), and adding the subdivision surface and solidify modifier:

Another blender skill I learned was UV mapping. This step allows the user to map an image acquired online onto his/her object. This allowed us to map images onto our posters and comic books:

Screen Shot 2018-04-23 at 11.51.45 AM.png

For the Astrosmash cartridge, additional steps were needed during UV mapping. Because I needed to add different images to the different surfaces of the object, I had to incorporate the use of Photoshop. The steps entailed unwrapping the 3D object into a 2D map, exporting this UV map to Photoshop, adding the online pictures to the photoshop, and importing the new UV map back to Blender

Overall, learning to use Blender to add objects to our VR representation of Aech’s basement was a very valuable experience. Although the initial learning stage was difficult, I can now use Blender at the beginner level. I now appreciate the vast amount of options and tools that overwhelmed me in the beginning, because it reflects the infinite amount of possibilities one has to create characters and objects. I am very glad that I took this course during my last semester at Vanderbilt. This course was different from all the other courses I have taken, going beyond the traditional essays and lectures, ultimately creating a more hands on learning experience. This final project really gave me appreciation and the confidence to continue using programs like Blender and photoshop in the future.

 

Vivian L.

Uniting All Elements in UnityScreen Shot 2018-04-24 at 10.40.01 PM.png

My role in the Aech’s Basement project group was to take each of my other group members’ separate Blender 3D objects and any other creations they worked on and implement them into the Unity Scene, ensuring everything looked as it was intended and worked properly. Another portion of my job was to raise the almost non-existent level of interactivity in the room. Prior to this semester’s final project, the player was not allowed to move within the room at all, and there was no way to touch or pick up objects, and I desperately wanted to change this.

At the beginning of this project, while all of my other group members worked on creating their assets, I researched and tested different ways to allow camera and body movement in VR. I looked into Unity Oculus Rift support pages, watched many Youtube video explanations on how to track the headset, and what the sensor controllers were called when used in the Scene. My initial thought process was to create a system in which there was a camera affixed to a Capsule 3D object that represented the player. The capsule would then rotate and translate itself according to the detected player headset movements and the Oculus Rift controller joystick input. To create the hands, I wanted to move two Sphere Colliders anywhere that the touch controllers were sensed. Lastly, to simulate the ability of picking up objects, when the player moved their hand Sphere Colliders to hit any other object that was meant to be moved, had a collider on it, and the grip trigger on that hand was being held down, the object would follow the Sphere Collider’s movement.

Screen Shot 2018-04-24 at 10.42.45 PM.png

Sounds fairly simple, right? I had initially thought so too.

However, I soon realized that although I had the thought process down, I had no idea how to physically code them into the room. I had scarcely any experience with writing C# code, which is the primary language in Unity, as well as using any of their numerous class libraries. I was also still piecing together how the controller input was read in Unity, and was largely unsure of how to read physical headset location. Another difficulty was the inability to test my code outside of the VR space in the Wond’ry.

Screen Shot 2018-04-24 at 10.43.59 PM.png

After another week or so struggling to write my own interactive player in VR, I decided to ask for Dr. Molvig’s help on the issue, since he was my professor last semester in the class Virtual Reality for Interdisciplinary Applications. He showed me many helpful websites and more videos on VR player bodies, especially Unity’s own player model. Following tutorials, I was able to put the model into the scene, but it didn’t quite work as expected.

#1 tip for all things Comp Sci: It never works like you expected

In fact, it crashed the game many times, and even when I got the Scene to play, my attempts in creating the hands painstakingly exact to a video I was recommended were ruined by the fact that it seemed as if the controllers were not being tracked in the scene at all! I believed that it was partially due to last semester’s attempts to create a teleportation system that was quite ineffective, and the tampering of the Scene files.

Screen Shot 2018-04-24 at 10.45.22 PM.png

This led to another large difficulty in my project. Since Unity does not allow the copying of Scene objects with their existing properties into another separate project, I would have to choose between spending more time trying to fix the existing scene, or create an entirely new Scene and copy all of the objects into it by hand. Given all of the time already spent trying to bugfix whatever was going on in the original scene, I decided that starting over may be my only choice if I wanted the movement and object interactivity to be a part of the room. I spent many days re-importing and organizing every detail of the room, testing it along the way to make sure that body and hand movement still worked.

Once I had finally recreated the original room, I held my breath when hitting the “Play” button one more time. It worked! The camera moved, my hands moved, and my body moved! I was filled with relief, but this was only the first step.

It was time to put the efforts of my teammates into the Scene. Altogether, we had quite a few assets created to put into the room; comic books, a skeeball machine, posters, a VCR player, game cartridges, and more. (I’m sure my teammates could tell you a lot more than I can about the specifics of their objects!) I also added a few Coke cans, mainly for hand and object collision testing.

A few things happened when I trying moving the objects into the Scene.

Screen Shot 2018-04-24 at 10.49.30 PM.png

  1. Objects were completely devoid of color.
  2. They were ginormous.

I knew I had to fix the color issue, but the size was not a problem. I could scale it to my will, but adding the correct materials was a separate hurdle. I researched what each of the little options on the objects did, and realized that clicking a button labelled “Lightmap Static” allowed many options, one of which was to assign materials to certain surfaces of the object. This meant that I would first need a material to assign, though. My teammates largely used pictures online to properly wrap around their objects, so I looked up how to create Unity materials from images. Once I figured out how to do that, I realized that objects that were supposed to have separated images on each face, like the Betamax VCR player, had only one surface on which to put the material. Without a proper UV map, I went around this issue by creating the object out of 6 quads, each representing a side. Then, I assigned a separate material for each of them. This worked nicely for neat geometrical objects, but I worried what would happen with more complex ones.

Screen Shot 2018-04-24 at 10.52.23 PM.pngThankfully, my teammates supplied me with UV maps that worked like magic for some of the other objects. For others, I simply looked up images online and utilized those. Sizing was done as realistically as possible

Eventually, every asset created was put into the Scene with the correct coloring and sizing. I was very happy with the results and seeing everyone’s work in the space, in VR, amazed me. Lastly, I was to put Robert’s Golden Axe music into the scene. I decided to create a Audio Source centered at the TV. I attached the Golden Axe audio clip to it, and changed the radii to reflect the distance at which the volume plays normally, and the distance at which it completely fades out.

Screen Shot 2018-04-24 at 10.53.14 PM.png

This marked the end of incorporating each team member’s object (and music) into the final room. Aech’s Basement is far from complete, but progress is progress, and each team member had our own learning experiences completing their parts.

Screen Shot 2018-04-24 at 10.54.41 PM.png

The Virtual School, the Better Choice?

james-moore-1
Dr. James L. Moore

 

 

As an education major at Vanderbilt’s Peabody College of Education and Human Development, I have a particular interest in different ways students can learn. When my class read Ready Player One by Ernest Cline, there were parts of the novel that denoted different ways in which the main character, Wade Watts (avatar Parzival online) learned and studied in a virtual world called the OASIS. What was most interesting is this apparent dichotomy between learning in a physical versus learning in a virtual setting. It appears that the OASIS, as you will soon come to learn in the opening chapter, emphasizes virtual components as the main vehicle of getting really almost anything done. Along with currency and communication being almost completely virtual, the schools seem to be as well. Taking a look at a quote in the beginning of the book,

” I was more or less raised by the OASIS’s interactive educational programs, which any kid could access for free. I spent a big chunk of my childhood hanging out in a virtual-reality simulation of Sesame Street, singing songs with friendly Muppets and playing interactive games that taught me how to walk, talk, add, subtract, read, write, and share,” (Cline 15).”

This quote points out the power of Virtual reality in promoting the healthy social/emotional development of students a an early age. An award winning and in my opinion one of the best, highest quality children’s programs of all time, Jim Henson’s Muppets- Sesame Street- Cline discusses through the main character how certain elementary concepts and lessons on social skills could be learned at a more interactive, much more tailored pace through this use of virtual education. Let’s take a look at a video that relates to this (Video #65 on the list:

After watching this, you may recall Dr. James L Moore III remarking that, “We need schools that are student centered and always factor in the human element.” This is juxtaposed with this idea of virtual learning that is the way of learning and understanding in the OASIS. What is better? Virtual learning to give an extremely tailored learning experiences to almost all students, or more expensive individualized learning, with a physical presence of a teacher?

Harassment in VR Spaces

(Spoiler warning for Ready Player One by Ernest Cline in first two paragraphs. Links contain sensitive content relating to sexual harassment in online/gaming communities.)

Ready Player One: 80’s nostalgia trip, celebration of gamer culture, cyberpunk dystopia, hero’s quest, and – teenage love story? I’ll admit, I haven’t finished the book yet, but from the beginning our protagonist Will Wade/Parzival is smitten with Art3mis, a fellow gunter and popular online personality. He even has pictures of her (or at least, her avatar) saved on his hard drive. When he first encounters Art3mis in the Tomb of Horrors, he gives her advice on how to beat the lich king. Once they’re both High Five celebrities on the famed scoreboard, they begin a casual romance. Art3mis breaks it off when she feels their time together has become too much of a distraction from the hunt. Parzival, lovesick, sends her unread messages, flowers, and stands outside of her virtual castle with a boombox: part persistent “good guy,” part slightly creepy stalker.

ready_player_one_cover

Ready Player One by Ernest Cline

Ready Player One has not (yet) examined gender politics on the OASIS, but it acknowledged the age-old mantra: “There are no girls on the Internet.” Even in a world where virtually the entire population uses OASIS and a game event with a massive prize, the default is presumed male. Parzival persistently questions Art3mis’s gender until he is assured that she’s “actually” female, accusations that Art3mis takes with good humor.

(Spoilers end here.)

But as we all know, there are women on the internet and in the gaming world, and they have been there since the beginning – even when the climate is hostile. Shortly after starting Ready Player One I found this article about the writer Jordan Belamire’s experience with sexual harassment in virtual reality. Despite all players having identical avatars, another player recognized her voice as female and followed her around attempting to touch her avatar inappropriately. She finally exited the game. The game’s developers were shocked and dismayed when they heard of the incident and in response developed an in-game “power gesture” that creates a privacy bubble around the player. They hope that other virtual reality developers will take harassment into consideration when designing their games. Online or in-game harassment is nothing new, but as we pioneer exciting new platforms and experiences, it continues to be a thorn in the community’s side.

Ready Player One might take place in the distant dystopian future, but in characters’ interactions with each other the culture seems closest to the Wild West of the 2000s internet – complete with flame wars and skepticism on women’s presence in the OASIS. Presumably, harassment continues to be an issue in this brave new world of the OASIS – but is the response closer to QuiVr’s developer-implemented “power gesture,” or the old advice of “just ignore it and it will go away?” Perhaps it isn’t even a talking point in the OASIS’s community – why worry about it when, after all, there are no girls on the internet?

What do you think of QuiVr developers’ response in implementing the power gesture? Do you think that this is a valid solution, or do you believe it is too much/too little?  What responses to harassment have you seen on other platforms and games?

Virtual Hopes

VR is an exciting way to experience media in a more immersive way although it still has a long way to go before it is truly available for everyone to experience in their daily lives. This is largely because of the cost for a single setup even before you buy any games or interactive experiences to enjoy with your headset. You can either have no interaction with your environment other than turning your head or you can have a fully immersive experience that costs a ton. Another major setback is that these expensive setups that can track your movements are not always very accurate which was a problem we ran into while solving a puzzle in our first experience with the HTC Vive. We were far enough from the walls and close enough to an object in the game that we should have been able to pick it up but the tracking system believed that we were much closer to the wall and prevented us from being able to grab the object until other people in the room moved around and the tracking started working correctly again. It is also rather obvious that you have a screen right in front of your eyes no matter which virtual reality setup you were using and depending on how clear the resolution is and how the screen is created it can get hard to watch really quickly.

Even with these limitations there is a lot of space for VR to expand in videos, games, and simulations for educational purposes. For example, it would be cool if they could have doctors practice surgeries in virtual reality so they don’t have to get cadavers all the time and they can practice over and over with different representations of peoples bodies. Personally, I would like to see VR improve with its tracking capabilities so that it becomes more immersive and can truly simulate real world experiences. VR has already been able to explore many concepts and styles of play by transforming regular three dimensional media into something you can stand in the middle of and feel like you are actually interacting with your environment rather than just sitting in front of a screen where you can’t touch any of the objects surrounding you. For example, there are many VR experiences that allow you to experience things that you wouldn’t be able to do in real life. This includes climbing Mount Everest and becoming a bunny in an animation. Experiences such as this where you can walk a plank at great height can even allow people to experience the things that terrify them without facing any real danger. VR can even transform games that start out as PC games into an immersive experience  allowing you to become a surgeon or play fruit ninja in almost real life. A great side effect of games in virtual reality is that it allows you to become active and practice archery or tennis without ever having to go too far or find a gym to work out in. And if you want to be able to play sports with friends or strangers around the world then you can do that as well though you can’t play with any friends who do not own their own VR setup. Virtual reality can even allow you to experience completely impossible environments that have an animated, drawn, or dreamlike feeling. Though these are all really cool advances in virtual reality that demonstrate how it can be used socially or in an active or dreamlike environment to enhance the way you experience a piece of media the tracking and visibility are not quite at the level they would need to be for it to be used in a truly educational sense for surgeries and other applications. Once these advances can be made and the price comes down to an accessible level then it everyone will truly be able to experience and enjoy virtual reality.

Pokemon GO! The Ethics of Augmented Reality

Pokemon Go is the new gaming phenomenon of the year. Revisiting the old fashioned Nintendo Pokemon games, Pokemon Go takes that same experience up a level by adding the features of Augmented Reality. Allowing people to walk around the planet with their phones and search for Pokemon, the game has added a new dimension to gaming. And along with that, one of the main selling points of the game is their focus on fitness. A recent statistic stated that since the game released earlier this year, people playing the game have walked about 4.6 billion kilometers. To put that into perspective, that’s more than the distance from the Sun to Neptune and more than the distance NASA’s Voyager 1 has travelled in the past 12 years!

However, this achievement does not come without problems. While the game has several positive aspects – people are being more active and have started going out more (even though they are still looking into their screens), and meeting new people (I myself have made a couple of friends while playing the game), there have been quite a few concerns regarding trespassing. People have often been reported to walk into peoples private residences, trying to catch a particular Pokemon. While Niantic, the developers of the game are completely on the legal side of this issue, questions about the company’s responsibility for the actions of their consumers have started emerging. The popularity of the game has pushed the company into new ethical and legal issues that have never been dealt with before; and with the fast developing world of augmented reality, such issues are going to become more frequent as new games implementing this technology are released. While some people say that the players are completely responsible for their actions and how they play the game, many suggest that the game in some ways is encouraging players to trespass into restricted areas, or at restricted times through where the PokeStops are located and where many Pokemon are found.

Public places like monuments or parks are the ideal location to play games such as these, so often Niantic focuses on such areas by providing more Gyms and PokeStops, in a way encouraging their players to come to that location more often. Niantic has received requests from several organizations to remove PokeStops from near their establishments, and so far, Niantic has complied. But the question of whether Niantic is responsible or not is still unanswered.

In my opinion, both parties in question are to an extent to blame for this. Neither are completely wrong in doing this, but since this is a new field of ethical gaming and technology we are dealing with, new rules must be put into play. So far, there are no limitations to where one can place digital markers in the real world, but now as augmented reality is becoming a… reality, we need to make some new laws or rules to govern this. The lack of limitations on where Niantic has put their Gym’s and PokeStops often leads people into unknown territory. As far as the players are concerned, ideally they should be paying more attention to where they are walking and should be more receptive of their surroundings, but the fact that to play the game you must always be looking at the screen of your phone is not really helpful. Niantic has made some efforts to reduce the amount of time that people spend looking at their screens by introducing apps for wearable devices such as the Apple Watch, but this is still not the complete solution. I’m sure that as more game developers start implementing VR into their games, new laws governing the use of digital space will emerge, but until then all we can do is make sure to be more receptive to our surroundings while playing until we are offered a satisfying solution.

 

Throwback Thursday

Playing the Sims was an instant throwback to my elementary school days. My friends and I would load the CD into the CPU and wait anxiously huddled around my desktop computer for the game to load. But times have changed! Emily showed up to class with her laptop and simply clicked for the game to start. Almost instantaneously we were transported into a virtual reality. First off, we created our Sim character. The number of options available for making a human is astronomical! I could not believe all of the different choices to make. It is no longer as simple as choosing hair and eye color, but rather we could choose their body weight and muscle definition, whether or not they have freckles, the shape of their eyebrows. Does this really matter when playing a video game? After a while it became tedious. It seemed so superficial that in a video game we have to care about outward appearances. Not only did we have to choose outward appearances, but also we had to pick their interests, personality, and life goals. Personally, I don’t really care if my Sim wants to be a 5 star chef or an international spy. Finally, we resorted to choosing characters created by the randomization tool since making each separate character was so time consuming. When we got past the part of making characters we had to choose a house and furnish it. While this part was entertaining, I found there were too many options. I’m indecisive in my own real life and almost felt like the video game was mocking me! We didn’t get to play much of the action part because we ran out of time, but I’m looking forward to playing in and creating a virtual world.

Molly Steckler