This review is an unconventional approach due to the item not actually being on the market. However, come to its release (supposedly sometime in 2019) could potentially change how we interact with Virtual reality. The company Teslasuit is on the forefront of full body haptic suits this day and age. Their website currently states that they are ina current stage in production where they are only releasing developmental units for game developers. This is so VR game developers have a chance to test their games with a haptic sensing suit and to make sure it’s calibrated with their game correctly. The suit currently doesn’t have a set price however, their brief Kickstarter campaign had the suit set at 1,500 dollars, a pretty steep price point.
If you aren’t familiar with the haptic suit concept, it is essentially a device that delivers real sensations designed to mimic environments in the digital space. This process is triggered by haptic sensors inside the suit that stimulate muscle groups using electrical feedback. This means that when a person is using the suit they are able to feel the simulations of any bump, push or shocks that occur in the digital space. The company states that the Teslasuit will in no way actually hurt the user or disrupt the muscle groups, it only uses electrical impulses to simulate the natural responses that the muscles already facilitate. Not only is the Teslasuit is designed with haptic sensing the suit also allows for climate control, motion capture and biometric systems. Among these systems, the suit also allows for tracking of the user’s heart rate, stress levels, and mental/emotional states. This is to allow more data to flow through the user to the developer to enhance and adapt the experience to how the user feels.
The Telasuit is equipped with 68 channels throughout the suit to help deliver electrical simulation, the company is also working on expanding that number to further increase the immersive experience. Along with the haptic sensing, the climate control lets the user feel the intended climate of a virtual space and adjusts the temperature of the suit (heating and cooling) to best simulate the virtual environment. The suit is said to have a 10 hour battery life as well as being completely wireless (connecting via Bluetooth or WiFi) and on top of all of that the suit is even washable.
While I believe this suit will revolutionize the gaming industry I find that this sort of device will be invaluable to the business side of things. It would allow for more realistic hands-on training. The company released a blog post discussing VR-training demo for different industries on this website that can be found here. In this article goes into the various industries software and equipment like this could be used such as astronaut VR training, emergency evacuations, and even oil-loading ramp operations. This kind of tech could also be used in the medical field to help training doctors to get more hands-on experience in the surgical room instead of working on real people. Should this device make it to the commercialized markets it will forever change our interactions with the digitals spaces that we know, and I for one, look forward to seeing it.
3D assets for video games and movies is incredibly expensive and time consuming, so it is no wonder why most VR don’t have personalized experiences. With advancements in AI and machine learning, however, those time-consuming tasks can be delegated to the computers, so that developers can make those personalized experiences quickly. AiSolve is a company working to make immersive and interactive learning modules by combining machine learning and VR.
AiSolve recognized the potential for VR to help people learn without having to purchase expensive equipment and classes. They offer a variety of courses that can teach the basics of pediatric emergency room healthcare, manufacturing, and education. By applying machine learning to handle the mundane tasks, developers can make curriculum opportunities that are tailor made for specific companies quickly and effectively.
AiSolve with the help of VRsims,
a proprietary virtual reality company, have earned learning awards for their
first VR experience. The simulation helped workers to learn how to plumb
without all the equipment. Not only does AiSolve’s machine learning help create
smart and interactive content quickly, but it also takes analytics from the
game to help advance the learning progress of the user. As they progress
through the program, they grow from beginner to an advanced user.
VRsims is used at the Los Angeles Children’s Hospital to teach healthcare students how to react to high stakes situations in the workplace. Students learn how to diagnose and treat symptoms through the immersive lens of VR. These experiences set them up for success in the workplace and teaches them how to be calm under pressure.
Additionally, AiSolve and
VRsims have created a simulation platform to teach airport security to spot and
react to dangerous situations in an airport. Workers can experience how to identify
dangerous weapons hidden in luggage.
AiSolve’s products and
services are breaking new ground by using machine learning to aid in the development
process and to track a user’s progress as they take the courses. This makes experiences faster to ship, more
realistic, and personalized for the user. They already are working with partners such as
Sony, Facebook, and Oculus and hope to continue their success.
Ever wondered what it is like to be a surgeon? Well, Bossa Studios has a VR game that allows you to act as a surgeon and virtually perform surgery on a patient. The game, Surgeon Stimulator, is available through their website: http://surgeonsim.com
The video above, a real surgeon attempts to perform surgery through VR. The game has different surgeries such as heart and brain transplant. It grades you based on your performance and the time it took to accomplish it. Dr. Johnson C Lee, the surgeon in the video, talks about how surgery should be focused on how the patient’s quality and the surgery quality and he exemplifies that during gameplay.
There is even a Donald Trump surgeon stimulator! Overall, the stimulator is a great usage of VR because it allows the player to actually interact with objects and sense what a surgeon does during surgery. Can it be used to teach medical students? Possibly but it cannot be the only resource. It was a fun game but in seriousness, medical students need the expertise and knowledge greater than the VR experience. Maybe if the game was garnered towards medical students by adding tutorials, explanations, and textbook information.
I think using a VR gaming headset is the best experience for this game because it allows you to physically be in that world. The iPhone app has limitations because it is basically just a tap and scrolling game because of controllers cannot be used.
As a longtime fan of the Rap music genre, I decided to take a look at a VR documentary on world-renowned rapper Marshall Mathers, professionally known as Eminem. Eminem is a multiple time Grammy Award-winning artist and one of music’s most influential and prolific individuals; according to Rolling Stone, he’s even regarded as one of the 100 Greatest Artists of all time.
This piece was curated by the great minds of Felix & Paul Studios. Félix Lajeunesse and Paul Raphaël are filmmakers and visual artist. Before teaming up, Felix worked as a director of film, commercials, and immersive-video installations where Paul’s passions lied with cinematic storytelling, visual effects, and technology; together they’ve developed multiple award-winning stereoscopic 3-D films and large-scale multimedia installations. The pair collaborated with film editor and director Caleb Swain to help bring Eminem’s story to life.
Marshall from Detroit is a full 21-minute VR experience. The short film originally premiered at the 2019 Sundance Film Festival’s New Frontier exhibition but shortly after became available on Oculus platforms. This experience takes the viewer on a winter’s night ride around Detroit alongside Eminem and journalist/interviewer Sway Calloway. Throughout the ride, you are taken to scenes whithin the city while Sway and Em discucuss significant events and changes to eminem and Detroit as time has gone by.
If you take this documentary at face value, you see the biggest appeal with this story being the chance to feel as though you are in a car with Eminem; you’d be pleasantly surprised to find out that the details of this story go beyond a one-dimensional immersive experience.
The story uses the same narrative structure throughout. Sway is asking Eminem a series of interview questions about Detroit and about Eminem’s life as they cruise through the streets of Detroit. As the viewer, you are provided shots within and on top of the vehicle, and a select few special scenes that truly need to be witnessed within VR to be fully appreciated. One element of production that adds to the immersive feel is the use of 3D audio. When you turn your head in the car to look out the window you hear Eminem talking beside you; when you turn to look directly at him or Sway it feels like you personally are right back in the conversation.
The most compelling thing about this documentary was not any particular segment, but just how well the entire project was put together. Video quality is stellar; it really feels like the subjects are in the room with you with details being as sharp as can be for consumer VR. The story was not only appropriate for VR, but it became exposed as a story that needed VR to best be told. Seeing the scenery of Detroit coupled with the accompanying sounds make the viewer feel like they are more so a part of the story than being told one. The emotional impact of Eminem’s responses was enhanced by the images and sounds provided along with them. The flow was heavily driven by both the location and the story which is something that all experiences strive for but don’t always achieve. In conclusion, I have to recommend this film for reasons that go beyond the story itself. The use of VR techniques alone and the premium storytelling elements make this a worthwhile time investment. This serves as a great example of a VR story done well.
The Meg: Submersive VR Experience, is a video that Warner Bros. Pictures uploaded to YouTube. This experience allows the viewer to be deep in the ocean with the largest prehistoric shark to ever exist. Warner Bros. Pictures allows you to explore where the movie “The Meg” takes place in the ocean and get up close to the shark.
The best part about this video is the storytelling. You start off in the ocean as a diver. You see nothing besides the submersible. The story telling begins as soon as the crew starts talking to you. Suddenly you start to see a shark in the distance, and the crew tells you not to worry because it is a small one. You can hear your heartbeat starting to beat faster and faster as it comes closer to you. Then, out of nowhere, you see another shark that is at least 75 feet long coming towards you. As it approaches you, the crew in your earpiece starts to lose signal and you can tell they are worried in their voice. The storytelling then ends when the crew tells you to swim up now in a scared voice. Just as you start to move up, the meg comes up out of nowhere and attacks. It ends in a black screen and no sound making you believe the shark just killed you.
in this VR experience helped with the storytelling and helped make it more
intense. Without these voices in the background, I feel as if this would not be
as effective. The storytelling allows you to actually feel as if you are in the
movie as a diver.
The emotion in
this VR experience can be described as suspenseful. The sound of the heart
beating faster and faster, allows the viewer to have more anxiety about being
in the ocean alone. I personally felt that watching this without sound had a
different effect on my emotions than it did with sound. Going along with sound,
the storytelling in this experience also had an effect on my emotions. The crew
that was talking in my ear had a tone in their voice that they were alarmed. This
storytelling had conflict in it and that allowed me to feel more tense and
questioning what was going to happen. This allowed my emotions of being scared
to heighten. Even though this storytelling was narrative driven, I also believe
that it could have been experience driven also. For the reason being it helped
my experience of being in the ocean (location narrative) with no one around me
besides the crew talking in my ear.
I thought that this VR experience was extremely appropriate for VR. I personally do not thing that it would have the same effect if it was just a normal video. I decided to see if this was the case and did watch it without the 360° effect. Although I still jumped at the moments the shark came up to the screen, the experience was not as successful than VR. In VR, I was able to feel like I was actually in the water and the whole storytelling was much stronger in this setting.
Even though this
was a great use of VR storytelling, there can always be improvements. The one
thing I wish I could do in this situation was to swim away or move around. I
would wonder if this could be better in AR and how that would change the experience.
The other thing I would change would be to add someone else in the water with
me. I feel as if the storytelling would be stronger and they could add
something happening to the other diver. Overall, I thought that Warner Bro.
Pictures did a great job on this VR experience and it made me want to see the