Marshall From Detroit: VR Story Review

Image result for marshall from detroit vr
Image from Felix and Paul Studios

As a longtime fan of the Rap music genre, I decided to take a look at a VR documentary on world-renowned rapper Marshall Mathers, professionally known as Eminem. Eminem is a multiple time Grammy Award-winning artist and one of music’s most influential and prolific individuals; according to Rolling Stone, he’s even regarded as one of the 100 Greatest Artists of all time.

Image result for felix and paul
Image from Felix and Paul Studios

This piece was curated by the great minds of Felix & Paul Studios. Félix Lajeunesse and Paul Raphaël are filmmakers and visual artist. Before teaming up, Felix worked as a director of film, commercials, and immersive-video installations where Paul’s passions lied with cinematic storytelling, visual effects, and technology; together they’ve developed multiple award-winning stereoscopic 3-D films and large-scale multimedia installations. The pair collaborated with film editor and director Caleb Swain to help bring Eminem’s story to life.

Image result for sundance film festival
Image from Flickr

Marshall from Detroit is a full 21-minute VR experience. The short film originally premiered at the 2019 Sundance Film Festival’s New Frontier exhibition but shortly after became available on Oculus platforms. This experience takes the viewer on a winter’s night ride around Detroit alongside Eminem and journalist/interviewer Sway Calloway. Throughout the ride, you are taken to scenes whithin the city while Sway and Em discucuss significant events and changes to eminem and Detroit as time has gone by.

Image result for marshall from detroit vr
Image from VRScout

If you take this documentary at face value, you see the biggest appeal with this story being the chance to feel as though you are in a car with Eminem; you’d be pleasantly surprised to find out that the details of this story go beyond a one-dimensional immersive experience.

The story uses the same narrative structure throughout. Sway is asking Eminem a series of interview questions about Detroit and about Eminem’s life as they cruise through the streets of Detroit. As the viewer, you are provided shots within and on top of the vehicle, and a select few special scenes that truly need to be witnessed within VR to be fully appreciated. One element of production that adds to the immersive feel is the use of 3D audio. When you turn your head in the car to look out the window you hear Eminem talking beside you; when you turn to look directly at him or Sway it feels like you personally are right back in the conversation.

Image result for marshall from detroit vr
Image from VRFocus
Image from VRFocus

The most compelling thing about this documentary was not any particular segment, but just how well the entire project was put together. Video quality is stellar; it really feels like the subjects are in the room with you with details being as sharp as can be for consumer VR. The story was not only appropriate for VR, but it became exposed as a story that needed VR to best be told. Seeing the scenery of Detroit coupled with the accompanying sounds make the viewer feel like they are more so a part of the story than being told one. The emotional impact of Eminem’s responses was enhanced by the images and sounds provided along with them. The flow was heavily driven by both the location and the story which is something that all experiences strive for but don’t always achieve. In conclusion, I have to recommend this film for reasons that go beyond the story itself. The use of VR techniques alone and the premium storytelling elements make this a worthwhile time investment. This serves as a great example of a VR story done well.

VR Experience: The Meg

The Meg: Submersive VR Experience, is a video that Warner Bros. Pictures uploaded to YouTube. This experience allows the viewer to be deep in the ocean with the largest prehistoric shark to ever exist. Warner Bros. Pictures allows you to explore where the movie “The Meg” takes place in the ocean and get up close to the shark.

The best part about this video is the storytelling. You start off in the ocean as a diver. You see nothing besides the submersible. The story telling begins as soon as the crew starts talking to you. Suddenly you start to see a shark in the distance, and the crew tells you not to worry because it is a small one. You can hear your heartbeat starting to beat faster and faster as it comes closer to you. Then, out of nowhere, you see another shark that is at least 75 feet long coming towards you. As it approaches you, the crew in your earpiece starts to lose signal and you can tell they are worried in their voice. The storytelling then ends when the crew tells you to swim up now in a scared voice. Just as you start to move up, the meg comes up out of nowhere and attacks. It ends in a black screen and no sound making you believe the shark just killed you.

The characters in this VR experience helped with the storytelling and helped make it more intense. Without these voices in the background, I feel as if this would not be as effective. The storytelling allows you to actually feel as if you are in the movie as a diver.

The emotion in this VR experience can be described as suspenseful. The sound of the heart beating faster and faster, allows the viewer to have more anxiety about being in the ocean alone. I personally felt that watching this without sound had a different effect on my emotions than it did with sound. Going along with sound, the storytelling in this experience also had an effect on my emotions. The crew that was talking in my ear had a tone in their voice that they were alarmed. This storytelling had conflict in it and that allowed me to feel more tense and questioning what was going to happen. This allowed my emotions of being scared to heighten. Even though this storytelling was narrative driven, I also believe that it could have been experience driven also. For the reason being it helped my experience of being in the ocean (location narrative) with no one around me besides the crew talking in my ear.

I thought that this VR experience was extremely appropriate for VR. I personally do not thing that it would have the same effect if it was just a normal video. I decided to see if this was the case and did watch it without the 360° effect. Although I still jumped at the moments the shark came up to the screen, the experience was not as successful than VR. In VR, I was able to feel like I was actually in the water and the whole storytelling was much stronger in this setting.

Even though this was a great use of VR storytelling, there can always be improvements. The one thing I wish I could do in this situation was to swim away or move around. I would wonder if this could be better in AR and how that would change the experience. The other thing I would change would be to add someone else in the water with me. I feel as if the storytelling would be stronger and they could add something happening to the other diver. Overall, I thought that Warner Bro. Pictures did a great job on this VR experience and it made me want to see the movie.

AR – Life Saving Technology

Augmented Reality (AR) is a technology that superimposes a computer-generated image on a user’s view of the real world, thus providing a composite view. It has many uses, which can be read here. In this blog, we will be discussing how Esri is implementing the public safety use of AR.

Geospatial data company Esri, is taking AR to the next level. Although they have been around since 1969, Esri continues to fund their research and development teams in order to stay up to date with the newest technologies. Currently, that means using augmented reality for spatial analysis and visualization.

Esri created an app in 2017 called AuGeo. This app allows users to see data (that has been previously recorded) in an augmented reality form.

Example of what it looks like inside of the AuGeo app

But what does that mean to us?

Forbes released an article in late March of 2019 explaining how AuGeo can start aiding search and rescue situations. Essentially, the app is able to create an overlay of the geographic terrain that was there before a natural disaster. Most recently, it was used in the flood in January of 2019, created by a dam break in Brazil. By giving first responders the ability to see what is beneath the disaster, search and rescue can move much more quickly and efficiently. B

How does the app work?

The app, AuGeo, is relatively simple to use. I have attached a fantastic, easy to understand video below to show how to get it working.

Quick AuGeo Tutorial

Where does the data come from?

Esri has created a system software called ArcGIS, which is a geographic information system. According to Esri, it is used for ‘creating and using maps, compiling geographic data, analyzing mapped information, sharing and discovering geographic information, using maps and geographic information in a range of applications, and managing geographic information in a database’. Ultimately, it provides an infrastructure for creating maps.

ArcGIS Mapping

Esri uses this software inside of AuGeo by making the maps and geographic information the augmented reality. For instance, in big cities, most underground infrastructure is documented in a 3D CAD file format, which makes for a simple transition into augmented reality on the app.

3D CAD Model Example of Infrastructure

Can it be used for anything other than search and rescue?

Absolutely. Although search and rescue teams benefit highly from applications such as this, there are other possible uses for those of us not involved with emergency situations. For example, the data found in the app can be used as historical data. If you were to find yourself walking around a new town and had questions about certain landmarks or areas, the app (given the information has been uploaded) can aid as a tour guide of sorts. It can provide answers to those questions you might have. This is an exciting start of what AR can offer to society. However, there is an infinite number of new possibilities and it’s even more exciting to see where AR will take us in the next 5, 10, 50 years.

GoPro Fusion Spatial Audio Tutorial

For my second blog post, I’ve chosen to write tutorial on how to use spatial audio in the GoPro Fusion 360 camera. I utilized this feature while filming and producing my final project for the semester, in which I filmed an acoustic cover of the song “Shallows” from A Star is Born with a three part band. GoPro describes the spatial audio feature as an ability to keep a 360 experience as immersive as possible. If the video a user is watching used the spatial audio feature, it will keep the audio placed with the source, no matter which way the viewer is looking in their headset/magic window. For example, if someone is talking in front of the viewer, but then the viewer turns their head to the left, the audio will now be stronger in the right ear as opposed to being balanced like it was before. This is because the source of that audio is now to the right of the viewer. Doing this creates more accurate spatial awareness to the environment for the viewer.

As long as you are using a GoPro Fusion camera, this feature is incredibly easy to use. Just like the GoPro software is able to stitch 360 videos automatically, it is able to create spatial audio automatically as well. Once you film your clip, you’ll need to connect your camera to the GoPro Fusion software to get ready to render it. Once you’ve found your selected video and trimmed it to the section you’d like to render, click “Add to Render Queue” and a pop up window will show more options for your outputted video. These options include where you’d like to export your video to (I always choose editing so it just sends the file to my hard drive), the resolution, and your “preferred sound setting” this is where you click 360 Audio to ensure the spatial audio feature is used.

After this, all you need to do is click “create the render queue,” then “render all” in the bottom right corner, and wait for your video to be complete! To get the best understanding of the spatial audio, headphone or headset use is recommended so the audio is better placed.

I’ve included links to some videos I’ve found online that take advantage of the spatial audio feature, including the video GoPro links on their website that highlights the effect very well in a controlled anechoic chamber. I have also included a short rough edit clip of what we filmed earlier today to give a sneak peek on how I will be using the feature.

GoPro Fusion Spatial Audio Demo: Skiing

VR at the Final Four

The NCAA basketball tournament is one of the most interesting and intriguing tournaments in all of sports. There is a rush of excitement when fans see a buzzer-beating shot or a Cinderella team upset a number one seed. On the contrast, there is a wave of agony and sadness when you see your team knocked out of the tourney. For the past four years, the NCAA, Turner Sports, in partnership with CBS, Oculous, and Intel have been offering fans a front row seat to all the action through the March Madness VR app. VR brings the fans straight to the arena and gives them a chance to follow their team through the tournament. In theory, watching a game in VR sounds amazing but is it really better than keeping up with the action through your television set? Here are some of the pros and cons of going through March Madness in VR.

View from within the March Madness Vr app

During the 2019 NCAA men`s Basketball tournament 21 games were broadcasted in VR including the Final Four and National Championship game. You can find the full list of games here. To get the broadcast in VR you had to own either an Oculus Go headset or a Gear VR headset. Then you would have to go an actually download the free March Madness VR app. Once you have the app downloaded and set up the app you can buy single game tickets for $2.99 or you can gain full tournament access for $9.99. The app also offers game highlights, the ability to view the bracket and view recaps in VR all for free. During the games, you have the ability to switch to multiple different camera angles or allow a director to switch for you. Compared to watching the games on TV this can seem like a lot of work to just get to the experience. On television, you get every single game in the tournament and don’t have to worry about missing any moments. You also don’t have to deal with the headache of setting up the app or paying to see the games. It is nice that the app offers VR highlights for free for those that don`t want to pay but still want to have the experience. The directors cut feature is was also a good idea because some people don’t want to have to worry about switching between camera angles.

Video courtesy of KPIX CBS SF Bay Area

The video above does a good job of showing off some of the features in the app and how the app actually works and functions. One of the main advantages of watching the game in VR is the different camera angles and views you get of the game. There is no way other than in VR to experience a right under the basket shot. I do feel like it can get exhausting to have a headset on for the length of a basketball game compared to watching a game on television. One thing that I hope happens in the near future is an expansion of the app to any mobile device. The app could draw in more viewers if it did not rely on the use of just an Oculus Go headset or a Gear VR headset. You could just have it as an app in the iPhone or Android app store and allow people to use any headset. This will give them much better viewership and more downloads of the app.

The work it takes to bring the tournament in VR is impressive and Intel has done a good job at setting up their cameras. I would suggest buying a single game pass and seeing if you do enjoy watching the game in VR. Then you can decide if you prefer the television view or VR view. If you don’t like the VR view or feel it is not worth the price you can always watch the highlights in VR for free. Overall there are advantages and disadvantages to watching the tournament in VR but everyone should at least try watching one game in VR.