The GOAT Farm 360 Video FINAL

In this 360 video, viewers are transported onto a goat farm where there are 42 goats in total. They will get a tour of the entire farm and learn about the goats and what it takes to own them. Viewers will even get to be at eye level with some of the goats and will get the chance to see a baby goat being bottle fed. Not many people get to experience what it is like to live and walk around a goat farm, but today is your lucky day! Although, viewers won’t be able to smell the goats, I think with a little imagination they will get the idea. Thanks for watching!

The Yahoo Falls Massacre

My VR story is about the Yahoo Falls massacre. This is a story that is not well known and you’d have to do some individual research online to learn more about it. I take you to Yahoo Falls which is a fairly known waterfall in Daniel Boone National Forest in the Cumberland Valley of Kentucky. In the 1800s this area was occupied by Native Americans. The Cumberland River Valley Cherokee tribe was offered education and asylum by Reverend Gideon Blackburne, who was a friend of Indian tribes in the area. It was arranged by Princess Cornblossom, who was the daughter of the war chief at the time, for everyone to arrive at the falls and head East toward Chattanooga around midnight. In Chattanooga they had schools catering toward Indian children and housing waiting for them. On August 10, 1810 John Sevier and others fighting under the United States War Department arrived at the falls and shot the women and children below them. Although there is no official documentation, this story has been published online in passed down verbally from generation to generation. For more specific details on this story check out this link : http://www.northerncherokeenation.com/children-massacre-at-ywahoo-falls.html

ECHO WITNESS (360 VR)

This Vr video took a lot of effort to create. I wanted the viewer to be the main character in the story instead of just a camera that is off to the side. In order to do this, I had to create a way for the viewer to feel like an actual person instead of a tripod. I wore a camera on top of my head when filming so it looked like the viewer had my body. This makes it feel more real versus looking down and seeing a tripod.

This film takes you through the experience of witnessing a crime and shortly falling into a coma. A detective and a doctor are trying to figure out who did the crime and they need your help. They have created this new technology that allows you to relive your memories, and they want to use this to help solve the case. Will the technology work? Or will there be something else holding the viewer back from helping them out?

Mars Rover AR Show and Tell

The AR application I used was embedded in Instagram, and took me through a doorway to mars, to see how the mars rover “Ingenuity” is able to fly. You start the AR experience by finding a “flat surface” for the doorway to spawn. I had a lot of difficulty in this part, as I tried pointing the camera at a lot of flat surfaces before one of them finally working. It wasn’t clear if I need a flat horizontal or vertical surface, and it took a lot of trial and error to find out what the program actually wanted to work. When the doorway appears, you walk through and are surrounded by the red planet mars. The rover is in front of you, and when you hit a button on the screen, it begins to fly and give more information about the rover. It was easy to see how the rover began take off, and the information was helpful, and not too much too fast. You are able to go at your own pace, and hit the next button whenever you like, or just keep watching the rover hover in place. It then began to tell me how gravity on mars and earth is different, and how that affects the rover. I was able to see first hand how gravity affected the rover’s descent, and how the thinner atmosphere made it take much longer for the rover to be able to fly on mars compared to earth. It demonstrated how these things affected the machine, cloning it with a blue wireframe line structure so you could differentiate the affects on the rover on mars, and the one on earth. Overall, I think this is a valuable experience in AR and has a lot of potential moving forward. The amount of information and demonstrations were really nice, and made it easier to understand the article. I can see how this concept could make harder to explain concept much easier for the reader to understand, since they have the option to experience it first hand.

 

Gaara AR Lens

For my AR Snapchat lens I decided to pull inspiration from one of my favorite TV shows, Naruto. My all-time favorite character is Gaara. He is shinboi from the sand village who has extremely great character development in the show. He marked himself with a bright red tattoo, the word love in Japanese, which is part of his mantra “a demon loving himself”. I had looked up Gaara tattoo lenses on Snapchat previously and have not seen any that I liked. The only one that was his actual tattoo was not placed right on facial anchor points, so even though this was a three bar complexity lens, I still found myself struggling but wanting to learn and eventually learning how to master this lens. I wanted to have a finished product that was something I’d use.

I found a PNG online of Garra’s tattoo outline. I decided to put that in Photoshop and apply a drop shadow. This particular effect was not something the other lens had. It was a very crisp PNG. I also added a motion blur on top of the outline to help it blend in with skin. When I exported the lens from Photoshop and brought it into Lens studio, I then changed the blending mode to multiply. Doing this helps the “tattoo” sit better and blend better overall! I’m very happy with how my lens turned out for my first time. I love the idea of making lenses for anime characters. There are countless number of characters out there all with unique face paintings.