Disney is known for doing exceptional things, especially in their parks. Personally, I think the next step for Disney would be to do stuff with Virtual Reality. This made me wonder if Disney had done anything with VR because I hadn’t heard anything about it. After researching, I found that Disney has been trying out VR and the possibilities look endless.
Disney Research has recently started messing around with VR. They have discovered this program called Cardinal. By using Cardinal, Disney Research has figured out a way to convert movie scripts into real-time VR experiences.
According to Sasha Schriber, Disney Research digital platforms group lead, the process could eventually be used for traditional filmmaking. Schriber said, “It goes from script to storyboard to animation in real-time.”
The purpose of Project Cardinal is to speed up the process of making scripts into storyboards and eventually into animation. With Cardinal, it would all be automatic. Not only does the software take scripts and turn them into simple animations, but it also allows creators to put voice recordings in on the spot and preview the scenes in VR.
Characters can be moved from within the headset and Disney Research hopes to add more advanced editing skills that creators can do within the headsets. If the creators can edit more things within the headset, it will make the process of making a movie go a lot quicker.
Disney Research is currently testing Cardinal with a number of filmmakers. Schriber admitted that there was still work to be done and Cardinal is not perfect. Schriber said, “Each scriptwriter writes in his or her own way.” This complicates the process since Cardinal only has a simple language for scripts that it takes in. Since all writers write differently, Cardinal will not work for all scripts.
Disney Research has encouraged scriptwriters to make their writing simple. Disney Research also encouraged scriptwriters to stick to the present tense, since that is what Cardinal is used to. If scriptwriters are able to make their writing simpler, then their scripts will come to life a lot quicker. Instead of waiting almost two years, scriptwriter could be waiting only a handful of months to see their works come to life.
Another cool thing that Disney has done is a drawing of a Disney character in VR. Disney animator Glen Keane went on The Late Late Show to draw his favorite Disney character, Ariel from The Little Mermaid, in VR. Keane used the app Tilt Brush VR app to complete this drawing.
It is amazing to watch how Keane does this. Watch the video below and hopefully, you are just as amazed as I was! The way that the strokes are made in VR is mesmerizing and I found myself not being able to look away!
I am hoping that Disney continues to explore the possibilities of VR. There are so many things that Disney could do with VR. Maybe one day, Disney will be able to do a VR Disney movie! Like I said before, the possibilities are endless.
In typical photography or video, creators have a lot of control over what their camera picks up. A key aspect that can help or hurt a final image is exposure.
With the field of virtual reality and 360 video expanding rapidly, it is important for budding videographers and photographers in this area to take care to properly expose the images they will capture.
For this tutorial, I will mostly talk about controlling exposure with the RICOH Theta V. This model is RICOH’s newest 360 camera and allows users to capture high-resolution photos and videos in 4K/30 fps. The Theta V retails for $399.95 and has some accessories, such as a spatial audio microphone, that can be bought separately. I believe that this camera is easy enough for beginners to use, but has enough quality for advanced videographers to use as well.
A videographer could manually control when the Theta V records, but he or she should download the mobile app the works with the camera. The app, called “Ricoh Theta S,” can be found for free on the App Store and the Google Play Store. From the app, users can not only turn the camera on and off, but they can preview what the camera sees and adjust ISO, shutter speeds and white balance as needed. Theta V users should note that in order to adjust these setting, the camera must be connected to the phone over Wi-Fi.
To adjust the exposure, a camera operator needs to open the preview viewer within the app. From there, users can press “EV” in the lower left-hand corner and use the slider to allow more or less light into the camera. The shutter speed, ISO and white balance can also be fine-tuned from this screen.
According to 360Rumors, “the correct exposure is the one that shows a real world object at the same ‘brightness” as in real life.” The previous hyperlinked article, which also gives a pretty general overview about exposure itself, said that with most 360 cameras on the market, users have limited exposure control but these types of cameras also are less susceptible to getting the wrong exposure because it can evaluate the whole scene around the camera.
As someone who has just learned how to create 360 videos, I did not personally take exposure into account as I should have until was presented a challenge in my recent video that I made with classmate Paidin Dermody.
Last month, we decided to film the University of Kentucky’s MacAdam Student Observatory as a 360 video project. The facility is only open on clear nights, which does not allow for much light for starters. Also, some white fluorescent lights were also available within the observatory, but most of the research and study done within the observatory is completed under red lights, which allow person to preserve their night vision.
We wanted to be able to authentically capture this uncommon lighting, as we are both journalists and wanted to show what a typical experience is like in the observatory. Thus, we had to learn about controlling the settings on our camera while in the field.
Check out our final project below. It can also be viewed on kykernel.com.
The Oculus Rift was released on March 28th, 2016 by Oculus VR, which is a division of Facebook, Inc. Virtual Reality is the biggest craze in the technology world as of right now. People are going crazy for these VR headsets, so I thought that I would review the popular Oculus Rift.
The Oculus Rift has 360-degree spatial audio, which provides users with the idea that they are immersed fully into an experience. Since the Oculus Rift released, the price has dropped from $599 to $399, which is a much better price, but still pretty expensive.
The Oculus Rift is also a fun way to play your favorite games in an immersive way. I tried three games on the Oculus Rift: Minecraft, BBC Home and Face Your Fears. Out of all of those, my favorite game was definitely Minecraft.
Even though the price dropped, the Oculus Rift is still expensive at $399. Not only will you have to buy the Oculus Rift, you will also have a compatible computer. These can get pretty expensive.
On the Oculus website, the suggested computers to use with the Oculus Rift can be anywhere between $799 and $2099.
Another con was that, after awhile, the Oculus Rift gave me a headache and made me sort of nauseous. Of course, that doesn’t mean that it will give everyone these same issues.
As I said before, the three games that I tried were Minecraft, BBC Home and Face Your Fears. Out of all of those games, Minecraft was my favorite to play.
In Minecraft, you had two different ways that you could play the game. In one perspective, I was sitting in a living room that was created with blocks from Minecraft. In the living room, I was sitting on a Minecraft couch and looking at a Minecraft-style TV.
In another perspective, I was completely immersed inside of the game. It was a little disorienting when I was in full immersive mode. I could move my head to look around or I could use the joystick on the controller to move my head around. When I used the joystick, it was a very jarring movement.
The next game I tried was BBC Home. In this game, I was an astronaut on the International Space Station who had to fix something that was damaged. I was forced to move around space to get to the damaged part of the International Space Station.
This game made me super dizzy. After a while, I had to sit down and even then, I still felt dizzy. Overall though, the game had amazing graphics. It was fully immersive and felt like I was actually in space. The game also had two modes: easy mode and astronaut (or hard) mode.
The last game I tried was more of an experience than a game. The experience is called Face Your Fears. This experience wasn’t interactive; all I did was sit in a bed while things happened around me.
Even though Face Your Fears wasn’t interactive, I enjoyed every second of it. It got my heart racing and I never knew what was going to happen next.
Overall, I feel like the Oculus Rift is definitely worth the money. Even though you would have to buy a compatible computer, I feel like the experience that the Oculus Rift gives is definitely worth the extra money.
For my second blog post I took a look at the New York Times most recent VR film We Who Remain. Directed and shot by Trevor Snapp and Sam Wolson, the film is about the Sudanese men, women and children who remain in North Sudan in the Nuba Mountains, though their lives are continuously in danger from military forces via jets who have been dropping bombs on villages, markets and farms. On top of this, the government has put up a aid blockade for medicine that is preventing those injured and sick from receiving the help they need. Through it all, countless children have been killed from the attacks and are daily put in harm’s way when they should be attending grade school and not living in fear of the attacks.
I think the film did a great job opening with the dust to lead your eye and turned into the title slide. While I think it was very effective and has an important role within VR I had not thought about graphics and visuals beyond the 360 videos used in a film. I think it was exceptionally used and really benefitted the video throughout. The video opens with some great images with the families to set the scene and give the viewer a sense of place and launch them into the story. The thing I really appreciate when watching the video in Google Cardboard is the length of the video cuts and how they lead into one another. I have time to survey the whole “room” meaning the entire frame and really feel like they took me someplace I could otherwise not go but use the video effectively to tell the story.
From around the 3-5 minute mark of the video, the videographer takes us inside what looks like an empty school classroom. Through the use of moving graphics, archived video and words on a ‘chalkboard’ updates the reader on the situation and historical context of the video. This is a powerful storytelling use and really pinpoints the viewer on what they are watching and how each frame is important to what they will be seeing, rather than just throwing a number of clips together. The videographer and editors of the film were intentional about it and show prior and proper preplanning.
The remainder of the film shows clips and really dives into different portions and realms the people face such as the banned journalism, the struggles in the hospital and the rebel forces fighting back and keeping others safe. I think the main strength of this film was the use of characters and really
helped us connect to different individuals as the video progressed and get a brief moment of connection to each of them.
I think the main weakness of the video was the interviews. Each time a new character was introduced it was generally one figure sitting on a bed or standing alone outside. The clip as a 360 was boring as it was very one dimensional and didn’t contribute to the video other than to introduce a character where I would have liked to see them moving or working within the context of the story.
Some people say Nikon makes better camera bodies while Canon makes better lenses, and many others swear by one brand over the other. Who cares? As far as DSLRs are concerned, they both make high-quality, professional equipment.
But when it comes to virtual reality, Nikon is closer to the cutting edge. Could that be because Canon has yet to release a 360-degree camera? In this review, we’ll take a look at Nikon’s new 4K-shooting Keymission 360 and weigh its strengths and weaknesses; given the company’s spirited rivalry with its chief competitor, this analysis might give us a hint of what Canon would do.
The Keymission 360 has a solid, rugged design, and its external lens elements are replaceable in case one or both are damaged. It’s waterproof up to about 100 feet (30 meters), and it’s also shockproof and “freeze proof.”
It accepts microSD/HC/XC cards, yielding much higher storage capacity than most of the other consumer-grade 360-degree cameras. Even if you do manage to fill up a 64GB or higher card on a shoot, you can just take it out and put another one in.
Its 1/2.3 CMOS sensor is capable of shooting in 4K resolution, but you have to change the default 1920/24p to 2160/24p through Nikon’s Snapbridge 360/170 app. We’ll talk more about that later.
Its stitching may not be as good as it is on the Ricoh Theta S, but that is to be expected with a much thicker camera. I don’t even think the stitching is all that problematic, but I’m sure future cameras, especially those with more than two lenses, will put it to shame.
The one physical advantage the Theta S has is its elongated frame, which makes it easier to carry. I often feel nervous carrying the Keymission 360 because it’s bulky and I feel like its glass is more exposed.
Canon would likely do little to beat the Keymission 360’s hardware, but it would at least match the camera’s capabilities. It may even go as far to improve stitching and portability, refining some of the less attractive qualities of Nikon’s offering.
The worst thing about the Keymission 360 is its Snapbridge 360/170 app for iOS and Android devices. Although it does have WiFi, bluetooth and NFC, connecting the hardware to the app for initial setup is far too complicated.
When I tried to pair the camera with my phone the first time, it would only display the battery life — I couldn’t use the app to control the camera remotely, and I couldn’t modify any of its settings. I honestly can’t explain why, because I followed all of the pairing instructions. It simply wouldn’t work. I needed it for a group project, and my teammates and I decided to use the physical buttons instead. Since we couldn’t preview the shots, a couple of clips turned out a little slanted.
I tried again two weeks later for this review and finally, inexplicably, got the camera connected to the app, but I was underwhelmed.
Transferring images and video is buggy, and many of the app’s features won’t work until you activate the camera by hitting one of its buttons. This, of course, will result in plenty of unnecessary photos or videos and a waste of battery life. Nikon’s $59.95 ML-L6 remote for the Keymission 360 could be an option, but I’m not sure if it will activate the camera remotely. Either way, it would be much better if its firmware were updated to turn it on with the buttons without taking an image or starting a video.
Beyond the battery indicator on the main “connect” menu, the most useful features are setting video resolution to 4K (camera menu/camera settings/shooting options/movies/movie options/2160/24p) and being able to get a preview. To that end, the “remote photography” option under the camera menu takes you to the app’s iOS settings instead of going directly to the WiFi screen, so it adds an unnecessary step to the process when trying to preview your shot. (I’m not sure if it does something similar on Android.)
What’s more, the Keymission 360’s exposure settings are fully automatic, which is a major disadvantage for those who want to capture natural-looking motion with a shutter speed of about 1/50th of a second (at 24 frames per second). The LG 360 is one of the cheaper options on the market and offers fully manual controls, so why can’t the Nikon (or some of the other consumer-grade VR cameras for that matter)? This doesn’t make sense to me.
This is where Canon could truly excel. Its “Camera Connect” and “EOS Remote” apps use a simpler method of connecting via WiFi, but it could also offer more functionality through manual controls and remote activation in an app.
The Keymission 360 is one of the better virtual-reality cameras for consumers on the market, and it would easily be the best if it had a better app and some manual controls. That said, none of the alternatives is all that great at the moment either because the technology is still rather young.
Connecting the Keymission 360 to Snapbridge 360/170 was a headache, and the app is underwhelming right now. An update could add some better features and more functionality. The only way to activate the camera now is by pressing one of its buttons, resulting in a still or a video recording, so it also could benefit from remote activation, if possible, in a future update.
The image quality is great for what it is, but lower-priced cameras offer comparable image quality with at least some degree of manual control. Stitching is acceptable but not great; the Ricoh Theta S and the LG 360 have a slight advantage because of their thinner profiles, but the difference really isn’t all that noticeable.
Canon is a much larger company than Nikon, but in recent years it has been slow to take on new technology — it’s hard to say if or when it will respond to the 360-camera craze, but Canon would surely take note of the Keymission 360’s shortcomings and give Nikon plenty of incentive to up its game. For now, the Nikon Keymission 360 is available as the vulnerable leader of the VR field for an expensive $499.95.