Sometimes it’s nice to have time to play around and see what where that takes us!
Like any agency, we have our busy seasons and our slower seasons. As much as we love working on client projects, it’s a welcome change of pace when we have some open time for experimenting. That is when we can spend some time trying to answer the questions we often ask. Questions like, “What if you could see your hands in VR?” or, “How can we integrate some custom input devices to bespoke software?” “What kind of crazy things can we do with eye tracking?”.
When we were at CES in 2018, we saw a couple of examples of VR using the Leap Motion controller to track hands in a limited capacity with varied results. We saw this as a great opportunity to experiment with VR interactions and see if we can further immerse the user.
It turns out, while not a completely developed system from a technical standpoint, conceptually, it is very sound. It is a really cool experience to be able to reach out and interact with virtual interfaces and objects with your own hands and not controllers. The Leap Motion also does a solid job of tracking each finger. Seeing your virtual hands moving, just as you move your real hands, is a pretty compelling experience.
One day while sitting down for breakfast in a hotel in New Orleans, I got to thinking about head and eye tracking and how that might be used to create some really unexpected and engaging interactive experiences. When I got back to the office I order a Tobbi eye tracker and started talking through the concept with the dev team.
We ended up with a very cool proof of concept piece that uses the Tobbi 4C. Not only does the device track your eyes, but also your head. As your head moves side to side, you experience a parallax effect. As your eyes float from object to object on the screen, information pops up about what you are looking at.
The same tech will be rolled out in VR in 2019 with the Vive Pro Eye. It will be interesting to see how well it works in a VR headset vs. a computer screen and if developers will be able to utilize it to really enhance VR experiences.
One of the things we really like to do with our interactive kiosk and trade show projects is to integrate external controls. In order to do that in a timely fashion, we need to experiment ahead of time. Lucky for us, Arduino parts are normally pretty cheap.
So far we have used various Arduino components (and once a Raspberry Pi) to connect touch interfaces, motion detectors, button/slider inputs, complex LED lighting and RFID tags with our Unity based applications. We have found this adds a richness to experiences that are not achieved through just the software alone.
As much as it might be fun to read about these things, trying them out is another thing entirely. Drop us a line and set up a time to come in and check it out.