We here at InfiniteRed recently had the opportunity to work with Eos Lightmedia based in Vancouver, Canada, on a very fun project for Vancouver International Airport. YVR commissioned an interactive display for their trade show booth. It’s a two-meter interactive globe that displays their routes in many ways, and has a game. A user at a separate podium can spin the globe around using a trackball, and make selections via the iPad app and a physical button. The globe is projected on to a concave surface in front of the user. When viewed from the front it becomes an optical illusion and appears to the user as if its an actual sphere pushing out towards you.
The project is broken in to two applications, both written in RubyMotion. The main OS X one and its iPad companion. The iOS app is used for inputs and selection on the main app. In order for the applications to communicate with each other we used an embedded web server called Apex to create a REST api within the Mac application. On the Mac we exposed certain endpoints that the iPad could communicate with. We also synchronized the databases (SQLite) and data models across both applications in order to cut on traffic and reuse information. There are also a few keyboard shortcuts to enable and disable admin features from behind the scenes.
The Vancouver International Airport recently used this system at a conference in Chicago and the users loved it. During the day they could explore flights around the globe, sales people could use it as a visual aid when describing their services, and at night they would leave it in game mode for people to play with.
Here’s a video of the system in action:
Built with RubyMotion and SceneKit
The best way to picture SceneKit is as a tree. Leaves are connected to branches and branches connected to the trunk. Each of these parts (leaves, branches and trunks) are referred to as nodes. Each node can have its own behavior and properties but needs to be attached to another node. This is how we can create worlds that have laws of physics and other behavior that can then trickle down and enforce the rules on to its children nodes without too much manual work.
OS X Yosemite introduced a lot of enhancements to SceneKit to bring it more in line with SpriteKit (2D framework). Unfortunately we started this application before Yosemite was available so we had to use Mavericks and bend SceneKit to our will just a little bit. For example: many of the animations are movies texture-mapped onto a node. The shadows under the dashed lines and some movies are actually a separate texture mapped onto another node, as real lighting didn’t produce the sharp shadows they wanted.
One of the most engaging parts of the application is the ability to freely move the camera around the globe. An issue we ran into is that moving the camera around would create drift — that is, as you moved up, down, left, right on the globe, a native camera would drift away from the expected north-is-up direction. The difference is evident if you compare Google Earth to Apple Maps on iOS. Google Earth will drift whereas Apple Maps will always keep the globe in the upright position. After trying to programmatically move the camera about and manually look after the coordinates and angles we realized what we wanted could be built using a two-axis gimbal as a node within the scene (it’s invisible), and attaching the scene’s camera to that. So now when you rotate the camera left or right the outer sphere with the camera attached to it rotates and when you move up and down, the center globe with the earth textures rotates north and south.This saved us doing the trigonometry ourselves and made it extremely easy to move the camera to any coordinate and gives the appearance that the camera is always pointing up. The main lesson we learned is this: don’t program, make a movie. Ask yourself how you would do this is real life, then replicate that virtually.
Because SceneKit is based in a 3D environment, light plays an important part. One of the first things you’ll need to know when you get started is how to place a camera and how to use lighting. Because our gimbal and camera moves around the globe we attached the light to the camera. When you view the globe through the scenes camera, just behind your POV is a light source. This way, no matter what you’re looking at as the user, the globe will always be consistently lit across all states.
The halo around the earth is also attached to the camera, a “trick” that makes it appear to surround the earth in atmosphere.
Thinking back to the earlier analogy of SceneKit as a tree you begin to see just how important the relationship between the nodes are. Once we figured out our scenes graph, ie; where the nodes sit within the scene, everything else fell in to place quite easily.
We made the decision early on that in order to be as flexible as possible we would use the GPS coordinates of each airport and map those to the globe directly. So rather than manually convert each coordinate to a vector value (x,y, z) for SceneKit we would automatically convert a latitude of -33.901852 and a longitude of 151.213570 in to a vector coordinate when the application required it. This meant that the client could remove and add airports without any additional work and increase its reusability for other trade shows.
Todd Werth started work on a SceneKit RubyMotion gem called motion-scene-kit. It is both an example and some utilities. It is very basic right now, but you can run it to see SceneKit in action. It works both on OS X and on iOS 8.x and above.