Tech Note #1
We breakdown the ins and outs of our approach to Lightship development

Editor Warning: Tech Note posts will NOT be for the feint of heart, there will be much nerdery and codespeak in the following paragraphs. You’ve been warned! LOL 🤓

For the past several weeks, we participated in the Niantic VPS Challenge. It was an opportunity for some of us in the Lightship ARDK community to stretch our minds and talents a bit by envisioning what Niantic’s Visual Positioning System (VPS) allows us to build. For those that don’t know, Lightship’s VPS allows you to augment real world locations with objects in your game/app. Much like graffiti, breadcrumbs or forgotten backpacks are bound to places in the real world, you could put digital equivalents so other users of your app can create and/or see. Our submission, of course, was Seentral Park, our AR experience based in NYC’s Central Park. We still had to prove to ourselves that what we envisioned last March was technically possible since client work interrupted our progress over the summer and fall.

Mr. Squish E demo from about 7 months ago

We showed some test videos earlier in 2022. Above, Mr. Squish E was walking on and around a bench that we like to call his home. (Its official name is the Waldo Hutchins Bench.) The video above shows our initial implementation of our character interaction. To accomplish this, we had taken the sample VPS Wayspot Anchors template scene in Unity and basically duplicated it. One version was our admin version and the other our player version.

In the admin scene, we would tap where we wanted to place a wayspot anchor, i.e. a virtual anchor in a real world location. We would then add a visual element (a Unity prefab) in the real world at the corresponding spot, just so we could see it. If it was in the wrong spot, we’d delete it and start over. We didn’t have a tool to reposition or rotate the anchor, so it took a lot of trial and error. Once things looked good, we’d save the anchors to our CosmosDB via a REST service up on an Azure cloud instance. We would then use a button in the app to switch scenes to the game scene. In that scene, we’d pull down the anchors from the CosmosDB in the Azure instance (because that’s how users would get them). From there, we would use the (now legacy) TrackingStateUpdated Event Handler to receive updates on the anchors. In the video above, you can see that while Mr Squish E is on and around the bench, he’s not really connected to anything in the scene. That’s because he appears in the world where the anchors are, oblivious to if that’s on an actual walkable surface or floating just above said surface. We have another video where our two scene setup worked in connection with Gameboard, Lightships core functionality that enables game objects to stay attached to a mesh’s surface and move from point to point. It didn’t work exactly as we wanted, and then client work got in the way. Dang clients, always ruining our game fun! LOL

Customized Niantic Map SDK with our original characters

Fast forward to November of last year, and Niantic announces the VPS Challenge that challenges us to build something cool with VPS, but at the same time they also announced a Beta of their new Maps SDK. The Maps SDK lets you completely control how a map looks. Above, you can see how we created a custom water and grass look and feel. They just use Unity shaders, which we didn’t know how to write at the time so we took a week of the challenge to learn how to do that. (It’s much easier than you think, in case you’re wondering. Thanks to the amazing Acegikmo and her wonderful lessons.) The map also has this concept called Map Layers, which let you add your own items to the map that will move and scale with the map as it changes due to user interaction. Having used the Google Maps and Apple Maps frameworks on native mobile apps, it was a bit refreshing to see Niantic’s approach. It’s fun to see our characters dancing and jumping around a live map of Central Park, stylized to our liking by our custom shaders. At the 40 second mark in the video below, you can see how we integrated the Maps SDK to trigger HUD (Heads Up Display) elements in our native app wrapper vs loading HUD elements in the Unity game layer. The reason for that is in preparation for HMDs( Head Mounted Displays). The HUDs will differ vastly between a mobile device vs an HMD. If you put your HUD in the Unity, then you’ll have to add a lot of unnecessary device detection logic plus duplicate elements you’ll need to show/hide vs keeping the Unity game elements clean and implementing native HUDs as needed.

Our VPS Challenge submission

The other surprise Niantic provided to us during this challenge was the Remote Authoring Beta. This feature of Lightship lets you place real world anchors, but from the comfort of your home/office. This allows you to do the bulk of the setup from your desktop vs having to go out to the real world location just to place the anchors like we used to. Placing the anchors is done very easily with a Lightship provided tool within Unity and then the same system bakes the anchor data into the Unity scene, thus eliminating two major pieces of our previous workflow: the admin scene to place the anchors and the REST service to load the anchors. This drastically cuts down time to layout the anchors and the development costs of loading the anchor data. The video in our demo looks very similar to one we released months ago, but the code powering them are vastly different in approach and execution. What we have now is the beginning of a custom framework that allows us to provide an acting script driven by “spikes” (i.e. actor cues) to any character, controlling how they move in and around any location in the world that Niantic’s VPS system can recognize. This is quite powerful and something we’re looking to open source in the near future.

With this optimized workflow in place due to the new beta bits, we realized we now have the power to execute our vision. There’s a ton more to talk about, but we think this was quite a lot to digest. If you’re new to AR, feel free to follow us here as we discuss the technical aspects of what we do. Soon, our lead dev will also start live coding our building of the Seentral Park over on his twitch channel. If you want to find out more about Lightship, head over and join their discord server.