Newsletter #16

Published 2021-11-12
Meme about choosing between reading this newsletter or doing something productive

My Snap AR predictions

Lens Fest is just around the corner so I want to share my predictions for what's next for Lens Studio. In case any of my predictions come true, I do want to clarify that I do not have any insider knowledge as to what Snap is currently developing. These are just my guesses based off previous updates and where I think things are headed in the future.

My first prediction is that they'll release Lens Studio (LS) 4.8, or maybe 4.9 because they seem to be skipping numbers now. Both versions 3.0 and 4.0 launched at the Snap Partner Summit that happens in early summer, so I don't think we'll see LS 5.0 until then.

So what about new features? There are a few features that I think are imminent due to a few factors. I don't think AR glasses are going to completely replace smartphones, but they probably will be a part of the future. Snap also seems to be courting game developers with Spectacles and then there's that whole "metaverse" thing that nobody seems to understand yet everybody keeps talking about it. So with those factors in mind, here six features that I think are coming next (potentially at Lens Fest, maybe not until LS 5.0, maybe never)

  • Physics system. Some clever creators have got physics libraries working in lenses, but I think we'll see a built-in solution in the near future.
  • Depth information. Some lenses can indeed understand the world around you to an extent. Android phones can use the Depth API from AR Core and LiDAR enabled iOS devices can use the LiDAR scanner to know where various surfaces are. This helps integrate the lens more with the scene - if a character goes around a corner, they are hidden. However, this capability is not available for lots of devices still. I think it's fairly possible that Snapchat will integrate some sort of machine learning based depth estimation to open up these capabilities to everyone. As AR turns more toward world interaction (hello Spectacles) I think we are going to see an increased demand for better interactivity between lenses and the real world.
  • Scene light estimation. It is still super easy to tell when something is AR, but having matching lighting would really help. What do I mean by that? When you create a lens, you specify where you want your light to be and then your object can cast shadows. However, the sun and room lights aren't always going to match so the shadow in the lens is usually going the wrong way. A cool feature for Snap to implement would be to try to understand the scene and match the direction and intensity of the light so your object fits into the surroundings better. Spark AR actually just added a light power estimator, but as far as I can tell not the light direction because they (still) can't do shadows cast by your 3D objects.
  • Vertical surface tracking. I believe the new Spectacles can track vertical surfaces like walls, but it would be great for Snapchat to have this ability as well.
  • Volumetric video. One area where web-based AR stands out from app-based AR is volumetric video. I'm not sure if the new 8 mb lens limit is enough for decent volumetric video or if Snapchat will have to support streaming that data into the lens from the internet, but I think we'll be seeing more of this in the future.
  • API access. Volumetric video has a dependency on being able to stream outside data into a lens, and I think at some point Snap will have to open that up to all sorts of data streams. They already have an MLB lens that can pull in realtime game data, but it's developed internally and outside creators don't have access to that feature. However, as lenses become more sophisticated I think we are going to see that change.

Eating my words

So in the last newsletter I said that body tracking was coming "soon" to Spark AR, but then it really did come soon and is now available.

Adding lenses to videos

I now have two tutorials on adding Snapchat lenses to videos with Snap Camera. The first tutorial feeds video into Snap Camera and then you can either record directly from Snap Camera or use something like OBS to record the Snap Camera feed (which is what I recommend). My second and new method uses a simple Python script to attempt to do the whole process frame-by-frame. It isn't perfect and things like particles or chain physics won't work, but if your video has lots of jump cuts then this might be a good option for you because Snap Camera sometimes needs a few frames to find the face again after the transition. If enough people start using these methods and we can show Snap there's real demand for a proper solution, maybe they'll create a video editor plugin or add the feature to Snap Camera.

Reality Engine

When 8th Wall started building hype about something they were going to announce at Augmented World Expo, I hoped and prayed that it would be affordable pricing for solo creators. Alas, I didn't pray hard enough and instead they announced Reality Engine. Reality Engine is 8th Wall's new platform which runs across smartphones, desktop/laptop computers, and AR/VR headsets. It is pretty cool to see this kind of cross-platform support and I hope we see more of it. Now if only they had a pricing tier that was actually appropriate for "developers looking to get started with WebAR" to help build out this so-called metaverse everyone keeps yammering about...