Newsletter #7

Published 2021-04-29

Best use of a tutorial

Martine Beerman watched my cloning tutorial and used that as the basis for her Jigsaw Puzzle lens. It is very inventive and very well done. I love seeing people follow my tutorials, but I really love seeing people use them as the basis for something new. Just a fantastic lens.

Snapcode for the Jigsaw Puzzle lens by Martine Beerman

Redesigned website

I haven't been making many lenses or tutorials lately, but that's because I decided to rework both arbootcamp.com and learn.arbootcamp.com into a single website. The new site is now live and all the old tutorial URLs will still work. I originally used arbootcamp.com as a sort of landing page and the subdomain for the tutorials, but I figured it would be better to put everything together in one site. If all of that sounded confusing, that is why I redesigned the site.

I still need to add the search functionality and I do miss the sidebar with all of the tutorials, so those will be coming at some future point in time.

Earth Day on Snapchat

So Earth Day was about a week ago (April 22) and Snapchat partnered with Google for some nifty AR lenses to celebrate. But you need an ARCore Depth API enabled device which means only certain Android phones can properly run the lenses. I do love the idea of having access to depth information, but I currently don't mess with it at all because you either need specific Android phones or a LIDAR enabled iPhone or iPad. I have neither so I couldn't even use the lenses myself. Hopefully depth information will be more widespread and more usable (same lens works with both LIDAR and ARCore Depth API) sometime in the near future.

What I want to see next for Lens Studio

With Spotlight launching with the accompanying music library (which apparently you can search for songs, you aren't limited to just what's displayed), I hope it's only a matter of time until audio reactive effects are added to Lens Studio and Snapchat. The Snap Partner Summit 2021 is coming up on May 20th and that seems to me like the perfect time for them to announce that feature; last year they announced SnapML.

I'd also love to see some better scripting support. Upgrading to ES6 would be nice, but mainly I'd like the built-in editor to support automatic code formatting, or alternatively if the Lens Studio team could ship the type definitions for use in an external editor (I like autocomplete).

What I want to see next for Spark AR

I would like to see some additional facial expression tracking and maybe some full body tracking added. Spark AR seems to be focused on selfies more than anything else right now so some features to help move past the selfie would be pretty cool.

I'd also love for the Spark AR team to reprioritize what they are working on. Previewing effects in VR sounds cool I guess, but in all honesty I don't think was worth implementing. Only world and image tracking effects are supported by the VR player and I really don't see the need to preview an AR effect in VR when the whole point is to use the effect from your phone. Giving us more audio file formats, improving the quality of the face tracking, and making image tracking useable all seem like better things to be working on. I don't want to blame the developer(s) who created the VR player because I'm sure it took some work to build. However, there are clear deficiencies in Spark AR which don't seem to be receiving any attention which makes me think that the people overseeing development of Spark AR don't have a clear roadmap.