If you aren't familiar with Hart's Snapchat lenses, you should be. He doesn't upload a ton of personal lenses, but what he does post is usually mindblowing. Go check out his work and give him a follow on Twitter.
In the last newsletter I called out Spark AR for appearing to be at end-of-life. While I wasn't able to figure out why Spark AR development has been stagnant lately, I did reach out to a couple people on the Spark AR team and they responded! There are indeed some incoming new features for Spark AR and they have the next year's worth of work planned out. Now who knows if their roadmap includes bringing Spark AR up to feature parity with Lens Studio (pretty please) or if they'll only add incremental changes, but at least there is some sort of plan for it.
I saw a forum post asking why updating a Snapchat lens causes the views to drop. This sounded like nonsense to me, but I took a look at my own stats and there does seem to be a considerable drop in lens reach if you update your lens. If you have updated any of your Snapchat lenses recently, take a look at your stats and chime in on the post if you notice anything fishy.
I was sent a cool filter that BBH London created for Burger King UK. The demo video shows that holding up your phone and scanning the billboard will cause a smoke trail to appear that leads you to your nearest Burger King. At first I assumed this was part of the Burger King app and that it used the user's location to pick the nearest Burger King. But then someone pointed out that it was an Instagram filter which can't access location data. So how did they do it? It turns out that the Burger King UK IG account has a view different versions of the filter tailored to different billboard/restaurant combos. Each filter has a preset path that was configured to lead from the sign to a specific restaurant. This is a super clever solution to not having access to a user's location, but it also means you can't roll this sort of effect out on a large scale. Still, consider me impressed.