Snapchat’s camera has to date mostly been associated with sending disappearing messages and goofy AR effects, like a virtual dancing hot dog. But what if it did things for you, like suggest ways to make your videos look and sound better? Or show you a similar shirt based on the one you’re looking at?
Starting Thursday, a feature called Scan is being upgraded and placed front and center in the app’s camera, letting it identify a range of things in the real world, like clothes or dog breeds.
Scan’s prominent placement in Snapchat means that the company is slowly becoming not just a messaging app, but a visual search engine. Scan also helps address a growing problem for Snapchat users: how to find the millions of AR effects, or Lenses, that are made by Snap’s creator community. With its ability to suggest Lenses based on what you’re looking at, Scan could bring more visibility to the Lenses people make, incentivizing them to keep making AR content for Snapchat.
Visual search isn’t a new idea. In 2017, Google debuted Lens, allowing users to scan items through their phone camera and identify them using its vast index of search results. Lens is integrated in the Google Pixel phones and a number of other Android handsets, as well as baked into the main Google mobile app. Pinterest also has its own visual search feature called Lens that shows similar images based on what you scan in the app.
Even though Snap is playing catch-up, it arguably has a better shot at taking the idea of visual search mainstream. Since Snapchat opens to the camera, any change there has big implications for how its nearly 300 million daily users interact with the app. Snap says that more than 170 million people already use Scan at least once a month — that was before it was put front and center on the camera like it is now.
“We definitely think Scan will be one of the priorities for [Snapchat’s] camera going forward,” Eva Zhan, Snap’s head of camera product, told The Verge in an exclusive interview. “Long term, we see the camera doing a lot more than what it can do today.”
Snap first started work on Scan a few years ago after observing how Snapchat users embraced scanning profile QR codes as a way to add friends in the app. After initially working with Shazam to identify songs and Photomath to solve math problems through its camera, Snap added the ability to identify items available for sale on Amazon.
This latest version of Scan, which Snap previewed at its developer conference earlier this year, adds detection for dog breeds, plants, wine, cars, and food nutrition info. The majority of Scan’s features are powered by other companies; the app Vivino is behind the wine scanning feature, for example. Soon Allrecipes will power a Scan feature that suggests recipes to make based on a specific food ingredient. Snap plans to keep adding more abilities to Scan over time using a mix of outside partners and what it builds in-house.
Scan’s biggest new addition is a shopping feature that was built by Snap and aided by its recent acquisition of Screenshop, an app that lets you upload screenshots of clothing and shop for similar items. Scan can recommend similar clothes based on what you’re looking at and let you buy clothes you discover. Scan’s shopping feature will also soon be added to the camera roll section of Snapchat called Memories, letting people shop for clothes based on what they have saved from their camera or screenshots.
Another core pillar of Scan is what Snap calls camera shortcuts. It works by recommending a combo of a camera mode, soundtrack, and Lense. So if you point the camera at the sky, Lenses specifically designed to work with the sky will be shown alongside a song clip and color filter, letting you apply all the changes at once. According to Zhan, Snap is working to add camera shortcuts to its TikTok rival Spotlight, potentially letting the viewer of a video quickly jump into their camera with the same configuration used to create the video they just watched.
I found Scan’s camera shortcuts to be fun initially, but they are currently confined to only a few situations: shots of the sky, human feet, dogs, and dancing. Snap plans to expand the situations that camera shortcuts work in over time, and the integration with Spotlight shows how they could become a more integral part of the video creation process.
Snap wants Scan to be an important way that users discover AR lenses going forward. It recently started letting its AR creators tag their Lenses with relevant keywords that will help Scan suggest the right Lenses based on what the camera is observing.
After testing the new Scan for the past couple of weeks, I found it to be hit or miss. There were many instances where Scan incorrectly identified things or didn’t work at all, like when it failed to recognize that there were clothes I was trying to get results for, and also times when it worked perfectly. Sometimes the suggested Lenses were relevant, and other times they were clearly not contextually recommended at all.
That said, Snap promises that Scan will get better over time, both in its ability to accurately identify things and with new categories of objects it can detect. No data from Scan is currently being used for ad targeting, but it’s easy to see how the feature could make money with more shopping or advertising tie-ins down the road.
Scan gets more compelling in a future world with people wearing AR glasses, like the latest Snap Spectacles. For me, it doesn’t feel natural to point my phone at things in the real world as a way to identify them, but the behavior makes more sense if I’m wearing smart glasses that can scan my surroundings.
Snap already anticipates this: the new Spectacles have a dedicated Scan button on the frame that triggers Lenses based on what the wearer is viewing. (The new Spectacles aren’t available for sale. Instead Snap is giving them to select AR creators and partners who apply for access.)
While Scan is fairly bare-bones now, it shows how Snap is evolving the use cases for the camera. Snap sees Scan as an important part of Spectacles — and potentially other cameras — going forward, according to Zhan. “We definitely don’t want to limit Scan to just the Snapchat camera.”