Consumer Electronics Daily was a Warren News publication.

Snap Announces Partnerships, New Filters, Identification Capability; Adds Voice Control

Snap announced new features in its Lens Studio for augmented reality creators, including ones to identify plants and dog breeds. The SnapML library lets developers bring their own machine learning models so they can create lenses with neural networks they have trained, it said Thursday. It partnered with Wannaby, Prisma, CV2020 for their first SnapML creations. Lens Studio offers Face Landmarks and Face Expressions for better facial tracking, new gesture templates and an updated user interface to simplify navigation within the tool. Lens Studio is releasing a foot-tracking template powered by an ML model from Wannaby that lets developers create Lenses that interact with feet. Snap is also previewing Local Lenses, designed to enable a persistent, shared AR world built on top of users’ neighborhoods. “Soon, Snapchatters and their friends can step into this virtual space together to decorate nearby buildings with colorful paint and experience a new dimension of AR,” said the company. When users “press and hold” on the camera screen, relevant lenses are unlocked based on what they see, it said. Users can identify 90% of known plants and trees using PlantSnap, and they can recognize nearly 400 dog breeds using Dog Scanner. Through an integration with Yuka, users will see a rating of the quality of ingredients in select packaged foods when they scan an item's label using Nutrition Scanner. Brands can create experiences, such as an upcoming experience with the Louis Vuitton monogram, it said. Snap is also introducing voice commands through a partnership with SoundHound: users press and hold on the camera screen to tell Snapchat which lens to display, it said.