Walmart Building on AI, ML, Computer Vision Tech Pushed by Pandemic
Walmart had to “unlearn” how to serve customers from a pre- to post-pandemic world and empower its teams to work “at speed,” Global Chief Technology Officer and Chief Development Officer Suresh Kumar told a webinar at the National Retail Federation’s virtual Retail Converge event Tuesday. Data and insights “became our lifelines,” he said.
Machine learning and AI models gave Walmart deeper insights into what products customers wanted, how they were adopting it and what associates needed, Kumar said. “Thinking outside the box” about how data could be used to assess operations across the company “helped us make smart decisions a lot faster.” Understanding what worked and didn’t work was important in keeping everyone aligned, he said.
Walmart is optimizing technology for the upcoming holiday season. Kumar said the retailer “doubled down” on omnichannel shopping processes that the pandemic accelerated. He cited machine learning, data, edge computing and augmented reality coming together in the Me@Walmart app that includes “everything that the associates need to do their job.”
Online volume scaled exponentially during the pandemic, and machine learning helped Walmart deploy thousands of additional stores as fulfillment hubs within four months, said Kumar: “It helped us crunch all the data to figure out where to ship an order.” If a customer ordered three different items, Walmart had to determine in real time the best way to fulfill the order using ML and AI. ML has also been helping to determine how many orders the retailer can fill in a given period so it can prompt customers to choose different slots, which helps optimize last-mile delivery for the fastest, most efficient delivery.
Kumar referred to manual inventory as one of the most “painful jobs” in a retail store due to its tedious nature. Walmart developed an app to automatically detect and correct inaccurate inventory so employees can focus on taking care of customers vs. spending unnecessary time on a boring task, he said. In addition to “removing the mundane,” the app uses ML and AI to “deal with the complexity of running a large omni business at scale.” Walmart also began using ML to help with assortment planning and to optimize the timing and pricing of markdowns, said Kumar, saving the company $30 million in markdown costs. It’s building algorithms to tell the retailer which items it needs, what the pricing should be and where to place the items in stores, he said.
Computer vision capabilities in the Me@Walmart app recognize items on shelves, said Kumar, allowing associates to go to the right item on shelves. Visual cues are easier to identify than bar codes, he said. Customers at Walmart affiliate Sam’s Club have been using computer vision when shopping using the Mobile Scan & Go app that allows them to shop and check out using their phones, without scanning a bar code. “This is great when you’re handling a big bag like dog food or a case of vodka; you don’t have to struggle,” he said. Computer vision will also be employed more in digital shopping, said Kumar, referencing the ability for customers to see how they’ll look in a jacket, for instance, before they put the item in their virtual shopping cart. “We’ll be sharing a lot more on that front.”
Voice technology will have an increasing role at Walmart, said Kumar, saying voice is a natural extension of the user interface that’s more intuitive and efficient than typing. “Voice is going to free us a lot further” and make customers’ and associates’ lives “even simpler.” It’s much easier for an employee to answer a customer question using voice via app than to use a computer or a smartphone to type it in, he said. Walmart had a million queries in the first month it launched the Ask Sam feature at Sam’s Club, he said.