Retailers and CPG companies want easier and more controllable ways to exploit Artificial Intelligence, says David Hawkings, SVP EMEA, antuit.ai.
There was plenty of excitement five years ago when it was announced that Artificial Intelligence (AI) had escaped academia and the tech lab, and been sewn into commercial applications. However, most of us can remember that the conversation that this sparked was almost exclusively about the artificial rather than the intelligence. And the industry selling AI had to admit around three years later that it had not done a good job of explaining the benefits of the technology, because getting companies to commit even to a pilot proved a struggle.
Arguably, all tech goes through this arc, from disbelief to reticence to final adoption, but in the case of AI, what happened in many cases is companies rushed to recruit data scientists, because they had at least bought into the data-driven enterprise story, and then came to halt once they started to think about the tools these data guys needed to deliver value.
It was at this point that the size of the gaps between needs, resources and tools began to appear. Companies had the human skills but not the tools to put them to effective use. And they had the burning need for data to start delivering – in retail the #1 issue both consumer products (CPG) and retail companies face is planning for, anticipating, shaping, and responding to consumer demand.
While this was always a business imperative, it now stands at #1 on the priority list because both CPG companies and retailers are facing more competition, both within their sector but outside it, with the growth of online and a new generation of brands.
Some retailers are suffering margin erosion from having to embrace the shift to online that escalated sharply as a result of the pandemic, because their business model did not support such a high proportion of off to online. The rush to find scarce and expensive warehouse space, new systems, better ecommerce interfaces and specialist human resources have all contributed to this loss of margin.
And while retailers and CPG companies did their very best to serve their customers during the pandemic, often suffering losses despite a rise in turnover, the customers themselves have become more demanding than ever which translates directly to higher operating costs. These higher operating costs continue to pile on as a result of rising labour costs and the burden of workplace regulation.
AI is not a panacea, but it does have the potential to assist in the creation of better forecasts that will lead to faster stockturn, higher full price sales, fewer markdowns and less waste. To achieve this, data science teams are experimenting with AI and machine learning to incorporate better forecasting throughout their demand planning, sensing, order promising, and replenishment processes, but are often unable to deploy a production system or to scale.
Our view on this is that data teams need a better way to create and then deploy solutions that can more easily address these big issues. And importantly, enable data science teams to model their own solutions, because so many vendors are selling highly prescriptive and restrictive ‘solutions’ that may solve a single problem but are neither expandable nor scalable. Off the shelf software often means you have to accept the results you are given. Even if the UI allows them to change the output, they can’t change the models or pipeline that produce the results.
Alternatively, they can buy an AI platform to create, build, and maintain their AI. But these are usually generic platforms constructed to handle many AI needs – so data scientists must build every project from the ground up, which takes additional time and resources. This won’t work because time is of the essence, and as data changes, data scientist teams need to be responsive in order to help their company maintain a competitive position by improving their ability to plan for, anticipate, shape and respond to consumer demand.
If companies are to build a data culture they need control, and work in partnership with AI vendors that have domain expertise and powerful tools, but which do not proscribe their use or seduce buyers with fancy UIs that are not backed up by functionality.
The solution is to adopt production-ready models, pipeline, and architecture while allowing companies to add their uniqueness and leverage their resources. The client’s data science team can augment, extend, tune, and drive continuous improvement of the solution.
Critically, this approach gives data scientists more control. Now, teams can seamlessly build models in development, move them to QA, deploy to production, and manage experiments by spinning off memory and processing capacity. And models can be pre-built for specific uses cases without any loss of control by the data science team.
From pilot through go-live, time-to-value can be accomplished in 12 weeks, however, the more experienced the team, the smoother the project.