How operationalizing XAI enables stock management optimization?


IA de confiance

Tous les articles


Overstocks, out-of-stocks, dead stock … Retailers’ business depends on their ability to optimize their warehouse management. A lot of progress has been made in the last decade thanks to new management paradigms and control processes.

À télécharger

Optimize Stock Using Demand Forecasts

But stocking remains expensive, requiring infrastructure and working force. On the other hand, the market keeps asking for more reactivity, more products and more availability. Retailers must provide a first-rate level of service while reducing operational cost. Every mistake, every margin of error impacts the performance of a warehouse and improvements of the current workflows become thiner and thiner.

The recent surge in machine learning applications brings new ways to automate processes into the industry. Retailers have started to gather data from their warehouses and now want demand forecasting that is both accurate and able to capture the complexity of current market requirements while remaining easily useable by their stock managers. This is a challenge for most companies because of the multiple seasonalities and high variability in the demand at the daily, weekly, monthly and yearly scale.

Predict the Demand at the Lowest Level of Granularity

Some of the state of the art algorithms are powerful enough to model the demand despite the complexity of this task. Some are even able to predict the demand product by product, or warehouse by warehouse,. With predictions at such low level, experts are able to continuously adjust the inflow of products into each stocking points.

To model this, such algorithms need the historical records about the level of stock, paired with the relevant contextual data (meteo, sales, release of new products …). Most warehouses have access to the former and vendors can provide the latter through APIs.

Sometimes experts make mistake, sometimes Artificial Intelligence does. To make experts trust the forecast application, it’s mandatory to include explanations for every decision made by the AI. Experts have to be able to understand the reason behind a prediction and correct the AI so it keeps improving itself over time.

Second key point is the capacity to deal with the fast changing customers needs. To keep producing accurate predictions, the AI within the demand forecasting application has to update itself very frequently.

A forecast application with both the ability to update itself overtime and to take expert feedback into account will empower the stock managers and greatly improve their ability to optimize the stock-related costs.

Significant Reduction of Dead Stock

Craft ai has been designed to enable business teams to quickly get the value out of their data streams. Our AI solution was built with explainability in mind and natively supports all levels of granularity and continuous learning. It can model the behavior of a product, a product line or even a warehouse if needed.

A major french retailer challenged it, asking a quarterly forecasts for a large variety of products. We outreached their expectations and reduced by 50% the forecast error while bringing full interpretability to their experts.

The retailer was also having a hard time deploying their less performing project, with limited experience on operationalizing AI on time series. Leveraging craft ai API, our 10 000 of models are ready to be deployed in their business applications in order to optimize their supply chain ressources.

Une plateforme compatible avec tout l’écosystème

Google Cloud
OVH Cloud
Tensor Flow
mongo DB

Vous pourriez également apprécier


How MLOps will streamline your AI projects?

When speaking of Artificial Intelligence, the efficiency and profitability of projects depend on the ability of companies to deploy reliable applications quickly and at low cost. To succeed, you need to organize and improve the processes for creating, implementing, and maintaining AI models with a diverse and sizable team.

Lire l'article


Don’t just build models, deploy them too!

You don’t know what “model deployment” means? Even when you try to understand what it means, you end up searching for the meaning of too many baffling tech words like “CI/CD”, “REST HTTPS API”, “Kubernetes clusters”, “WSGI servers”… and you feel overwhelmed or discouraged by this pile of new concepts?

Lire l'article

IA de confiance

Un-risk Model Deployment with Differential Privacy

As a general rule, all data ought to be treated as confidential by default. Machine learning models, if not properly designed, can inadvertently expose elements of the training set, which can have significant privacy implications. Differential privacy, a mathematical framework, enables data scientists to measure the privacy leakage of an algorithm. However, it is important to note that differential privacy necessitates a tradeoff between a model's privacy and its utility. In the context of deep learning there are available algorithms which achieve differential privacy. Various libraries exist, making it possible to attain differential privacy with minimal modifications to a model.

Lire l'article