Artificial Intelligence makes supply chains more vulnerable

Just as it does with everything else, of course.

Artificial Intelligence makes supply chains more vulnerable /img/ai-supply-chain.jpg

Among other things, the COVID pandemic forced pretty much everyone in the world to realize how fragile and interconnected the “just-in-time” supply chains of pretty much everything we use are. That realization will never end, nor it should, I think. In 2022, for example, the list of Things You Should Really Know About includes “supply chain attacks by AI” (Artificial Intelligence), which, says Forbes, are “an incredibly big threat right now”.

Easy examples of supply chain attacks by AI

AI software systems are able to make instantaneous automatic decisions about supply chains or countless other topics because they are “trained” in advance to recognize all the possible situations they may encounter, and what the best solution may be in each of those cases. This training happens by feeding the software with very large datasets, that contain all that information.

This general technique can be exploited to mess with AI-powered supply chain software in at least two ways. One, called “poisoning”, consists of corrupting the initial dataset so that “when it is used to train a system, the system will be poisoned and make wrong decisions."

Other attacks modify the model used to learn from the datasets, “so that it will only perform certain actions when given certain inputs”. In practice, this may lead image recognition software to NOT recognize certain models of weapons, or to grant full access to a system, without asking for further identification, when it is shown a predefined face, or other image.

At this point, you may wonder “what’s new here??”

Attacks to mission-critical software, with or without going through the Internet, have been a problem since the dawn of computers, and well known even to the general public at least since movies like “Wargames” (1983). So you may be justified to wonder “what’s new here?”

The answer is as easy as it is difficult to handle. The Forbes piece gives it in these words:

“AI is usually based on [models that] we still struggle to understand how they work in normal situations, let alone adversarial ones."

In simpler words, what’s new here, is that attacks to systems using AI are much harder to prevent because AI software works in ways that not even its authors understand completely, or predict in all possible cases. See here why this is a really big issues, not just for supply chains, but for every part of a digital society.

There needs to be completely new ways of identifying the presence of such backdoors and better methods of dealing with them. While nothing is being done, know that in the hands of attackers, these exploits present huge opportunities that can fatally harm your company.

Image sources: