Procedure) rigor, these workflow challenges
can lead to process variability and negative
outcomes. If, for example, computer vision-
based models can define what “good” looks
like in a fabrication process, is it worth using
these types of inputs and detection models?
If high-definition MP3 audio data can detect
out-of-tolerance vibrations of a mounting
harness for an engine, is it worth letting predictive models help?
Deployment Implications: In line with
continuous improvement strategies, review-
ing SOP definitions prompts a healthy way
to question dated practices and find im-
provement opportunities. Wherever you can
augment or codify human judgment with
AI algorithms offers an opportunity to de-
rive benefits from evolved procedures, ap-
proval queues and making decisions.
But what about your workforce? Many
have written that AI will replace humans.
Do not accept this as an inevitable truth. AI
augments rather than replaces human skills.
While AI will shape the manufacturing jobs
of the future, AI will not replace them. For as
amazing AI has become at performing stupid
human tricks, AI’s intelligence is narrow.
This narrow intelligence will allow your
workforce to focus on higher levels of thinking such as optimization trends and root
cause patterns. Your workforce has unique
value in their judgment. Humans know the
Feature/ The Do’s and Don’ts for Deploying AI 6/10
Data diversity is as real as the variety we encounter with the human population. When considering inputs into AI
models, look hard around the enterprise for data types that can improve
the ability for your data to statistically
narrow down underlying factors
contributing to recurring changes
or outcomes (that may be good
or bad or for your business). Don’t
think in terms of “structured” vs.
“unstructured” data – rather ask your
teams what data can be examined to
best give explanatory or predictive
strength in a deployed model.
From our experience, this in-
cludes not only conveniently for-
matted numeric data, but also text
data from call center notes or image
data from pictures of contaminated
materials in a staging area. We
consider audio data that records
anomalous decibel spikes or drops
as robust candidates for predicting,
up. In this latter example, the mere
task of surfacing a problematic
noise pattern requires techniques
to identify which combinations
or patterns of “ups and downs” in
decibels are statistically indicative
of pending doom.
The concept of “data fusion” is
analogous to bringing together cultural recipes to create a new dish. As
part of this, those building your AI
models benefit from creating new
features or attributes that derive
from your original data sources.
And if the various data types themselves are relevant to the problem
you’re solving, the potential features that are created from them to
serve as inputs into a model have a
greater chance of predicting future
outcomes or making appropriate
recommendations based on the deployed modeling techniques.
One of our manufacturing clients
needed to keep products from jam-
ming up when they converged on a
conveyor line. In this case, pictures
of products on the staging line were
a core input to a machine learning
model that SAS built. The model
needed to recognize whether units
of the product were “together” or
“apart.” In the development phase,
we trained the model with pictures of
different units in their staging area on
that line. Then a signal was developed
to indicate whether the units were
together or apart, and we then asso-
ciated a known jam outcome or not.
Ultimately, when the model was
deployed, the model raised an
alert to operators when the signal
indicated that units were together
for an extended duration. In the deployment, cameras placed over this
staging area automatically fed the
production model new images so
signal calculations could trigger the
Practice “data fusion”