Skip to content

Embracing AI to boost machine efficiency

If data is the new oil, manufacturing companies are sitting on a real oil well. The machine pools they use to make their products also generate a wealth of data. Using data science and artificial intelligence, they can unlock this hidden treasure and dramatically streamline their operations.

Almost all electronic devices today are equipped with one or more sensors. These collect a wealth of data - about how the systems are doing, what exactly they are doing and under what conditions they are doing it. All this data can be of great value to a company. However, most of the potential currently remains untapped. That's a shame, because it offers a unique opportunity to improve what is known as overall equipment effectiveness (OEE).

OEE is a measure of how well a production process is utilized in terms of facilities, time, and materials, compared to its full potential, during the period the process is scheduled to run. It identifies the percentage of production time that is actually productive. An OEE of 100 percent means we are making flawless products (100 percent quality), at maximum speed (100 percent performance), without interruption (100 percent availability). By measuring OEE and the underlying losses, we gain important insights on how to systematically improve our production process.

Setting up machine monitoring

To measure OEE, we need to pull data from our systems and send it to the cloud for analysis. This presents a number of challenges. One major obstacle is sentiment: we prefer not to share our data with the outside world, especially data that could reveal business-sensitive information. However, with the right security measures, we can minimize the risk of our data falling into the wrong hands, while anonymization allows us to adequately address any privacy concerns. And we can always decide not to disclose the data that truly provides a competitive advantage.

Once sentiment has turned for the better, there is the challenge of establishing the data processing pipeline. The machine park is likely to contain systems of different brands, each with its own control and communication technology, and of different ages, from new with modern capabilities to decades old with limited control and connectivity. It will take some effort to align these systems and their data streams and bring them together into a single environment where we can determine OEE.

Once we have the infrastructure in place and the pipeline operational, the challenge is to make sense of all the data coming in. This may look like a mountain of work, taking a lot of time and effort to climb. But with specialized help and intelligent use of smart tools, it is possible to start small and manageable.

Connection to the network

Setting up a data processing pipeline starts with connecting the machine control system to the corporate network. The data is then pulled from the machine controls and sent to the cloud. The cloud platform, provided by companies such as Amazon (AWS), Google (Cloud), IBM (Cloud) or Microsoft (Azure), stores them in a so-called data lake. It also provides the ability to analyze them automatically. These analyses convert the data into actionable information, which is sent back to a PC dashboard that presents it in an insightful way.

The newer systems in the machinery are equipped with sensors and wireless connectivity that provides remote access to the controls and sensor data. Older machines may need to be retrofitted with such connectivity. This can be done by connecting a wireless gateway to the serial or Ethernet port of the controller or - if such a port is not available - by using a special device that takes the analog or digital I/O of the controller as its input and has its own wireless connection to communicate with the network.

Systems do not always immediately provide all the data needed to adequately determine OEE. Perhaps they do not measure an important characteristic or the sampling rate is too low. To broaden the scope, we may choose to install additional sensors, either off-the-shelf or customized.

Pre-processing of data streams

Different systems use different communication protocols, such as Modbus, MTConnect or OPC UA. In order to correlate data from different sources, the different data streams must be structured in a uniform way. This is where artificial intelligence (AI) comes in. Cloud platforms offer machine learning (ML) algorithms that take the raw data, analyze it for (recurring) patterns and structure it accordingly. Unsupervised learning, for example, determines the structure of unlabeled data and constructs a model without human supervision. However, it is usually more efficient to have a service technician assist in interpreting the data.

Once structured, the data may need to be cleaned up. There may be some noise in it, such as from sensor failures or other one-off events that are irrelevant to the OEE determination. ML techniques can filter out this useless data so that the data is ready for the heavy lifting.

We can (semi) automatically process the raw data at the company's own location - at the so-called edge. For this we can place a special peripheral device between the machine control and the company network. This device, connected to the controller by wire or wirelessly, collects the data from the system and starts working with it locally. This gives some speed gains, our data stays at our location and we still have analytics available even if the internet connection is down.

Deployment of data scientists

Thus, in these AI-supported transformation and decision-making processes, data scientists play a key role. They combine statistics, data analysis, machine learning and related methods to understand and analyze current phenomena with data. Instead of just seeing numbers, data scientists understand what they mean and how to use the AI toolbox to get the desired information.

We can hire a data scientist or hire one from a consulting firm. The main reason for employing one is that an in-house specialist can quickly master our domain. In addition, it ensures continuity and prevents the process from backfiring after a project is completed. It is also much easier to use data internally than to share it with an external party.

Getting Started

With specialized expertise and adequate tooling, it is quite easy to turn a machine park into a data factory. By connecting production systems to the network and having their sensor measurements automatically processed and presented, we can gain valuable insights into the quality, performance and availability of our production. It provides actionable information based on which we can fine-tune these parameters and increase our OEE.

All the necessary expertise and tooling are readily available. We can start small and keep it small, bringing in outside help to get things up and running and having a periodic audit. Or we can make it as big as we want, by setting up our own data science machinery. Either way, we can benefit greatly by tapping into our data properly and using the oil from the sources to lubricate our production process.

Lakana

Lakana

Lakana Consulting helps industrial companies develop and implement a smart digitization strategy, using software to drive innovation in their business.

View Business
Sign up for our newsletter and receive updates.