_ _ _ _ _
Transcribed interview (slightly shortened and edited)
André, you have been researching Big Data and AI for ten years, but you also have an excellent practical insight through your work and your clients, which is why I am very pleased that you are available for this interview today.Right off the bat, my first question: will Artificial Intelligence fundamentally change manufacturing and if so, how and why?
André Rauschert: Yes, hello Julia, thank you for having me here. We’ll get right into it and the answer to the question with manufacturing is clearly yes. However, you have to say, there are two different levels. One is the machine and the other is the factory. In other words, we have the production itself, which is being automated with AI. And on the other hand, we have the machine manufacturer, who has to optimize his products on the machine and wants to sell them. And AI is actually something of an umbrella term. Its biggest area of application today is still in the field of predictive maintenance.
Sensors were originally only designed to monitor processes. In other words, to visualize such a process. For the engineer who wants to optimize the process. We have projects in-house where there are 3,000 sensors in such a machine. So that’s terabytes of log files and data that are generated per machine per day. And the challenge is to make the data available in order to develop a decision-making process. And if you now raise your perspective a bit, you quickly reach the point where you say: What is actually the challenge? Quite simply! Chinese machine builders are actually inferior to the Germans in terms of precision and process quality. But if they now massively integrate AI – and they actually do – then there is the potential for predictive maintenance and, of course, a competitive advantage. Because those “unscheduled down times” are the biggest killer of productivity. Productivity, after all, is always measured in terms of OEE, or “overall equipment effectiveness,” and that’s what matters in the end. This means that the question that German companies have to ask themselves is: Does a customer now buy a cheaper Chinese machine with AI or a German one with high precision, if the Chinese machine achieves a similar or perhaps even better OEE through the use of AI?
There is a nice analogy from the automotive sector. In the case of internal combustion engines, German companies are the world market leaders. Of course, this is changing a bit with battery technology, which is now coming more and more. Because without an engine, such a vehicle is of course easier to produce, and in what is currently the largest car market, the consumer – i.e. the consumer in China – is of course asking: Why should I pay more for an electric car than for a combustion engine? Because yes, there is less technology inside, but more software and AI. And these are the keywords that are important for consumers: Autonomous traffic jam advancement, entertainment integration, and that’s just software and AI. And that’s ultimately relevant for automotive exports. In mechanical and plant engineering and in factory automation, we have to pick up on this zeitgeist and, as in the automotive sector, it is of course under extreme time pressure.
Exciting, yes. You’ve already mentioned mechanical engineering. What does production in the factory look like?
André Rauschert: In principle, there are the small AI models in the individual, let’s say, machines, which are then combined for production. And you can take an example from the commissioning of a so-called smart factory. Today, entire plants are commissioned remotely. This ranges from the raw material supply to the recipe-controlled process plant to filling, including SAP integration, which is fully implemented. And there it’s usually the case that a core team, an in-house core team, is on site and with eyes and ears at the plant, and the service providers are called in on the phone if necessary. And yes, if you look at the past, the programmers would come to the site early on a Monday morning, find out that the machines weren’t connected yet, there were all kinds of problems, and then they’d sit around with nothing to do. Now they can come in remotely and flexibly push ahead with other subprojects in the event of delays.
I would like to delve a little deeper into the technology area. In connection with artificial intelligence, we are now hearing more and more about edge AI, i.e. AI that is integrated directly into the sensor, for example. And the counter-concept, if I may put it so casually, is the cloud topic, so to speak. How do these two concepts rank in the field of smart systems? Are there specific applications for which one or the other concept is better or simply more relevant?
André Rauschert: Yes, that is quite exciting, because the difference between Edge AI and Cloud Computing, or Cloud AI for me, is actually that one builds on the other. With Edge, I’m talking about machine learning at the endpoint of the Internet. This is the machine on the production floor, but only when I really know what data streams I need, how the data has to be prepared in the first place, or perhaps how I analyze it piece by piece. This means that I first need a lot of data for learning, and when the models have been learned, I can bring them to the machine. So the first step is to get a lot of data together – that makes the most sense in the cloud – especially if I’m maybe getting it from different factories or from different manufacturers. And then that’s followed by bringing the models and algorithms to the Edge. That is, edge follows cloud.
Large cloud providers advertise that they generally evaluate data in the cloud. Could you briefly summarize the advantages and disadvantages
André Rauschert: Yes, I think each manufacturer has to decide for itself, that’s clear. But the fact is, of course, that nowadays data in the cloud is safe from a competitor’s access. But if you consider that the majority of cloud providers that are really technological leaders come from the U.S. economic area and that the so-called Patriot Act exists there, the government can request cloud operators to make all data available to the U.S. government at any time. And that there are also state-driven challenges, in the case of data misuse or data use of a different kind – perhaps it’s better to say it that way – that is indeed a challenge. And that is something that plays a role on the political track, namely when a company is also very strongly interested in protecting its absolute process know-how, which is available above all here in Germany. So you have to think about that very carefully. The other thing is that it’s also about the use of these cloud services, because in the past we’ve actually also noticed that customers of ours have simply tried out how you could do that, whether you’re faster and more dynamic in the cloud. But if you make mistakes in the use of these services, have many models calculated and end up with billing per node, it can quickly add up to a five- or six-figure sum. And that has happened in the past when a model was started on a Friday and the results were checked on Monday.
You have just mentioned the keyword know-how. Know-how is basically a requirement for a company when I deal with the revision of processes, i.e. also with the question of whether or not to integrate AI into my production. What are the remaining challenges or perhaps also what are the greatest challenges that companies have to face when implementing artificial intelligence in their production processes and how can you best overcome them?
André Rauschert: Basically, it has to be said: The question of all questions is how do I make my machines intelligent with sensors? And AI can only ever be thought of in terms of Big Data. Large amounts of data are always generated, these amounts of data have to be structured, and that is a basic requirement that you have. Access to the data is also important. For example, if I purchase machines from third-party manufacturers, especially in the Smart Factory and I then want to use this data to organize my production flow and the production process across machines. That means we have the Big Data issue, we have the data from third-party machines and the machines themselves. These have to be analyzed and it has to be decided whether suitable sensor technology is even available or perhaps still has to be built.
And ultimately, AI must be used to recognize patterns that help me in the production process, and there are certainly industries that are lagging behind somewhat, and then there are industries that are very far along and in which there is almost a complete smart factory and in which smart systems are used.
You’ve already described the process a bit. How long does it generally take until the system really has a suitable AI or algorithm, and what requirements are necessary for this?
André Rauschert: Yes, that’s not so easy to answer. It’s always a multi-stage process. So the first thing is always that the systems have to be made fit with sensor technology. I have to have the right sensors in the right places that continuously provide me with structured log files, i.e., data.
The second thing is to collect that data. I can’t attach the sensor technology and then immediately analyze it. I need a certain amount of time in which I generate fault data. So, on this second point, I can summarize by saying that bad data is good data. Because, if I have a lot of errors in a row, then the AI can recognize patterns as well. The derivation is then there from business understanding – what do I want to achieve – to data understanding – what is in the data – to data preparation – how do I prepare the data. In other words, how do I prepare the data?
And a third big topic is to get the machine learning algorithms onto it. That usually doesn’t take that long. Complex algorithms, i.e., when there are many data streams that have to be merged, naturally have a higher degree of complexity. I can give you an example. Some people in the semiconductor sector are familiar with this: One of the most complex machines is the lithography exposure machine from ASML. This is an extremely complex process. A tin droplet is fired at 50,000 times per second, which is 45 times hotter than the sun. That’s really crazy, because the necessary light waves of 13.5 nanometers have to be generated somehow. That’s an example of a more complex machine. And the fourth big issue is, of course, that I have a “data management pipeline” and a “continuous improvement process”. That means I need to have a handle on my data. I have to know how it comes in, how you prepare it, and Continuous Improvement means nothing other than that I develop new models and introduce them into my production process. Because I can’t always stop the machines, maintain them and then try them out. It’s a production process and improvements have to be integrated into it. Yes, and if you put all that together, it can be that – if you interact very well – you have created an initial structure in 12 months. But depending on the degree of complexity, the sky is the limit, of course.
Small companies often don’t have the opportunity to do this. But if small manufacturers with limited resources want to start implementing AI anyway, what are their options?
André Rauschert: First of all, as a small company, I usually lack a large amount of learning data, which I need in order to be able to perform this failure analysis. And that raises the question of where I can get the data. There is now a European initiative for this, which is also being driven forward by Germany and which is called “Manufacturing-X”. The project has now been launched at the Hanover Fair and is intended to become the central data exchange point for machine and plant manufacturers. In principle, this is a Gaia-X control framework. Gaia-X is the European counterpart to the American and Chinese hyperscalers. First, we will look at the Supply Chain Act, or more precisely the “Act on Corporate Due Diligence in Supply Chains”. Manufacturing-X is modeled on the already successful cluster for automotive called Catena-X. In the next step, it would be conceivable to use this to exchange a lot of the data needed for learning. Alternatively, there are quite a few specialized companies here in Saxony or even beyond Saxony’s borders, as well as an excellent university and research landscape that can be used, for example in research projects or cooperation projects. That means you can also learn in the network!
For example, in ours. Finally, I would like to ask you what potential AI has in terms of sustainable and CO2-neutral production, keyword ESG. Can artificial intelligence actually help save energy and resources, and if so, how and why?
André Rauschert: You can perhaps pick out two examples here. One is a bit of large-scale thinking, so to speak. The large cloud data centers have proven that they have actually generated energy savings of up to 30 percent through AI by optimizing computing time utilization and optimizing nodes and how they interact with each other. And then that’s a very direct correlation to the carbon footprint. Second, we’re back to our predictive maintenance case that we mentioned earlier. After all, I produce far fewer waste products if I have significantly reduced downtimes or only planned downtimes. Then, of course, I also need less personnel and have less material expenses. All of this adds up to efficiency and effectiveness. The more I know, the more sustainably I can operate, and of course I can achieve this knowledge in particular with machine learning or the umbrella term AI.
So, in any case, a “pro AI” in production on your part. Thank you very much, André, for the fascinating insights.
André Rauschert: I’d love to! See you soon, bye.
_ _ _ _ _
Our interview partner
Head of Digital Processes
Fraunhofer Alliance Big Data AI
Telefon: +49 (0351) 4640-681
👉 Big Data AI Website