Applications that are based on artificial intelligence do not use hard-wired programs. Rather, they rely on algorithms or neuronal networks trained with large volumes of data. It is the essence of AI that it continues to “learn” during operation, in other words, it continues to be optimized based on current data. This enables further increases of production efficiency and improvements of product quality. Data are the fuel for AI - and it needs a lot of them.
Collecting, storing and processing such volumes of data requires great computing power. This is often the point where automation engineering reaches its limits and forces machine manufacturers to compromise. One option for circumventing the restrictions created by the available computing power is for example the consolidation of data. However, this entails the risk of losing relevant information, which would have detrimental effects on the results of the AI.
Another strategy is to run the AI not in the machine but outside of it. This can be done for example on cloud platforms or on servers in in-house data centers. Another established alternative is edge computing, where powerful IT resources are placed immediately on the “edge” of the shop floor. But this strategy has its drawbacks. AI via cloud computing, for example, is not suitable for real-time control systems due to the long delay times of the remote data centers. The network is likewise a critical factor for on-premises installations, whether in centralized data centers or on edge servers: Network outages or fluctuating response times could disrupt production.
The vagaries of IT
Of course, the market offers powerful IT equipment where each subsequent generation comes with CPUs with faster switching times and more kernels. And there really are providers who place powerful standard PC equipment into protective housings in order to win points with their enormous computing capacity in data-heavy Industry 4.0 applications. However, this fails to meet industrial requirements in a number of ways.
It is not only that the individual components are not designed for the rough ambient conditions or the high degree of fail-safe stability required in industrial manufacturing. This approach also fails to fulfill one of the central demands of industrial customers—the guaranteed availability of the same product over many years.
This very issue is also a general obstacle in the development of AI solutions for the industrial sector. Neuronal networks and algorithms are frequently hardware-embedded on state-of-the-art PCs or server machines. With this approach, each new development leap usually requires a shift to the latest hardware, making it unsuitable for the industrial sector.
AI competence for industrial applications: example KEBA
KEBA has recognized this opportunity and decided to invest: An AI center of competence was started several years ago. At this center, in-house developers are programming industrial-strength AI platforms that support the installation and operation of AI solutions. In addition, the center creates proprietary AI solutions based on KEBA automation technology. Last but not least, the specialists at the AI center of competence also provide support, both in-house and externally.
KEBA covers all three areas of AI application:
- IoT where AI runs either on the premises or in the cloud to satisfy needs around digitization, data analysis and smart factory.
- Local AI directly in the machine or product, for example in order to make machines autonomous and more intelligent.
- Assistant systems that use smart AI support for everything from programming to operation in order to reduce complexity and to hugely simplify machine handling through intelligent HMIs and AI support.
Based on this comprehensive approach, a universal AI solution was developed that meets all the requirements of industrial customers, long-term availability, on-site support, and updates for a long period of time.
AI accelerator for the industrial sector: AI Control
AI Control, developed by KEBA, is a solution that encompasses both hardware and software. The hardware, which meets the standard requirements for industrial settings, includes an AI module whose outer appearance with its many interfaces resembles a programmable logic controller (PLC).
There are interfaces for Gigabit Ethernet, EtherCat and CAN field buses, USB and audio ports, as well as an SD slot for adding more memory. Inside, there are various ARM processors as well as a data processing unit (DPU) that provides the AI acceleration and handles the communication with the entire sensor system. The solution also includes an open software platform that supports developing and running the AI applications.
The entire sector is in the midst of a transformation that affects both the world of control engineering and the world of programming
In addition, KEBA supports its partners in the digital transformation and the development of their own AI applications. Experience has shown that industrial customers who are currently working on the utilization of AI in production typically have well-trained data scientists and programmers in-house who are experienced with popular AI programming languages such as Python or frameworks such as TensorFlow.
But modeling a neuronal network or an algorithm and putting it on a CPU, GPU or DPU (data processing unit) is one thing— implementing them smoothly in a machine context entails its own set of challenges. One important aspect is to establish reliable communication with the control system and make sure that the capabilities and the added value provided by AI are recognized and exploited.
The experts at KEBA’s AI center of competence have developed advanced know-how in this area over the past few years, and today machine manufacturers can enjoy the benefits of this know-how.
Using demo machines and robots from in-house development, KEBA’s AI experts can demonstrate how smart and autonomous a machine can be and what capabilities are provided by AI Control. They have an in-depth understanding of the possibilities of AI uses in Industry 4.0, but also of its limits.