Vision 2016: Industry 4.0 energises and inspires embedded vision systems
The first embedded vision systems were launched at Vision some 20 years ago. The concept of combining machine vision with hardware and software in a stand-alone system has proved to be a veritable success story. It has changed the world of industrial machine vision and even led to the foundation of a specialist and constantly growing VDMA working group.
"Interest in our working group increased enormously after the presentation of our study, and we now have 20 active members," reports Dr. Klaus-Henning Noffz, Head of the VDMA Embedded Vision Working Group and CEO of Mannheim-based Silicon Software GmbH. The talk is of the Embedded Vision Study Group (EVSG) Report, which was presented at the G3 Future Standards Forum in Chicago in the summer of 2015.
Embedded vision plays a key role with Industry 4.0
The study reflects, above all, the enormous technical changes. "Previously, embedded technology was concerned with incorporating everything compactly in a single unit, which can also be used on the go," explains the member of the VDMA Board's Machine Vision department. "But embedded vision goes so much further: the focus is on clarification of the role machine vision can play in relation to Industry 4.0." The experts are therefore asking themselves how units need to look to ensure that "images can be inspected where they are taken." "For instance, how can a sensor be combined with processor intelligence for it to feed its image data automatically into a network?" states Dr. Noffz, citing one of the typical questions. "It all comes down to creating decentralised intelligence."
This also has an effect on automation technology, as embedded vision systems change into intelligent, independently acting subscribers in the smart factory. "These machine vision units do not just log in automatically to the network, they also know which network subscribers need their measurement results," explains the CEO. "However, the automation network can also contact the unit to call up detailed information specifically from it – say from a current image inspection. Correspondingly, the network would also change from its earlier pyramid model with strictly hierarchical levels to an end-to-end IT structure.
The demand is growing for appropriate systems that can be integrated with ease into the smart factory of the future.
"At VISION we are therefore launching software that makes cameras more intelligent," explains Dr. Noffz. "The camera thus transmits completely prepared results rather than just image data."
Suddenly real-time processing is called for
Embedded vision also puts its success down to hugely improved computing power and hardware complexity. According to Carsten Strampe, Managing Director of Friedberg-based IMAGO Technologies GmbH: "Externally the boards look unremarkable. However, up to 1,300 pins are connected to a modern processor – a challenge including the peripherals and high-speed layouts." Multi-core multitasking-enabled real-time operating systems work in the computers, alternatively with Linux operating systems. Significantly more complex interfaces with Ethernet or package-orientated camera protocols (e.g. GigE Vision) are used. Separate RISC processors can even process real-time field bus protocols. And not forgetting the requirements made by Industry 4.0. "The latest operating systems and comprehensive additional programs are called for as the systems are naturally integrated into networks and databases," explains Carsten Strampe. "The embedded vision system now performs many supplementary tasks rather than just specialising only in machine vision – preferably in the process real-time of today's fast-operating machines and production systems."
"We will probably be the first company to showcase an embedded vision computer with NXP's flagship ARM processor at VISION 2016," comments Carsten Strampe. "10 GigE camera interface, I/O, encoder and LED controller, field bus interfaces and 10 GigE Ethernet are combined with a 2-GHz 8-core ARM A72 architecture processor. We look forward to the feedback from professional developers to the LeMans VisionBox."
With its partner programme (ecosystem of partners), Xilinx Inc., based in San Jose (California), relies on very close cooperation with users, within the context of which it designs solutions for embedded vision system hardware and software developers. There is already a series of sophisticated embedded vision applications using Xilinx technology – from monitoring, robots and drone deployment to augmented reality. Industry 4.0 plays "a key role" in these industrial applications for Aaron Behman, Director of Strategic Marketing/Embedded Vision. After all, only embedded vision enables the real-time and high-performance capabilities of Xilinx's Field Programmable Gate Arrays (FPGAs) to be fully exploited.
Chip protects against data theft
At Vision, Xilinx will be showcasing how sensor fusion can be achieved with high levels of freedom, thanks to freely programmable systems. "Sensor fusion is more than just a forward and backward-pointing camera," stresses Aaron Behman. "Xilinx supports homogeneous applications, like stereo vision or multi-camera surround views and heterogeneous sensor fusion – for instance of infra-red and visible data – in a single system." The highlights include consideration of possible data theft as well as the new "Trust Zone": encryption systems and anti-tampering functions for embedded vision are incorporated on a single chip.
Smarter vision with embedded vision?
This year's open forum held at the Industrial Vision Days deals with the issue of “Smarter Vision with Embedded Vision?" and is presented by the inVISION trade magazine. It is intended to highlight the issues behind "embedded vision" and the benefits it offers users.