Data Modeling in Industrial Automation

datamodelingIn lots of ways the Industrial Automation business has been pretty unsophisticated for a very long time. There’s lots of reasons for this. IA uses PLCs and these have been pretty unsophisticated since, well, their creation.

PLCs have always had the identical structure. Get Inputs in, process the logic, set new Outputs…repeat forever. The data types were pretty limited. Binary, Integer and Floating Point and arrays of those items for the most part. Then later on, structures were included and then user defined structures.

Logic wasn’t any better. You had subroutines but software engineering principles that were well-developed and used in higher level languages generally weren’t applied to Ladder Logic development tools. IEC 6-1131 has done a good job improving some of this but ladder is still ladder. The polymorphism and inheritance that you find in a C++ isn’t even whispered about when it comes to ladder.

Another factor that has kept Industrial Automation in the dark ages is that most data is just binary. It’s a sensor input like a photo eye or an actuator output like a gate. The data being processed has relationships to other data but those relationships were pretty much ignored. The fact that the photo eye was part of the folder device on a packaging machine was ignored. It provided little value in the “old” days.

EtherNet/IP and Profinet IO and all the other factory floor protocols haven’t done anything to change the way that we look at factory floor data. It’s still just plain old inputs and outputs. An Ethernet application layer protocol like EtherNet/IP or an application layer protocol like Profinet IO may have some semblance of object based structure but none of that is really present in the communications. It’s still a bunch of inputs in and outputs out.

The techniques that have evolved in the higher level languages used in general computation just haven’t been used on the factory floor. The evolution there was different. From day one, people who built business systems had to think about relationships. They had to consider how one piece of data was related to another piece of data. They had to consider this as they collected it, as they stored it and as they reported it. It was central to all their thinking.

To do this they created all sorts of techniques and technologies for data modeling. In the ‘80s, one of the seminal ideas for data modeling was engineered by G.M. Nijssen. Deemed NIAM, short for “Nijssen’s Information Analysis Methodology,”. Later is was called ORM, or “object role modeling.” This approach shows representations of relationships instead of showing types of entities as a rational table. That’s a lot of gobblygook but it means that ORM shows relationships and constraints.

Today, because of the superb capabilities of OPC UA, we now have the first real ability to model information on the factory floor. In UA, we have a true extensible Type Definition system and a real ability to describe data, device, machines, lines, anything we want in a standard open way. And then communicate and use that description in ways that have never been possible before.

Lots of industries and standard bodies are now getting on this bandwagon. One of the first ones is the Oil and Gas association that is concerned with the devices and data descriptions for oil platforms. It will be interesting to see how all these efforts progress in the future and what things will look like for Industrial Automation when true data models start to get used on the factory floor.