A few weeks ago I had the opportunity to attend the GE Innovation Summit in Orlando. It was a great opportunity, and despite what some have said, I didn’t just go to Florida to scuba dive. I actually spent more time at the conference than in the water.

GE has an interesting business strategy. If you haven’t noticed, they’ve sold off or are selling all their consumer divisions. GE was big with consumers. They built the electric toaster business. They invented the self-cleaning oven. They sold all sorts of appliances directly to consumers for lots of years. It was a huge business for them.

And now it’s gone. They’ve sold their entire appliance business to Electrolux in Sweden. They only thing left is some light bulb business, and I’m betting that doesn’t last through 2015.

Where are they going? The Internet of Things (IoT), or, as they call it, the Industrial Internet.
They believe that the world is changing and that they have a massive opportunity to take a leadership role in a new paradigm. Jeff Immelt, the GE CEO, expressed that new paradigm like this:


What that equation means is that if you equip machines with smart sensors, you securely collect that data someplace where it can be processed using advanced analytic software so you can achieve exceptional results. Results like a 1% decrease in fossil fuel usage. 1% may not seem like a lot, but for things like aircraft engines and locomotives, it’s a massive amount of money for their customers (and GE).

They detailed at the conference how a single engine on a flight from Orlando to Chicago now generates 2 Terabytes of data. Analysis of all that data has vastly improved the in-service time at Delta airlines. They have decreased the number planes out of service from an average of thirty-five to around fifteen or so. That’s almost a 60% decrease in out-of-service airplanes, meaning they can probably afford to drop the number of planes in their fleet by one or two. I haven’t checked the prices on a 737 or an MD88, but I would bet they’re not cheap to own.

GE is going after results like this in several markets like transportation (aviation, rail), medical and oil and gas. They have the expertise to equip the machinery with sensors, the technology to move the data securely, and the advanced analytic engines to identify actions to generate those exceptional outcomes. It really does look like a good strategy to me.

So what does that mean for all of us who develop, market and sell less glamorous industrial equipment like scales, valves and drives?
The IoT for consumers and the larger market is a question mark. It’s moving fast and what’s going to happen tomorrow won’t really be clear until tomorrow comes. But in our world, that’s not entirely true. Our industry moves slower and more methodically. We don’t drop everything we’re doing today when a …

OPC UA Server Object

UAobjectsThere’s a lot of interest around the globe for OPC UA. One of the drivers is that UA is an extension of the original OPC which I have been calling OPC Classic. I named it after Classic Coke. Though UA isn’t the same throwback after a terrible experiment that Classic Coke was.

The more I learn about UA the more I marvel at its complexity and simplicity. It is both very powerful and at the highest level, very easy to understand. The power comes in the Type System, Object Model, multiple transport layers and ability to wrap other protocols.

For the past several days I have been studying the Server Object. Like everything else in UA there are layers and layers of complexity to it.

The Server Object is, like almost all UA Objects, an instance of a Type Definition. In its case, its Type Definition is the ServerType which is itself, a subtype of the BaseObjectType. All the type definitions for UA are neatly grouped under the root node. From the Root Node in every Server you can follow the reference to the Types Object. From there you can follow the reference to the ObjectTypes Object. And from there to the BaseObjectType which is the parent of the ServerType. A Client can follow references from the Root Node through the Types Object to find the definition of any Object in a UA Server.

The Server object, which is required as part of the Core Server Facet, neatly organizes a lot of information about the Server and makes it easily available to Clients who want to know the capabilities of the Server, its current operational status and any diagnostic information that the Server can provide. Since all Profiles are based on the Core Server Facet, a Client can count on being able to find this information by reading the Attributes and properties of the Server Object. A Client can find the Server Object from the Root Node of the Server by following the reference to the Objects Object and then to the Server Object.

In the Server Object you will find a number of different kinds of references:
ORGANIZES – An Organizes reference serves no purpose other than to establish a relationship from one the source Object to the nodes it organizes. The Root Node, for example, uses Organizes references to relate the base View, Objects and Types Objects to it. The View Node Organizes all references to the currently existing Views in a system. The Server is the destination node for a Organizes reference from the Objects Object.

COMPONENT – A component reference implies a “stronger” relationship between the Source Object and the Destination Object. A Component reference through a hasComponent reference implies that the destination Object is part of the Source Object. The Server Object, for example, has hasComponent references to the ServerCapabilities Object, VendorServerInfo Object and Namespaces Object. Whenever the destination is complex, meaning it is more than just a simple variable, UA requires that a hasComponet …

Data Modeling in Industrial Automation

datamodelingIn lots of ways the Industrial Automation business has been pretty unsophisticated for a very long time. There’s lots of reasons for this. IA uses PLCs and these have been pretty unsophisticated since, well, their creation.

PLCs have always had the identical structure. Get Inputs in, process the logic, set new Outputs…repeat forever. The data types were pretty limited. Binary, Integer and Floating Point and arrays of those items for the most part. Then later on, structures were included and then user defined structures.

Logic wasn’t any better. You had subroutines but software engineering principles that were well-developed and used in higher level languages generally weren’t applied to Ladder Logic development tools. IEC 6-1131 has done a good job improving some of this but ladder is still ladder. The polymorphism and inheritance that you find in a C++ isn’t even whispered about when it comes to ladder.

Another factor that has kept Industrial Automation in the dark ages is that most data is just binary. It’s a sensor input like a photo eye or an actuator output like a gate. The data being processed has relationships to other data but those relationships were pretty much ignored. The fact that the photo eye was part of the folder device on a packaging machine was ignored. It provided little value in the “old” days.

EtherNet/IP and Profinet IO and all the other factory floor protocols haven’t done anything to change the way that we look at factory floor data. It’s still just plain old inputs and outputs. An Ethernet application layer protocol like EtherNet/IP or an application layer protocol like Profinet IO may have some semblance of object based structure but none of that is really present in the communications. It’s still a bunch of inputs in and outputs out.

The techniques that have evolved in the higher level languages used in general computation just haven’t been used on the factory floor. The evolution there was different. From day one, people who built business systems had to think about relationships. They had to consider how one piece of data was related to another piece of data. They had to consider this as they collected it, as they stored it and as they reported it. It was central to all their thinking.

To do this they created all sorts of techniques and technologies for data modeling. In the ‘80s, one of the seminal ideas for data modeling was engineered by G.M. Nijssen. Deemed NIAM, short for “Nijssen’s Information Analysis Methodology,”. Later is was called ORM, or “object role modeling.” This approach shows representations of relationships instead of showing types of entities as a rational table. That’s a lot of gobblygook but it means that ORM shows relationships and constraints.

Today, because of the superb capabilities of OPC UA, we now have the first real ability to model information on the factory floor. In UA, we have a true extensible Type Definition system and a real ability to describe data, device, machines, lines, anything …

Future of the Factory Floor


I’m hearing a lot of talk about IT protocols and the factory floor. This stems from the fact that the Factory Floor is slowly becoming IT. It’s much like the continents drifting around the globe over time. Continents form, move together, move apart over thousands of years. Continents of IT and the Factory Floor are now moving together.

It’s going to happen a lot faster than some vendors in factory automation would like. It’s also apparent to me that IT is going to absorb the factory floor and make the Engineers who work on the factory floor part of IT. As that happens, I think vendors will respond by making their products have more IT “friendly” features.

If you look at IT kinds of products they use a lot of XML, SOAP, SNMP and Web Services. Things pretty easily integrate. You can find interfaces available on the network, figure out what they do and connect and use those interfaces pretty easily. There are things like WSDL documents that publicly describe what is available. We all know that it’s a lot harder on the factory floor. We have things like EtherNet/IP, DeviceNet, Profibus DP and PLCs.

What’s this evolution going to look like? Well, certainly it’s clear that PLCs will look a lot more like Servers with built-in routers over the next few years. They’ll support Oracle and Sequel databases, HTML5 Web pages and web services. You’ll probably have the ability to write control code in Java or IEC 6-1131. We won’t lose all the factory floor protocols – those things will still do all the hard work of communicating with I/O.

What will be different is that these PLC Servers will be easily integrated with the Enterprise. They’ll look to the Enterprise just like any other Enterprise Server. It’s just that there will be a back end to these devices that does factory floor I/O and Control. With the speed and bandwidth of Servers today and the ability to partition the hardware and software it will be easy to use standard Servers as PLCs in the future. In fact, there is no reason to have the physical footprint of the PLCs we have today. A hardened server will do the trick nicely. I expect that Siemens, Rockwell and other PLC manufacturers will probably become software companies in the future at least in regard to the control currently found in their PLC’s.

The key, of course, is going to be security. That’s where OPC UA comes in. I expect UA to be the prime Transport layer for all the Enterprise / PLC communications in the future. It has the ability to support IT transports like Web Services as well as fast binary. With its adaptable, user configurable and powerful security component it makes the most sense as the prime data mover between the factory floor and Enterprise.

I do expect HMIs to disappear also. Everyone will have their own person HMI on them with their own web pages or dashboard that …

Betamax and HMI’s

BetaandHMIsI’m old enough to remember the war between VHS and Betamax. If you don’t remember, these were the first two tape formats for home video recording. Betamax was the Sony standard, while VHS was the standard from JVC. Eventually JVC won the war, even though Betamax was clearly the superior technology. JVC did a really smart thing. They licensed their technology to any and every manufacturer they could find. Those manufacturers all started putting out nearly identical machines that competed with each other. Price became the distinguishing factor, and the manufacturers outdid each other to see who could sell for the lowest price.

Meanwhile, Sony had its very high quality, superior technology Betamax units. The people who bought them bragged about having better quality machines, implying that they were somehow smarter and more prescient than the rest of us. But in the end, we had the last laugh: Sony couldn’t compete with all the low cost VHS manufacturers and Beta died a slow death. Eventually, those arrogant Betamax users had to go out and buy a VHS machine. (Of course, DVR technology then killed VHS.)

We have a similar contest going on that is going to have some impact on the Industrial Automation industry. Since smart phones were introduced, there have been people who saw the smartphone as a “personal HMI.” The idea is that instead of walking up to one of those Rockwell PanelView screens on the side of the machine, you simply pull out your smart phone and look at the data you’re interested in.

There have always been issues with this. One way to do this is to use your cellular provider to access your manufacturing network. That’s not an option that every management team has readily accepted given the security issues with opening up their manufacturing networks to cellular access. People are doing this, but it’s awkward and requires some significant thinking and planning to pull off securely, and that’s what matters most.

Other management teams have just said NO – you can’t use your smartphone to access machine data. I think that’s shortsighted. The new engineers we’re now getting in the automation industry think of their smartphone as their sixth sense. It’s integral to them. As they grow and mature, you can bet that they’ll be pushing for more access.

Well, there is another way to access machine data from a smartphone – or should I say two ways. I am talking about the contest between NFC (Near Field Communications) and Bluetooth LE. With these technologies, the smartphone doesn’t use its cellular communications to access machine data; instead it accesses the machine data whenever it’s in range over one of these short-range technologies.

Near Field Communications (NFC) is a descendant of the RFID (radio-frequency identification) technology we’ve long used on the factory floor. RFID allows a reader to send radio waves to a passive electronic tag for identification, authentication and tracking. NFC is a similar technology that communicates either by a modulated electric field (not …