Potato processors are at the top of the sophistication hierarchy of fruit, vegetable and nut processors who use Key’s sorters. Their lines run 24/7 up to three weeks between cleaning and sanitation shutdowns. When sorters are placed at multiple points on the line, they tell a story of how the processes between them impacted the product. They also provide clues about upstream machine performance and the state of the sorter itself — a malfunctioning ejector, for example, or a sensor window in need of cleaning.
Machine performance also is an indicator of component wear, and analyzing the performance of multiple sorters in a company’s manufacturing network enhances predictive maintenance. The information would be even more powerful if it consolidated data from comparable sorters at McCain Foods, Simplot and other potato processors, although security concerns pre-empt that possibility.
In food manufacturing, such a database is a pipedream. On the other hand, simulation models built on integrated data from multiple sources might one day help resolve bottlenecks and lead to process improvements for every potato processor.
“Big Data can be a significant enabler,” Azzaretti allows, “but it goes hand in hand with the ability to capture data in real time.” When throughput is measure in tons per hour, the faster the response to change, the less rework and waste.
“We work with leading companies, with sophisticated engineering departments that are on the bleeding edge of capturing processing line data to optimize processes,” he adds. “But in the last three years, we’ve noticed widespread interest in this kind of information, in different industry segments and company sizes.” Some want the data delivered off line for analysis, others are using it to drive line performance, and the most highly automated want real-time data fed directly to SCADA systems or MES line-management software for integration with other machine data.
“The crux of the issue now is how all this data is going to come together,” he says. The machine-to-machine protocol of choice in Key’s customer base is OPC UA, while other industry segments favor EtherNet/IP, Profinet and other protocols.
Change is a’comin
If IIoT is to be more than a catch phrase, it will have to deliver more than vague promises of transformational change and provide a business model with clear returns on investment. To date, no such model has emerged in food manufacturing.
An evaluation of offshore oil rigs by McKinsey & Co. concluded that those rigs are equipped with as many as 30,000 sensors, but less than 1 percent of the data generated is used in decision making. Uploading all that sensor data to a cloud surely would fill many servers, but would it help ExxonMobil or BP improve efficiency or pump one more gallon to justify the effort?
Some observers trace the origins of the IIoT to the trucking industry, where GPS devices and wireless sensors relay data via the internet for improved fleet management. In general business, an IIoT case is being made for assistance in product development and improved customer service — priorities for executive management, to be sure, but goals that don’t improve operations within the four walls of the factory.
“Creating the IIoT infrastructure is expensive and time consuming,” G2 Crowd’s Light understates. “Huge growth was expected in 2016, but it didn’t happen and it may not in 2017, either. It’s still a few years away.”
Nonetheless, proponents remain upbeat. “The IT/OT convergence is a reality,” maintains Schneider’s McGreevy. “The IT folks used to be seated at the table with OT half the time. Now they’re at the table 90-100 percent of the time.”
A few examples of the transformational changes that proponents promise would go a long way to jump starting some food company initiatives. Until then, firms will continue breaking in their IIoT shoes and figuring out which suits they go with best.