SPECIAL OFFER! Join ISA now and get the rest of 2023 Free.

  • By Brian E. Bolton
  • Automation IT
Taming your process automation data
The challenge—what to do with all this data
Auto IT Sept/Oct Main img

By Brian E. Bolton

Technological advances in automation and control have grown to a point where trying to keep up with them is increasingly difficult. The amount of data being produced has far exceeded "big data." Thus, the need to capture and store data has risen. Fortunately, the cost of data storage has dropped. The real challenge is figuring out what to do with all this data. We must unlock the data, analyze it, and decide what data is business critical and what data will make us more productive. We only need one set of data based on real facts. We can then use the information to improve business efficiency and processes.

For a long time when we heard the term process automation data, we immediately thought of manufacturing data (e.g., a valve being open or closed, the level in a tank, the temperature in a vessel, or a motor being off or on). As instruments and equipment became more advanced, process automation data followed suit. We could analyze the data to determine when a valve was opened, how long it took to open it, how long it remained open, when it was closed, and how long it took to close it.

For instance, a valve could control cooling water being applied to a vessel to ensure a product was cooled to its specified shipping temperature. With the data from the valve, we could then determine the daily effectiveness of the cooling system. As simplistic as this information seems today, at one point in time, this type of data was considered advanced process automation data. Understanding where the process automation data comes from and how to consume and analyze it provides decision makers with the information needed to successfully run the business.

System data

Data acquisition (DAQ) is the process of measuring an electrical or physical condition, such as voltage, current, temperature, pressure, or sound, with a computer. A data acquisition system consists of sensors, data acquisition measurement hardware, and a computer with programmable software. Sensors, a type of transducer, are devices that convert physical properties into a corresponding electrical signal. Signal conditioning is required to convert sensor signals into a form that can be converted into digital values. Analog-to-digital converters convert conditioned sensor signals to digital values.

Programmable logic controllers (PLCs), distributed control systems (DCSs), and supervisory control and data acquisition (SCADA) control systems are designed to interface with local control modules from different manufacturers. SCADA, DCS, and PLC systems utilize an instrument tag database. The database contains elements called tags or points. The tags or points are related to specific instrumentation or actuators within the process system.

Mission-critical process data is extracted from these systems and stored in a process data historian. Using a variety of available tools, manufacturing personnel can then capture, collect, visualize, and analyze the data.

On the leading edge of technology

Early adopters in certain industries (e.g., pulp and paper, oil and gas, and power generation) were the first to grasp the value of unlocking process automation data and to understand the importance of capturing and collecting it to improve production processes. As these industries were also the least regulated, it was less expensive to generate process automation data using data acquisition systems. In some respects, these early adopters paved the way for more regulatory-compliant manufacturers, who had to make significant upfront financial investments to automate and install large enough data acquisition systems to capture the available process automation data.

In particular, the life sciences industry (mostly pharmaceutical) recognizes the value of process data but is much slower at adopting data acquisition systems. The challenge is finding data acquisition systems that will reduce risk in compliance and address inefficiencies, all while accessing data that is useful. Many of these manufacturers are currently in the process of identifying systems and best practices that will take them to the next level in all areas of their businesses, while also maintaining the integrity of their strict regulatory compliance requirements.

Meanwhile, for manufacturers moving right along the technology curve, process automation data is no longer limited to manufacturing equipment. Smart devices and edge devices are providing data for full end-to-end analytics. Every level of the organization can now make data-driven decisions.

Many software companies have worked diligently to create products that not only will unlock process automation data but will also make it available for other applications. Businesses can now combine metadata, enterprise resource planning/manufacturing execution system (ERP/MES) data, and maintenance management data with process automation data to perform various detailed analyses. Scheduling takes on a whole new meaning when manufacturers can capture a variety of data to help understand the entire life cycle of their products. From the time an order is taken to the accepted delivery to the customer-and everything in between-actionable data is captured.

The data path forward

Selecting the right data historians and the architecture used to gather, collect, store, and protect business critical information/data is key to success. Once that decision is made and implemented, the what, how, when, and who questions will determine the next steps.

  • What data is needed to make informed decisions?
  • How do you want the data presented?
  • When or how often do you need access to the data?
  • Who are the right decision makers that need access to the data?

To be successful, a business will understand early on that the answers to these questions can and probably should change as the analysis of the process automation data matures. Using continuous improvement models will certainly drive early success. These models also generate critical-thinking skills from the data-driven decision makers. With each success comes a new challenge and helps generate a new enthusiasm for all employees, especially when the improvements are tied to monetary gains.

Developing best practices along the way will ensure process knowledge, recognizable quality improvements, and financial gains are realized across the entire enterprise. Things like the development of naming conventions for assets, instrument tags, and data historian tags, will make the end users' jobs much easier. Using a structured data or framework will allow data related to a single asset or multiple assets to be consumed more efficiently.

Implementing condition-based maintenance will greatly improve the efficiency of the maintenance process and should save money by properly identifying the exact time to perform routine maintenance tasks. Notifications can help identify when things are going well or badly and are also key in alerting the right people to stay on top of processes.

Data, for example, can drive tasks like greasing a motor. Let's say the motor manufacturer recommends greasing its motor after every 150 hours of running. Without capturing run-time data, greasing that motor may be scheduled each week. Maybe the motor does not run 150 hours in an entire month. With the run-time data captured and notifications developed, the proper people can be notified when the pump has run 140 hours, so maintenance can be scheduled for the proper greasing intervals.

Although this is a very simplistic example, we can see how using process automation data effectively will help us approach even the most complex tasks with more thought and insight. With the right amount of process knowledge, notifications can be written and implemented for any measurable activity. As with process alarms, it is very important that notifications be used in such a way that they are not a nuisance. It is also important that escalations be used when setting up notifications. Escalations will assure notifications are addressed within a specified time. Using the run time of a motor to determine when it needs to be greased as an example, the notification would be sent to the appropriate maintenance scheduler. The escalation period may be set for a 24-hour acknowledgment. If the person notified does not take action, an escalation notification would be sent to the next person responsible. Some notifications may be of high enough importance to even have additional escalations.

Future data keys

Unlocking process automation data has found its way to other areas of business. Business process automation (BPA) looks at repeatable processes within a company's day-to-day activities and applies automation. This includes things like transferring files, generating reports, and extracting data from unstructured sources and automating them from one central location or computer. A great example would be when hiring a new employee. A simple email with the right information in a consistent format can be sent to BPA software, where the data is extracted to automatically create a user account and passwords for the employees' access to the business servers and applications.

Robotic process automation (RPA) is an automation technology where software robots are used to manipulate and communicate with business systems and applications. This streamlines processes to reduce the burden on human employees. RPA is automation at the graphical user-interface level, or GUI automation. RPA is commonly used in call centers for customer relationship management systems. Having the software robots track down things like order history from an ERP while the customer is being helped makes the process timelier and allows the customer service agent to focus on the needs of the customer.

Intelligent process automation (IPA) is the latest set of technologies used to combine the redesign of fundamental processes with robotic process automation and machine learning. Business process improvements and next-generation tools assist knowledgeable workers by removing repetitive, replicable, and routine tasks.

For those who struggle with the fast-paced technology world, a very significant trend to prepare for is the coming of 5G technology. 5G will have a larger impact on the way we do business and the speed at which we do business than any other technology thus far. What was once just future planning will be available very soon. 5G's speed and efficiency will take industry from off-site monitoring to real-time off-site controlling. It will allow engineering personnel from corporate headquarters to run and monitor equipment in a remote facility. Currently, we can monitor off site with some lag in the information, but when 5G is fully implemented, the network speeds will be fast enough to react to the processes from remote locations. This capability will greatly change the landscape of manufacturing facilities, as it minimizes the number of people required to operate manufacturing processes on site. It will only be possible if businesses are unlocking, processing, and analyzing the right process automation data and preparing their networks for 5G and all it has to offer.

The amount of available data will continue to grow. At the end of the day, the most successful businesses will be the ones that are able to determine what data is critical to their business and how to capitalize on what their data tells them.

Reader Feedback


We want to hear from you! Please send us your comments and questions about this topic to InTechmagazine@isa.org.



Like This Article?

Subscribe Now!

About The Authors


Brian E. Bolton is a consultant for MAVERICK Technologies. He has more than 35 years of experience in chemical manufacturing, including more than 20 years involved with the OSIsoft PI suite of applications, quality assurance, continuous improvement, and data analysis.