SPECIAL OFFER! Join ISA now and get the rest of 2023 Free.

  • By Dennis Nash
  • May 31, 2020
  • Continuous And Batch

Aggregated PID data provides process manufacturers with new insights into control loop performance

By Dennis Nash

While process manufacturers have justified nearly 30 years of automation investments on the observations of one man, today’s control loop performance monitoring technologies are providing new, data-driven insights into the state of regulatory control.

To no one’s surprise, data facilitates more and more of today’s manufacturing decisions. The data itself is abundant, its storage is low cost, and accessing it is increasingly easy. What is more, novel software tools and enhanced processing capabilities empower today’s manufacturers to uncover opportunities for improvement that are invisible to the naked eye. An evolution within the manufacturing sector is unfolding as data analytics continues its progression from basic descriptive functions to predicting outcomes and prescribing corrective measures. All of these model-based capabilities are made possible with data.

In recent years, the manufacturing sector has focused on leveraging data to improve asset reliability, but greater attention is now being invested in process optimization. Whereas one emphasizes uptime as a primary financial lever, the other pursues the quality and throughput gains that result from tighter, more efficient control. In particular, advances in process analytics, such as control loop performance monitoring (CLPM), now simplify the procedures by which manufacturers proactively identify, isolate, and correct negative performance trends.

Recent aggregation of CLPM data confirms industrywide advances in control, and it suggests that new optimization opportunities are on the horizon. For process manufacturers, data analytics in general and CLPM in particular has the power to be a disruptive force for positive change.

Findings and shortcomings

Early insights into process control and optimization highlighted the need for better, more relatable data. In an article written in 1993, David Ender, president of Techmation Inc., asserted that more than 30 percent of proportional, integral, derivative (PID) controllers at a typical production facility were operated manually rather than in their designated automatic mode. Also attributed to Ender and the article, 65 percent of a facility’s regulatory controllers were either poorly tuned or even detuned as a means of concealing other control-related issues. As noted in “Process Control Performance: Not as Good as you Think,” the Techmation executive’s assertions were based on projects conducted at hundreds of production facilities. And while the observations pointed to widespread shortcomings in industrial process control, they also highlighted opportunities for improving both the production and the financial performance of facilities across the process industries.

Several years later the economic impact of poor control was given some much needed clarity. A 2001 report published by the U.K.’s Energy Efficiency Best Practice Programme directly linked gains in operational efficiency and energy consumption with improvements in control. More specifically, the 30 percent of control loops cited by Ender as being operated in manual mode were found to be a primary source of operational losses. The report entitled “Invest in Control – Payback in Profit” quantified the impact of poor control in meaningful terms. By failing to properly harness a production facility’s PID controllers, manufacturers left considerable value on the table. The report sized those losses as follows: 2–5 percent in lost production throughput, 5–10 percent in lost production yield, and 5–15 percent in excess energy consumption, among other operational consequences.

At the time of publication, Ender’s findings were compelling, but they were also anecdotal in nature. They predated availability of the historians currently used to capture and catalog voluminous amounts of process data. Similarly, the software tools needed to systematically analyze large numbers of control loops did not exist in 1993. Although the British government’s report provided a means for quantifying the value of process improvement, the findings were nonetheless presented as generalized ranges. They lacked the segment-specific breakdown and unambiguous financial values favored by both plant and corporate management. In spite of these shortcomings, considerable progress has been achieved over the past three decades, and there is now data to prove it.



The U.K.’s “Invest in Control – Payback in Profits” report published in 2001 offered a much-needed assessment of the impact that poor regulatory control had on production performance. The report cited lost opportunities that manufacturers could recapture through better use of existing automation capabilities. Included were opportunities to increase throughput by 2–5 percent and yield by 5–15 percent.

 

Growth of CLPM and process analytics

The ability to assess PID controller performance on a plantwide scale first became a reality at the start of the new millennium. The first CLPM products were introduced by a cadre of small and large automation firms known for their expertise in process control and PID tuning. The subsequent rise of CLPM solutions as a unique product category mirrored manufacturing’s rapid adoption of sensing, storage, and processing technologies. Within a modern production facility, data is now seen as an essential resource in day-to-day operations. Following the rise of data, CLPM solutions are an increasingly well-established tool in manufacturing’s process diagnostic and optimization toolbox. There is steady year-over-year growth in end-user inquiries and licensed deployments. CLPM is enabling more and more manufacturers to realize greater returns on their investments in data.

CLPM solutions consume a production facility’s readily available process data. The data is either streamed live from a centralized control system or extracted on demand from a historian. With access to a PID controller’s vital signs—set point, controller output, and manipulated variable—even basic CLPM solutions can calculate key performance indices (KPIs) capable of proactively identifying negative trends. More sophisticated solutions include advanced forensic capabilities used to isolate root causes and to formulate recommendations for issue-specific corrective actions. Select CLPM solutions recommend controller tuning adjustments by automating the identification and modeling of data associated with everyday output changes. Some allow access to the calculated values that can then be synthesized with data for analysis using other business intelligence tools, such as Microsoft Power BI, Tableau Software, and Looker.

Objective versus subjective KPIs

A key distinction sets several KPIs apart from the majority that are commonly included in CLPM solutions. That distinction is a function of the objectivity of the insights. Objective KPIs such as uptime, percent time in normal, and stiction offer information that is indisputable. Their values are concrete, uniformly understood, and not subject to interpretation. As examples, a process is either running or it is not, just as a controller is either in its designated “normal” mode or it is not. These KPIs output values that are normalized and that enable comparison with other loops.

In contrast, controller performance metrics such as output travel and output reversals offer insights that are subjective. Values calculated by these metrics are not normalized. Although valuable, the output from these KPIs requires interpretation; additional context is needed. Consider a control loop that has an output travel calculated at 10 full range movements per hour. For a valve, that would equate to 10 fully open to fully closed cycles per hour. Just as a designation of “good control” varies from process to process and even engineer to engineer, it is nearly impossible to state unequivocally that 10 represents too much controller effort without additional consideration of the associated process.

Whereas the binary nature of objective KPIs makes them ideal for analysis and comparative purposes, there remains potential value in the output of most every CLPM metric. Viewed in the context of a single facility, all of the KPI-based information equips management with a clearer understanding of the facility’s capacity for improving production output and efficiency. When viewed through a broader lens, however, the aggregated KPI results from facilities around the globe and representing all industry segments have the potential to bring additional insights. The increasingly widespread deployment of CLPM solutions is making that possible. A more comprehensive understanding of regulatory control is afoot, as is the basis for establishing industrywide standards. Potential by-products are the ability to define “good control” and a new perspective on what constitutes world-class manufacturing.


 

Control Station conducted a formal assessment using CLPM data collected from 116 production facilities. Compared to observations from 1993, results from the CLPM assessment suggest that process manufacturers have made significant improvements in PID control loop performance.

 


New technology, new findings

My company arrived late to market, first piloting its loop performance monitoring solution in 2009. Included in the company’s evaluation initiative were numerous North American manufacturers representing the basic materials, chemicals, oil and gas, and power and utilities sectors of the process industries. Since then the system has been commercially licensed for use at production facilities in 25 countries, and it is used to actively monitor the performance of tens of thousands of PID control loops. Recently the company leveraged its community of customers to compile blinded data from a subset of those production facilities. It used the data from 116 different facilities to test the findings that were originally published nearly three decades ago and to launch an assessment of macro-level controller performance and process optimization trends.

Manual versus automatic mode

Ender’s 1993 review of control loop performance noted that manufacturers operated a large share of their facility’s PID controllers manually. Indeed, the report asserted that in excess of 30 percent of PID control loops were not operated in their designated or “normal” mode. At the time of publication this singular finding revealed to many manufacturers that they would not fully realize the return on investment from their investments in automation without additional changes. Findings culled from the CLPM assessment point to meaningful improvement in controller mode. Specifically, current data indicates that a significantly smaller 14 percent of controllers spend the majority of time operated in a non-normal mode, with another 5 percent occasionally operated in a non-normal mode.

It was well understood then as it is today that manual operation of control loops is undesirable. Statistica, a provider of aggregated data and market analysis, projects sizeable investments that target the further eradication of manual processes. Specifically, Statistica forecasts global automation investments by process manufacturers that exceed $83 billion by 2021. What seems less appreciated than the financial resources being applied, however, is the underlying factors that continue to drive operator behavior. Operators tend to transition critical or dangerous loops from automatic to manual control when such loops function at or near a constraint. Similarly, significant changes to the rate of production without a corresponding adjustment in controller configuration routinely result in a shift away from the prescribed automatic mode. A common refrain from operators over the years is that they feel a greater sense of safety when they are in control.

The reduction in manual operation supported by the CLPM assessment is noteworthy, and it is not unreasonable to attribute the increased use of automatic control to general improvements in automation technologies. In particular, modern supervisory control platforms like distributed control systems are more robust than those from the ‘80s and ‘90s. Regulatory controllers are better too. They are more responsive than the programmable logic controllers from years past, which enables manufacturers to maintain tighter control. Similarly, today’s control room is often equipped with monitoring and diagnostic tools that bolster operator confidence by systematically alerting them to production issues and potential equipment failures. Investments by manufacturers are hitting the target.



MJ_2020_Cont-Batch-Process-fig3

Select CLPM solutions use high-resolution data to automatically isolate process changes, model the associated dynamics, and calculate optimal PID tuning coefficients. The image from PlantESP’s TuneVue™ utility indicates when existing coefficients fail to satisfy the control objective (red) and how recommended coefficients would improve performance.


Controller tuning

The improvement in controller mode mirrors gains that were also achieved in controller tuning. Whereas 65 percent of controllers in the ‘90s had apparently been found to be either poorly tuned or tuned in such a way as to mask other control-related issues, a greater share of today’s PIDs appear to be tuned appropriately. As a metric for analyzing the efficacy of a control loop’s existing tuning parameters, tuning deviation is an effective approach. By modeling a controller’s response to output changes, select CLPM solutions can determine a range of acceptable or recommended tuning values. The tuning deviation KPI quantifies the relationship between a controller’s existing tuning values and the recommended range based on a given controller’s objective. It is expressed in terms of standard deviations (SD). The CLPM assessment data found only 20 percent of controllers in need of tuning based on tuning deviation. Included in that calculation were controllers with a value in excess of 2 standard deviations of the recommended range. The data characterized an additional 25 percent of all PIDs as having “fair” controller tuning values based on SD values between 1 and 2.

To this day, the process of tuning PID controllers is commonly viewed as a “black art” even though controllers have been used commercially since first being introduced by Taylor Instrument Company in 1940. Most manual approaches involve subjective decision-making processes, and they produce results that can be wildly inconsistent. The current market for tuning software includes numerous products that are incapable of accurately modeling the noisy, oscillatory data that is typical of real-world applications.

Advances over the past few decades in PID tuning can be attributed to better education and to a few meaningful innovations. Among those innovations was the elimination of the steady-state requirement, which had forced practitioners to start and end each tuning session with data held at a steady state. Improvements in regulatory control strengthen a production facility’s foundation, and they enable manufacturers to realize the benefits promised by advanced supervisory and model-predictive solutions.

Valve stiction

As pivotal as Ender’s observations proved to be in terms of creating awareness and motivating action, they included only questionable input on the most common mechanical issues associated with poor PID controller performance. Data cited from a valve OEM was based on a statistically irrelevant set of 31 valves from a single system. Although hardly representative of the broader process industries, the article shared that an excessive level of friction was present in 11 of the valves (35.5 percent). Static friction, or stiction, is widely cited as the leading mechanical issue faced by engineers. Stiction prevents a valve or other final control element from functioning properly. It generally results when a valve is packed too tightly such that the valve’s stem cannot respond without the use of excessive force. Stiction is a problem that cannot be corrected with tuning. Analysis of CLPM data uncovered that only 5 percent of all loops exhibited an excessive amount of stiction (i.e., > 2 percent).

Stiction exhibits itself in the form of either saw-toothed or square-waved data when trended. Sharp changes in the value of a control loop’s process variable correspond with the time at which the associated valve stem overcomes static friction in the valve. The stiction metrics available in most CLPM solutions evaluate changes in the process variable. More specifically, they take the size and frequency of changes into account as a means of determining the probability of stiction. Some CLPM solutions not only calculate the probability of stiction, they also quantify the amount of stiction that is present. Combined, the details of probability and amount are useful when prioritizing maintenance projects.

The initial CLPM assessment results are a starting point for understanding the state of regulatory control in the process industries. It shifts the discussion from a review of observations to a review of data-based evidence. If the original observations are to be believed, then the CLPM assessment points to significant progress.

Analytics and risk

Data analytics is used extensively by businesses to examine financial operations and to seek new opportunities for market growth. In its report published in January 2018, the global consulting and advisory services firm McKinsey & Company depicted the effect that analytics has had on business practices. The report cited the functions of sales and marketing and research and development across all industries as being fundamentally changed as a result of analytics. However, McKinsey’s analysis showed the function of manufacturing as experiencing moderate to no change. With the exception of the basic materials and energy sectors, the process industries were found to be lagging in their use of data as a means of advancement.

Clearly, manufacturers utilize data analytics, and some use it extensively. A challenge facing many manufacturers is that not all data from the production floor is readily available for more intensive assessment. The separation of business and process networks is a well-known barrier to access. In a form of jest, corporate and plant engineers routinely refer to this separation of networks as the “DMZ.” The result is that data scientists and others are unable to tap into the full extent of their company’s available data resources. Though reasons for the separation are valid, such security measures have long been a hinderance to analytical initiatives.

Now over two years ago, the authors of the McKinsey report noted that manufacturing was in the midst of the most significant disruption in decades. Citing the Industrial Internet of Things, they wrote: “Competition is intensifying not just within industries but also between them.” In a way the report offered a warning. Those who ignore the importance of data analytics and who fail to understand their relative position in the market will be at risk.

A case for disruption

Consistent with the views of McKinsey and other leading advisory firms, a fundamental benefit of CLPM data is the establishment of clear performance benchmarks. The process industries currently lack objective guidelines for evaluating regulatory control performance. And while guidelines for the manufacturing industry would provide an ideal starting point, there would be similar benefit to segmenting CLPM data and to instituting segment-specific benchmarks.

As with analytics, there is a wide berth between those industry sectors that rely heavily on automation and those that utilize little of it. Sectors such as oil and gas are known for their investment in automation technologies. The refinement of oil is such that incremental improvements in throughput and efficiency can result in outsized financial gains.

Although the value of benchmarking control loop performance is speculative, the role PID controllers play in regulating production is undeniably significant. It is conceivable that CLPM benchmarking could be used in highly impactful ways. It could shape decisions for responding to an ever-changing competitive landscape and for justifying the funding of additional automation investments. As manufacturers advance toward more optimal control, their expectations could certainly be expected to change. Similarly, CLPM benchmarks could guide decisions for transitioning from regulatory to supervisory control or from supervisory to more advanced, multivariable solutions.

A deeper understanding of controller performance truly has the potential to be disruptive. To that end Ender’s original observations deserve credit for highlighting a combination of dysfunction and opportunity that seem obvious today. As in 1993 but now equipped with aggregated performance data, the question remains: How will we use this information?

Reader Feedback


We want to hear from you! Please send us your comments and questions about this topic to InTechmagazine@isa.org.



Like This Article?

Subscribe Now!

About The Authors


Dennis Nash is president of Control Station, Inc., and manages the company’s overall growth strategy. He assumed control of the company in 2004.