SPECIAL OFFER! Join ISA now and get the rest of 2023 Free.

  • By Bill Lydon
  • Features

From transistors to the brink of Industry 4.0

By Bill Lydon

Since ISA’s founding in 1945, technology has leapt forward, bounding across the decades to change the world. Industrial automation technology has changed no less than technology for transportation, communication, and commerce, and it has indeed borrowed heavily from innovations in those fields. Now, 75 years later, we are on the brink of the Fourth Industrial Revolution, called Industry 4.0, and a wide range of products, methods, plans, and architectures have allowed automation and controls professionals to step ever onward. Here are the top technology milestones that have marked their path. Below are some of the titans of automation technology—ISA members and others—who furthered the tech, optimized it for industrial operations, and supported others in applying it.

Semiconductors and Moore’s law

The significant power spanning from mainframe computers down to embedded processors inside of controllers and sensors is rooted in the transistor, which was invented in 1947 by John Bardeen and Walter Brattain at Bell Labs. Innovations in transistors are the basis of the development of integrated circuits and microprocessors. Even the most powerful processor chips today are measured based on the number of transistors they contain.

The first low-cost junction transistor widely available was the CK722, a PNP germanium small-signal unit from Raytheon, introduced in early 1953 for $7.60 each. Putting cost in perspective, Texas Instruments of Dallas and Texas and Industrial Development Engineering Associates (I.D.E.A.) of Indianapolis, Ind., collaborated on creating the Regency TR-1, the world’s first commercially produced transistor radio. When it was released in 1954, the Regency TR-1 cost $49.95 in U.S. dollars (approximately $476 in 2019 U.S. dollars).

Transistors made possible electronic industrial controllers in the 1950s that over time supplanted the application of many pneumatic or air-based controllers of the 1920s and ‘30s.

The cost of solid-state electronics is a history of delivering more power at lower cost. It was expressed in Moore’s law, which has stood the test of time since it was first defined by Gordon Moore, the cofounder of Fairchild Semiconductor and CEO of Intel. In a 1965 paper, he described a doubling every year in the number of components per integrated circuit, and projected this rate of growth would continue for at least another decade. Moore’s law continues to drive computing technology as evidenced by the sophistication and miniaturization of commercial and industrial devices.

The Internet of Things (IoT) is possible because of continued high-performance electronics and processors that continue to get smaller in size and lower in price.

Programmable logic controller (PLC)

The first PLC was delivered to General Motors in 1970 to control metal cutting, hole drilling, material handling, assembly, and testing for the Hydramatic Model 400 automatic transmission. The two ingredients for success were using a computer to solve logic that was previously done with relays, and Ladder Logic Programming that empowered electricians to program the computer from their base knowledge. Before the PLC, huge banks of relays were used. They consumed a large amount of space, were dependent on mechanical relay reliability, were hard to troubleshoot, and required significant hours of rewiring to change the logic for any reconfiguration.

Richard Morley, a Bedford engineer, is credited with the original design. He and his team of engineers created a solid-state, sequential logic solver, designed for factory automation and continuous processing applications: the first practical programmable logic controller. It was called the “Modicon 084,” because it was the 84th project at Bedford Associates. Upon learning of GM’s requirements, the company demonstrated the Modicon 084 to General Motor’s Hydramatic Division in November 1969.

Ladder Logic Programming was a huge advantage for working electricians. Much like spreadsheets that later empowered accountants and others, users could program a computer in an easy-to-understand way. Bedford Associates’ Ladder Logic incorporated symbols from electrical engineering to depict sequences of operations. In his article, “Ladder Logic Languishing?” published in the April l992 issue of Manufacturing Systems, inventor Morley recalled:

“Ladder Logic, as a control language, was first used in conjunction with silicon devices around 1969 at Bedford Associates. To support the control language, a hardware platform was devised that had three constituent elements: a dual-ported memory, a logic solver, and a general-purpose computer. Early at Modicon, we used a degenerate form of ladder representation. The great advantage was that the language could be understood by any working electrician in the world. Later the language was expanded to multi-node, and additional functions were added . . . Ladder Logic functionality and PLC adaptability quickly spawned an entire industry.”

The advantage that the PLC brought to the control industry was the ability to program the system, which could not be accomplished with electromagnetic relay panels. The panels had to be rewired when control schemes changed. In contrast, the new PLC could be changed much more easily and faster, and also had the advantage of a much smaller footprint.

Distributed control system (DCS) architecture

The introduction of the Honeywell TDC 2000 in 1975 was the beginning of commercial DCSs. It was the first system to use microprocessors to perform direct digital control of processes as an integrated part of the system. This distributed architecture was revolutionary with digital communication between distributed controllers, workstations, and other computing elements. Computer-based process control systems before the TDC 2000 were mainly data collection and alarm systems with control done by pneumatic loop controllers and standalone electronic proportional, integral, derivative (PID) controllers.

About the same time in the mid-1970s, Yokogawa in Japan introduced a distributed control system called the Centum. The Yokogawa Centum and Honeywell TDC 2000 were based on the concept that several microprocessor-based loop controllers could be controlled by supervisory minicomputers. The operator would have a push-button, cathode ray tube (CRT) based display rather than an annunciator panel. The controllers would be connected together on a data highway that would carry the information from the various nodes or stations. The highway, or bus, would serve as a signal route. The design would move the controllers back to the process, shorten the control loops, and save on wiring costs.

Personal computer (PC)

In 1981, IBM introduced the personal computer using what was to become the standard disk operating system (DOS), created by Microsoft. The PC was a general-purpose computer at a cost point significantly below that of minicomputers. The PC architecture leveraged the innovation and creativity of a wide range of developers with an open hardware bus for add-on cards and an open operating system that developers could use to run their own applications. The openness of the PC platform dramatically broadened available applications, unleashed creativity, and created an ecosystem of developers serving a variety of needs. The PC revolution was on, and industrial automation benefited greatly.

Human-machine interface (HMI)

The development of human-machine interfaces in process control began with a CRT-based system where an operator could read the relative position of process variables “at a glance,” allowing the operator to develop a pattern recognition method of analyzing the current plant operating situation. Honeywell’s TDC 2000, which arrived in 1975, drastically changed the pace of operator console development.

The availability of PCs running DOS and using third-party graphic image software spawned a new breed of HMI solutions around 1985. New companies included Intellution, Iconics, and USDATA.

The next big step was in 1987 when Wonderware introduced InTouch, the first Microsoft Windows–based HMI that added significant features and open interfaces to information technology (IT) and business systems.

Microsoft Windows

Microsoft Windows, introduced in late 1985, had a profound impact on industrial automation. Starting with Wonderware InTouch software, it was later adopted by virtually all industrial automation suppliers, even though they initially took the position that it was not appropriate for industrial and process automation.

Microsoft introduced Windows on 20 November 1985 as a graphical operating system shell for MS-DOS in response to the growing interest in graphical user interfaces (GUIs). Microsoft Windows was a complete, integrated operating system that dominated the world’s PC market with more than 90 percent of market share, overtaking Mac OS, which was introduced in 1984. On PCs, Windows is still the most popular operating system.

Microsoft Windows’ rich environment spawned a large ecosystem of developers who wrote software for a wide range of applications. These included databases, analysis, advanced control, manufacturing execution systems (MESs), batch management, production tracking, and historians.

Microsoft Windows became the way to bridge real-time plant operations with IT and business systems for more unified and coordinated manufacturing and production. The Microsoft Windows operating system platform allowed users to leverage standard IT tools to analyze manufacturing data and share production and plant information seamlessly with business systems. Windows provided the platform for development of OPC, which significantly simplified software drivers for industrial networks and equipment interfaces.

Data historians

A wide range of scientific and engineering applications established the value of time series historic data, which became widely available for control and automation when PCs made it practical. OSIsoft, which started as Oil Systems Inc., introduced the PI System (or Plant Information System) that led to the wide adoption of historians. Patrick Kennedy founded the company in 1980 and is considered the father of plant historians. Historians have become an important tool in many types of industrial manufacturing and process control applications to improve productivity, efficiency, and profits. Historian information is used by automation engineers, operation staff, and businesspeople for many different kinds of applications. Standing the test of time and proving continuing value, historians are now being embedded in controllers and on cloud servers. Watch this great video by Pat Kennedy for more information about real-time data infrastructure: https://tinyurl.com/ISA75f1a.

Open industrial networks

Open industrial networks that encompass sensors, control, and communications significantly simplified the applications of control and automation. This marked the beginning of being able to use sensors and devices from multiple vendors in a single system with a common communications interface. Open industrial communications networks became possible using commercial technology as general electronic and communications technologies advanced.

Starting in 1979, Modbus enabled communication among many devices from multiple vendors by leveraging the RS-485 standard. The standard defined the electrical characteristics of drivers and receivers to use in multidrop, serial communications systems to connect a wide range of controllers, sensors, instrumentation, PID controllers, motor drives, and other devices. Modbus is still prevalent in products, manufacturing, and process plant applications.

In the late 1980s and into the 1990s, there was a proliferation of fieldbus communication standards. Prominent ones that survived include DeviceNet, Profibus, SERCOS, ASi (Actuator Sensor Interface), Foundation Fieldbus, and HART (Highway Addressable Remote Transducer). The HART Communication Protocol, the only one not purely digital, is a hybrid analog + digital industrial automation open protocol. Its most notable advantage is that it can communicate over legacy 4–20 mA analog instrumentation current loops, sharing the pair of wires used by the analog-only host systems.

ISA-88 batch control standards

The ISA-88 (ANSI/ISA-88) series of standards for the design and specification of batch control systems widely used in process control industries has had a significant impact on productivity that continues with wide adoption throughout the world. The ISA-88 addresses batch process control for implementing a design philosophy to describe equipment and procedures applicable to software implementations and manual processes. The first standard in the series was approved by ISA in 1995 and adopted by the IEC in 1997 as IEC 61512-1.

ISA-88 provides a consistent set of standards and terminology for batch control and defines the physical model, procedures, and recipes. The standards address a wide range of needs, including creating a universal model for batch control, common means for communicating difficulty in expressing user requirements, integration among batch automation suppliers, and simplification of batch control configuration. ISA-88 has been instrumental in bridging all aspects of batch production from the plant floor to enterprise systems. It is worth noting that the PackML standard uses ISA-88.

ISA-95 enterprise-control system integration standards

The ISA-95 (ANSI/ISA-95) enterprise-control system integration standards, which describe the interface content from sensors to enterprise systems, have been widely adopted worldwide. Notably, the Industry 4.0 initiatives reference and use ISA-95. ISA-95 increases uniformity and consistency of interface terminology and reduces the risk, cost, and errors associated with implementing these interfaces. The standards also reduce the effort associated with implementing production of new product offerings.

ISA-95 (ANSI/ISA-95) provides consistent terminology and object models that are foundational for supplier and manufacturer communications. By helping to define the boundaries between enterprise systems and control systems, ISA-95 models clarify application functionality and how information is to be used.

Not insignificantly, the American National Standards Institute (ANSI) approved ISA as an ANSI-accredited standards-writing organization in 1976.

Wireless 802.15.4 enables wireless sensors

The IEEE 802.15.4 low-rate wireless personal area networks (LR-WPANs) standard and subsequent commercial chip components became the building blocks for industrial wireless sensor standards, including ISA100a and WirelessHART. The IEEE 802.15.4 is a technical standard that defines the operation of LR-WPANs. The IEEE 802.15 working group, which defined the standard in 2003, continues to maintain it.

The ISA100.11a (IEC 62734) wireless networking technology standard, developed by ISA, focused on “wireless systems for industrial automation: process control and related applications” with a focus on field-level devices. In 2009, the ISA Automation Standards Compliance Institute established the ISA100 Wireless Compliance Institute. The ISA100 Wireless Compliance Institute owns the “ISA100 Compliant” certification scheme, which does independent testing of ISA100-based products to ensure they conform to the standard.

WirelessHART (IEC 62591) is a wireless sensor networking technology based on HART. It is defined for the requirements of process field devices. A goal of WirelessHART is backward compatibility with existing HART-compatible control systems and configuration tools to integrate new wireless networks and their devices.

Machine vision and image recognition

The application of machine vision systems continues to grow with lower costs and more capability from advances in software technology, particularly image recognition. Properly configured and programmed vision systems eliminate human error, increasing productivity, quality, and profits. Vision systems have become highly intelligent, and flexible sensors in the control and automation process provide a range of input for real-time control. Applications include quality inspection, part identification, robot guidance, and machine control based on parts flow. Initially a camera was connected to a PC that did pattern recognition. Newer machine vision cameras incorporate pattern recognition and a complete IEC 61131-3 PLC in a small device mounted on machines. This is possible because of dramatic developments in computer system-on-a-chip (SoC) and miniature video camera chips.

ISA-95 enterprise-control system integration standard B2MML

ISA-95 (ANSI/ISA-95) has been accepted throughout the world in a wide range of industries. The latest development, Business to Manufacturing Markup Language (B2MML), creates compatibility with enterprise computing, cloud computing, IoT, and Industry 4.0. B2MML further adds value to ISA-95 by providing consistent terminology and object models and by bridging IT and OT. B2MML expresses ISA-95 (IEC/ISO 62264) data models in a standard set of XML schemas written using the World Wide Web Consortium’s XML Schema language (XSD).

B2MML is an open source XML implementation of the ISA-95 and IEC 62264 standards. The Manufacturing Enterprise Solutions Association (MESA International) B2MML is used as the de facto standard interface to exchange the contents defined in ISA-95. There is cooperation to bring this into the OPC UA framework, which provides a secure and reliable architecture for manufacturing industries.

Gaming technology

The gaming industry has pushed the envelope of computing, which industrial automation applications are taking advantage of. Developments originally intended for the video game industry are now having an impact on the cloud, artificial intelligence, data science, and autonomous vehicles. The enormous gaming industry volume, exceeding $125 billion in 2018, is pushing the performance of technology and dramatically lowering cost. All types of industries are applying these new technologies in creative ways. Video game industry hardware and software is increasingly leading to new industrial automation technology and business use cases. Particularly, virtual reality software platforms and user interfaces, such as virtual reality glasses, are being used in industrial automation in creative ways.

Over the years, the industrial and process automation industry has taken advantage of and leveraged commercial technologies as they became mainstream to create applications that deliver greater value. Augmented reality has seeped into daily life and is being used in everything from mobile games to heavy industry. These innovative technologies can assist with every phase of a project, including design, virtual commissioning, startup, troubleshooting, and quality control. Examples of applications that benefited from these technologies are:

  • Machine and process simulation, including virtual commissioning, to identify issues and bottlenecks before installing real equipment, saving time and money
  • Smart glasses with immediate access to manuals, instruction videos, and other materials to help on-site personnel troubleshoot problems. Coupled with communications to subject-matter experts at remote sites, this is a tremendous value to improve production uptime.
  • Training simulators provide immersion learning to plant personnel before they go on the site. For example, there are many demonstrations of training people in a virtual petrochemical plant environment and giving them challenges, so they learn how to react to dangerous disruptions and operations.

OPC UA

OPC UA (IEC 62541) is a service-oriented architecture (SOA) that bridges industrial automation with the latest computing and IoT technologies. It supplies high-quality, contextual data based on application-oriented data models. OPC UA is a unifying technology; members of the OPC Foundation include suppliers of automation, PLCs, DCSs, sensors, industrial software, enterprise resource planning, and cloud services. OPC UA is becoming a key technology for integration of IT and OT.

OPC UA can be deployed in any operating system, including Windows, Linux, real-time operating systems, and proprietary systems. Consistent with modern software practice, OPC UA is open source and available on the open source GitHub website.

The basic principles of service-oriented architecture are independent of vendors, products, and technologies for seamless interoperability. OPC UA has become the unifying system architecture to communicate data and information from many industrial automation disciplines efficiently and effectively. Working with various standards groups, the OPC Foundation jointly created standardized information models, defined in Companion Specifications, to achieve interoperability from sensors to enterprise without layers of software for translation and normalization to disparate systems.

Machine learning

The application of machine learning is accelerating with high-performance, lower-cost hardware, lower-cost data acquisition, large libraries of open source frameworks, and software modules making it practical to apply it in a significantly wide range of applications. Machine learning (ML) is a branch of artificial intelligence based on the idea that systems can learn from data, identify patterns, and make decisions with minimal human intervention. ML applies algorithms and statistical models to analyze and predict future performance without being explicitly programmed to perform the task.

The iterative aspect of machine learning is important, because as models are exposed to new data they automatically adapt and learn from previous computations to produce reliable, repeatable decisions and results. Today the ability to automatically apply complex mathematical calculations to big data repetitively with high-performance, low-cost computing is driving applications such as:

  • self-driving cars
  • Amazon and Netflix online recommendations
  • fraud detection.

In the past, applications had to be built from scratch. Now “off-the-shelf” solutions implemented in common open-source frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) make it possible to rapidly create applications.

Predictive maintenance using machine learning is increasing the uptime of manufacturing and process production lines by eliminating breakdowns that by their nature are disruptive unplanned events. By monitoring equipment and benchmarking against models and rules, systems can predict problems and advise maintenance workers to make repairs before problems cascade into larger failures. In addition, embedded processors with sensors are being added to specific pieces of equipment to analyze and alert maintenance about impending problems. Machine learning is also being applied as part of the closed-loop strategy for control and automation to improve machine and process performance.

Industry 4.0 initiatives

Industry 4.0 is focused on the application of a range of new technologies to create efficient self-managing production processes using IoT and open software and communications standards that allow sensors, controllers, people, machines, equipment, logistics systems, and products to communicate and cooperate with each other directly. Germany’s Industrie 4.0 initiative has influenced thinking throughout the world and become a model for other initiatives and cooperative efforts, including Made in China 2025, Japan Industrial Value Chain Initiative (www.iv-i.org), Make in India, and Smart Manufacturing Leadership Coalition (SMLC).

A core tenant of Industry 4.0 is that automation systems must adopt open source, multivendor, interoperability software application and communication standards similar to those that exist for computers, the Internet, and cell phones. Industry 4.0 demonstrations acknowledge this by using existing standards, including the ISA-88 batch standards, ISA-95 enterprise-control systems integration standards, OPC UA, IEC 6-1131-3, and PLCopen.

The Industry 4.0 initiative started as one part of a 10-point high-tech German strategic plan created in 2006. On 14 July 2010, the German cabinet decided to continue the strategy by introducing the High-Tech Strategy 2020 initiative focusing the country’s research and innovation policy on selected forward-looking projects related to scientific and technological developments over 10 to 15 years. Industry 4.0 is a vision of integrated industry implemented by leveraging computing, software, and Internet technologies. The 4.0 refers to the idea of a fourth Industrial Revolution:

  • First: production mechanization using water and steam power
  • Second: mass production (Henry Ford often cited as the innovator)
  • Third: digital revolution (e.g., machine tool numerical control, programmable logic controllers, direct digital control, and enterprise resource planning)
  • Fourth: Industry 4.0 leveraging cyber-physical systems, embedded computing, Internet of Things technologies

The German strategy emphasizes cooperation between industry and science to promote closer links between knowledge and skills.

The vision of Industry 4.0 is significantly higher productivity, efficiency, and self-managing production processes where people, machines, equipment, logistics systems, and work-in-process components communicate and cooperate with each other directly. A major goal is applying low-cost mass production efficiencies to achieve make-to-order manufacturing of quantity one by using embedded processing and communications. Production and logistics processes are integrated intelligently across company boundaries, creating a real-time lean manufacturing ecosystem that is more efficient and flexible.

The digital twin

The digital twin has become one of the most powerful concepts of Industry 4.0. The implementation of model-based, real-time, closed-loop monitoring, control, and optimization of the entire manufacturing and production process, the digital twin concept is helping organizations achieve real-time integrated manufacturing.

The fundamental idea of the digital twin is to have a virtual model of ideal manufacturing operations and processes. This model will benchmark the actual production metrics in real time. The broadest implementation models include all of the factors that affect efficiency and profitability of production, including machines, processes, labor, incoming material quality, order flow, and economic factors. Organizations can use this wealth of information to identify and predict problems before they disrupt efficient production.

The digital twin is a prominent example of practical macro-level, closed-loop control that is feasible with the advanced hardware, software, sensors, and systems technology now available. A critical part of the creation of a digital twin is the need to have a complete information set, including the capture of real-time information with a wide range of sensors based on these requirements. Industry 4.0 is a practical application of the latest technologies, including IoT, to integrate manufacturing and business systems.

Cloud and edge computing

Cloud computing is affecting a wide range of applications, including industrial automation, by providing easy-to-use, high-performance computing and storage that does not require a large capital investment or ongoing overhead support costs of in-house computers and servers. Cloud providers, including Microsoft Azure and Amazon Web Services, have a variety of software tools (i.e., data analysis and predictive) that the general industrial sector and process automation plants can use to solve manufacturing, production, and business challenges. Many industrial automation applications, such as historians, condition-based maintenance, predictive maintenance, asset management, and failure analysis, are now more cost effective with cloud computing technology.

Cloud computing leverages shared resources and economies of scale similar to an electric utility, providing almost limitless computing power and massive storage on demand. Edge computing, a deployment of low-cost and high-performance computing (including communications) is becoming commonplace. It brings computation and data storage closer to the location where it is needed to improve response times, add context to data, and perform functions required locally. In the history of computers and industrial automation, processing has always been pushed as far to the edge of the network as practical with the technology at the time. Today, edge devices can be small blind node computers or SoC embedded in sensors, actuators, and other devices extremely cost effectively. Putting this in context, consider the power and cost of your smartphone today.

These computing devices are platforms for a wide range of software, including IoT, IEC 61131-3 PLC, OPC UA, and MQTT, cloud interfaces, time series databases, HMIs, and analytics. ISA-95 Level 0–2 functions and portions of Level 3 consistent with the new IoT distributed computing models can be accomplished in these devices.

The growing acceptance of industrial sensor networks coupled with edge devices will see more applications deployed on these open system devices rather than PLC and DCS controllers. Edge computing devices deploy both industrial and enterprise networking and communication functions to help seamlessly integrate IT and OT.

Collaborative robots

Collaborative robots (cobots) are a new breed of lightweight and inexpensive robots that work cooperatively with people in a production environment. They are a new way to implement flexible manufacturing without extensive plant floor retrofits and large capital investment. Cobots are inherently safe; they sense humans and other obstacles and automatically stop, so they do not cause harm or destruction. Protective fences and cages are not required, increasing flexibility and lowering implementation costs.

These robots are particularly attractive investments for small- to medium-sized companies. The programming process of this new class of robots is greatly simplified and does not require programming gurus. The robots can be programmed by example or with software that is similar to gaming. Most tasks can be accomplished with no programming skills simply by moving the robot arms and end effectors, teaching the robot what to do. The robot memorizes the motions and creates the program. This is a physical form of the popular computer concept called “what you see is what you get” (WYSIWYG) programming. It is intuitive for users and has proven to broaden the application of technology. The typical cost is less than $40,000 U.S. Simplified programming means collaborative robots can be deployed without hiring specialized engineers.

The development of this new class of robots is similar to how the application of computers expanded with the development of the PC. In the beginning, computers were expensive, powerful devices locked away in special rooms and programmed by software specialists who wrote cryptic computer code. Because the cost to implement solutions was high, few applications used computers. When PCs were introduced, they did not have the computational power of mainframes and minicomputers nor the large amount of memory. But with their lower cost and flexibility, people were empowered to apply computers to a wider range of applications. This factor, coupled with simplified programming, led to a revolution in the application of computers for industrial automation.

These new collaborative robots cannot pick up an engine block, but they can perform a great variety of tasks with smaller payloads, typically 10–30 kilograms. Collaborative robots can flawlessly perform repetitive, mundane, and dangerous tasks that were previously performed by an operator. Operators no longer are forced to stand at a machine for hours doing mindless work or working in a hazardous environment. This improves productivity and quality while freeing up workers for tasks that require human skills.

Cobots are now one of the fastest-growing industrial automation segments; it is expected to jump tenfold to 34 percent of all industrial robot sales by 2025, according to the Robotic Industries Association (RIA) (www.robotics.org/blog-article.cfm/Collaborative-Robots-Market-Update-2018/84). An exciting development is the coupling of collaborative robots with vision systems, image recognition, and artificial intelligence that replicate human manufacturing procedures.

Collaborative robots have lowered the barriers to automation. A broad range of users, particularly small and medium enterprises, can implement them without sophisticated automation personnel. The flexibility of collaborative robots enables the automation of functions that were not practical in the past. Collaborative robots are also suitable for production with make-to-order requirements, since they are easily programmed to do multiple tasks.
 

Titans of Automation

Walter Brattain and William Shockley, transistor inventors

The device that changed everyone’s life in industrialized society, including in the process control industry, was the transistor, invented in 1947 by John Bardeen, Walter Brattain, and William Shockley of Bell Laboratories. Arguably the most important invention of this century, the transistor opened the electronics age, driving out many pneumatic or air-based controllers of the 1920s and ‘30s. The transistor contained three electrodes and could amplify or vary currents or voltages between two of the electrodes in response to the voltages or currents imposed on the third electrode.

Federico Faggin, microprocessor innovator

Microprocessors made industrial controllers of all types practical, and Federico Faggin designed the first commercial microprocessor, the Intel 4004, in 1971. He led the 4004 (MCS-4) project and the design group during the first five years of Intel’s microprocessor effort. After the 4004, he led development of the Intel 8008 and 8080. Later he cofounded Zilog, the first company solely dedicated to microprocessors, and led the development of the Zilog Z80 (used extensively in controllers) and Z8 processors. The influence of microprocessors continues; they are embedded in sensors, actuators, and other end devices. Microprocessors are fueling the implementation of Industry 4.0, industrial digitalization, and IIoT. In 2010, Faggin received the 2009 National Medal of Technology and Innovation, the highest honor the U.S. confers for achievements related to technological progress.

Richard Rimbach, ISA founder and first secretary

ISA was officially born as the Instrument Society of America on 28 April 1945, in Pittsburgh, Pa., U.S. It was the brainchild of Richard Rimbach of the Instruments Publishing Company and grew out of the desire of 18 local instrument societies to form a national organization. Rimbach graduated from MIT with an engineering degree and was the first executive secretary of the Instrument Society of America. Industrial instruments, which became widely used during World War II, continued to play an ever-greater role in the expansion of technology after the war. See www.isa.org/about-isa/history-of-isa.

Albert F. Sperry, ISA founder and first president

Albert F. Sperry, chairman of Panelit Corporation, became ISA’s first president in 1946. The same year, the Society held its first conference and exhibit in Pittsburgh. The first standard, RP 5.1, Instrument Flow Plan Symbols, followed in 1949, and the first journal, which eventually became InTech, was published in 1954. Representatives from regional societies first gathered in New York on 2 December 1944. ISA was officially founded on 28 April 1945, with 15 local instrument societies and about 1,000 members. Sperry was the first president; Karl Kayan, a professor at Columbia University, was vice president; Clark E. Fry of Westinghouse was treasurer; and Richard Rimbach of Instruments Publishing Co. was secretary.

Glenn F. Harvey

ISA executive director for 32 years, Glenn F. Harvey oversaw ISA’s direction and saw the focus shift from valves and other electrical, mechanical, and pneumatic instruments to microprocessors and PCs to a solutions-based, software-driven discipline. Under his leadership, ISA grew from a few thousand members to a peak of more than 60,000 members during the 1990s.

 

A. T. James and A. J. P. Martin, gas-liquid chromatograph

In 1952, A. T. James and Archer John Porter Martin developed the process of gas-liquid chromatography, a technique for separating and analyzing a mixture, for which they later received the Nobel Prize. This technique dramatically improved the speed, accuracy, and sensitivity of previous chromatographic procedures. By 1956, a company called Beckman Instruments was marketing the first gas chromatograph.

Dick (Richard) Morley, father of the PLC

Dick Morley is considered the father of the programmable logic controller (PLC), which was conceived by his team at Bedford Associates. Morley also supported ISA and encouraged young automation professionals to join.

Morley and his team of engineers developed a solid-state, sequential logic solver designed for factory automation and continuous processing applications: the first practical programmable logic controller called the Modicon 084. The company demonstrated the Modicon 084 to General Motor’s Hydramatic Division in 1969 and delivered the first commercial unit to GM in 1970 to control metal cutting, hole drilling, material handling, assembly, and testing for the Hydramatic Model 400 automatic transmission. The new system replaced the large electromagnetic relay panels that GM previously used to identify where problems had occurred.

The PLC allowed those in the control industry to program the system, which was not possible with electromagnetic relay panels.

Karl Åström, father of adaptive control

Karl Johan Åström is a Swedish control theorist who made contributions to control theory and control engineering, computer control, and adaptive control. In 1965, he described a general framework of Markov decision processes (MDPs) with incomplete information, which led to the notion of a partially observable Markov decision process (POMDP). A POMDP models an agent decision process in which it is assumed that the system dynamics are determined by an MDP, but the agent cannot directly observe the underlying state. Instead, it must maintain a probability distribution over the set of possible states, based on a set of observations and observation probabilities, and the underlying MDP. The POMDP framework is general enough to model a variety of real-world sequential decision processes. Applications include robot navigation problems, machine maintenance, and general planning under uncertainty. Leslie P. Kaelbling and Michael L. Littman adapted it for problems in artificial intelligence and automated planning.

Bill Gates and Paul Allen, Microsoft founders

Microsoft Corporation, founded by Bill Gates and Paul Allen on 4 April 1975, has and continues to have a significant impact on accelerating the creation and application of valuable industrial control and automation software. Microsoft Windows and server offerings in particular are a platform for a wide range of innovative, creative, and valuable industrial applications. More continue to be developed by industrial control and automation subject-matter experts.

Larry Evans, pioneer in process modeling

Larry Evans started as an MIT chemical engineering professor and principal investigator of the ASPEN Project, a major research and development effort. The purpose of the project was to develop a “third-generation” process modeling and simulation system that could be used to evaluate proposed synthetic fuel processes both technically and economically.

When the project was completed in 1981, Evans, along with seven key members of the project staff, founded Aspen Technology, Inc. (AspenTech) to license the technology from MIT and to further develop, support, and commercialize it. As CEO at AspenTech, Evans greatly expanded the breadth and depth of the technology over the ensuing years and brought on board a wide range of complementary products. The company grew from a 10-person startup to a public company.

Dennis Morin, founder of Wonderware

Dennis Morin founded Wonderware in 1987. His vision of Microsoft Windows–based HMI was inspired by an early 1980s video game that allowed players to digitally construct a pinball game. He figured operators monitoring factory operations would be more productive with a machine that was fun and easy to use. Wonderware marked the beginning of the Microsoft industrial software revolution that opened the industrial and process control systems architectures to third-party developers. In 2003, InTech magazine listed Dennis Morin as one of the 50 most influential innovators in the history of industrial automation.

In one of the great rags-to-riches entrepreneurial stories of the 1980s, Morin was 40 years old when he was terminated by Triconex and started Wonderware. He drove a taxi in Boston before coming to California in the 1970s. He told his idea to a young technology wizard, Phil Huber, who joined him in forming Wonderware.

Patrick Kennedy, father of plant historians

Patrick Kennedy, considered the father of plant historians, founded Oil Systems, Inc. (now OSIsoft), and the Plant Information System became the first OSIsoft product that was widely deployed throughout industry. Historians have become an important tool in a range of industrial manufacturing and process control applications to improve productivity, efficiency, and profits. Historian information is used by automation engineers, operations, and businesspeople for many types of applications. Standing the test of time and proving continuing value, historians are now being deployed embedded in controllers and on cloud servers.
Kennedy earned a BS and a PhD in chemical engineering from the University of Kansas. A registered professional engineer in control systems engineering, he holds a patent on a catalytic reformer control system.

Tom Fisher, champion of ISA-88

Tom Fisher contributed to and was a champion of ISA-88 and was a World Batch Forum (WBF) chairman. Fisher was a founder of the ISA SP88 committee, which formulated the batch manufacturing standards used worldwide. Fisher joined Lubrizol in 1967 as a process engineer and rose through the ranks during his long career to become Lubrizol’s operations technology manager. He worked previously for DuPont and NASA. He also was ISA’s publications VP and a member of the Process Control Safety subcommittee of the Center for Chemical Process Safety. He led the IEC’s SC65A Working Group for batch control. Fisher educated a generation of batch process engineers and wrote several books on subjects including safety interlock systems, control design, and control applications (including a major text on batch control systems). Fisher was elected chairman of WBF in 1999.

Lynn Craig, champion of ISA-88

Lynn Craig was deeply involved in ISA-88, World Batch Forum, and ISA-95. Craig attended the University of Tennessee – Knoxville and was manager of process control and automation at Rohm & Haas company for more than 30 years. Craig was an originator and voting member of the ANSI/ISA SP95 standards committee (17 years), past chairman and voting member of the ANSI/ISA SP88 Batch Control committee (17 years), and first elected chairman of the WBF.

 

Dennis Brandl, champion of ISA-95

Dennis Brandl, BR&L Consulting, wrote most of ISA-95, as well as other important industry standards. Brandl is an active member of the ISA95 Enterprise/Control System Integration committee, coauthor of the MESA B2MML standards, a member of the ISA99 Industrial Cybersecurity standards committee, the former chairman of the ISA88 Batch System Control committee, and a contributor to the OPC Foundation and IEC 62541 standards. He specializes in helping companies use manufacturing IT to improve applications such as device connectivity, business-to-manufacturing integration, manufacturing execution systems, batch control, general and site recipe implementations, and automation system cybersecurity. He has been involved in automation system design and implementations, including Apollo and space shuttle test systems for Rockwell.

Ed Hurd, helped birth commercial DCS

Ed Hurd was a major driver of the Honeywell 2000, which was introduced in 1975 and marked the beginning of commercial DCSs. At the 1976 ISA show in Houston’s Astrodome, Honeywell formally unveiled the TDC-2000, the first system to use microprocessors to perform direct digital control of processes as an integrated part of the system. This distributed architecture was revolutionary with digital communication between distributed controllers, workstations, and other computing elements. Hurd served as president of Industrial Control from 1993 to 1995 and, before that, was vice president and general manager of Honeywell’s Industrial Automation and Control Group. He won a Sweat Award in 1967 for circuitry design and was the design architect for an assignment called Project 72. After about two years, the group synthesized a next-generation control system. The project led to the TDC 2000, a DCS that took the industrial automation and control group from $5 million to $500 million in five years.

Bill Lowe, lab director for the IBM PC
IBM’s personal computer (IBM 5150) was introduced in August 1981, one year after corporate executives gave the go-ahead to Bill Lowe, the lab director in the company’s Boca Raton, Fla., facilities. Non-IBM personal computers were available as early as the mid-1970s, but the IBM PC launch legitimized use of this class of computers in business, scientific, and industrial applications. Lowe established a task force that developed the proposal for the first IBM PC, fighting the idea that things could not be done quickly at IBM. One analyst was quoted as saying that “IBM bringing out a personal computer would be like teaching an elephant to tap dance.” The group worked with a little-known company, Microsoft, for the operating system, and the team beat the deadline, finishing the IBM personal computer by 1 April 1981.

 

John Berra, communication protocol impresario
John Berra, the president of Emerson Process Management and Emerson executive vice president, received ISA’s “Life Achievement Award” at ISA 2002 in recognition of long-term dedication and contributions to the instrumentation, systems, and automation community. As of 2001, only seven people had received the honor, which was first given in 1981. Berra, who began his career as an engineer at Monsanto Co., played a major role in the development of three major manufacturing communications protocols: HART, Foundation Fieldbus, and OPC.

 

Charlie Cutler, redefined APC
Charles R. Cutler, a member of the National Academy of Engineering, invented and commercialized a highly successful multivariable controller that redefined the term advanced process control (APC). In 1984 he founded DMC Corporation, and in 1999 he founded a second company called the Cutler Technology Corporation. Cutler conceived control engineering applications that have brought a competitive edge to the current oil and gas industry, namely Dynamic Matrix Control (DMC) and real-time optimization (RTO). He was honored with a membership in the National Academy of Engineering in 2000 for his contributions to a new class of advanced process control technology. Cutler graduated as a chemical engineer from Lamar University in 1961 and went to work for Shell Oil Co., where he would conceive and implement the concept of a DMC algorithm, saving the petrochemical industry millions of dollars.

 

Odo Struger, named the PLC
Odo Struger of Allen-Bradley is credited with creating the acronym PLC (programmable logic controller). Struger, who earned a PhD from the Vienna University of Technology, also developed PLC application software during his nearly 40-year career at Allen-Bradley/Rockwell. He played a leadership role in developing National Electrical Manufacturers Association (NEMA) and International Electrotechnical Commission (IEC) 1131-3 PLC programming language standards. After moving from Austria to the U.S. in the 1950s, he became an engineer at Allen-Bradley in 1958, retiring in 1997 as Rockwell Automation’s vice president of technology.

 

Mike Marlowe, U.S. federal government liaison for ISA
Mike Marlowe’s relationships and U.S. government contacts where instrumental to ISA gaining access to the necessary agencies and legislators to allow a partnership with the U.S. Department of Labor on workforce development and the Automation Competency Model (ACM). Additionally, Marlowe worked to get the ISA-99 standard adopted by the U.S. government as a foundational standard in the cybersecurity of critical infrastructure. Marlowe’s efforts were significant in ISA-99/IEC 62443 becoming integral components of the United States Cybersecurity Enhancement Act of 2014 the federal government’s plans to combat cyberattacks.

 

Peter G. Martin, automation renaissance man
Peter G. Martin has been an industry contributor, innovator, author, and champion of industrial control and automation for over 40 years. Martin was named one of the “50 Most Influential Innovators of All Time” by ISA. In 2009, he received the ISA Life Achievement Award, recognizing his work in integrating financial and production measures that improve the profitability and performance of industrial process plants. Martin, who began his process control career at Foxboro, holds multiple patents, including patents for real-time activity-based costing, closed-loop business control, and asset and resource modeling. He has authored or coauthored three books: Bottom Line Automation; Dynamic Performance Management: The Pathway to World-Class Manufacturing; and Automation Made Easy: Everything You Wanted to Know About Automation – and Need to Ask.

 

Vint Cerf, father of the Internet

Vint Cerf, widely known as a “Father of the Internet,” is the codesigner of the TCP/IP protocols and the architecture of the Internet. In December 1997, President Bill Clinton presented the U.S. National Medal of Technology to Cerf and his colleague Robert E. Kahn. In 2005, President George Bush gave him the Presidential Medal of Freedom. Cerf began his work at the U.S. Department of Defense Advanced Research Projects Agency (DARPA), playing a key role in leading the development of Internet and Internet-related data packet and security technologies. Since 2005, he has served as vice president and chief Internet evangelist for Google. From 2000–2007, he served as chairman of the board of the Internet Corporation for Assigned Names and Numbers (ICANN), an organization he helped form. Cerf was a founding president of the Internet Society from 1992–1995, and in 1999 served a term as chairman of the board.

 

 

Reader Feedback


We want to hear from you! Please send us your comments and questions about this topic to InTechmagazine@isa.org.



Like This Article?

Subscribe Now!

About The Authors


Bill Lydon is an InTech contributing editor with more than 25 years of industry experience. He regularly provides news reports, observations, and insights here and on Automation.com