Enterprises with industrial operations typically utilize at least two types of computer networks Information Technology (IT) - a network that supports enterprise information system functions like finance, HR, order entry, planning, email and document creation; and Operational Technology (OT) - a network that controls operations in real-time. This second type of network supports realtime or control system products, generally referred to as Supervisory Control and Data Acquisition (SCADA) systems, Distributed Control Systems (DCS), Energy Management Systems (EMS) or Manufacturing Execution Systems (MES), depending on the industry.
There has been much discussion and debate around the convergence between Information Technology (IT) and Operational Technology (OT). In an effort to provide better visibility and information flow between revenue generating OT assets and enterprise applications, these systems have often been interconnected, in many cases without properly securing the control systems from cyber attack first. If the IT and OT networks are interconnected, yet not properly secured, a breach to one network can easily transverse to the other, leaving the entire computing infrastructure at risk.
At first glance, interconnected IT and OT networks appear to share similar technologies and so a common approach to cyber-security might be indicated. However, upon deeper inspection, many important differences in IT and OT networks will be revealed. The unique characteristics of OT systems and networks preclude many traditional IT enterprise security products from operating safely without impairing operations, and when introduced, can provide significant disruption and downtime to these real-time, revenue generating assets.
This paper is intended to educate IT professionals on the unique requirements of operational technology and what is required to properly secure these networks from cyber attack, so that organizations can assure security, reliability and safety of information and revenue generating assets.02/26/2010
Whitelisting is described by its advocates as "the next great thing" that will displace anti-virus technologies as the host intrusion prevention technology of choice. Anti-virus has a checkered history in operations networks and control systems many people have horror stories of how they installed anti-virus and so impaired their test system that they simply couldn't trust deploying it in production.
While anti-virus systems detect "bad" files that match signatures of known malware, whitelisting technologies identify "good" executables on a host and refuse to execute unauthorized or modified executables, presumably because such executables may contain malware. This is a least privilege approach of denying everything that is not specifically approved.
In this paper the Industrial Defender team performs an independent analysis of a variety of whitelisting solutions for their applicability to control systems. The paper closes with some recommendations related to this technology and areas for further research.02/26/2010
ISA100 is one of three standards competing in industrial wireless sensing. What is distinctive about ISA100? What are the prospects for convergence of standards? What would convergence be worth to the industrial wireless market?
ISA100 is a major standards initiative managed by the International Society of Automation (ISA). In addition to standards development, a new organization, the ISA100 Wireless Compliance Institute (WCI), is charged with delivering compliance certification services for the work of ISA100.
The ISA100 committee establishes standards, recommended practices, technical reports, and related information for implementing wireless systems in the automation and control environment, with an initial focus on the field level. Given the committee's broad scope, they have formed a number of working groups to pursue specific tasks. The primary deliverable from the Committee thus far is the standard ISA-100.11a, "Wireless Systems for Industrial Automation: Process Control and Related Applications". However a quick glance at the list of working groups shows that several other topics will be addressed by future ISA100 deliverables.
In 2006, at about the same time ISA100 was forming, the ISA also created the non-profit Automation Standards Compliance Institute (ASCI). This organization manages certification, conformance, and compliance assessment activities in the ISA's automation domain.
ASCI extends the standards work of ISA by facilitating the effective implementation and independent testing of ISA standards. It creates a vital link between the development of standards and industries' implementation of the standards. The ISA100 Wireless Compliance Institute (WCI) functions as an operational group within ASCI. Operating the ISA100 Wireless Compliance Institute within ASCI allows it to leverage the infrastructure of ASCI, which in addition to WCI, is shared by several ASCI compliance programs.11/22/2010
Today's control system engineers face competing design demands: increase embedded system performance and functionality, without sacrificing quality or breaking the budget. It is difficult to meet these challenges using traditional design and verification approaches.
Without simulation it is impossible to verify a control design until late in the development process when hardware prototypes become available. This is not an insurmountable problem for simpler designs with predictable system behavior, because there are fewer sources of error in simpler control algorithms--and those errors can often be resolved by tuning the controller on the hardware prototype.
Today's multidomain designs combine mechanical, electrical, hydraulic, control, and embedded software components. For these systems, it is no longer practical to delay verification until late in the development process. As system complexity grows, the potential for errors and suboptimal designs increase. These problems are easiest to address when they are identified early in the development process. When design problems are discovered late, they are often expensive to correct and require time-consuming hardware fixes. In some cases the hardware simply cannot be changed late in the development process, resulting in a product that fails to meet its original specifications.
Traditional verification methods are also inadequate for testing all corner cases in a design. For some control applications, it is impractical or unsafe to test the full operating envelope of the system on hardware.03/02/2010
One Code to Save Millions: ASME Codes and Standards Guide Dominion in Efficiency, Cost Savings and Safety
In order to stay on track with technology and provide the safest and most efficient working environment at Dominion's nuclear power plants, Dominion follows the codes and standards developed by ASME. ASME's mission is for its Standards & Certification organization "to develop the preeminent, universally applicable codes, standards, conformity assessment programs, and related products and services for the benefit of humanity." These codes and standards have a significant impact on the industry and save companies millions of dollars per year as well as assist in accident prevention and the development of more efficient production and operational practices. This case study illustrates how ASME has helped Dominion become more efficient, increasing cost savings and improving safety measures.03/25/2010
Continuous level measurement is about one thing, e.g. answering the question "how much stuff do I have". There are many applications where you need to know how much material is in a bin, silo or other vessel type. Usually the desired engineering unit is expressed in terms of volume or weight. "Measuring" volume or weight is not always the most practical approach, sometimes it isn't even viable. Take those silos you have, how do you weigh the ingredients if the silos weren't installed with load systems? Not an easy or inexpensive question to answer. So what do we do? This is where continuous level measurement sensors and systems come into play and offer a viable and cost effective approach.
The purpose of this white paper is to discuss and inform about the application considerations when you need to measure the level of material continuously or simply determine on a continuous basis how much stuff you have in your vessels.07/20/2010
The purpose of this paper is to explore the particular ways in which operators can tightly integrate wireless instrumentation networks with SCADA and realize.
Integrating wireless instrumentation with SCADA systems can drive operational efficiency and reduce deployment costs.
The use of wireless instruments in pipelines and gas production operations has been gaining momentum over the past few years. Driven by cost cutting measures and the need to gain more operational visibility to meet regulatory requirements, wireless instruments eliminate expensive trenching and cabling while providing access to hard-to-reach areas using self-contained, battery-powered instruments. However, SCADA engineers and operators are facing the challenge of integrating wireless instrumentation networks with other communication infrastructure available in the field. Managing and debugging dispersed wireless networks presents a new level of complexity to field operators that could deter them from adopting wireless instrumentation despite the exceptional savings.
This paper will look into the particular ways in which operators can tightly integrate wireless instrumentation networks with SCADA and realize the full benefits of such an integrated solution.06/29/2010
Protection from noise and ground loops due to ISO-Channel architecture.
Precision measurement systems are often limited in that all inputs are connected to a single ground. Typically, multiplexer input configurations are set up this way, since all signal inputs are connected to the same return. Even differential input configurations use the same ground reference. The result is that accuracy and flexibility for accurate measurements can be severely compromised when noise or common mode voltage is present.
Crosstalk from one input signal can easily be reflected onto another input. The design movement to an A/D per channel can help this problem. But that is not sufficient in many cases.
To minimize noise and ground loops, some newer systems offer isolation between the input signal ground reference and the computer ground. This effectively separates the computer ground from the measurement portion of the system. But still, there is no isolation between input sensor channels, which is a common source of error and frustration for user applications. Why?01/06/2010
AMS2750D Temperature Uniformity Surveys using TEMPpoint.
Industrial process furnaces and ovens require uniform temperature and heating; This is critical to repeatable product performance from batch to batch. These furnaces require periodic inspection for temperature uniformity.
Electronic and Mechanical Calibration Services, Millbury Massachusetts characterizes temperature uniformity in industrial furnaces and ovens for their customers. This is accomplished by measuring temperature in several locations throughout the furnace and monitoring temperature with thermocouples over time according to AMS2750D specifications.
The customer previously used chart recorders which require constant monitoring while the survey is running. Surveys can run anywhere from 35 minutes to several hours long depending on the industry specified requirements. With the TEMPpoint solution the operator can set it up and let it run unattended, freeing them up to multitask their time and work more efficiently. The shipping TEMPpoint application required very little modification using Measure Foundry and now fulfills customer's requirements.01/06/2010
Everyone is familiar with the concept of temperature in an everyday sense because our bodies feel and are sensitive to any perceptible change. But for more exacting needs as found in many scientific, industrial, and commercial uses, the temperature of a process must be measured and controlled definitively. Even changes of a fraction of a degree Celsius can be wasteful or even catastrophic in many situations.
For example, some biotech processes require elevated temperatures for reactions to occur and added reagents require exactly the right temperature for proper catalytic action. New alloys of metal and composites, such as those on the new Boeing 787 Dreamliner, are formed with high temperature methods at exacting degree points to create the necessary properties of strength, endurance, and reliability. Certain medical supplies and pharmaceuticals must be stored at exactly the desired temperature for transport and inventory to protect against deterioration and ensure effectiveness.
These new applications have driven the hunt for more exacting temperature measurement and control solutions that are easy to implement and use by both novice users and experienced engineers alike. This is a challenging task. However, new equipment and standards, such as LXI (LAN Extensions for Instrumentation) offer a methodology to perform these exacting measurements in test and control applications.
Many LXI devices are available on the market today. But, what do you need to know to select the best temperature measurement solution for your test and control application? This paper describes the common pitfalls of precision temperature measurement and what you need to consider before selecting a temperature measurement solution.01/06/2010
Training the Field Operator of the Future
Simulators are widely recognized as essential to process control training as they facilitate the propagation of a company's standard operating procedures (SOPs). This paper explores the use of process control simulators by Chevron Products Company to challenge existing corporate SOPs and to help achieve improvements in overall production performance.
Simulation software has proven highly valuable to modern computer-driven businesses. The growth of Computer-Aided Design technologies in the 1960s enabled engineering and architectural firms to quickly explore new products and novel approaches. The impact was a dramatic reduction in the time and cost associated with then-current best-practices for product innovation and design. Computers became more affordable in the 1990s and software became more powerful. This facilitated widespread acceptance of simulation tools within educational spheres, particularly within universities. Simulators allow an instructional designer to construct realistic tasks or situations that elicit the behaviors a learner needs to function effectively within a domain (Mislevy, 2002). Simulation tools have been used as a means of exposing students to complex concepts and have inspired higher level learning activities including novel research. Through the use of two- and three-dimensional models, the theoretical was more easily examined and the proven more readily understood. Similarly, simulation models can be used for individual or team-based problem solving. In their research, Mislevy, Steinberg, Breyer, Almond, and Johnson (2002) describe the importance of capturing data from a simulator that directly relates to real-world performance and production. This helps instructors to connect the student's interactive simulation experiences with known best-practices for advanced learning.02/23/2010
Ensuring an Accurate Result in an Analytical Instrumentation System Part 1: Understanding and Measuring Time Delay
Process measurements are instantaneous, but analyzer responses never are. From the tap to the analyzer, there is always a delay. Unfortunately, this time delay is often underestimated or not accounted for or understood. Time delay in sample systems is the most common cause of inappropriate results from process analyzers.
In many cases, it is invisible to operators and technicians, who are focused on the necessity of making the sample suitable for the analyzer. It is not unusual for operators to assume that the analytical measurement is instantaneous. In fact, sample systems often fail to achieve the industry standard of a one minute response.
As a general rule, it's always best to minimize time delay, even for long cycle times, but delays extending beyond the industry standard are not necessarily a problem. The process engineer determines acceptable delay times based on process dynamics.
Delays become an issue when they exceed a system designer's expectations. A poor estimate or wrong assumption about time delay will necessarily result in inferior process control.
This article is intended to enhance understanding of the causes of time delay and to provide the tools required to calculate or approximate a delay within a reasonable margin of error. We will also provide some recommendations for reducing time delay. The potential for delay exists in the follow sections of an analytical instrumentation (AI) system: process line, tap and probe, field station, transport line, sample conditioning system, stream switching system, and analyzer.11/18/2010
Ensuring an Accurate Result in an Analytical Instrumentation System Part 2: Calibrating the Analyzer
In many analytical instrumentation systems, the analyzer does not provide an absolute measurement. Rather, it provides a relative response based on settings established during calibration, which is a critical process subject to significant error. To calibrate an analyzer, a calibration fluid of known contents and quantities is passed through the analyzer, producing measurements of component concentration. If these measurements are not consistent with the known quantities in the calibration fluid, the analyzer is adjusted accordingly. Later, when process samples are analyzed, the accuracy of the analyzer's reading will depend on the accuracy of the calibration process. It is therefore, imperative, that we understand how error or contamination can be introduced through calibration; when calibration can - and cannot - address a perceived performance issue with the analyzer; how atmospheric pressure or temperature fluctuations can undo the work of calibration; and when and when not to calibrate.11/18/2010
Ensuring an Accurate Result in an Analytical Instrumentation System Part 3: Maintaining a Representative Sample
The objective of an analytical instrumentation (AI) system is to provide a timely analytical result that is representative of the fluid in the process line at the time the sample was taken. If the AI system alters the sample so the analytical result is changed from what it would have been, then the sample is no longer representative and the outcome is no longer meaningful or useful. Assuming the sample is properly taken at the tap, it may still become unrepresentative under any of the following conditions:
- If deadlegs or dead spaces are introduced at inappropriate locations in the AI system, resulting in a "static leak," a bleeding or leaking of the old sample into the new sample;If the sample is altered through contamination, permeation, or adsorption;
- If the balance of chemicals is upset due to a partial change in phase; or
- If the sample undergoes a chemical reaction.
This article will review the major issues leading to an unrepresentative sample and provide recommendations on how to avoid a compromised sample. It will discuss deadlegs and dead spaces; component design and placement; adsorption and permeation; internal and external leaks; cross contamination in stream selection; and phase preservation.11/18/2010
As production runs ever closer to equipment and facility operating limits and new plants come on line in expanding and developing economies, the pressure to design and operate systems more safely and economically is increasing. A key to meeting this goal is having competent people who are knowledgeable and experienced in applying the IEC 61508 and IEC 61511 / ISA 84 functional safety standards. To develop and measure an individual's safety engineering competence, several personnel functional safety certification programs have been created. This paper will discuss why these programs are needed and the benefits they deliver to individuals and companies alike. It will also review the characteristics and differences of the various certification programs on the market today, things to watch out for, and some important questions to ask when selecting a certification program.03/05/2010
Klargastechnik Deutschland GmbH's equipment and processes help customers address organic biomass fermentation and recovery while supporting electric power co-generation. The result is clean, green electric power that also reduces both solid waste and hazardous toxic gases such as carbon dioxide and methane, which pollute the environment and contribute to global warming.
In order to provide these benefits, the company's equipment and systems rely on highly precise and reliable flow measurement of process waste gases. Measuring biogas flow at several points in the system provides operators with critical information for optimal gas production, control, safety and reporting. However, Biogas applications present several challenges in selecting the proper flow meter.
Download this application note to learn how a biogas processinf system manufacturer can identify the best flow meter for gas measurements.10/28/2010
Last year the EPA implemented new regulations entitled "Mandatory Reporting of Greenhouse Gases." The new regulations called for certain facilities emitting 25,000 metric tons or more per year of specified GHG's to provide an annual report of their actual GHG emissions.
It is estimated that more than 10,000 facilities in the US meet the criteria for mandated reporting of greenhouse gases. A full description of the EPA mandate can be found on the EPA's web site.
The EPA's reporting mandate comes in response to the goal of reducing warming gases in the atmosphere to address the consequences of global warming.
The EPA says the present objective of the mandate is simple reporting and is not about regulating the reduction of GHG at this time, although bloggers and industry pundits speculate this is likely the next step. It's doesn't require a stretch of logic to anticipate the data collected will frame new regulations to curb the release of GHG in response to domestic and international pressure to slow the rate of global warming.
The EPA's initial mandate in October of 2009 required 31 industry sectors that collectively equal 85 percent of US GHG emissions, to track and report their emissions. In addition to these original 31 industries, the agency in March of this year proposed to collect emissions data from the petroleum and natural gas sector, as well as from industries that emit fluorinated gases and from facilities that inject and store carbon dioxide underground for the purposes of geologic sequestration for enhanced oil and gas recovery.
Methane is the primary GHG emitted from oil and natural gas systems and is more than 20 times as potent as carbon dioxide at warming the atmosphere, while fluorinated gases are even stronger and can stay in the atmosphere for thousands of years. The EPA says the data collected will allow businesses to track their own emissions, compare them to similar facilities, and identify cost effective ways to reduce their emissions in the future.10/28/2010
When a business expands an existing facility, adds a new location, incorporates an influx of new users, or upgrades an existing infrastructure - it's vital to ensure network readiness and validate infrastructure changes to optimize network performance, minimize user downtime and reduce problems after implementation. This white paper describes a methodology to manage network changes that meets the need for speed of implementation without sacrificing accuracy.
Changes in business place demands on the network -and the network professionals who administer it -to expand and accommodate different users, additional users, remote locations and more. Situations driving this increased need to manage and validate infrastructure changes include:
- Mergers and acquisitions: The network established for 50 users must now accommodate 500.
- Business growth into a new wing or facilities: The current network must handle the increased load of new users, applications and infrastructure.
- New technologies: As part of a corporate-wide upgrade, a new technology must be validated for all users before implementation.
- Upgrading the network: When installing new infrastructure devices, the configuration must be validated as correct.
Regardless of what drives the change, one commonality is the need for rapid and accurate completion of the project. Too often, however, changes are reacted to rather than managed proactively, leading to future problems. In part, this is due to the need for fast deployment: All of these changes must happen as quickly as possible, so shortcuts are taken and steps skipped in the process. Accuracy suffers as a result. And ironically, both the network and IT staffs are slowed down because expanding or upgrading networks without upfront due diligence leads to time-consuming problems and troubleshooting later.04/28/2010
Delivering increased precision and enabling advanced regulatory control strategies for continuous process control.
Process control in the most generic sense involves continuously controlling an operation or sequence of operations that changes the state of matter; specifically, this includes changing the state of energy, chemical composition, and/or physical dimension of a substance.
As complex programs need to interface with various aspects of a comprehensive production system, Logic Developer Process Edition function blocks from GE Intelligent Platforms add precision and ease of use to reduce the learning curve for engineers, enable higher operational efficiency, and lower development costs.
This white paper helps engineers and programmers explore the power provided by Logic Developer Process Edition function blocks that allow changes in the state of matter to be controlled to generate beneficial outputs that enhance life (e.g., fuel in, electricity out), and illustrates how businesses can use these function blocks to realize advanced regulatory control strategies. It also explains the differences between Logic Developer Process Edition and GE's Proficy Machine Edition PLC Logic Developer programming software, which is optimal for leveraging an integrated development environment for discrete, motion, and multi-target control applications.04/07/2010
Safety & Automation System (SAS) - How the Safety and the Automation Systems Finally Come Together as an HMI
Today we have clear guidelines on how the Safety Instrumented Systems (SIS) and basic Process Control Systems (BPCS) should be separated from a controls and network perspective. But what does this mean to the HMI and the control room design?
Where do Fire & Gas Systems fit into the big picture and what about new Security and Environmental monitoring tasks?
What does the Instrument Engineer needs to know about operators and how systems communicate with them.
The evolution of the control room continues as Large Screen Displays provide a big picture view of multiple systems. Do rules and guidelines exist for this aspect of independent protection layers? What are today's best practices for bringing these islands of technology together.
This paper will review the topic and provide advice on a subject on which the books remain silent. Today's practices are haphazard and left to individuals without a systematic design or guidance.
Over the past 20 years the Safety System and the Automation system have been evolving separately. They use similar technologies, but the operator interface needs to be just one system. Unfortunately, due to the nature of the designs, this is not the case.
The automation system has been evolving since the introduction of the DCS and many Human Factor mistakes have been made. As we move towards new standards such as ISA SP 101 a more formal approach to HMI design is being taken.
The past widespread use of black backgrounds which cause glare issues in the control room and are solely responsible for turning the control room lights down to very low levels, or in some cases off, are being replaced with grey backgrounds and a new grayscale graphic standard replacing bright colors for a more plain grayscale scheme only using color to attract the operators' attention.
In having strong compliance schemes that restrict color usage to just a handful of colors, restricting the use of some colors that are reserved for important information such as alarm status, it appears that the automation system is being standardized and is starting to take advantage of new technology available to control room designers such as large screen displays.01/06/2010