Manufacturing and production processes have had to be controlled and managed in real time from inception because they change in real time frames. This has been a natural premise of industrial systems from the very beginning.
A major shift in the business of manufacturing has occurred over the past decade which is driving the dynamics of the business of production and manufacturing into the real time domain. Business variables, such as energy prices, feedstock prices and even product prices have rapidly transitioned from highly transactional time frames into real time frames. For example, a decade ago it was not unusual for an industrial plant to establish a contract with its energy supplier that essentially set the price over an extended time period, of often 6 months or even a year. Today, in most parts of the world, long term fixed price energy contracts are not being offered and the price of energy can change multiple times in a day. The implications of this transition are clear. Industrial business functions must operate in real time to be effective and efficient. Industrial companies that do not move to real time business operations will be at a severe disadvantage in their marketplace.10/25/2010
An Introduction to Data Loggers
"I just think the only way we are really going to get to the point we need to get to is to start collecting the real data."
This comment, made in 2009 by New York Public Service Commission chairman Garry Brown, conveys a growing sentiment about the need for solid, objective data on building energy performance.
When it comes to determining actual building performance, it all comes down to data. Data takes the guesswork out of energy management, and drives decisions as to what energy conservation measures need to be taken in a facility.
Portable data loggers are ideal tools for collecting building performance data. These affordable, compact devices can help establish energy performance baselines, and reveal a buildings performance under real-world, rather than modeled, circumstances.
They offer fine-tuned visual performance feedback, measuring changes in temperature and energy use when people enter and exit a building, turn on and off lights, or run heating and cooling systems. They can also be used to help ensure that indoor air quality and comfort are maintained in a building.05/17/2010
The date of January 1, 2005 sits vividly in the minds of manufacturers within the industrial control panel field. That's because that's the day when the National Fire Protection Association's (NFPA) National Electrical Code (NEC) 2005 Article 409 officially went into effect. The code required that short circuit current rating be clearly marked on the industrial control panels in order to be inspected and approved. The markings made it easier to verify proper over-current protection against hazards such as fires and shocks on components or equipment, whether it be for initial installation or relocation. It was the beginning of an era when things would become a little more complicated, but for all the right reasons of ensuring more safety within the industrial world.
The main vision of the NFPA is to reduce or limit the burden of fire and other hazards on the quality of life by providing and advocating scientifically based consensus codes and standards, research, training and education. These codes and standards were established to minimize the possibility of and effects of fire and other risks. Due to misinterpretations, inconsistencies and advancements in technology over the years, they have had to update their codes with consistency in order to comply with existing standards.
Therefore, the focus of this paper will look at the changes that occurred due to Article 409, the impacts that it had, who was affected by the code and how to comply with the code. Precautions like this article had been enforced in the past, but they were too vague, so people found ways to get around them.
The biggest change that took place within the article was the new requirements adopted for industrial machinery electrical panels, industrial control panels, some HVAC equipment, meter disconnect switches and various motor controllers. For the purpose of this paper, we will be concentrating on industrial control panels which are specified as assemblies rated for 600V or less and intended for general use. All in all, it states that the above products must feature a safe design and be clearly marked with specific information concerning Short Circuit Current Rating (SCCR) in efforts of aiding with the designing, building, installation and inspection of the control panels. This way, the above users can both reference and apply all the needed requirements for all new products and installations as well as for modifying existing ones.05/17/2010
This technical white paper will discuss Yokogawa's CENTUM VP DCS (Distributed Control System) product, hereafter referred to as "CENTUM VP", and the extent of its compliance with Part 11 of Title 21 of the Code of Federal Regulations, (21 CFR Part 11), the Electronic Records / Electronic Signatures Rule.
CENTUM VP Batch Management is the optional Batch control function for CENTUM VP, which provides recipe management and process management functionality based upon the ISA-88 Batch Control System standard. This whitepaper addresses the use of CENTUM VP and the Batch Management function.
A detailed analysis of Part 11 was performed, the results of which are listed in the Detailed Part 11 Compliance section (section 5) of this document, which supports the compliance of the CENTUM VP system to Part 11.
CENTUM VP is a comprehensive software package containing configurable functions that support Part 11 compliance (audit trails, electronic signatures and electronic records). The system capitalizes on its Part 11 compliance attributes in the marketing strategy of supplying FDA regulated industries with state of the art automation capabilities.
User training and education as well as the development and utilization of policies and procedures are key components of Part 11 compliance which must be established by the user.04/15/2010
Delivering increased precision and enabling advanced regulatory control strategies for continuous process control.
Process control in the most generic sense involves continuously controlling an operation or sequence of operations that changes the state of matter; specifically, this includes changing the state of energy, chemical composition, and/or physical dimension of a substance.
As complex programs need to interface with various aspects of a comprehensive production system, Logic Developer Process Edition function blocks from GE Intelligent Platforms add precision and ease of use to reduce the learning curve for engineers, enable higher operational efficiency, and lower development costs.
This white paper helps engineers and programmers explore the power provided by Logic Developer Process Edition function blocks that allow changes in the state of matter to be controlled to generate beneficial outputs that enhance life (e.g., fuel in, electricity out), and illustrates how businesses can use these function blocks to realize advanced regulatory control strategies. It also explains the differences between Logic Developer Process Edition and GE's Proficy Machine Edition PLC Logic Developer programming software, which is optimal for leveraging an integrated development environment for discrete, motion, and multi-target control applications.04/07/2010
Using video data to improve both safety and ROI.
Most companies are gathering trillions of bytes of data, day after day, at no small cost, and then doing very little with it. Worse still, the data often is not serving its primary function very cost-effectively.
The "culprit," so to speak, is video surveillance data, the information captured by the video cameras that are used throughout most modern facilities.
But the situation is changing rapidly, thanks to an application called Video Analytics. This white paper looks at the new software technology, and how it can be used to leverage video data for better security and business performance.03/05/2010
This white paper argues strongly that meeting greenhouse gas emissions targets set within the Kyoto Protocol will fail unless Active Energy Efficiency becomes compulsory.
Active Energy Efficiency is defined as effecting permanent change through measurement, monitoring and control of energy usage. Passive energy efficiency is regarded as the installation of countermeasures against thermal losses, the use of low consumption equipment and so forth.
It is vital, but insufficient, to make use of energy saving equipment and devices such as low energy lighting. Without proper control, these measures often merely militate against energy losses rather than make a real reduction in energy consumed and in the way it is used.
Everything that consumes power - from direct electricity consumption through lighting, heating and most significantly electric motors, but also in HVAC control, boiler control and so forth - must be addressed actively if sustained gains are to be made. This includes changing the culture and mindsets of groups of individuals, resulting in behavioral shifts at work and at home, but clearly, this need is reduced by greater use of technical controls.03/05/2010
Meeting the next great disruptive challenge of the 21st century.
Since the Industrial Revolution our society has been driven by an increasing pace of change in business and technology. Every decade or two we have faced a new and disruptive event that challenges business and creates opportunities-the locomotive, the electric light, the automobile, the airplane, the television and the computer, to name a few.
But the greatest disruptive event of the next 20 years may come, not from a single invention, but from the world around us-that is, climate change.
How your business responds to the climate challenge can either differentiate you from the competition and launch new and successful products, or make you the focus of consumer backlash and eroding margins.
This paper will explore the environment as a disruptive force in business, examine the consequences of inaction, and propose the benefits of a proactive environmental policy. It will describe increasing levels of investment that a small company, an enterprise or an industry can make to address the challenge and develop a business case. The paper ends with a concrete roadmap to lead you from today's "business as usual" to a long-term sustainable approach to growing a Green corporation.
After reading this paper, business leaders in every industry will have an understanding of how the environment will impact their business, how to make changes to mitigate the negative impacts and how to explore business opportunities in this new and exciting sustainable world.03/05/2010
Moore Industries believes it is of vital importance to have third-party SIS evaluation for plant safety provided by a company with global coverage and reputation. Earlier designs for process control and safety systems typically used "good engineering practices and experience" as their guidelines. As safety awareness evolved new standards started to evolve. International standards such as IEC 61508/61511 and U.S. born standards like ANSI/ISA84 require the use of more sophisticated guidelines for implementing safety. Unfortunately for manufacturers, compliance with IEC 61508 standards requires enormous documentation. In addition, more complex products require a greater depth of analysis. Software-based products such as those from Moore Industries are complex with their inherent programmable and flexible features unlike previous generation single function analog circuits.
Some companies are actively attempting to bypass the vital third party certification by proclaiming self certification to IEC 61508. This is not in the best interest of end users or the safety industry in general. Self certification is analogous as someone proclaiming compliance without third party testing on a hazardous area approval (such as Intrinsically-Safe).
Moore Industries has been working for many years with customers who require products for safety systems, including those compliant with worldwide safety standards such as ANSI/ISA 84 and IEC 61508/61511. To assist customers in determining if their instruments are appropriate for specific safety systems, Moore Industries has been providing Failure Modes, Effects and Diagnostic Analysis (FMEDA) reports for key products, and has been involved in the evolution of the IEC 61508 standard. As this standard has become more widely recognized and adopted by worldwide customers it was clear that end users were looking for products which had been designed to IEC 61508 from their initial concept. Customers are demanding not only compliance to the standards but verification from an independent third party agency such as TUVRheinland.03/03/2010
Today's control system engineers face competing design demands: increase embedded system performance and functionality, without sacrificing quality or breaking the budget. It is difficult to meet these challenges using traditional design and verification approaches.
Without simulation it is impossible to verify a control design until late in the development process when hardware prototypes become available. This is not an insurmountable problem for simpler designs with predictable system behavior, because there are fewer sources of error in simpler control algorithms--and those errors can often be resolved by tuning the controller on the hardware prototype.
Today's multidomain designs combine mechanical, electrical, hydraulic, control, and embedded software components. For these systems, it is no longer practical to delay verification until late in the development process. As system complexity grows, the potential for errors and suboptimal designs increase. These problems are easiest to address when they are identified early in the development process. When design problems are discovered late, they are often expensive to correct and require time-consuming hardware fixes. In some cases the hardware simply cannot be changed late in the development process, resulting in a product that fails to meet its original specifications.
Traditional verification methods are also inadequate for testing all corner cases in a design. For some control applications, it is impractical or unsafe to test the full operating envelope of the system on hardware.03/02/2010
Real-time performance monitoring to identify poorly or under-performing loops has become an integral part of preventative maintenance. Among others, rising energy costs and increasing demand for improved product quality are driving forces. Automatic process control solutions that incorporate real-time monitoring and performance analysis are fulfilling this market need. While many software solutions display performance metrics, however, it is important to understand the purpose and limitations of the various performance assessment techniques since each metric signifies very specific information about the nature of the process.
This paper reviews performance measures from simple statistics to complicated model-based performance criteria. By understanding the underlying concepts of the various techniques, readers will gain an understanding of the proper use of performance criteria. Basic algorithms for computing performance measures are presented using example data sets. An evaluation of techniques with tips and suggestions provides readers with guidance for interpreting the results.
Over the past two decades, process control performance monitoring software has become an important tool in the control engineer's toolbox. Still, the number of performance tests and statistics that can be calculated for any given control loop can be overwhelming. The problem with controller performance monitoring is not the lack of techniques and methods. Rather, the problem is the lack of guidance as to how to turn statistics into meaningful and actionable information that can be applied to improve performance.
The performance analysis techniques discussed in this paper are separated into three sections. The first section details methods for identifying process characteristics using batches of existing data. The second section outlines methods used for real-time or dynamic analysis of streaming process data. These are vital techniques for the timely identification and interpretation of changing process behavior and deteriorating loop performance. The third section outlines techniques that aid in the identification of interacting control loops.02/23/2010
This paper discusses how portable data logging technology can be used to measure, record, and document the performance of geothermal heat pumps, and provides specific case study examples of how the technology is being applied in geothermal system monitoring applications.12/10/2009
The paper provides an overview of the controller types and enterprise computer systems that can be connected with these appliances.12/09/2009
It's now time to upgrade to a new HART Communicator. Your old hand held HART Communicator is obsolete and receives limited support. You shop around and find that it costs between $3000 and $7000 for a new hand held HART Communicator. A Google search reveals a PC based alternative. Will the PC alternative perform as required? What should you look for?
The PC based HART Communicator has been around for many years, but until recently it has not been able to replace the hand held HART communicator. The main reason is that it could not communicate at the DD level with all the devices in the DD library. Recent developments have eliminated that problem and now is a good time to review the capabilities of a PC based HART Communicator.12/01/2009
The Effect on the 10/100 Industrial Ethernet Switch Performance.
The Anixter Infrastructure Solutions Lab wanted to determine what effect the new TIA-1005 industrial cabling infrastructure standard would have on the data throughput performance of real Ethernet data packets running between SmartBits test cards and various manufacturers' 10/100 Ethernet switches in a real-world simulation. The test included five (5) different IP20-rated switches and three (3) different enterprise rack-mounted switches using various cabling channels made from both Category 5e and Category 6 cabling components and connector pairs that are allowable under the standard. The premise also asserts that the effect of the cabling channel interference will also vary from port to port and switch to switch because of the variable transmitter and receiver functionality.10/28/2009
The Need for Wireless Monitoring An Overview
There is a real on-going need for monitoring of valve positions (actuated or manual) in the process line. Malfunctioning of a valve can result in danger to human health and safety, affect yields, and generate environmental risks. In some industries, regulation requires constant recording of valve position. Currently, such monitoring is done through wired Switch Boxes. Each such device requires data transmission and power cabling. Not only are these cables costly to manufacture and install, they are also one of the most frequent sources of failures in the process line, due to the fact that they are very often exposed to harsh environmental conditions. In fact, it is right here, at the field device level, where the majority of problems with wires really exist.10/09/2009
Industrial application developers have had two main options for interacting with production processes via programmable logic controllers (PLCs): they can buy a preprogrammed monolithic, shrink-wrapped human machine interface (HMI), complete and ready to go or they can customize their own solutions.
Shrink-wrapped HMI software packages are appealing because many complex tasks are hidden from you. Purchase the development software from an authorized distributor, load it into your development PC and then configure, debug and test. Then, just deploy the necessary runtime applications, data servers and configuration files on to your target PC or PCs. What could be easier?
But cookie-cutter HMI software solutions might not necessarily be the best or most practical approach for your specific industrial applications.
For one thing, while the shrink-wrapped HMI software packages enable connections to other vendors' devices, software, and systems via OPC or other standards, such connectivity is seldom adequate for high security or real-time control. And no matter how advanced the integration technology the package uses, you will end up lagging behind the technology curve. For example, if you had bought a package using the distributed common object model (DCOM) and wanted to benefit from advances in security and robustness that Microsoft had made since you bought the package, you would have to buy a new package. Moreover, the monolithic nature of the shrink-wrapped offerings often makes it difficult to embed third-party capabilities directly into your solution, thus limiting your options further.
Then there's training. Because the development environment and behavior of each HMI vendor's software varies, you'll need to acquire specialized skills to accomplish similar tasks. Training courses, material costs and schedules also vary by HMI publisher and many times are offered only through exclusive distributor channels. You could consider hiring outside help, but because of the specialized training and experience, the talent pool can be relatively shallow and therefore proportionately expensive.
And for many, cost of multiple deployments is an even bigger issue. Before you can actually deploy your solution to PCs, portable devices, or Web servers, you must typically have to pay for additional runtime software licenses. If you have more than a couple of users, this could amount to a considerable expense, often making this approach cost-prohibitive, especially if you are paying for more functionality than you actually never need.
Finally, there are the intangibles. As well-designed and flexible as these shrinkwrapped solutions might be, they almost always force compromises that would not be necessary if the solution were custom built for your specific applications. Whether that is a matter of function or just pride, it can be significant determining your satisfaction with the resulting interface.09/10/2009
Ethernet for industrial communications is growing rapidly in factory automation, process control and SCADA systems. The ODVA EtherNet/IP network standard is gaining popularity as a preferred industrial protocol. Plant engineers are recognizing the significant advantages that Ethernet-enabled devices provide such as ease of connectivity, high performance and cost savings. While EtherNet/IP has many advantages, cable installation is often expensive, and communications to remote sites or moving platforms may not be reliable or cost-effective.
Wireless Ethernet technologies have emerged that can now reliably reduce network costs while improving plant production. However, applying these technologies is not a simple matter as industrial Ethernet systems vary greatly in terms of bandwidth requirements, response times and data transmission characteristics. This paper will explore applying wireless technologies to EtherNet/IP based networks for industrial automation systems.09/03/2009
Statement for the Record, July 21, 2009 Hearing before the Subcommittee on Emerging Threats, Cybersecurity, Science and Technology.
I appreciate the opportunity to provide the following statement for the record. I have spent more than thirty-five years working in the commercial power industry designing, developing, implementing, and analyzing industrial instrumentation and control systems. I hold two patents on industrial control systems, and am a Fellow of the International Society of Automation. I have performed cyber security vulnerability assessments of power plants, substations, electric utility control centers, and water systems. I am a member of many groups working to improve the reliability and availability of critical infrastructures and their control systems.07/22/2009
This whitepaper provides the history of the Six Sigma Symbol and explanations on the Six Sigma concept, the Six Sigma implementation, the Six Sigma calculation and more. Download this paper now.
Product variation and defects undercut customer loyalty as well as company profits. Six Sigma is a rigorous, disciplined, data-driven methodology that was developed to enhance product quality and company profitability by improving manufacturing and business processes.
Six Sigma uses statistical analysis to quantitatively measure how a process is performing. That process can involve manufacturing, business practices, products, or service. To be defined as Six Sigma means that the process does not produce more than 3.4 defects per million opportunities (DPMO) which translates to 99.9997% efficiency.
A Six Sigma defect is considered anything that can cause customer dissatisfaction, such as being outside of customer specifications. A Six Sigma opportunity is the total number of chances for a defect to occur.
Six Sigma Concept
The Six Sigma concept was developed by Motorola in 1986 with the stated goal of improving manufacturing processes and reducing product defects and variation. The underlying goal was to achieve near quality perfection with 99.9997% of variable values within specifications.07/17/2009