White Papers

21-40 of 107 < first | | | last >
  • MESA Global Education Program

    This initiative is the first step in filling a noticeable void in industry - the lack of independent competency training in the Operations Management (MES/MOM) arena. This lack of wide-scale competency is recognized as a major barrier to plant and supply chain optimization and global operations excellence.

    With members in 85 countries globally, MESA is an independent, objective community of like-minded people and enterprises working to make Operations more reliable, capable and profitable. Some of the foremost experts across the Operations Management landscape are leading the knowledge sharing within the MESA community by offering programs across 4 continents by mid-2011.

    MESA Certificate of Competency (CoC) for MES/MOM* Methodologies: A 4-day, comprehensive program of MES/MOM Methodologies courses aimed at Systems Analysts, Architects, Programmers, Project Managers and Consultants.

    MESA Certificate of Awareness (CoA) for MES/MOM Business Awareness: A 2-day, high-level program of MES/MOM Business Functions courses geared for executives, manufacturing/operations and IT personnel and sales professionals. The CoA courses are higher level, short versions of the CoC program.

    Learn more.

    MESA
    01/19/2011
  • Cloud Instrumentation, the Instrument Is In the Cloud

    A short bit of history helps to understand why the cloud instrumentation development is so significant.

    The first created instruments, let us call them traditional instruments, are of standalone or box format. Users connect sensors directly to the box instrument front panel, which contains the measurement circuitry and displays the results. Initially it was on analog meters and later with digital displays.

    In many cases, test engineers wanted to have instruments communicate with each other, for instance in a stimulus/response experiment, when a signal generator instructs a digitizer when to start taking samples. This was initially done with serial links, but in the 1970s the Hewlett Packard Interface Bus, which evolved into today's IEEE-488 interface, became extremely popular for connecting instruments.

    The next major breakthrough in measurement technology came with the availability of desktop computers, which made it more cost effective to run test programs, control instruments as well as collect data and allow test engineers to process and display data. Plug-in IEEE-488 boards allowed minicomputers and later PCs to perform these tasks.

    Today such interface cards are often not needed thanks to instruments that communicate with PCs directly over USB or the Ethernet, and most recently even over wireless Ethernet schemes.

    Marius Ghercioiu, President of Tag4M at Cores Electronic LLC
    01/14/2011
  • Ensuring an Accurate Result in an Analytical Instrumentation System Part 1: Understanding and Measuring Time Delay

    Process measurements are instantaneous, but analyzer responses never are. From the tap to the analyzer, there is always a delay. Unfortunately, this time delay is often underestimated or not accounted for or understood. Time delay in sample systems is the most common cause of inappropriate results from process analyzers.

    In many cases, it is invisible to operators and technicians, who are focused on the necessity of making the sample suitable for the analyzer. It is not unusual for operators to assume that the analytical measurement is instantaneous. In fact, sample systems often fail to achieve the industry standard of a one minute response.

    As a general rule, it's always best to minimize time delay, even for long cycle times, but delays extending beyond the industry standard are not necessarily a problem. The process engineer determines acceptable delay times based on process dynamics.

    Delays become an issue when they exceed a system designer's expectations. A poor estimate or wrong assumption about time delay will necessarily result in inferior process control.

    This article is intended to enhance understanding of the causes of time delay and to provide the tools required to calculate or approximate a delay within a reasonable margin of error. We will also provide some recommendations for reducing time delay. The potential for delay exists in the follow sections of an analytical instrumentation (AI) system: process line, tap and probe, field station, transport line, sample conditioning system, stream switching system, and analyzer.

    Doug Nordstrom and Tony Waters, Swagelok
    11/18/2010
  • Ensuring an Accurate Result in an Analytical Instrumentation System Part 2: Calibrating the Analyzer

    In many analytical instrumentation systems, the analyzer does not provide an absolute measurement. Rather, it provides a relative response based on settings established during calibration, which is a critical process subject to significant error. To calibrate an analyzer, a calibration fluid of known contents and quantities is passed through the analyzer, producing measurements of component concentration. If these measurements are not consistent with the known quantities in the calibration fluid, the analyzer is adjusted accordingly. Later, when process samples are analyzed, the accuracy of the analyzer's reading will depend on the accuracy of the calibration process. It is therefore, imperative, that we understand how error or contamination can be introduced through calibration; when calibration can - and cannot - address a perceived performance issue with the analyzer; how atmospheric pressure or temperature fluctuations can undo the work of calibration; and when and when not to calibrate.

    Doug Nordstrom and Tony Waters, Swagelok Company
    11/18/2010
  • Game Changer: Visibility, Enablement, and Process Innovation for a Mobile Workforce

    A key aspect of the "Perfect Plant" is having the right information in the right place at the right time. In most manufacturing environments, instrumentation and monitoring is widespread. Pages and pages of graphs and reports describe every operational characteristic and are used by operators and management to steer the plant to optimal performance. However, in the modern plant, the right time to view this information is not when you are standing in front of an operator console. It is when you are in the field, in front of a failing piece of equipment or discussing a problem while on the move. More often than not, the right way to deliver information is by putting it in the hands of a mobile worker.

    The right way to collect information also involves mobility. Remember that 40% to 60% of equipment in the plants and on the shop floors is not instrumented. Optimizing this critical aspect of plant performance depends on mobile field workers. Armed with the right tools, mobile workers can cost-effectively gather data from non-instrumented assets that can be readily analyzed and integrated into existing back-end decision support systems. Bidirectional flow of information to and from mobile workers is a key competitive imperative required to make fully informed decisions regarding the operation of the Perfect Plant.

    Regrettably for most companies, when it comes to the mobile workforce in manufacturing, too often, vital decisions are made in the dark, in an information-poor environment and with little support or historical contextual information to make informed decisions proactively. Field workers - the people who are closest to the equipment and processes, who feel the heat, hear the noises, and see the changes that can be the first indicators of trouble - frequently do their jobs based on individual experiential knowledge acquired over many years.

    This approach makes manufacturers vulnerable to high levels of variability based on individual talent, skills, and training. With the massive investments in automation over the past decades, management often lacks visibility into what these decision makers in the field do and finds it hard to provide guidance to ensure execution of best practices occur across the field worker roles, production shifts, and assets.

    Invensys, Charlie Mohrmann
    10/25/2010
  • NI Connectivity to Industrial Communications

    National Instruments' programmable automation controllers (PACs) and LabVIEW software can add a wide variety of functionality to existing programmable logic controllers (PLCs) and industrial systems. Machine condition monitoring, high-speed analog measurements and custom vision applications are a few examples of typical PAC applications. Communication between the PLC and PAC systems is extremely important and must be simple, effective and often times deterministic. Common methods include using basic analog and digital I/O, as well as the widespread standard, OLE for Process Control (OPC). For more complex fieldbus systems, there exists a large number of industrial protocols used in process automation, machine–building and other such markets.

    National Instruments
    06/23/2010
  • Passive Techniques for Reducing Input Current Harmonics

    Events over the last several years have focused attention on certain types of loads on the electrical system that result in power quality problems for the user and utility alike. Equipment which has become common place in most facilities including computer power supplies, solid state lighting ballast, adjustable speed drives (ASDs), and un-interruptible power supplies (UPSs) are examples of non-linear loads. Adjustable speed drives are also known as Variable Frequency Drives (VFDs) and are used extensively in the HVAC systems and in numerous industrial applications to control the speed and torque of electric motors. The number of VFDs and their power rating has increased significantly in the past decade. Hence, their contribution to the total electrical load of a power system is significant and cannot be neglected.

    Non-linear loads are loads in which the current waveform does not have a linear relationship with the voltage waveform. In other words, if the input voltage to the load is sinusoidal and the current is non-sinusoidal then such loads will be classified as non-linear loads because of the non-linear relationship between voltage and current. Non-linear loads generate voltage and current harmonics, which can have adverse effects on equipment that are used to deliver electrical energy. Examples of power delivery equipment include power system transformers, feeders, circuit breakers, etc. Power delivery equipment is subject to higher heating losses due to harmonic currents consumed by non-linear loads. Harmonics can have a detrimental effect on emergency or standby power generators, telephones and other sensitive electrical equipment.

    When reactive power compensation in the form of passive power factor improving capacitors are used with non-linear loads, resonance conditions can occur that may result in even higher levels of harmonic voltage and current distortion thereby causing equipment failure, disruption of power service, and fire hazards in extreme conditions.

    The electrical environment has absorbed most of these problems in the past. However, the problem has now reached a magnitude where Europe, the US, and other countries have proposed standards to engineer systems responsibly, considering the electrical environment. IEEE 519-1992 and EN61000-3-2 have evolved to become a common requirement cited when specifying equipment on newly engineered projects. Various harmonic filtering techniques have been developed to meet these specifications. The present IEEE 519-1992 document establishes acceptable levels of harmonics (voltage and current) that can be introduced into the incoming feeders by commercial and industrial users. Where there may have been little cooperation previously from manufacturers to meet such specifications, the adoption of IEEE 519-1992 and other similar world standards now attract the attention of everyone.

    Mahesh M. Swamy, Yaskawa Electric America
    05/17/2010
  • General Purpose Permanent Magnet Motor Drive without Speed and Position Sensor

    1. Power consumption by electric motors
    Worldwide, about two-thirds of the electricity is consumed by motors used in powers industrial facilities. According to DOE report, the motor systems are responsible for 63% of all electricity consumed by U.S. industry and electric bill represents more than 97% of total motor operating costs.

    Rapidly increasing energy cost and strong global interest in reducing carbon dioxide emissions are encouraging industry to pay more attention to high-efficiency motors.

    Permanent Magnet (PM) motors have higher efficiency than induction motors because there are no I2R losses of the rotor. But widespread use of the PM motors has been discouraged by price and requirement of a speed encoder.

    Recent release of low-cost high-performance CPUs and establishment of the speed sensorless control theory (hereinafter referred to as an open-loop vector control method) enables the advent of a general-purpose open-loop control PM drive. In this white paper, the open-loop PM motor control technology is introduced and its characteristics and major application fields are described.

    Jun Kang, Yaskawa Electric America
    05/17/2010
  • Understanding the Concepts Behind Short Circuit Current Ratings (SCCR)

    The date of January 1, 2005 sits vividly in the minds of manufacturers within the industrial control panel field. That's because that's the day when the National Fire Protection Association's (NFPA) National Electrical Code (NEC) 2005 Article 409 officially went into effect. The code required that short circuit current rating be clearly marked on the industrial control panels in order to be inspected and approved. The markings made it easier to verify proper over-current protection against hazards such as fires and shocks on components or equipment, whether it be for initial installation or relocation. It was the beginning of an era when things would become a little more complicated, but for all the right reasons of ensuring more safety within the industrial world.

    The main vision of the NFPA is to reduce or limit the burden of fire and other hazards on the quality of life by providing and advocating scientifically based consensus codes and standards, research, training and education. These codes and standards were established to minimize the possibility of and effects of fire and other risks. Due to misinterpretations, inconsistencies and advancements in technology over the years, they have had to update their codes with consistency in order to comply with existing standards.

    Therefore, the focus of this paper will look at the changes that occurred due to Article 409, the impacts that it had, who was affected by the code and how to comply with the code. Precautions like this article had been enforced in the past, but they were too vague, so people found ways to get around them.

    The biggest change that took place within the article was the new requirements adopted for industrial machinery electrical panels, industrial control panels, some HVAC equipment, meter disconnect switches and various motor controllers. For the purpose of this paper, we will be concentrating on industrial control panels which are specified as assemblies rated for 600V or less and intended for general use. All in all, it states that the above products must feature a safe design and be clearly marked with specific information concerning Short Circuit Current Rating (SCCR) in efforts of aiding with the designing, building, installation and inspection of the control panels. This way, the above users can both reference and apply all the needed requirements for all new products and installations as well as for modifying existing ones.

    Yaskawa Electric America
    05/17/2010
  • Application of PHMSA Rule Control Room Management/Human Factors

    Effective Feb. 2, 2010, the PHMSA rule: 49 CFR Parts 192, 195 Pipeline Safety: Control Room Management/Human Factors imposes control room management requirements for all regulated gas and hazardous liquid pipelines. This paper gives an overview of requirements and time line to comply. Learn more.

    TiPS Incorporated
    05/04/2010
  • Introduction to Vibration

    Vibration is a characteristic of virtually all industrial machines. When vibration increases beyond normal levels, it may indicate only normal wear, it may signal the need for further assessment of the underlying causes, or for immediate maintenance action. But how can the plant maintenance professional tell the difference between acceptable, normal vibration and the kind of vibration that requires immediate attention to service or replace troubled equipment? Download this white paper and learn how to tell this difference.

    Fluke Networks
    03/16/2010
  • Making Permanent Savings Through Active Energy Efficiency

    This white paper argues strongly that meeting greenhouse gas emissions targets set within the Kyoto Protocol will fail unless Active Energy Efficiency becomes compulsory.

    Active Energy Efficiency is defined as effecting permanent change through measurement, monitoring and control of energy usage. Passive energy efficiency is regarded as the installation of countermeasures against thermal losses, the use of low consumption equipment and so forth.

    It is vital, but insufficient, to make use of energy saving equipment and devices such as low energy lighting. Without proper control, these measures often merely militate against energy losses rather than make a real reduction in energy consumed and in the way it is used.

    Everything that consumes power - from direct electricity consumption through lighting, heating and most significantly electric motors, but also in HVAC control, boiler control and so forth - must be addressed actively if sustained gains are to be made. This includes changing the culture and mindsets of groups of individuals, resulting in behavioral shifts at work and at home, but clearly, this need is reduced by greater use of technical controls.

    Schneider Electric
    03/05/2010
  • Tips For Air/Gas Flow Measurement In High Temperature Environments

    For those who work in or are suppliers to many of the process industries, the "heat" is always on plant equipment even during the cold of winter and the search to find ways to beat the heat when selecting plant instrumentation and controls that withstand rugged operating conditions continues. Air/gas flow meters are no exception. While performance, ease of installation, maintenance and other criteria are all important, flow meters must always be evaluated according to their operating environment and process conditions. These conditions often range from 500 to 850°F (260 to 454°C) in high temperature process industries. Download this white paper to learn more about selecting flowmeters for high temperature process industries.

    FCI
    01/26/2010
  • Electromagnetic Flowmeters: Lining Material for Water Applications

    This paper gives an overview of some basic criteria for choosing lining material for the water / wastewater industry and furthermore provides a short description of the properties, strengths and weaknesses of EPDM, NBR, PUR and Ebonite, i.e. the four types of lining material most commonly used in the water / wastewater industry.

    Basic criteria for choosing lining material


    Due to the functionality of the flowmeter, a non-conductive lining material is imperative, but other requirements vary according to the specific features of the intended application.
    Siemens
    01/25/2010
  • Flowmeters: Discussion of Flowmeter Accuracy Specifications

    Understanding the accuracy of a given flowmeter is an important field but it can also be misleading as different specifications are used to explain how accurate a flowmeter measurement actually measures. This paper discusses the different specifications and interprets the impact of them.

    Why deal with accuracy?


    The reasons for dealing with flowmeter accuracy specifications are many-folded. One important reason is from an economical point of view. The more accurate a flowmeter can measure, the more money you will save as the medium is measured with only very little inaccurately.

    E.g. If the medium is expensive such as oil, it is important to know exactly how much is consumed. This ensures it is being consumed as efficiently as possible. Another reason is in terms of dosing, where a given amount of a medium is added. This must be done with a high level of precision and the accuracy is thus important in order to dose correctly. This is critical in certain industries such as in pharma or chemical.

    Siemens
    01/25/2010
  • Isolated Inputs Offer New Application Advantages

    Protection from noise and ground loops due to ISO-Channel architecture.

    Precision measurement systems are often limited in that all inputs are connected to a single ground. Typically, multiplexer input configurations are set up this way, since all signal inputs are connected to the same return. Even differential input configurations use the same ground reference. The result is that accuracy and flexibility for accurate measurements can be severely compromised when noise or common mode voltage is present.

    Crosstalk from one input signal can easily be reflected onto another input. The design movement to an A/D per channel can help this problem. But that is not sufficient in many cases.

    To minimize noise and ground loops, some newer systems offer isolation between the input signal ground reference and the computer ground. This effectively separates the computer ground from the measurement portion of the system. But still, there is no isolation between input sensor channels, which is a common source of error and frustration for user applications. Why?

    Data Translation
    01/06/2010
  • Monitoring and Controlling Energy Efficiency in Utilities/W.A.G.E.S. for Cost Reduction

    Customers in all industries are coming more and more under pressure to measure the cost of their utilities. Important drivers for this pressure are the rising cost of energy and various certifications according to EMAS and the ISO 14000 series. Measuring utilities has been neglected in the past and using calibrated technology is necessary for this process. However, many companies only measure their utility consumption at the custody transfer point, and these few measuring occurrences leave room for inaccuracy and poor energy management. By investing money in efficient measuring tools, is possible to set up energy monitoring systems that measure the consumption of each respective utility close to the point of use. This white paper reviews processes that can help you attain better energy management. Download now to learn more.

    Endress+Hauser
    12/14/2009
  • Take Care Of Your Pumps And They'll Take Care Of You

    The old saying, "an ounce of prevention is worth a pound of cure," may have been coined by process and plant engineers tired of repairing or replacing pumps. Adding a flow switch to monitor pump wet/dry conditions in the process control loop to avoid dry–running conditions helps protect pumps and keeps them running longer.

    FCI
    10/07/2009
  • Thermal Management: Heat Balance Control via IR Sensing

    The ExergenIR speed boost system is a productivity enhancing concept which, by optimizing and properly controlling process temperatures, can dramatically increase production speed while assuring high product quality.

    Exergen
    09/29/2009
21-40 of 107 < first | | | last >