White Papers

1-20 of 68 < first | prev | | last >
  • Purge and Pressurization White Paper

    Purge and pressurization is an alternative hazardous location protection concept that allows lesser rated equipment to be used in hazardous areas by segregating the equipment from the hazardous material.

  • Measuring Molasses with a WORM

    A rigid sensor couldn't measure temperature accurately in a conventional thermowell, but a custom thermowell and a WORM RTD flexible sensor measure the molasses perfectly.

    Jayson Sorum, Moore Industries
  • Get Rid of Rigid: Get the WORM Flexible Temperature Sensor

    The WORM goes where no other temperature sensor has gone before, literally! With its flexible design, it is able to fit in places that rigid sensors can't. It provides accurate readings while being extremely easy to maintain. Read this white paper to learn how the WORM provides a "one size fits all" solution to temperature sensors that saves you time and money.

    Moore Industries
  • Big Data: Improving Decisions thru Smart Instrumentation

    Analyzing "Big Data" provides decision makers with tools to make better operational decisions impacting efficiency, costs, security, and ultimately contribute to greater profits. Download this white paper to learn the role of smart instrumentation, and find out how data is not only shaping business but changing the future of instrumentation

  • Types of Pressure: When and Why Are They Used

    Without measurement there is no control. As with any type of measurement, results need to be expressed in a defined and clear way to allow everyone to interpret and apply those results correctly. Accurate measurements and good measurement practices are essential in industrial automation and process environments, as they have a direct effect on the success of the desired outcome. Pressure, the measure of a force on a specified area, is a straightforward concept, however, depending on the application, there are many different ways of interpreting the force measurement. This white paper will identify the various units of pressure measurement, while discussing when and why certain pressure measurements are used in specific applications.

  • Weigh Your Instrumentation Options: Switch, Transmitter or Hybrid?

    For decades, process instrumentation specifiers have faced the decision whether to use a mechanical switch or a continuous transmitter for a given application. Either type of instrument can be used to effectively control industrial processes and protect equipment and personnel -- and each has associated pros and cons. Application specifics typically drive decision-making, dictating which approach is most effective from performance, cost and lifecycle support perspectives.

  • Choosing the Right Pressure Sensor

    Today's pressure sensors are called on to work within the harshest of environments - with the most hostile and corrosive media - or sometimes to take the simplest of pressure readings.

  • Discussion of Flowmeter Accuracy Specifications

    Understanding the accuracy of a given flowmeter is an important field but it can also be misleading as different specifications are used to explain how accurate a flowmeter measurement actually measures. This paper discusses the different specifications and interprets the impact of them.

    Why deal with accuracy?

    The reasons for dealing with flowmeter accuracy specifications are many-folded. One important reason is from an economical point of view. The more accurate a flowmeter can measure, the more money you will save as the medium is measured with only very little inaccurately.

    Another reason is in terms of dosing, where a given amount of a medium is added. This must be done with a high level of precision and the accuracy is thus important in order to dose correctly. This is critical in certain industries such as in pharma or chemical.

    A third reason is in terms of billing purposes. By performing with good accuracy, you know exactly how much fluid flows into the process. Thereby, you are able to determine the right price of the product and thereby bill the customers correctly.

    Therefore, knowing how much that flows through your system is paramount in order to make a profitable and solid business. You need to rely on a precise measurement with good accuracy. However, good accuracy must be obtained not only in one measurement, but in all measurements independent of the time.

  • New Differential Pressure Sensor Incorporates LVDT Technology to Create More Environmentally-Resistant, Dependable and Economical Pressure Sensing Solution

    Differential pressure (dP) sensors with electronic signal processing are increasing being used to monitor flow, filter condition and level. Since these devices offer linear and accurate output, they are also replacing the differential pressure switch that only support on-off condition and useless for closed loop control system. These dPs are often configured with expensive valves and fluid filled remote seals for added protection against corrosive media, radiation and/or extreme media temperature ranges when operating in demanding environments. In cold ambient environment specially operating in temperatures below -4 deg F (-20 deg C), the sensor need to be heated either by trace heater or within a heated enclosure to maintain the operation of the dP sensor. In addition to being expensive, these valves and seals tend to be bulky and require time to install and maintain. In many critical applications such as food and pharmaceuticals, filled fluids are a serious concern due to process contamination. In gaseous systems such as hydrogen and oxygen and semiconductor applications, fluid filled sensors are being banned since the leakage of fluid into the process could lead to an explosion and serious safety issues.

    A new series of LVDT (linear variable differential transformer) based oil-less dP sensor with dual channel ASIC (applications specific integrated circuit) have been developed that can operate in a wide range of corrosive materials, radiation and temperature without any oil filling and bulky sealing systems. By encapsulating LVDT proven technology with digital compensation, the pressure sensors combine the benefits of friction-free operation, environmental robustness and unlimited mechanical life. By selecting the diaphragm thickness and material properties, Table 1 show the dP ranges that can be produced using the LVDT technology.

    American Sensor Technologies
  • Proper Employment of Guided Wave Radar in Steam Loops

    This paper will address the application of Guided Wave Radar (GWR), also known as Time Domain Reflectometry (TDR), in your steam loop. Included will be discussions of how this technology functions and differs from more traditional forms of level indication.

    The heart and soul of any boiler based power generation system is the steam loop or circuit. Without the proper availability of water in this system, efficiency suffers. In more extreme circumstances damage to other components from either too much water (carryover) or too little water (low water condition) will occur and shorten a boiler's lifespan. In the most extreme situation a dry fire accident could occur resulting in severe damage and personal injury.

    Level indication in the steam loop is critical, yet the methods employed to measure it have been slow to evolve or change. Some of that has been due to code requirements (PG-60 of the ASME Boiler and Pressure Vessel Code) or a simple lack of confidence in "new" technology. It has only been in the past 15 to 20 years (recent in terms of boiler/steam loop history) that technologies such as magnetic level gages or differential pressure devices have been used in place of direct reading glass gauges on applications such as feedwater tanks, high pressure preheaters or hotwells. These same devices are now utilized for drum level indication as well. The most recent addition to the technology basket for steam loop applications has been Guided Wave Radar. Used in conjunction with other technologies it is seen as a reliable cost effective choice for redundant level measurement in all steam loop applications, including drum level.

  • OptoFluidic: Real-time Optical Analysis

    Optofluidics is a relatively new interdisciplinary technology that combines optics and fluidics. It extends to both the realization of optical effects and components and the analysis of fluids in motion. Fluids comprise liquids and gases, but also bulk solid materials that flow through pipelines and their fittings.

    This technology furnishes diagnostic and analytical methods in which certain characteristics, constituents or parameters of fluids in motion such as density, volume, colour, or content of noxious substances are detected and evaluated. For this purpose, the fluid is charged with information that can be subsequently read by optical components. The fluid thus becomes a medium that carries in itself the code for optical analysis. Devices such as cameras and sensors visualize the diagnosis in real time, without the process flow having to be interrupted. In future, optofluidic analysis methods could replace time-consuming sampling and stabilize process flow, while reducing the number of components required and maintenance costs.

  • The Insider's Guide to Applying Miniature Solenoid Valves

    Equipment designers frequently must incorporate miniature solenoid valves into their pneumatic designs. These valves are important components of medical devices and instrumentation as well as environmental, analytical, and similar product applications. However, all too often, designers find themselves frustrated. They face compromise after compromise. Pressure for increasingly miniaturized devices complicates every step of the design and valve selection process. And missteps can wreak havoc. How do designers balance the needs for reliability, extended service life, and standards compliance against often-contradictory performance requirements such as light weight, high flow, and optimum power use?

    This report consolidates the expert views of designers and manufacturers with wide experience applying miniature solenoid valves for myriad uses across multiple industries. It presents a true insider's guide to which requirements are critical for common applications. It also highlights new valve technologies that may lessen or eliminate those troubling compromises.

  • MESA Global Education Program

    This initiative is the first step in filling a noticeable void in industry - the lack of independent competency training in the Operations Management (MES/MOM) arena. This lack of wide-scale competency is recognized as a major barrier to plant and supply chain optimization and global operations excellence.

    With members in 85 countries globally, MESA is an independent, objective community of like-minded people and enterprises working to make Operations more reliable, capable and profitable. Some of the foremost experts across the Operations Management landscape are leading the knowledge sharing within the MESA community by offering programs across 4 continents by mid-2011.

    MESA Certificate of Competency (CoC) for MES/MOM* Methodologies: A 4-day, comprehensive program of MES/MOM Methodologies courses aimed at Systems Analysts, Architects, Programmers, Project Managers and Consultants.

    MESA Certificate of Awareness (CoA) for MES/MOM Business Awareness: A 2-day, high-level program of MES/MOM Business Functions courses geared for executives, manufacturing/operations and IT personnel and sales professionals. The CoA courses are higher level, short versions of the CoC program.

    Learn more.

  • Cloud Instrumentation, the Instrument Is In the Cloud

    A short bit of history helps to understand why the cloud instrumentation development is so significant.

    The first created instruments, let us call them traditional instruments, are of standalone or box format. Users connect sensors directly to the box instrument front panel, which contains the measurement circuitry and displays the results. Initially it was on analog meters and later with digital displays.

    In many cases, test engineers wanted to have instruments communicate with each other, for instance in a stimulus/response experiment, when a signal generator instructs a digitizer when to start taking samples. This was initially done with serial links, but in the 1970s the Hewlett Packard Interface Bus, which evolved into today's IEEE-488 interface, became extremely popular for connecting instruments.

    The next major breakthrough in measurement technology came with the availability of desktop computers, which made it more cost effective to run test programs, control instruments as well as collect data and allow test engineers to process and display data. Plug-in IEEE-488 boards allowed minicomputers and later PCs to perform these tasks.

    Today such interface cards are often not needed thanks to instruments that communicate with PCs directly over USB or the Ethernet, and most recently even over wireless Ethernet schemes.

    Marius Ghercioiu, President of Tag4M at Cores Electronic LLC
  • Ensuring an Accurate Result in an Analytical Instrumentation System Part 1: Understanding and Measuring Time Delay

    Process measurements are instantaneous, but analyzer responses never are. From the tap to the analyzer, there is always a delay. Unfortunately, this time delay is often underestimated or not accounted for or understood. Time delay in sample systems is the most common cause of inappropriate results from process analyzers.

    In many cases, it is invisible to operators and technicians, who are focused on the necessity of making the sample suitable for the analyzer. It is not unusual for operators to assume that the analytical measurement is instantaneous. In fact, sample systems often fail to achieve the industry standard of a one minute response.

    As a general rule, it's always best to minimize time delay, even for long cycle times, but delays extending beyond the industry standard are not necessarily a problem. The process engineer determines acceptable delay times based on process dynamics.

    Delays become an issue when they exceed a system designer's expectations. A poor estimate or wrong assumption about time delay will necessarily result in inferior process control.

    This article is intended to enhance understanding of the causes of time delay and to provide the tools required to calculate or approximate a delay within a reasonable margin of error. We will also provide some recommendations for reducing time delay. The potential for delay exists in the follow sections of an analytical instrumentation (AI) system: process line, tap and probe, field station, transport line, sample conditioning system, stream switching system, and analyzer.

    Doug Nordstrom and Tony Waters, Swagelok
  • Ensuring an Accurate Result in an Analytical Instrumentation System Part 2: Calibrating the Analyzer

    In many analytical instrumentation systems, the analyzer does not provide an absolute measurement. Rather, it provides a relative response based on settings established during calibration, which is a critical process subject to significant error. To calibrate an analyzer, a calibration fluid of known contents and quantities is passed through the analyzer, producing measurements of component concentration. If these measurements are not consistent with the known quantities in the calibration fluid, the analyzer is adjusted accordingly. Later, when process samples are analyzed, the accuracy of the analyzer's reading will depend on the accuracy of the calibration process. It is therefore, imperative, that we understand how error or contamination can be introduced through calibration; when calibration can - and cannot - address a perceived performance issue with the analyzer; how atmospheric pressure or temperature fluctuations can undo the work of calibration; and when and when not to calibrate.

    Doug Nordstrom and Tony Waters, Swagelok Company
  • Game Changer: Visibility, Enablement, and Process Innovation for a Mobile Workforce

    A key aspect of the "Perfect Plant" is having the right information in the right place at the right time. In most manufacturing environments, instrumentation and monitoring is widespread. Pages and pages of graphs and reports describe every operational characteristic and are used by operators and management to steer the plant to optimal performance. However, in the modern plant, the right time to view this information is not when you are standing in front of an operator console. It is when you are in the field, in front of a failing piece of equipment or discussing a problem while on the move. More often than not, the right way to deliver information is by putting it in the hands of a mobile worker.

    The right way to collect information also involves mobility. Remember that 40% to 60% of equipment in the plants and on the shop floors is not instrumented. Optimizing this critical aspect of plant performance depends on mobile field workers. Armed with the right tools, mobile workers can cost-effectively gather data from non-instrumented assets that can be readily analyzed and integrated into existing back-end decision support systems. Bidirectional flow of information to and from mobile workers is a key competitive imperative required to make fully informed decisions regarding the operation of the Perfect Plant.

    Regrettably for most companies, when it comes to the mobile workforce in manufacturing, too often, vital decisions are made in the dark, in an information-poor environment and with little support or historical contextual information to make informed decisions proactively. Field workers - the people who are closest to the equipment and processes, who feel the heat, hear the noises, and see the changes that can be the first indicators of trouble - frequently do their jobs based on individual experiential knowledge acquired over many years.

    This approach makes manufacturers vulnerable to high levels of variability based on individual talent, skills, and training. With the massive investments in automation over the past decades, management often lacks visibility into what these decision makers in the field do and finds it hard to provide guidance to ensure execution of best practices occur across the field worker roles, production shifts, and assets.

    Invensys, Charlie Mohrmann
  • Passive Techniques for Reducing Input Current Harmonics

    Events over the last several years have focused attention on certain types of loads on the electrical system that result in power quality problems for the user and utility alike. Equipment which has become common place in most facilities including computer power supplies, solid state lighting ballast, adjustable speed drives (ASDs), and un-interruptible power supplies (UPSs) are examples of non-linear loads. Adjustable speed drives are also known as Variable Frequency Drives (VFDs) and are used extensively in the HVAC systems and in numerous industrial applications to control the speed and torque of electric motors. The number of VFDs and their power rating has increased significantly in the past decade. Hence, their contribution to the total electrical load of a power system is significant and cannot be neglected.

    Non-linear loads are loads in which the current waveform does not have a linear relationship with the voltage waveform. In other words, if the input voltage to the load is sinusoidal and the current is non-sinusoidal then such loads will be classified as non-linear loads because of the non-linear relationship between voltage and current. Non-linear loads generate voltage and current harmonics, which can have adverse effects on equipment that are used to deliver electrical energy. Examples of power delivery equipment include power system transformers, feeders, circuit breakers, etc. Power delivery equipment is subject to higher heating losses due to harmonic currents consumed by non-linear loads. Harmonics can have a detrimental effect on emergency or standby power generators, telephones and other sensitive electrical equipment.

    When reactive power compensation in the form of passive power factor improving capacitors are used with non-linear loads, resonance conditions can occur that may result in even higher levels of harmonic voltage and current distortion thereby causing equipment failure, disruption of power service, and fire hazards in extreme conditions.

    The electrical environment has absorbed most of these problems in the past. However, the problem has now reached a magnitude where Europe, the US, and other countries have proposed standards to engineer systems responsibly, considering the electrical environment. IEEE 519-1992 and EN61000-3-2 have evolved to become a common requirement cited when specifying equipment on newly engineered projects. Various harmonic filtering techniques have been developed to meet these specifications. The present IEEE 519-1992 document establishes acceptable levels of harmonics (voltage and current) that can be introduced into the incoming feeders by commercial and industrial users. Where there may have been little cooperation previously from manufacturers to meet such specifications, the adoption of IEEE 519-1992 and other similar world standards now attract the attention of everyone.

    Mahesh M. Swamy, Yaskawa Electric America
1-20 of 68 < first | prev | | last >