2010

1-20 of 67 < first | prev | | last >
  • MESA Global Education Program

    This initiative is the first step in filling a noticeable void in industry - the lack of independent competency training in the Operations Management (MES/MOM) arena. This lack of wide-scale competency is recognized as a major barrier to plant and supply chain optimization and global operations excellence.

    With members in 85 countries globally, MESA is an independent, objective community of like-minded people and enterprises working to make Operations more reliable, capable and profitable. Some of the foremost experts across the Operations Management landscape are leading the knowledge sharing within the MESA community by offering programs across 4 continents by mid-2011.

    MESA Certificate of Competency (CoC) for MES/MOM* Methodologies: A 4-day, comprehensive program of MES/MOM Methodologies courses aimed at Systems Analysts, Architects, Programmers, Project Managers and Consultants.

    MESA Certificate of Awareness (CoA) for MES/MOM Business Awareness: A 2-day, high-level program of MES/MOM Business Functions courses geared for executives, manufacturing/operations and IT personnel and sales professionals. The CoA courses are higher level, short versions of the CoC program.

    Learn more.

    MESA
    01/19/2011
  • Understanding REACH

    Registration Evaluation Authorization and Restriction of Chemical Substances

    It is certainly no secret to anyone that the past decade has placed a renewed focus on the environment and how all members of the world community, to include business organizations, affect it. Concerns about protecting the world in which we live have been the impetus behind such worldwide movements as recycling and renewable energy. From a manufacturing standpoint, RoHS (Reduction of Hazardous Substances) has impacted businesses as well as REACH, a more recent set of regulations that are becoming more significant to North American based manufacturing operations that are part of a supply chain that directly or indirectly supplies products into the European Union.

    As with any new regulatory requirements, the initial exposure to the documentation can create a degree of uncertainty among those who will be asked to comply. From this perspective, REACH is no different from any of its predecessors. In an attempt to offer some understanding of the REACH regulations and some clarification of the requirements it places on manufacturers, C&M Corporation gathered Michael Karg, Director of Product Development, along with Randy Elliott, Regulatory Compliance Engineer, and Ariann Griffin, Regulatory Compliance Technician, to discuss some of the particulars of REACH and respond to some of the questions C&M has been discussing with members of its client base.

    What is the purpose of REACH?

    Mike Levesque, Randy Elliott, Ariann Griffin and Michael Karg, C&M Corporation
    12/13/2010
  • Explaining the Agency Approval Process for Wire and Cable Products

    Some engineers think it is science. Others contend it is some type of black magic.

    Many have no idea of exactly how the process works.

    Regardless of what is known –or unknown – about the submission and evaluation process, there are few that will disagree with the premise that agency certifications, such as those offered by organizations like Underwriters Laboratories (UL), Canadian Standards Association (CSA), or Intertek, formerly known as Edison Testing Laboratories (ETL), to name only a few, are an important part of any product offering in the wire and cable industry. With today’s focus on product safety, there has been an increased need for wire and cable products to carry either a listed or recognized mark signifying they have been independently evaluated and have met the appropriate safety guidelines that have been established based on their intended use.

    In an attempt to help bring some clarity to the agency certification process for bulk cable, I have posed a series of related questions to Randy Elliott, C&M Corporation’s Regulatory Compliance Engineer. Randy has been a practicing engineer in the wire and cable industry for over 20 years. His background in R&D and design engineering has brought him into contact with regulatory agencies and their requirements on a regular basis throughout his career. For the past three years, his focus has been completely on regulatory issues for C&M.

    Who is responsible for testing and what do their results mean?

    Mike Levesque & Randy Elliott, C&M Corporation
    12/13/2010
  • Understanding NFPA 79

    NFPA-79 is the electrical standard that has been developed by the National Fire Protection Association (NFPA) and is "intended to minimize the potential hazard of electrical shock and electrical fire hazards of industrial metalworking machine tools, woodworking machinery, plastics machinery and mass produced equipment, not portable by hand."

    The National Fire Protection Association is also responsible for the National Electric Code (NEC)/ (NFPA-70).

    The scope of NFPA-79 is summarized as follows: "The standard shall apply to the electrical/electronic equipment, apparatus, or systems of industrial machines operating from a nominal voltage of 600 volts or less, and commencing at the point of connection of the supply to the electrical equipment to the machine."

    One of the focuses of the latest edition is to improve product safety by ensuring that appropriate types of wire and cable are used in the application with regard to current carrying capacity, temperature rating, or flammability.

    As such, the guidelines for NFPA-79 compliant products are more stringent than those cables allowed by past editions.

    The NFPA-79 provisions make specific reference to only two types of cable.

    Ned Lloyd and Mike Levesque, C&M Corporation
    12/13/2010
  • Using Tofino to Control the Spread of Stuxnet Malware

    This application note describes how to use the Tofino Industrial Security Solution to prevent the spread of the Stuxnet worm in both Siemens and non-Siemens network environments.

    What is Stuxnet?
    Stuxnet is a computer worm designed to target one or more industrial systems that use Siemens PLCs. The objective of this malware appears to be to destroy specific industrial processes.

    Stuxnet will infect Windows-based computers on any control or SCADA system, regardless of whether or not it is a Siemens system. The worm only attempts to make modifications to controllers that are model S7-300 or S7-400 PLCs. However, it is aggressive on all networks and can negatively affect any control system. Infected computers may also be used as a launch point for future attacks.

    How Stuxnet Spreads
    Stuxnet is one of the most complex and carefully engineered worms ever seen. It takes advantage of at least four previously unknown vulnerabilities, has multiple propagation processes and shows considerable sophistication in its exploitation of Siemens control systems.

    A key challenge in preventing Stuxnet infections is the large variety of techniques it uses for infecting other computers. It has three primary pathways for spreading to new victims:
    - via infected removable USB drives;
    - via Local Area Network communications
    - via infected Siemens project files

    Within these pathways, it takes advantage of seven independent mechanisms to spread to other computers.

    Stuxnet also has a P2P (peer-to-peer) networking system that automatically updates all installations of the Stuxnet worm in the wild, even if they cannot connect back to the Internet. Finally, it has an Internet-based command and control mechanism that is currently disabled, but could be reactivated in the future.

    Tofino
    11/30/2010
  • ISA100 and Wireless Standards Convergence

    ISA100 is one of three standards competing in industrial wireless sensing. What is distinctive about ISA100? What are the prospects for convergence of standards? What would convergence be worth to the industrial wireless market?

    ISA100 is a major standards initiative managed by the International Society of Automation (ISA). In addition to standards development, a new organization, the ISA100 Wireless Compliance Institute (WCI), is charged with delivering compliance certification services for the work of ISA100.

    The ISA100 committee establishes standards, recommended practices, technical reports, and related information for implementing wireless systems in the automation and control environment, with an initial focus on the field level. Given the committee's broad scope, they have formed a number of working groups to pursue specific tasks. The primary deliverable from the Committee thus far is the standard ISA-100.11a, "Wireless Systems for Industrial Automation: Process Control and Related Applications". However a quick glance at the list of working groups shows that several other topics will be addressed by future ISA100 deliverables.

    In 2006, at about the same time ISA100 was forming, the ISA also created the non-profit Automation Standards Compliance Institute (ASCI). This organization manages certification, conformance, and compliance assessment activities in the ISA's automation domain.

    ASCI extends the standards work of ISA by facilitating the effective implementation and independent testing of ISA standards. It creates a vital link between the development of standards and industries' implementation of the standards. The ISA100 Wireless Compliance Institute (WCI) functions as an operational group within ASCI. Operating the ISA100 Wireless Compliance Institute within ASCI allows it to leverage the infrastructure of ASCI, which in addition to WCI, is shared by several ASCI compliance programs.

    ARC Advisory Group
    11/22/2010
  • Ensuring an Accurate Result in an Analytical Instrumentation System Part 1: Understanding and Measuring Time Delay

    Process measurements are instantaneous, but analyzer responses never are. From the tap to the analyzer, there is always a delay. Unfortunately, this time delay is often underestimated or not accounted for or understood. Time delay in sample systems is the most common cause of inappropriate results from process analyzers.

    In many cases, it is invisible to operators and technicians, who are focused on the necessity of making the sample suitable for the analyzer. It is not unusual for operators to assume that the analytical measurement is instantaneous. In fact, sample systems often fail to achieve the industry standard of a one minute response.

    As a general rule, it's always best to minimize time delay, even for long cycle times, but delays extending beyond the industry standard are not necessarily a problem. The process engineer determines acceptable delay times based on process dynamics.

    Delays become an issue when they exceed a system designer's expectations. A poor estimate or wrong assumption about time delay will necessarily result in inferior process control.

    This article is intended to enhance understanding of the causes of time delay and to provide the tools required to calculate or approximate a delay within a reasonable margin of error. We will also provide some recommendations for reducing time delay. The potential for delay exists in the follow sections of an analytical instrumentation (AI) system: process line, tap and probe, field station, transport line, sample conditioning system, stream switching system, and analyzer.

    Doug Nordstrom and Tony Waters, Swagelok
    11/18/2010
  • Ensuring an Accurate Result in an Analytical Instrumentation System Part 2: Calibrating the Analyzer

    In many analytical instrumentation systems, the analyzer does not provide an absolute measurement. Rather, it provides a relative response based on settings established during calibration, which is a critical process subject to significant error. To calibrate an analyzer, a calibration fluid of known contents and quantities is passed through the analyzer, producing measurements of component concentration. If these measurements are not consistent with the known quantities in the calibration fluid, the analyzer is adjusted accordingly. Later, when process samples are analyzed, the accuracy of the analyzer's reading will depend on the accuracy of the calibration process. It is therefore, imperative, that we understand how error or contamination can be introduced through calibration; when calibration can - and cannot - address a perceived performance issue with the analyzer; how atmospheric pressure or temperature fluctuations can undo the work of calibration; and when and when not to calibrate.

    Doug Nordstrom and Tony Waters, Swagelok Company
    11/18/2010
  • Ensuring an Accurate Result in an Analytical Instrumentation System Part 3: Maintaining a Representative Sample

    The objective of an analytical instrumentation (AI) system is to provide a timely analytical result that is representative of the fluid in the process line at the time the sample was taken. If the AI system alters the sample so the analytical result is changed from what it would have been, then the sample is no longer representative and the outcome is no longer meaningful or useful. Assuming the sample is properly taken at the tap, it may still become unrepresentative under any of the following conditions:
    - If deadlegs or dead spaces are introduced at inappropriate locations in the AI system, resulting in a "static leak," a bleeding or leaking of the old sample into the new sample;If the sample is altered through contamination, permeation, or adsorption;
    - If the balance of chemicals is upset due to a partial change in phase; or
    - If the sample undergoes a chemical reaction.

    This article will review the major issues leading to an unrepresentative sample and provide recommendations on how to avoid a compromised sample. It will discuss deadlegs and dead spaces; component design and placement; adsorption and permeation; internal and external leaks; cross contamination in stream selection; and phase preservation.

    Doug Nordstrom and Tony Waters, Swagelok Company
    11/18/2010
  • Greenhouse Gas Flow Monitoring

    Last year the EPA implemented new regulations entitled "Mandatory Reporting of Greenhouse Gases." The new regulations called for certain facilities emitting 25,000 metric tons or more per year of specified GHG's to provide an annual report of their actual GHG emissions.

    It is estimated that more than 10,000 facilities in the US meet the criteria for mandated reporting of greenhouse gases. A full description of the EPA mandate can be found on the EPA's web site.

    The EPA's reporting mandate comes in response to the goal of reducing warming gases in the atmosphere to address the consequences of global warming.

    The EPA says the present objective of the mandate is simple reporting and is not about regulating the reduction of GHG at this time, although bloggers and industry pundits speculate this is likely the next step. It's doesn't require a stretch of logic to anticipate the data collected will frame new regulations to curb the release of GHG in response to domestic and international pressure to slow the rate of global warming.

    The EPA's initial mandate in October of 2009 required 31 industry sectors that collectively equal 85 percent of US GHG emissions, to track and report their emissions. In addition to these original 31 industries, the agency in March of this year proposed to collect emissions data from the petroleum and natural gas sector, as well as from industries that emit fluorinated gases and from facilities that inject and store carbon dioxide underground for the purposes of geologic sequestration for enhanced oil and gas recovery.

    Methane is the primary GHG emitted from oil and natural gas systems and is more than 20 times as potent as carbon dioxide at warming the atmosphere, while fluorinated gases are even stronger and can stay in the atmosphere for thousands of years. The EPA says the data collected will allow businesses to track their own emissions, compare them to similar facilities, and identify cost effective ways to reduce their emissions in the future.

    Fluid Components International, Allen Kugi, Member Technical Staff Fluid Components International (FCI)
    10/28/2010
  • How A Biogas Processing System Manufacturer Identified the Best Flow Meter for Gas Measurement

    Klargastechnik Deutschland GmbH's equipment and processes help customers address organic biomass fermentation and recovery while supporting electric power co-generation. The result is clean, green electric power that also reduces both solid waste and hazardous toxic gases such as carbon dioxide and methane, which pollute the environment and contribute to global warming.

    In order to provide these benefits, the company's equipment and systems rely on highly precise and reliable flow measurement of process waste gases. Measuring biogas flow at several points in the system provides operators with critical information for optimal gas production, control, safety and reporting. However, Biogas applications present several challenges in selecting the proper flow meter.

    Download this application note to learn how a biogas processinf system manufacturer can identify the best flow meter for gas measurements.

    Fluid Components International, Achim Sprick, Managing Director, Klargastechnik Deutschland GmbH
    10/28/2010
  • Real-Time Energy Management

    WHAT'S INSIDE:
    1. Background
    2. IOM Real-Time Energy Management
    3. Real-Time Energy Management as Part of an Enterprise approach
    4. Conclusion

    Over the last several years energy costs have more than doubled! In the process manufacturing industries, with energy costs often comprising as much as 80% of the overall variable cost of operating a plant, this has created a crisis. Many manufacturers have responded to this crisis with programs aimed at reducing the overall energy consumption of an operation or looking to alternate, lower cost fuels. Although these initiatives may provide a good starting point in the battle to reduce energy costs, they are not adequate to meet the needs of today's real time business environment.

    Historically, the price of energy could often be dealt with as a constant over a prolonged time period. Large energy users could develop contracts with energy suppliers for 6 months or even a year that would effectively set the price of energy over that time period. Today long-term energy contracts are the exception. In most parts of the world the price of energy changes in real time.

    It is essential that industrial companies manage their business in the time frame at which the business variables change. Otherwise the business is completely out of control. When it comes to managing industrial energy, the time frame is real time and real time energy management is required.

    Invensy, Peter G. Martin
    10/25/2010
  • Virtual Reality Training Program

    A Comprehensive Plant Crew Training Solution Improving Process Reliability and Safety

    One of the key challenges that capitalintensive industries will face over the next five years is replacing the gray-haired workforce with the computer-savvy/gaming generation. High-fidelity operator trainer simulators that represent the production process, control system and the control room interface have proved to be very effective for control room operations training. However, for the remaining 50% of the plant start-up procedures that are executed in the field, no fully interactive training environment has been available - until now.

    Industries like oil and gas, refining and power companies need to institutionalize their workforce knowledge in more efficient and effective ways. Leveraging Virtual Reality (VR) models to improve time-to competency in critical areas like safety, environment protection systems, knowledge, performance training, and reliability provides a vehicle to rapidly train the new workforce in ways that align with their interests and skills.

    With continuing advances in hardware and software techniques Virtual Reality (VR) is accessible today as the best aid to multimedia training, process design, maintenance, safety, etc. which are currently based around conventional 2-Dimensional (2-D) equipment views.

    The real time rendering of equipment views puts demands on processor time and so the use of high fidelity simulators is becoming more and more of a standard in process understanding and training. Within many VR commercial projects in the past, the results have either been unrealistically slow or oversimplified to the detriment of the solution effectiveness. As the technology continues to develop, these issues have been eliminated, giving way to a new process simulation era that is based on commercially standard IT hardware.

    IVRP (Immersive Virtual Reality Plant) now provides a large range of effective multimedia aids that are easily and economically accessible to support design, training, maintenance or safety in the process industry by linking the power of dynamic simulation - DYNSYM - to VR applications and tools.

    Invensys has filed patents for the solution outlined in this paper.

    Invensys, Maurizio Rovaglio, Tobias Scheele and Norbert Jung
    10/25/2010
  • A Blueprint for the Real Time Enterprise

    Manufacturing and production processes have had to be controlled and managed in real time from inception because they change in real time frames. This has been a natural premise of industrial systems from the very beginning.

    A major shift in the business of manufacturing has occurred over the past decade which is driving the dynamics of the business of production and manufacturing into the real time domain. Business variables, such as energy prices, feedstock prices and even product prices have rapidly transitioned from highly transactional time frames into real time frames. For example, a decade ago it was not unusual for an industrial plant to establish a contract with its energy supplier that essentially set the price over an extended time period, of often 6 months or even a year. Today, in most parts of the world, long term fixed price energy contracts are not being offered and the price of energy can change multiple times in a day. The implications of this transition are clear. Industrial business functions must operate in real time to be effective and efficient. Industrial companies that do not move to real time business operations will be at a severe disadvantage in their marketplace.

    Invensys, Peter G. MartinInvensys, Peter G. Martin
    10/25/2010
  • Real-Time Profit Optimization

    Distributed Control Systems (DCS) have been successfully utilized to help control manufacturing and production processes since the late 1970s. The primary function of these DCS systems has been the automatic feedback control of the various process loops across the plants and the human interfacing with plant operators guiding the production from control rooms. Although these systems have proven to be very successful at improving the efficiency of industrial operations as compared with earlier control technologies, the state-of-the-art has not grown significantly since their inception. Most plants still operate exactly as they did 40 years ago.

    Considerable research and development has been invested in expanding the functionality of DCS's in the areas of advanced controls and advanced manufacturing execution software. Numerous industrial plants have started to employ advanced controls in critical or high-value process operations, with some venturing into the use of advanced application software packages, each typically designed to address a specific issue or challenge within the industrial operations. Entrepreneurial software companies typically developed the software at this level of operation, essentially between the automation and business levels, often referred to as the manufacturing execution software (MES).

    Although some industrial operations implemented advanced control and advanced MES software, the vast majority of processes are still controlled by simple automatic feedback control. The efficiency and effectiveness of most plants is a function of the installed feedback control systems. As a result, many industrial managers have expressed concerns that, in spite of the huge investments made in automation systems and software, plants do not appear to be operating better than they had been 30 years ago. In some cases, the plants actually appear to be operating less efficiently, possibly due to the reduced and inexperienced work forces and aging equipment.

    Invensys, Peter G. Martin, PhD, Invensys Operations Management
    10/25/2010
  • Sustainable Profitability

    Many industrial businesses and manufacturing operations were designed, implemented and operated around a set of basic assumptions that have served the industry well over the last century. For example, although it was expected that the values of process variables, such as flow, level, temperature and pressure, would naturally fluctuate in real time, business variables, such as production value, energy cost, and material cost were assumed to be fairly stable over long periods of time. It was also typically assumed that the production operations could effectively work independently from the business operations. Production operations would focus on making the products while business operations would focus on reporting results. This, in turn, led to a bottom-up business information flow perspective. Business information was used only for reporting results and only the required data from the operation had to be provided to the business reporting system. Often no business information flowed to the operations.

    The traditional focus of industrial operations resulting from these assumptions has been on operational objectives, such as throughput and consumption of resources, as compared to business objectives. Typically, plants were designed to maximize production output, which proved to have the limited agility necessary to meet market demands during economic downturns.

    Finally, the labor mindset of the industry resulting from the workforce dynamics of the early industrial revolution is, for the most part, still very much part of the standard operational philosophy utilized in today's industry. A huge separation continues to exist between the professional and management staffs from the operations and maintenance staffs that comprise today's labor force. This separation was necessary during the formative period of the industrial revolution when the available labor force was unskilled and almost completely uneducated. Although today's "labor force" is fairly well educated and highly skilled in comparison, the professional and management teams still tend to work under the traditional assumptions. For example, the operator interfaces of most industrial automation systems have been designed around a philosophy called operations by exception. Essentially this means that operators are to do nothing that impacts the plant unless an exception condition, an alarm or event occurs that requires human intervention. Once the event is addressed, operators can go back to doing nothing. This philosophy was developed to protect the plant from the uneducated and unskilled operators.

    For the most part, these traditional industrial assumptions have served the industry quite well up to this point. However, there are current changes underway that are beginning to show that these traditional assumptions will not be effective going forward.

    Invensys, Peter G. Martin, PhD
    10/25/2010
  • Operations Excellence

    Invensys Operations Management division of Invensys was formed in May 2009 by combining Invensys Process Systems - Avantis, Foxboro, InFusion, SimSci-Esscor and Triconex - with Wonderware, Eurotherm and IMServ. The basis for this move was each of these four traditionally separate and excellent business units would provide more value for clients working together to solve difficult business problems that could not be solved by each unit separately. The Invensys leadership believes that the combined business would help to define and develop leadership in an emerging market space referred to as Operations Management. Invensys Operations Management will focus all of its resources to help customers drive Operations Excellence into their businesses.

    Attaining Operations Excellence requires that industrial companies maximize the efficiency and profitability from their operations through excellent control, drive maximum business value from all their industrial assets, continually drive and increase productivity from all operations-focused personnel, all while reducing negative environmental impact and improving safety. Therefore, Invensys has defined Operations Excellence along four key themes: Control Excellence, Asset Excellence, Productivity Excellence and Environment and Safety Excellence.

    Invensys, Donald Clark, Peter G. Martin, PhD, Charles Piper and Simon Windust
    10/25/2010
  • Replacing Aging Process Automation Systems: Finding the Best Option

    Today, for a variety of reasons, tremendous pressures are building that will require plant managers to update their aging automation systems during the next decade. Defining the need for and exploring alternative approaches to this modernization of manufacturing systems is the subject of this report.

    Managers in today's process manufacturing plants must react to factors ranging from massive customization and growing demand for change orders in the middle of production runs to management expectations mandating ever-faster execution of production orders.

    Such constant pressures are driving many manufacturers to reevaluate the role of their automation strategies while improving the overall effectiveness of their enterprises. They're finding that automation is playing an increasingly important role in the effectiveness and profitability of their entire enterprise, impacting everything from cost of operations to customer satisfaction.

    Fortunately, many are also discovering that they can make significant improvements throughout their value chain - without being forced to abandon their entire existing automation investment.

    Invensys
    10/25/2010
  • Sustaining Performance

    What's Inside:
    1. Support - A Competitive Weapon?
    2. Sustaining Performance - a Dream or a Goal?
    3. The End Result - Less than Optimal Performance
    4. Sustain and Improve Operational Performance
    5. Outside the Box - An Opportunity for Synergy
    6. Support is More than "Just Insurance"
    7. Proactively Preventing Problems is Better than Just Fixing Them
    8. Support Programs - a Cost or a Benefit?
    9. The Business Case for Support
    10. "Side-Effects" Consideration
    11. Invensys Experience
    12. Summary

    Bringing a production facility online is the result of a huge investment. It typically takes years of planning, design, construction and finally, you move into operating the plant. Whether your operation has been running for 20 years or is about to start up, you face the ongoing challenge to achieve the highest returns possible on that investment. Once you have the initial bugs worked out and achieve the goals of targeted productivity, efficiency, quality and performance, how do you sustain high performance, or even improve it?

    This goal of this paper is to present a perspective on Support Services as a tactical approach to not only sustain current Operations performance levels, but to continually improve them - and to be able to measure the ROI of an ongoing Support program.

    Invensys
    10/25/2010
  • Game Changer: Visibility, Enablement, and Process Innovation for a Mobile Workforce

    A key aspect of the "Perfect Plant" is having the right information in the right place at the right time. In most manufacturing environments, instrumentation and monitoring is widespread. Pages and pages of graphs and reports describe every operational characteristic and are used by operators and management to steer the plant to optimal performance. However, in the modern plant, the right time to view this information is not when you are standing in front of an operator console. It is when you are in the field, in front of a failing piece of equipment or discussing a problem while on the move. More often than not, the right way to deliver information is by putting it in the hands of a mobile worker.

    The right way to collect information also involves mobility. Remember that 40% to 60% of equipment in the plants and on the shop floors is not instrumented. Optimizing this critical aspect of plant performance depends on mobile field workers. Armed with the right tools, mobile workers can cost-effectively gather data from non-instrumented assets that can be readily analyzed and integrated into existing back-end decision support systems. Bidirectional flow of information to and from mobile workers is a key competitive imperative required to make fully informed decisions regarding the operation of the Perfect Plant.

    Regrettably for most companies, when it comes to the mobile workforce in manufacturing, too often, vital decisions are made in the dark, in an information-poor environment and with little support or historical contextual information to make informed decisions proactively. Field workers - the people who are closest to the equipment and processes, who feel the heat, hear the noises, and see the changes that can be the first indicators of trouble - frequently do their jobs based on individual experiential knowledge acquired over many years.

    This approach makes manufacturers vulnerable to high levels of variability based on individual talent, skills, and training. With the massive investments in automation over the past decades, management often lacks visibility into what these decision makers in the field do and finds it hard to provide guidance to ensure execution of best practices occur across the field worker roles, production shifts, and assets.

    Invensys, Charlie Mohrmann
    10/25/2010
1-20 of 67 < first | prev | | last >