Interested in linking to "Data Analytics in Batch Operations"?
You may use the Headline, Deck, Byline and URL of this article on your Web site. To link to this article, select and copy the HTML code below and paste it on your own Web site.
Advances in analytic technology over the last ten years make it possible to compensate automatically for varying operating conditions and for process holdups. Also, it is possible to compensate for feedstock property information and sampled lab data of quality parameters. However, such technology is not uniformly available in commercial analytic products. In nearly all cases, the integration of lab data and supplier information on the material properties of truck or railcar shipment properties is customized to each installation and, thus, not addressed by analytic tools. One of the reasons for this is that all of the business systems, lab systems and DCS control systems available today are designed very nicely for meeting their intended use, but not to help facilitate data integration and data analysis.
Some companies judge the status of batch processing by a standard set by the so-called “golden” or ideal batch. Typically, a golden batch is defined as the time-based profile of the measurement values that were recorded for a particular batch that met product quality targets. When using this standard, a batch is judged by how closely the golden batch profile is maintained though the adjustment of process inputs.
The term “golden batch” certainly has a nice sales and marketing ring to it, and many companies promote it. It is very easy to implement a comparative overlay of a current batch time-based profile with the single trace of the golden batch, and to the casual user, this approach may seem very logical. However, it is plagued with problems inherently.
The approach has two big weaknesses. First, conditions indicated by each measurement may affect product quality in a different manner. For example, it may be important to control some parameters tightly, while other measurements may vary significantly without affecting the product quality. Second, the “golden batch” is a univariate approach to a multivariate problem. There is absolutely no knowledge gained of the relationships of process variations. One simply emulates a single batch without knowing why, where or how this trajectory is good.
Without taking these and other key items into consideration, actions taken may incorrectly allocate resources, leading to incorrect control strategies. Time and money may be spent to improve control where it is not needed and directed away from where it is. Through the use of multivariate statistical techniques, it is possible to characterize variations both within and between batches and relate them to both process relationships and to predicting typical batch events and important end-of-batch quality characteristics.
One of the important multivariate statistical methods is principal component analysis (PCA). At the heart of PCA is the concept that a time-based profile for measurement values may be established using a variety of batches that produced good quality product and had no abnormal processing upsets. Analysis tools designed for batch analysis make it possible to extract, analyze and use data from multiple batches. For these batches, the normal variation in measurements is then quantified in terms of a PCA model. The model may then be used to develop a better understanding of how multivariate parameters relate to one another, and how these can affect the batch-to-batch costs, energy, waste and time needed to produce a product.
The model structure automatically takes into account that many of the measurements used in the batch operation are collinear; that is, related to each other and respond in a similar manner to a process input change. You can use the PCA model to identify process and measurement faults that may affect product quality. A problem is flagged only if a parameter deviates by more than the typical variation defined for a good product. As a result, the multivariable environment of a batch operation may be reduced to just a few simple statistics that the operator may use to assess how the batch is progressing. These statistics take into account the significance of a component’s variation from its established profile in predicting a fault.
Through the use of PCA, it is possible to detect abnormal operations resulting from both measured and unmeasured faults.
Projection to latent structures (PLS)— also known as partial least squares— may be used to analyze the impact of processing conditions on final-product quality parameters. When this technique is applied in an online system, it can provide operators with continuous prediction of end-of-batch quality parameters. Where the objective is to classify the operation results into categories of importance (e.g., fault category, good vs. bad batch, etc.), then use discriminate analysis (DA) in conjunction with the PCA and PLS.
Through the application of data analytics, it is possible for the operator to monitor a batch operation simply by looking at a plot of the PCA statistics and the PLS estimated end-point value for quality parameters, as illustrated in Figure 2.
ControlGlobal.com is exclusively dedicated to the global process automation market. We report on developing industry trends, illustrate successful industry applications, and update the basic skills and knowledge base that provide the profession's foundation.