Using a Standard Method for Advanced Analytics Integration

Peter explains that the team chose to use an EIDI-provided integration tool because it eliminates the following delays, which are often encountered in advanced analytics projects:

  • • Developing programmatic extraction scripts, which are often unreliable or developed in a nonstandard way
  • • Having to contextualize the data in terms of plant, process, product, or specific event
  • • Producing data sets that contain inconsistent or nonstandard formatting, which requires substantial rework time
  • • Using inefficient extraction techniques, requiring restarts, excess overhead, or too much processing time
  • • Having a lack of understanding in what the scripts actually create
  • • Being overconfident that the data is accurate
  • • Constantly maintaining or developing new scripts, costing time

Peter noted, "Our operations people and engineers may be skeptical of insights coming from a corporate team that does not have the expertise to make day-to-day decisions on the industrial floor. For now, we need something to accelerate and automate operational data extraction. People will learn how to extract data collected by the EIDI, and they will be prepared to use it with predictive analytics tools, business intelligence tools, and AI tools. The EIDI has become critical to our operations. The scope is becoming larger than we thought, so we need to make sure that extraction tools are standard, easy-to-use, robust, and accurate. These tools must be able to easily extract our data for use elsewhere in the company."

Using these methods, data scientists and analytics teams will have the context that they need to fully understand the operational data. Peter reminds the team that they have emphasized the use of process flow diagrams to create digital plant models (see Chapter 4). This is a vital step in their digital transformation: to have a simple nomenclature for the unit attributes that will be incorporated into their models.

Integrating Production Event-Based Data to Advanced Analytics

Next, Peter has the team review how to extract the data using the production event-framed data for each of the operational states among all the units: "In this case, we can extract any of the time-derived values for each state.

FIGURE 10.4

Event-framed time-series data extracted using the digital plant template.

This derived aggregated information can be reused in real time by AI algorithms using the process mode for enhanced decision-making."

Figure 10.4 shows an entire event-framed data set for all process units in the refinery, aggregating production, energy, water by unit, and by unit operational mode. This data set may be reused by other business intelligence analytic tools.

This standard, configurable extraction tool integrates operational data stored in the EIDI for analysis on Amazon Web Services (AWS) or a Microsoft Azure environment. Supporting data can be retrieved from Hadoop, common relational databases, data lakes/data warehouses, or can be ingested by Amazon Kinesis, Microsoft Azure, or Apache Kafka, and messaging hubs, which are often seen in big data environments.

Peter Argus and Monica Armstrong, who coordinates planning and economics, review their operational industrial workflow, which integrates the work of operations, planning, engineering, and management (Figure 10.5). They agree that the manufacturing process flow and avoiding process constraints are key targets when integrating business and operational systems.

Peter shows the team how to extract the EIDI data using the digital plant template, resulting in the data set shown in Figure 10.6. After successful extraction, Peter feeds the data set into Microsoft Power BI, which creates a cloud-based report. Figure 10.7 shows the entire refinery unit production and consumable losses by unit, operational mode, and by operational shift for an area of the refinery.

In the Power BI report template, several folders were added that describe aggregated KPI information, calculated by the EIDI's real-time analytics using the unit template. The report can display monthly energy consumption

FIGURE 10.5

Adding derived values from operational data.

FIGURE 10.6

Refinery digital plant template results extracted for advanced data analytics.

by production area, as shown in Figure 10.8, operating modes, historical production trends, and historical consumption trends (Bascur 2019). As shown here, we can display results and filter by season of the year.

Using operations and production data together allows for better corporate reporting. Peter explained, "We need to ensure that orders are produced on time and in full while optimizing profitability, so that we improve our

FIGURE 10.7

Microsoft Power BI dashboard folders to present yearly results.

FIGURE 10.8

Microsoft Power BI consumption analysis.

capacity and sequence planning. We can also improve refinery linear programming (LP), also known as linear optimization modeling, to get an accurate and timely projection of best yields and consumables."

Monica added: "These AI tools will assist us in analyzing the operations data in order to optimize refinery operations. In addition, we can model customer demand to improve production target settings according to our process capability. We can also do the following:

  • • Model and manage our consumable resources (energy, water, raw materials).
  • • Increase and optimize our production schedule through our online equipment health monitoring system.
  • • Calculate and show material shortages, delays, quality issues, unscheduled asset downtimes, and operating cost variances in real time, so that operations people can make decisions quickly and avoid unnecessary costs and delays."
 
Source
< Prev   CONTENTS   Source   Next >