Data Analytics in Chemical Production and Maintenance
How the Chemicals Industries Can Use Statistics to Create more Value more Quickly
In the era of big data and a growing interest in Internet of Things (IoT) and Industry 4.0, the pressure to innovate is especially strong in the process-enabling industries. As with many other disciplines, chemistry must evolve to take advantage of more data and the methods to distill that data into knowledge. But what is the best way for companies to navigate the sea of data points they measure every day?
Chemical industry innovator Stan Higgins is attuned to those predictions and sees the opportunities for analytics to create more value. Higgins, retired CEO of the North East of England Process Industry Cluster (NEPIC), currently is a non-executive director at Industrial Technology Systems (ITS) as well as a senior adviser to Tradebe, a waste management and specialty chemical company and to JMP, a provider of statistical analysis software. In January 2018, he was awarded an Officer of the Order of the British Empire (OBE) for his work promoting the UK’s chemical process manufacturing industry.
CHEManager asked Higgins why he argues the case for more sophisticated data analytics in process manufacturing and is convinced that process knowledge and robust methods affect both the company’s bottom line and its ability to serve its customers.
CHEManager: Mr. Higgins, innovation is one of the key factors for any organization to be successful. What drives innovation in the chemicals industry and its related sectors?
Stan Higgins: The drivers for innovation in the chemical and wider process industry have until the current millennium mostly been about profitability. Over the last 20 years there has been a change with more and more focus on sustainability. This is compounded by society’s demand for a better environment and a clearer future as articulated in the “Grand Challenges” that we face as a species. Working towards how to house, feed and keep healthy so many more people; how to provide them with energy, transport and healthcare. Climate change issues have also added a new dimension to those challenged to provide effective and innovate new solutions.
Which are the specific internal and external pressures to innovate in the process-enabling industries?
S. Higgins: Internally within the process industries, those responsible for innovation are driven to speed up research, development, and the related decision-making processes. There is a cost element to this, keeping down R&D and development time; but the real driver is capturing more value, by being first to market. Such developers need to be ever more responsive particularly towards their downstream customers. They must quickly identify development routes, complete bench testing and scale up, while responding to customer demands for performance information today rather than tomorrow. Such pressures can create internal tensions between business development and R&D managers.
At a more fundamental level: What are the challenges in chemical R&D, production and maintenance?
S. Higgins: The R&D managers have to constantly balance resources towards internal and external demands, like supporting manufacturing to resolve problems with scale up and introduction of new products as well as providing the business development team timely responses to customer demands and providing solutions to new opportunities.
In a free webinar on April 4, 2019, Stan Higgins will give an introduction to the challenges and opportunities associated with faster, more predictable process development for industries who need to innovate to remain competitive. The lecture will also present a JMP case study on the efficient optimization of a process for a making new product. Register here to attend the presentation.
In production the challenges are also never ending. Maintaining quality production and maximizing output within health, safety, environmental and quality goals is the key. However, there are many variables: the quality and variability of raw materials, the control of many process parameters, the durability of processing equipment, waste and emissions management, the quality and performance of products made, their storage, packaging and delivery to the customer. Even then, there are issues that remain. How does the material supplied perform in the down-stream customers processes, products and markets? Despite the supply of product from what looks like a perfect production run, if this material does not perform at the customers premises, then there will be a backlash that inevitably finds its way back to the production managers office.
Maintenance management is a specialist activity within the manufacturing operations. Significant manufacturing outages can occur due to the breakdown of relatively small pieces of equipment, and of course large bits of kit too. Modern maintenance practices are moving away from breakdown maintenance, which amounts to fixing the problems as they arise, to more predictive maintenance – identifying and fixing problems before they arise. This gives rise to planned actions and activities to minimize outage times.
In all these areas there is a growing recognition that data management is key to drive performance.
Which role does analytics play in R&D and innovation? How can software help companies to overcome the challenges mentioned above?
S. Higgins: New chemicals, intermediates, processes and formulations are being developed in these industries on a daily basis and getting them to market more quickly means more value could be realized earlier. The key challenges are always to minimize laboratory and pilot plant time whilst gaining the most information from the work undertaken. Establishing a new product or capturing sales because you have responded best to an enquiry or an opportunity is a crucial commercial driver. This is where analytics can have such a valuable impact on innovation workstreams.
In addition to quickly observing key process relationships that are statistically significant, advances in analytics software to aid in the “design of experiments” can help speed up decision making in chemical development. Such smart experimental design will quickly identify the parameters that need to be tested to prove their potential sensitivity. This ability to focus on what matters is key to limiting the amount of experimentation that has to be done to capture enough data to be meaningful. This level of understanding enables innovation leaders and R&D managers to be more precise about the time they will need to deliver their results.
What are the opportunities for analytics in production and maintenance?
S. Higgins: There is complexity within the data produced in the modern factory and indeed it is easy to be lost in a sea of data. More recently the inability to use the information locked within such data has become known as the “hidden factory”.
There are many sources of data in manufacturing operations and just as much if not more in maintenance activities. Often, process equipment-based data is ignored because there can be too much of it, and spotting important relationships is difficult. These relationships can arise in the data from the equipment itself but also from the surrounding processes and operational data. Data is generated by sensors, measurement systems, process control systems and process inputs and outputs; complexity is compounded by such data being oftentimes collected with different periodicities.
“Too many factories are producing
a mountain of data that is
never used.”
Too many factories are producing a mountain of data that is never used. Companies are starting to realize – especially with initiatives like Industry 4.0 – that they’ve paid for all this data and they need more returns from it. It is an asset in which many hidden gems may lay. There may be real value hidden about how to improve the process, or how to make the product better or more efficient, but they need the tools to release it.
How can companies extract valuable insight from the flood of manufacturing data they generate? Which tools are available?
S. Higgins: Many managers in industry will be uncomfortable with statistical analysis techniques. Most chemical factories will not have a professional statistician in the management team. There may be one at the head office, but that person probably isn’t analyzing the way that machines are working in a particular factory and relating the analysis to the product output, nor studying if there is any connectivity between the two.
With accessible and easy-to-use point and click software like JMP, the maintenance manager, the production manager and the R&D manager can do their own analysis. It is designed for use by engineers and scientists and doesn’t need input by a specialist at head office.
“The key to the use of
statistical software is to have
data normalized in some way.”
How can data be used to improve operations through data-driven monitoring and predictive modeling?
S. Higgins: The key to the use of statistical software is to have data normalized in some way. A date or time marker, batch number even in continuous processes data can usually be associated with a periodicity. With a limited amount of effort, managers, scientists and engineers will be able to identify relationships and show real statistical evidence of those connections. With that knowledge they will be more encouraged and motivated to take a look into the depths of their data historians and make the effort to analyze the data within. Data can be input from many formats. There can be no doubt that by improving the understanding of the valuable insights that can be gained by using analytical tools, this should be enough of a driver for most to put in the effort to normalize their data. Especially when it will enable improved testing, monitoring and delivery of the performance of their processes, plant and equipment and giving them access to their hidden factory.
Why is design of experiments critical if you want confidence in meeting development project milestones?
S. Higgins: There have been advances in the understanding of the statistics of the design of experiments. With a quite limited data set, using analytical software, statistical relationships can be identified such that it will determine the number of experiments needed to provide some certainty in the outcome. This enables R&D mangers to be much more accurate in their planning and response times. Reducing the tension between R&D and business development mangers and improving internal or external customer relationships. Being responsive in this way is more likely to result in better value capture for the business concerned.
Can you give a specific example, e.g. a case study?
S. Higgins: There is an impressive case study from JMP from chemical development: A company wanted to bring a catalyst to market that could synthesize aliphatic polycarbonate polyols from waste carbon dioxide. 35 factors were identified that might affect yield, polymerization rate and by-product production and they couldn’t see an efficient way forward to optimize the process.
In the past they had tried traditional experimental – or one factor at a time, OFAT – approaches with varying success. Sometimes they were lucky and identified a solution quickly, other times they were unlucky and had significant over-runs. They heaved a sigh of relief that a solution was found, but were never sure they had found the best, most robust, or lowest cost process to operate.
By using JMP Pro they were able to exploit their existing data using data mining to identify the top 10 factors out of the 35 that might be responsible for their KPI outcomes. Using these top factors, they defined an efficient data collection plan requiring 21 individual experiments.
After conducting these experiments, they were able to fit a predictive model, query and present that model graphically to gain insight and communicate that insight to key stakeholders, delivering the solution needed to successfully scale production to 7,500 liter capacity. The findings were at odds with simple kinetic theory which also led to a better understanding of the reaction mechanism. Rather than commercializing the catalyst, the business unit was spun off to Saudi Aramco for $100 million.
A lot of companies seem to have a problem with making sense of Industry 4.0 and linking product performance to the marketplace. Which are the biggest challenges companies face in adopting data analytics?
S. Higgins: The biggest issue is to recognize that most value will not come from making Industry 4.0 a company initiative. Most value will come from giving the analytics tools to managers, scientists and engineers at the laboratory, factory, office and shop floor level. Ensuring that they have access and the short amount of training needed to become much more analytical and proficient in the understanding of their data.
There are some examples where customer data linked to manufacturing has helped deliver better solutions. This is where huge downstream benefits might accrue if customer and supplier relationships are deep enough to allow data exchange to improve outcomes for the whole supply chain.
How would you make the case to upper management for increased data analytics in the chemical industry?
S. Higgins: On the whole the chemical industry has been good at the adoption of improvement techniques – from understanding plant and performance and adopting measures like Overall Equipment Effectiveness to the use of the techniques of Deeming such as Kaizen and Six Sigma. However even within these methodologies there is still a tendency to allow anecdote and the pressure of personality to be perpetuated. Once the low hanging fruits have been taken using these techniques, they sometimes become a means to manage and can lose momentum.
All those that use these techniques need greater statistical insight into the data that they generate. Data analytics can provide more accurate insight, really showing what is important to control. Sometimes more importantly they can reveal the many things that do not contribute to better outcomes, thereby allowing people to focus on the things that really matter. Furthermore, the modelling within the software will enable them to test their ideas before making real process changes. This has not been possible outside of the factory, laboratory or workshop until now.
How can companies create a culture that fosters data driven improvements?
S. Higgins: The key driver to make more returns from the data generated in R&D, manufacturing, maintenance and customer service is to put modern data analytical tools – that is software – into the hands – that is desktops – of the managers generating the data. Give them a little bit of training – JMP is point, click and drag just like Microsoft tools with which they will be familiar – then use the models created in team and management meetings. This will result in more statistically accurate insights rather than anecdote to solve problems and improve the business.
Personal Profile
Stan Higgins, retired CEO of the North East of England Process Industry Cluster (NEPIC), currently is a non-executive director at Industrial Technology Systems (ITS) as well as a senior adviser to Tradebe, a waste management and specialty chemical company and to JMP, a provider of statistical analysis software.
In January 2018, Higgins was awarded an Officer of the Order of the British Empire (OBE) for his work promoting the UK’s chemical process manufacturing industry.
Contact
JMP Statistical Discovery / SAS Institute GmbH
In der Neckarhelle 162
69118 Heidelberg
Germany
+49 (0)6221 415 3367