Logo
Machine Learning Based Industry 4.0 Framework for Composite Autoclave Manufacturing

Artificial intelligence (AI), machine learning (ML), and data-driven techniques can be used to support and optimize the manufacturing of composite materials. The full model development lifecycle encompasses analysis, optimization, inverse problem-solving, and experimental validation. The Industry 4.0-inspired modeling pipeline tailored to composite processing combines both data-centric and model-centric knowledge engineering practices, with an emphasis on embedding domain expertise directly into the dataset curation and model development phases.

A high-level Industry 4.0 (I4.0) framework is composed of three interrelated sub-systems:

(a) Data acquisition from cyber-physical systems, which may include a combination of real-time production data and simulated outputs;
(b) A data pipeline that manages key preprocessing tasks such as data cleaning, dimensionality reduction, storage, and other logistical operations necessary to ensure data usability;
(c) A model development pipeline, which involves filtering incoming data, constructing relevant knowledge datasets, building and validating predictive models of the manufacturing system, and ultimately deploying these models to support real-time decision-making on the shop floor.

Focusing on the sub-system (c), the proposed framework builds upon the CRISP-DM (Cross-Industry Standard Process for Data Mining) methodology, a widely adopted and open-standard model for data analytics. It provides a structured, iterative approach to data-driven problem-solving, making it especially suited for dynamic, evolving manufacturing environments.

Screenshot 2025-04-10 at 9.04.49 PM

A key element in data-driven modeling is the anomaly detection system, which ensures that incoming data aligns with the distribution of previously collected data. This validation step can be implemented using a variety of techniques, such as classification, clustering, deep learning, or statistical models. If the chosen method determines that a new data point fits within the existing distribution, it is accepted and incorporated into the dataset; otherwise, it is rejected. To enhance this process, domain expertise can be integrated through a dataset knowledgeability exercise, allowing expert judgment to refine the dataset and establish a feedback loop. This hybrid approach contrasts with fully automated, knowledge-agnostic systems and is seen as a vital component in advancing toward Industry 5.0. By enforcing consistency in data distribution, the assumption of independently and identically distributed (IID) data is upheld—an essential condition for making reliable comparisons between models trained on distinct datasets.

USE CASE:

Composite materials have seen broad adoption across industries such as aerospace, automotive, and construction, thanks to their exceptional structural properties—namely, high stiffness-to-weight and strength-to-weight ratios—along with reduced maintenance needs and lower lifecycle costs. Despite these benefits, the production of composite components remains challenged by significant levels of uncertainty, both aleatoric (inherent variability) and epistemic (lack of knowledge), which hinder the consistent manufacture of high-quality parts.

In the aerospace sector, where structural reliability is paramount due to the potentially catastrophic consequences of failure, stringent qualification frameworks exist to ensure the integrity of materials, processes, and designs. While necessary for safety, these frameworks place considerable demands on manufacturers and often limit flexibility in production—particularly when it comes to cost optimization and real-time decision-making on the factory floor (Crawford et al., 2021a).

One example of these challenges can be seen in the use of “bus stop” autoclave cure cycles, where multiple parts, tools, and materials are batched together and cured simultaneously. This method offers practical advantages—such as reduced floor space requirements, more efficient use of capital equipment, and lower overhead—but also increases the complexity of planning and process control. In such settings, shop floor engineers and operators are often required to make in-situ decisions, relying on their expertise and tacit knowledge to maintain process conformance and part quality.

However, these decisions are typically unstructured and lack systematic optimization, revealing an opportunity for the integration of intelligent, technology-enabled decision-support systems. The success of such systems depends not only on access to historical process data, but also on the incorporation of expert insights—bridging human experience with data-driven methods to improve repeatability and efficiency in composite manufacturing.

In bus-stop autoclave curing runs for manufacturing composite aerospace structures, multiple small components—each with similar physical characteristics and qualified to undergo the same cure cycle—are stacked together in a single autoclave to improve production efficiency. Despite the shared cure cycle, each part must maintain a thermal history that stays within an acceptable thermal envelope to ensure product quality.

To achieve a successful cure across all components, parts are carefully selected based on physical similarities, such as laminate thickness, construction type (e.g., monolithic vs. sandwich panels), tooling material, and other relevant attributes. Two critical features derived from each part’s thermal profile—the peak exotherm (the maximum temperature reached during the cure) and the steady-state lag (the highest temperature differential between the part and the surrounding autoclave gas)—are used as indicators of process quality.

Lower values of these thermal metrics generally signify reduced process variability, which correlates with higher-quality outcomes. If these thresholds are exceeded, parts may develop defects like voids, ultimately compromising mechanical properties such as flexural strength, flexural modulus, and interlaminar shear strength. Thus, peak exotherm and steady-state lag serve as essential acceptance criteria for screening cured parts.

In the context of bus-stop autoclave cure cycles, interpretable surrogate models such as Logistic Rule Regression are integrated with expert knowledge through a fuzzy scoring system, enabling an assessment of dataset “knowledgeability” prior to the deployment of black-box models. To further support model validation, two metrics are introduced: specificity as a global confidence indicator, and the novel Decision Boundary Crispness Score (DBSC) as a local, sensitivity-based metric.

The modeling task at the factory level is framed as a binary classification problem, where the objective is to predict whether a carbon fiber prepreg part passes or fails the quality standards following autoclave curing. Specifically, the classification is based on two critical thermal processing criteria:

  1. Pass if the peak exotherm temperature is less than 5 °C (Criterion 1)
  2. Pass if the maximum lag temperature is less than 20 °C (Criterion 2)

Parts that meet both criteria are labeled as "pass" (class 1), while those that exceed either threshold are labeled as "fail" (class 0).

The model architecture, illustrated in the following figure, consists of two independent predictive models—one for each thermal outcome. This design allows for separate evaluation and parameter tuning tailored to each specific target, enabling more precise learning for each thermal metric.

Screenshot 2025-04-10 at 10.42.56 PM

Both models share an identical architecture, comprising five hidden layers with seven neurons per layer. This configuration was chosen through a parametric study, where it demonstrated the lowest error rate on the test dataset. Each neuron uses a sigmoid activation function, and model training is performed using the Adam optimizer with a binary cross-entropy loss function.

The results demonstrate that DBSC offers a more nuanced and conservative evaluation of both dataset and model quality, especially useful when dealing with uncertain or variable manufacturing conditions. The enhanced explainability and localized insight provided by these methods are particularly valuable to production engineers, supporting trust and accountability when using black-box models in high-stakes, real-time decision-making scenarios.

Source: https://doi.org/10.1016/j.compind.2021.103510

Leave a Reply

Your email address will not be published. Required fields are marked *