Handling Product’s BigData in PLM


If we flashback on how PLM had evolved from a simple engineering control system to an organizational practice that helped companies to manage the complete lifecycle of the product, we can see the amount of data associated with the product kept increasing staggeringly. In this evolution cycle, as product complexity kept swelling, PLM software systems had to mature themselves from simple data management screens to more intelligent piece of software that efficiently understood the product’s behavior.

Product data plays a dynamic role in creating a “Digital thread” and data from all stages of the product lifecycle is important. As more intelligent products are introduced and with growing scope for industrial internet, much of this Big-Data are created by the smart sensors that are encapsulated within the product.

One of an interesting example, that I came across on product’s Big-Data in industrial context is the overwhelming data that were created in Large Hadron Collider (LHC) in the hunt for “Particle-X”. Interesting statistics state that the amount of data generated in the LHC contributes to one petabyte of data per second. The good news though is, only around 25 petabytes of this data are stored annually. Visualizing another low-scale IoT scenario, if each senor on a wind turbine are read 400 times a second, a single turbine could generate more than 10GB of data per day.

Now, whatever may be the volume or structure of a product’s big data, it is evident that it wouldn’t make sense for a human to read and make sense of it. It is important a PLM system or its supporting smart algorithms mine this data for patterns, correlations and any other valuable information that help to understand the product’s behavior better.

Big-Data in Product’s Lifecycle:

A product undergoes its lifecycle across three stages Beginning-of-Life (BOL), Middle-of-Life(MOL) and End-of-Life(EOL).

BOL” is the period during which a product idea is born. It is eventually realized virtually and physically manufactured before its used by customers. During “MOL” phase, the manufactured product is put to use and is maintained and serviced periodically. Finally, during “EOL” a product goes for dusk, when it is recycled or disposed of by customers.

The amount of data generated by the product during each of these phases is huge. A PLM system is of great significance and to achieve better performance in PLM, figuring out what kind of data are involved in PLM in each of the product’s lifecycle phase, is a mandatory task before proposing any methods for mining the Big-Data for advanced analytics.

Big-Data during “BOL”

In general, a product’s life begins with its Design. A Design itself is generated based on a specific customer requirement or mostly because of market analysis. With a modern “Smart Factory” setup, large amount of data is generated by the numerous sensors setup in the shop floor that continually monitor the environmental parameters, manufacturing equipment and the finished goods itself. In most cases, these data keep changing dramatically along the manufacturing cycle. In addition, various Item routing data such as assemble instruction, product history data, inventory status etc. are generated during the manufacturing cycle. A PLM system with an efficient “Digital Thread” setup should be able to support the staggering data generated during this phase independently or with smart integration enabled along with other planning and management systems such as ERP.

Big-Data during “MOL”

As stated in the previous article, a product’s behavior is well understood when the product is actually used or consumed. In the middle of the product lifecycle, the IoT devices send back a large amount of information to the Big-Data repository. This usage information needs to be monitored and stored to provide maintenance guidance, or to provide feedback to the engineering designers on any deviations.

Outdated breakdown maintenance is no longer efficient and real time product status information enables predictive maintenance. To achieve such kind of ecosystem, the PLM system involved should enable manufacturers to trace back in time for better understanding of the product design, what changes were made during manufacturing, why it was made and so on.

Big-Data during “EOL”

The decision of whether a product should be handled when it reaches its end-of-life are made based on the current product-status information, its maintenance history and any other environmental parameters. These analytics can be well understood using the Big-Data generated during the product’s “MOL” period. A product can be disposed to sunset, or it can be recycled or reengineered depending on the result of advanced analytics methods. Product’s Big-Data plays a key role in providing cognitive insights to the end users on predicting remaining life time of the independent parts or components used to manufacture the product and make decisions.

To Conclude, the concept of “Big-Data” is not new and the use of “Big-Data” will become the competitive strategy for manufacturing companies. Though “Big-Data” techniques when applied in PLM may face various challenges such as Data collection, Data Storage, Data Transfer, Data Processing, Data Security etc, a good PLM software system should take into consideration all these factors and overcome these challenges for enabling SMART product lifecycle management.



You May Also Like

About the Author: SathishSathiya

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: