Datenbestand vom 15. November 2024

Warenkorb Datenschutzhinweis Dissertationsdruck Dissertationsverlag Institutsreihen     Preisrechner

aktualisiert am 15. November 2024

ISBN 9783843954297

45,00 € inkl. MwSt, zzgl. Versand


978-3-8439-5429-7, Reihe Informatik

Sebastian Schwendemann
Transfer Learning for Predictive Maintenance Solutions

210 Seiten, Dissertation Technische Universität Clausthal (2023), Softcover, A5

Zusammenfassung / Abstract

This thesis is primarily concerned with transfer learning for predictive maintenance for fault classification and RUL estimation. First the state-of-the-art machine learning techniques with a focus on techniques applicable to predictive maintenance tasks are presented. This is followed by the machine tool background and current research that applies the machine learning techniques to predictive maintenance tasks. One novelty of this thesis is that it introduces a new intermediate domain that represents data by focusing on relevant information to allow the data to be used on different domains without losing relevant information. The proposed solution is optimized for rotating elements. Therefore, the presented intermediate domain creates different layers by focusing on the fault frequencies of the rotating elements. Another novelty is a transfer learning-based fault classification approach for different component types under different process conditions. It is based on the intermediate domain utilized by a convolutional neural network (CNN). In addition, a novel transfer learning loss function, called layered maximum mean discrepancy (LMMD) is introduced. It is based on the maximum mean discrepancy (MMD) and extends it by considering the intermediate domain layers.

Another novelty is an RUL estimation transfer learning approach for different component types based on the data of accelerometers with low sampling rates. It uses the feature extraction concepts of the classification approach. The features are then used as input for a long short-term memory (LSTM) network. The transfer learning is based on fixed feature extraction, where the trained convolutional layers are taken over. Only the LSTM network has to be retrained. The intermediate domain supports this transfer learning type, as it is similar for different types. In addition, it enables the practical usage of accelerometers with low sampling rates during transfer learning, which is an absolute novelty.