Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Self-supervised Learning for Efficient Remaining Useful Life Prediction
RISE Research Institutes of Sweden, Materials and Production, Product Realisation Methodology. Mälardalen University, Sweden.ORCID iD: 0000-0002-1262-9143
RISE Research Institutes of Sweden, Digital Systems, Smart Hardware.ORCID iD: 0000-0002-9505-0822
Mälardalen University, Sweden. (Division of Sustainable Energy Systems)ORCID iD: 0000-0002-8466-356X
2022 (English)In: Vol. 14 No. 1 (2022): Proceedings of the Annual Conference of the PHM Society 2022 / [ed] Chetan Kulkarni and Abhinav Saxena, 2022, Vol. 14, article id 3222Conference paper, Published paper (Refereed)
Abstract [en]

Canonical deep learning-based Remaining Useful Life (RUL) prediction relies on supervised learning methods which in turn requires large data sets of run-to-failure data to ensure model performance. In a large class of cases, run-to-failure data is difficult to collect in practice as it may be expensive and unsafe to operate assets until failure. As such, there is a need to leverage data that are not run-to-failure but may still contain some measurable, and thus learnable, degradation signal. In this paper, we propose utilizing self-supervised learning as a pretraining step to learn representations of the data which will enable efficient training on the downstream task of RUL prediction. The self-supervised learning task chosen is time series sequence ordering, a task that involves constructing tuples each consisting of $n$ sequences sampled from the time series and reordered with some probability $p$. Subsequently, a classifier is trained on the resulting binary classification task; distinguishing between correctly ordered and shuffled tuples. The classifier's weights are then transferred to the RUL-model and fine-tuned using run-to-failure data. We show that the proposed self-supervised learning scheme can retain performance when training on a fraction of the full data set. In addition, we show indications that self-supervised learning as a pretraining step can enhance the performance of the model even when training on the full run-to-failure data set. To conduct our experiments, we use a data set of simulated run-to-failure turbofan jet engines.

Place, publisher, year, edition, pages
2022. Vol. 14, article id 3222
Keywords [en]
self-supervised learning, self-supervised, deep learning, machine learning, remaining useful life, unsupervised learning, neural networks
National Category
Reliability and Maintenance
Identifiers
URN: urn:nbn:se:ri:diva-61136DOI: 10.36001/phmconf.2022.v14i1.3222OAI: oai:DiVA.org:ri-61136DiVA, id: diva2:1709580
Conference
14th Annual Conference of the Prognostics and Health Management Society
Available from: 2022-11-09 Created: 2022-11-09 Last updated: 2024-04-16Bibliographically approved

Open Access in DiVA

fulltext(1600 kB)143 downloads
File information
File name FULLTEXT01.pdfFile size 1600 kBChecksum SHA-512
2050b03c0073094c7c21cf1308f0059a4569b1c6f138449940055872295417966c3d20b8bdc3afdbe612e9e240beda44c039001d5adf0ce6c97ca6d1f247dd93
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Authority records

Söderkvist Vermelin, WilhelmLövberg, Andreas

Search in DiVA

By author/editor
Söderkvist Vermelin, WilhelmLövberg, AndreasKyprianidis, Konstantinos
By organisation
Product Realisation MethodologySmart Hardware
Reliability and Maintenance

Search outside of DiVA

GoogleGoogle Scholar
Total: 143 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 430 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf