hh.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Extracting Invariant Features for Predicting State of Health of Batteries in Hybrid Energy Buses
Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.ORCID iD: 0000-0002-6040-2269
Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.ORCID iD: 0000-0002-3034-6630
Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.ORCID iD: 0000-0003-3272-4145
Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS), CAISR - Center for Applied Intelligent Systems Research.ORCID iD: 0000-0002-0051-0954
Show others and affiliations
2021 (English)In: 2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA), Porto, Portugal, 6-9 Oct., 2021, IEEE, 2021, p. 1-6Conference paper, Published paper (Refereed)
Abstract [en]

Batteries are a safety-critical and the most expensive component for electric vehicles (EVs). To ensure the reliability of the EVs in operation, it is crucial to monitor the state of health of those batteries. Monitoring their deterioration is also relevant to the sustainability of the transport solutions, through creating an efficient strategy for utilizing the remaining capacity of the battery and its second life. Electric buses, similar to other EVs, come in many different variants, including different configurations and operating conditions. Developing new degradation models for each existing combination of settings can become challenging from different perspectives such as unavailability of failure data for novel settings, heterogeneity in data, low amount of data available for less popular configurations, and lack of sufficient engineering knowledge. Therefore, being able to automatically transfer a machine learning model to new settings is crucial. More concretely, the aim of this work is to extract features that are invariant across different settings.

In this study, we propose an evolutionary method, called genetic algorithm for domain invariant features (GADIF), that selects a set of features to be used for training machine learning models, in such a way as to maximize the invariance across different settings. A Genetic Algorithm, with each chromosome being a binary vector signaling selection of features, is equipped with a specific fitness function encompassing both the task performance and domain shift. We contrast the performance, in migrating to unseen domains, of our method against a number of classical feature selection methods without any transfer learning mechanism. Moreover, in the experimental result section, we analyze how different features are selected under different settings. The results show that using invariant features leads to a better generalization of the machine learning models to an unseen domain.

Place, publisher, year, edition, pages
IEEE, 2021. p. 1-6
Keywords [en]
State of Health Estimation, Remaining Useful Life Prediction, Invariant Features, Lithium-ion Battery, Transfer Learning, Electric vehicles, Predictive maintenance
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:hh:diva-45895DOI: 10.1109/DSAA53316.2021.9564184ISI: 000783799800049Scopus ID: 2-s2.0-85126144193OAI: oai:DiVA.org:hh-45895DiVA, id: diva2:1612184
Conference
2021 IEEE 8th International Conference on Data Science and Advanced Analytics (DSAA), Porto, Portugal, 6-9 Oct., 2021
Funder
Vinnova
Note

Som manuscript i avhandling/As manuscript in thesis

Available from: 2021-11-17 Created: 2021-11-17 Last updated: 2024-01-24Bibliographically approved
In thesis
1. Evolving intelligence: Overcoming challenges for Evolutionary Deep Learning
Open this publication in new window or tab >>Evolving intelligence: Overcoming challenges for Evolutionary Deep Learning
2024 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Deep Learning (DL) has achieved remarkable results in both academic and industrial fields over the last few years. However, DL models are often hard to design and require proper selection of features and tuning of hyper-parameters to achieve high performance. These selections are tedious for human experts and require substantial time and resources. A difficulty that encouraged a growing number of researchers to use Evolutionary Computation (EC) algorithms to optimize Deep Neural Networks (DNN); a research branch called Evolutionary Deep Learning (EDL).

This thesis is a two-fold exploration within the domains of EDL, and more broadly Evolutionary Machine Learning (EML). The first goal is to makeEDL/EML algorithms more practical by reducing the high computational costassociated with EC methods. In particular, we have proposed methods to alleviate the computation burden using approximate models. We show that surrogate-models can speed up EC methods by three times without compromising the quality of the final solutions. Our surrogate-assisted approach allows EC methods to scale better for both, expensive learning algorithms and large datasets with over 100K instances. Our second objective is to leverage EC methods for advancing our understanding of Deep Neural Network (DNN) design. We identify a knowledge gap in DL algorithms and introduce an EC algorithm precisely designed to optimize this uncharted aspect of DL design. Our analytical focus revolves around revealing avant-garde concepts and acquiring novel insights. In our study of randomness techniques in DNN, we offer insights into the design and training of more robust and generalizable neural networks. We also propose, in another study, a novel survival regression loss function discovered based on evolutionary search.

Place, publisher, year, edition, pages
Halmstad: Halmstad University Press, 2024. p. 32
Series
Halmstad University Dissertations ; 109
Keywords
neural networks, evolutionary deep learning, evolutionary machine learning, feature selection, hyperparameter optimization, evolutionary computation, particle swarm optimization, genetic algorithm
National Category
Computer Systems Signal Processing
Identifiers
urn:nbn:se:hh:diva-52469 (URN)978-91-89587-31-1 (ISBN)978-91-89587-32-8 (ISBN)
Public defence
2024-02-16, Wigforss, Kristian IV:s väg 3, Halmstad, 08:00 (English)
Opponent
Supervisors
Available from: 2024-01-24 Created: 2024-01-24 Last updated: 2024-03-07

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Altarabichi, Mohammed GhaithFan, YuantaoPashami, SepidehSheikholharam Mashhadi, PeymanNowaczyk, Sławomir

Search in DiVA

By author/editor
Altarabichi, Mohammed GhaithFan, YuantaoPashami, SepidehSheikholharam Mashhadi, PeymanNowaczyk, Sławomir
By organisation
CAISR - Center for Applied Intelligent Systems Research
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 399 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf