hh.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Machines Do Not Have Little Gray Cells:: Analysing Catastrophic Forgetting in Cross-Domain Intrusion Detection Systems
Halmstad University, School of Information Technology.
Halmstad University, School of Information Technology.
2023 (English)Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesisAlternative title
Machines Do Not Have Little Gray Cells: : Analysing Catastrophic Forgetting in Cross-Domain Intrusion Detection Systems (English)
Abstract [en]

Cross-domain intrusion detection, a critical component of cybersecurity, involves evaluating the performance of neural networks across diverse datasets or databases. The ability of intrusion detection systems to effectively adapt to new threats and data sources is paramount for safeguarding networks and sensitive information. This research delves into the intricate world of cross-domain intrusion detection, where neural networks must demonstrate their versatility and adaptability. The results of our experiments expose a significant challenge: the phenomenon known as catastrophic forgetting. This is the tendency of neural networks to forget previously acquired knowledge when exposed to new information. In the context of intrusion detection, it means that as models are sequentially trained on different intrusion detection datasets, their performance on earlier datasets degrades drastically. This degradation poses a substantial threat to the reliability of intrusion detection systems. In response to this challenge, this research investigates potential solutions to mitigate the effects of catastrophic forgetting. We propose the application of continual learning techniques as a means to address this problem. Specifically, we explore the Elastic Weight Consolidation (EWC) algorithm as an example of preserving previously learned knowledge while allowing the model to adapt to new intrusion detection tasks. By examining the performance of neural networks on various intrusion detection datasets, we aim to shed light on the practical implications of catastrophic forgetting and the potential benefits of adopting EWC as a memory-preserving technique. This research underscores the importance of addressing catastrophic forgetting in cross-domain intrusion detection systems. It provides a stepping stone for future endeavours in enhancing multi-task learning and adaptability within the critical domain of intrusion detection, ultimately contributing to the ongoing efforts to fortify cybersecurity defences.

Place, publisher, year, edition, pages
2023. , p. 39
Keywords [en]
Catastrophic Forgetting, Intrusion Detection Systems, Continual Learning
National Category
Computer Engineering
Identifiers
URN: urn:nbn:se:hh:diva-51842OAI: oai:DiVA.org:hh-51842DiVA, id: diva2:1806232
Subject / course
Digital Forensics
Educational program
Master's Programme in Network Forensics, 60 credits
Presentation
2023-09-25, Halmstad University, Kristian IV:s väg 3, Halmstad, 01:45 (English)
Supervisors
Examiners
Available from: 2023-10-23 Created: 2023-10-20 Last updated: 2023-10-23Bibliographically approved

Open Access in DiVA

fulltext(1003 kB)142 downloads
File information
File name FULLTEXT02.pdfFile size 1003 kBChecksum SHA-512
76d6e5c16b1ef9cea671201e16496aad6d023ce8b2009758fe857a1ef5a7c48cf17afaf3d3389534503ba4e0e2f88cee4fe71f99bea4ada41aae96a384bdf8c7
Type fulltextMimetype application/pdf

By organisation
School of Information Technology
Computer Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 142 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 1009 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf