Class Incremental Deep Learning: A Computational Scheme to Avoid Catastrophic Forgetting in Domain Generation Algorithm Multiclass Classification

Registro completo de metadados
MetadadosDescriçãoIdioma
Autor(es): dc.contributorUniversidade Estadual Paulista (UNESP)-
Autor(es): dc.creatorGregório, João Rafael-
Autor(es): dc.creatorCansian, Adriano Mauro-
Autor(es): dc.creatorNeves, Leandro Alves-
Data de aceite: dc.date.accessioned2025-08-21T21:13:29Z-
Data de disponibilização: dc.date.available2025-08-21T21:13:29Z-
Data de envio: dc.date.issued2025-04-29-
Data de envio: dc.date.issued2024-08-01-
Fonte completa do material: dc.identifierhttp://dx.doi.org/10.3390/app14167244-
Fonte completa do material: dc.identifierhttps://hdl.handle.net/11449/306709-
Fonte: dc.identifier.urihttp://educapes.capes.gov.br/handle/11449/306709-
Descrição: dc.descriptionDomain Generation Algorithms (DGAs) are algorithms present in most malware used by botnets and advanced persistent threats. These algorithms dynamically generate domain names to maintain and obfuscate communication between the infected device and the attacker’s command and control server. Since DGAs are used by many threats, it is extremely important to classify a given DGA according to the threat it is related to. In addition, as new threats emerge daily, classifier models tend to become obsolete over time. Deep neural networks tend to lose their classification ability when retrained with a dataset that is significantly different from the initial one, a phenomenon known as catastrophic forgetting. This work presents a computational scheme composed of a deep learning model based on CNN and natural language processing and an incremental learning technique for class increment through transfer learning to classify 60 DGA families and include a new family to the classifier model, training the model incrementally using some examples from known families, avoiding catastrophic forgetting and maintaining metric levels. The proposed methodology achieved an average precision of 86.75%, an average recall of 83.06%, and an average F1 score of 83.78% with the full dataset, and suffered minimal losses when applying the class increment.-
Descrição: dc.descriptionDepartment of Computer Science and Statistics (DCCE) São Paulo State University (UNESP), São Paulo-
Descrição: dc.descriptionDepartment of Computer Science and Statistics (DCCE) São Paulo State University (UNESP), São Paulo-
Idioma: dc.languageen-
Relação: dc.relationApplied Sciences (Switzerland)-
???dc.source???: dc.sourceScopus-
Palavras-chave: dc.subjectbotnets-
Palavras-chave: dc.subjectcybersecurity-
Palavras-chave: dc.subjectdeep learning-
Palavras-chave: dc.subjectDGA-
Palavras-chave: dc.subjectincremental learning-
Palavras-chave: dc.subjectmulticlass classification-
Título: dc.titleClass Incremental Deep Learning: A Computational Scheme to Avoid Catastrophic Forgetting in Domain Generation Algorithm Multiclass Classification-
Tipo de arquivo: dc.typelivro digital-
Aparece nas coleções:Repositório Institucional - Unesp

Não existem arquivos associados a este item.