Javascript must be enabled for the correct page display

Lifelong Machine Learning UnderConcept Drift - Enforcing Forgetting by Noise Accumulation

Kan, Zhuoyun (2019) Lifelong Machine Learning UnderConcept Drift - Enforcing Forgetting by Noise Accumulation. Master's Thesis / Essay, Computing Science.


Download (5MB) | Preview
[img] Text
Restricted to Registered users only

Download (141kB)


We consider a modelling framework for the investigation of prototype-based classifiers under concept drift processes. Specifically, We study the supervised learning model of a Learning Vector Quantization (LVQ) systemin terms of two concept drift situations: virtual drift and real drift. In virtual drifts, the statistical properties of high-dimensional streaming data are time-dependent. In real drifts, however, the target task changes over time. Both class-balanced and class-imbalanced cases in training examples are considered under real drift. Previous studies demonstrate that weight decay as an explicit mechanism of forgetting has limitations since it imposes a significant restriction on the flexibility of learning systems. We present an alternative to the explicit weight decay: the application of additive noise in each training step, which imposes a forgetting effect without restricting the flexibility of the LVQ system. Our results show that the noise accumulation overcomes the limitations and can outperform weight decay under virtual drift. However, under real drift processes, the additive noise as an alternative mechanism of forgetting does not improve the performance, which therefore requires modification to the model in the future.

Item Type: Thesis (Master's Thesis / Essay)
Supervisor nameSupervisor E mail
Degree programme: Computing Science
Thesis type: Master's Thesis / Essay
Language: English
Date Deposited: 23 Aug 2019
Last Modified: 27 Aug 2019 10:22

Actions (login required)

View Item View Item