A buffered online transfer learning algorithm with multi-layer network

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

A buffered online transfer learning algorithm with multi-layer network. / Kang, Zhongfeng; Yang, Bo; Nielsen, Mads; Deng, Lihui; Yang, Shantian.

I: Neurocomputing, Bind 488, 2022, s. 581-597.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Kang, Z, Yang, B, Nielsen, M, Deng, L & Yang, S 2022, 'A buffered online transfer learning algorithm with multi-layer network', Neurocomputing, bind 488, s. 581-597. https://doi.org/10.1016/j.neucom.2021.11.066

APA

Kang, Z., Yang, B., Nielsen, M., Deng, L., & Yang, S. (2022). A buffered online transfer learning algorithm with multi-layer network. Neurocomputing, 488, 581-597. https://doi.org/10.1016/j.neucom.2021.11.066

Vancouver

Kang Z, Yang B, Nielsen M, Deng L, Yang S. A buffered online transfer learning algorithm with multi-layer network. Neurocomputing. 2022;488:581-597. https://doi.org/10.1016/j.neucom.2021.11.066

Author

Kang, Zhongfeng ; Yang, Bo ; Nielsen, Mads ; Deng, Lihui ; Yang, Shantian. / A buffered online transfer learning algorithm with multi-layer network. I: Neurocomputing. 2022 ; Bind 488. s. 581-597.

Bibtex

@article{7c83e003f30742399d4aa920c35d503c,
title = "A buffered online transfer learning algorithm with multi-layer network",
abstract = "Online transfer learning (OTL) has attracted much attention in recent years. It is designed to handle the transfer learning tasks, where the data of the target domain isn't available in advance but may arrive in an online manner, which may be a more realistic scenario in practice. However, there typically are two limitations of existing OTL algorithms. 1) Existing OTL algorithms are based on shallow online learning models (SOLMs), e.g., linear or kernel models. Due to this limitation of SOLMs they cannot effectively learn complex nonlinear functions in complicated application and the OTL algorithms based on SOLMs cannot either. 2) Existing algorithms only utilize the latest arrived instance to adjust the model. In this way, the previously arrived instances are not utilized. It may be better to utilize the previously arrived instances as well. In this paper, to overcome the abovementioned two limitations, a buffered online transfer learning (BOTL) algorithm is proposed. In the proposed BOTL algorithm, the learner is designed as a deep learning model, referred to as Online Hedge Neural Network (OHNN). In order to enable the OHNN to be effectively learned in an online manner, we propose a buffered online learning framework that utilizes several previously arrived instances to assist learning. Further, to enhance the performance of the OHNN, a model learned in the source domain is transferred to the target domain. The regret bound of the proposed BOTL algorithm is analyzed theoretically. Experimental results on realistic datasets illustrate that the proposed BOTL algorithm can achieve lower mistake rate than the algorithms compared.",
keywords = "Deep learning, Multi-layer neural network, Online learning, Online transfer learning, Transfer learning",
author = "Zhongfeng Kang and Bo Yang and Mads Nielsen and Lihui Deng and Shantian Yang",
note = "Publisher Copyright: {\textcopyright} 2021 Elsevier B.V.",
year = "2022",
doi = "10.1016/j.neucom.2021.11.066",
language = "English",
volume = "488",
pages = "581--597",
journal = "Neurocomputing",
issn = "0925-2312",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - A buffered online transfer learning algorithm with multi-layer network

AU - Kang, Zhongfeng

AU - Yang, Bo

AU - Nielsen, Mads

AU - Deng, Lihui

AU - Yang, Shantian

N1 - Publisher Copyright: © 2021 Elsevier B.V.

PY - 2022

Y1 - 2022

N2 - Online transfer learning (OTL) has attracted much attention in recent years. It is designed to handle the transfer learning tasks, where the data of the target domain isn't available in advance but may arrive in an online manner, which may be a more realistic scenario in practice. However, there typically are two limitations of existing OTL algorithms. 1) Existing OTL algorithms are based on shallow online learning models (SOLMs), e.g., linear or kernel models. Due to this limitation of SOLMs they cannot effectively learn complex nonlinear functions in complicated application and the OTL algorithms based on SOLMs cannot either. 2) Existing algorithms only utilize the latest arrived instance to adjust the model. In this way, the previously arrived instances are not utilized. It may be better to utilize the previously arrived instances as well. In this paper, to overcome the abovementioned two limitations, a buffered online transfer learning (BOTL) algorithm is proposed. In the proposed BOTL algorithm, the learner is designed as a deep learning model, referred to as Online Hedge Neural Network (OHNN). In order to enable the OHNN to be effectively learned in an online manner, we propose a buffered online learning framework that utilizes several previously arrived instances to assist learning. Further, to enhance the performance of the OHNN, a model learned in the source domain is transferred to the target domain. The regret bound of the proposed BOTL algorithm is analyzed theoretically. Experimental results on realistic datasets illustrate that the proposed BOTL algorithm can achieve lower mistake rate than the algorithms compared.

AB - Online transfer learning (OTL) has attracted much attention in recent years. It is designed to handle the transfer learning tasks, where the data of the target domain isn't available in advance but may arrive in an online manner, which may be a more realistic scenario in practice. However, there typically are two limitations of existing OTL algorithms. 1) Existing OTL algorithms are based on shallow online learning models (SOLMs), e.g., linear or kernel models. Due to this limitation of SOLMs they cannot effectively learn complex nonlinear functions in complicated application and the OTL algorithms based on SOLMs cannot either. 2) Existing algorithms only utilize the latest arrived instance to adjust the model. In this way, the previously arrived instances are not utilized. It may be better to utilize the previously arrived instances as well. In this paper, to overcome the abovementioned two limitations, a buffered online transfer learning (BOTL) algorithm is proposed. In the proposed BOTL algorithm, the learner is designed as a deep learning model, referred to as Online Hedge Neural Network (OHNN). In order to enable the OHNN to be effectively learned in an online manner, we propose a buffered online learning framework that utilizes several previously arrived instances to assist learning. Further, to enhance the performance of the OHNN, a model learned in the source domain is transferred to the target domain. The regret bound of the proposed BOTL algorithm is analyzed theoretically. Experimental results on realistic datasets illustrate that the proposed BOTL algorithm can achieve lower mistake rate than the algorithms compared.

KW - Deep learning

KW - Multi-layer neural network

KW - Online learning

KW - Online transfer learning

KW - Transfer learning

U2 - 10.1016/j.neucom.2021.11.066

DO - 10.1016/j.neucom.2021.11.066

M3 - Journal article

AN - SCOPUS:85120776938

VL - 488

SP - 581

EP - 597

JO - Neurocomputing

JF - Neurocomputing

SN - 0925-2312

ER -

ID: 291543669