A New Modeling for Knowledge Transfer in Machine Learning

A New Modeling for Knowledge Transfer in Machine Learning

Minimum Enclosing Ball-based Learner Independent Knowledge Transfer for Correlated Multi-task Learning

LAP Lambert Academic Publishing ( 13.05.2011 )

€ 49,00

Купить в магазине MoreBooks!

Multi-Task Learning (MTL), as opposed to Single Task Learning (STL), has become a hot topic in machine learning research. MTL has shown significant advantage to STL because of its ability to facilitate knowledge sharing between tasks. This thesis presents my recent studies on Knowledge Transfer (KT) – the process of transferring knowledge from one task to another, which is at the core of MTL. The novelly proposed KT algorithm for correlated MTL adapts learner independence, thus empowering any ordinary classifier for MTL. The proposed MEB-based KT is on the basis that in the feature space, the two correlated tasks share some common input data that lie on the overlapping regions of the feature spaces in-between the two correlated tasks. The main idea is to find the correlating knowledge – overlapping regions of the two tasks – and transfer the related data regardless of the learner employed. KT is done by building a correlation space via MEBs and transferring the enclosed instances from the primary task to the secondary task. The extent of KT depends on the amount of overlapping instances between two tasks. This book is required reading for post-graduates and researchers in MTL.

Детали книги:

ISBN-13:

978-3-8443-9732-1

ISBN-10:

3844397329

EAN:

9783844397321

Язык книги:

English

By (author) :

Fan Liu

Количество страниц:

88

Опубликовано:

13.05.2011

Категория:

Информатика, ИТ