Home  | Publications | LCT+21

Effective Version Space Reduction for Convolutional Neural Networks

MCML Authors

Link to Profile Daniel Cremers PI Matchmaking

Daniel Cremers

Prof. Dr.

Director

Abstract

In active learning, sampling bias could pose a serious inconsistency problem and hinder the algorithm from finding the optimal hypothesis. However, many methods for neural networks are hypothesis space agnostic and do not address this problem. We examine active learning with convolutional neural networks through the principled lens of version space reduction. We identify the connection between two approaches – prior mass reduction and diameter reduction – and propose a new diameter-based querying method – the minimum Gibbs-vote disagreement. By estimating version space diameter and bias, we illustrate how version space of neural networks evolves and examine the realizability assumption. With experiments on MNIST, Fashion-MNIST, SVHN and STL-10 datasets, we demonstrate that diameter reduction methods reduce the version space more effectively and perform better than prior mass reduction and other baselines, and that the Gibbs vote disagreement is on par with the best query method.

inproceedings


ECML-PKDD 2021

European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases. Virtual, Sep 13-17, 2021.
Conference logo
A Conference

Authors

J. Liu • I. Chiotellis • R. Triebel • D. Cremers

Links

DOI

Research Area

 B1 | Computer Vision

BibTeXKey: LCT+21

Back to Top