Title
Leveraging Unlabeled Data for Sketch-based Understanding
Date Issued
01 January 2022
Access level
open access
Resource Type
conference paper
Author(s)
Weber State University
Publisher(s)
IEEE Computer Society
Abstract
Sketch-based understanding is a critical component of human cognitive learning and is a primitive communication means between humans. This topic has recently attracted the interest of the computer vision community as sketching represents a powerful tool to express static objects and dynamic scenes. Unfortunately, despite its broad application domains, the current sketch-based models strongly rely on labels for supervised training, ignoring knowledge from unlabeled data, thus limiting the underlying generalization and the applicability. Therefore, we present a study about the use of unlabeled data to improve a sketch-based model. To this end, we evaluate variations of VAE and semi-supervised VAE, and present an extension of BYOL to deal with sketches. Our results show the superiority of sketch-BYOL, which outperforms other self-supervised approaches increasing the retrieval performance for known and unknown categories. Furthermore, we show how other tasks can benefit from our proposal.
Start page
5149
End page
5158
Volume
2022-June
Language
English
OCDE Knowledge area
Ciencias de la información
Ciencias de la computación
Scopus EID
2-s2.0-85137777404
ISBN
9781665487399
Source
IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Resource of which it is part
IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
ISSN of the container
21607508
ISBN of the container
978-166548739-9
Conference
2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2022
Sources of information:
Directorio de Producción Científica
Scopus