Zero-shot Clustering of Embeddings with Pretrained and Self-Supervised Learnt Encoders

Scott C. Lowe, Joakim Bruslund Haurum, Sageev Oore, Thomas B. Moeslund, Graham W. Taylor

Publikation: Konferencebidrag uden forlag/tidsskriftPaper uden forlag/tidsskriftForskningpeer review

Abstract

We explore whether large pretrained models can provide a useful representation space for datasets they were not trained on, and whether these representations can be used to group novel unlabelled data into meaningful clusters. To this end, we conduct experiments using image encoders pretrained on ImageNet using either supervised or self-supervised training techniques. These encoders are deployed on image datasets that were not seen during training, and we investigate whether their embeddings can be clustered with conventional clustering algorithms. We find that it is possible to create well-defined clusters using self-supervised feature encoders, especially when using the Agglomerative Clustering method, and that it is possible to do so even for very fine-grained datasets such as NABirds. We also find indications that the Silhouette score is a good proxy of cluster quality for self-supervised feature encoders when no ground-truth is available.
OriginalsprogEngelsk
Publikationsdato15 dec. 2023
Antal sider17
StatusUdgivet - 15 dec. 2023
BegivenhedWorkshop on robustness of zero/few-shot learning in foundation models (NeurIPS 2023) - New Orleans, USA
Varighed: 15 dec. 202315 dec. 2023
https://sites.google.com/view/r0-fomo

Workshop

WorkshopWorkshop on robustness of zero/few-shot learning in foundation models (NeurIPS 2023)
Land/OmrådeUSA
ByNew Orleans
Periode15/12/202315/12/2023
Internetadresse

Fingeraftryk

Dyk ned i forskningsemnerne om 'Zero-shot Clustering of Embeddings with Pretrained and Self-Supervised Learnt Encoders'. Sammen danner de et unikt fingeraftryk.

Citationsformater