Skip to main navigation Skip to search Skip to main content

Uses of continual learning techniques in generalised few-shot object detection

Student thesis: Master's Thesis

Abstract

The best large-scale deep learning models require massive amounts of training data. For some tasks, collecting such data may be unfeasible, due to logistical or legal reasons. Few-Shot Learning has emerged as a field of study to maximise performance based on very few samples. In Generalised Few-Shot Learning, a model has to learn new few-shot classes while recalling earlier large-scale training classes. In this work, we review the tools and techniques used in Few-Shot Learning before exploring the parallels between Generalised Few-Shot Object Detection (G-FSOD) and Continual Learning (CL) methods. We focus on the manipulation of gradient descent since it has been recently proposed for G-FSOD. We show that gradient methods appear to be no better than existing techniques, and point out that potentially beneficial insights on sampling from the Continual Learning world have yet to be employed. We hope this work will provide a blueprint for further study of both G-FSOD and CL as interconnected fields.
Date of Award11 Jun 2025
Original languageEnglish
Awarding Institution
  • University Of Strathclyde
SupervisorMarc Roper (Supervisor) & Andrew Abel (Supervisor)

Cite this

'