Abstract:
Few samples learning (FSL) is significant and challenging in the field of machine learning. The main challenge of few-shot learning is the deficiency of samples. Training on much smaller training sets while maintaining nearly the same accuracy would be very beneficial. Meta-learning is the process of learning how to learn. It is a subfield of machine learning where automatic learning algorithms are applied to metadata about machine learning experiment. Reptile is the application of the Shortest Descent algorithm to the meta-learning setting. We have developed a simple meta-learning algorithm named stomatopods inspired from Reptile which works by repeatedly sampling a task, performing stochastic gradient descent on it, and updating the initial parameters towards the final parameters learned on that task. The obtained results show a significant improvement in accuracies on four different datasets and found that the results were better.