Bayesian few-shot classification has been a focal point in the field of
few-shot learning. This paper seamlessly integrates mirror descent-based
variational inference into Gaussian process-based few-shot classification,
addressing the challenge of non-conjugate inference. By leveraging
non-Euclidean geometry, mirror descent achieves accelerated convergence by
providing the steepest descent direction along the corresponding manifold. It
also exhibits the parameterization invariance property concerning the
variational distribution. Experimental results demonstrate competitive
classification accuracy, improved uncertainty quantification, and faster
convergence compared to baseline models. Additionally, we investigate the
impact of hyperparameters and components. Code is publicly available at
https://github.com/keanson/MD-BSFC.