publié le: 2023
Knowledge distillation is a simple yet effective technique for deep model compression, which aims to transfer the knowledge learned by a large teacher...
The success of deep learning in recent years has lead to a rising demand for neural network architecture engineering. As a consequence, neural archite...
A complete representation of 3D objects requires characterizing the space of deformations in an interpretable manner, from articulations of a single i...
In the paradigm of online continual learning, one neural network is exposed to a sequence of tasks, where the data arrive in an online fashion and pre...
Graph convolution networks (GCNs) based methods for 3D human pose estimation usually aggregate immediate features of single-hop nodes, which are unawa...