About me
I’m a 4th year PhD student at the Quebec Artificial Intelligence Institute (Mila) and an intern at ElementAI under the supervision of Laurent Charlin and Pau Rodriguez, respectively.
I’m interested in algorithms able to accumulate transferable knowledge or skills enabling generalization to future tasks. Accordingly, my research topics lie in continual learning and meta-learning. My recent work proposes a new and more realistic approach to continual learning at the intersection of both fields.
Recently, I have developed a particular interest in the idea of composing existing skills to learn new ones quickly. I believe this is the real appeal of continual learning and that it can propel reinforcement learning. Consequently, i’m currently focused on continual RL.
News
09/2020 My work proposing a new approach to continual learning (OSAKA) as been accepted at NeurIPS 2020!
09/2020 Our work proposing a synthetic dataset generator (Synbols) to probe different learning algorithms as been accepted at NeurIPS 2020!
08/2020 I gave a talk on our new approach to continual-learning evaluation towards real-life deployment of continual-learning systems (OSAKA) at ContinualAI
06/2020 I hosted a panel with Chelsea Finn, Chris Kanan and Subutai Ahmad at our Continual Learning workshop at CVPR 2020! You can find it here
06/2020 Our work on online continual compression lead by my brother was accepted ICML 2020! You can find a 18-min video here
12/2019 Our workshop on Continual Learning at CVPR2020 as been accepted! Watch out for our really cool competition.
12/2019 My work on Language GANs Falling Short was accepted ICLR 2020! You can find a 5-min video summary here
09/2019 My work on Online Continual learning with Maximal Interfered Retrieval was accepted NeurIPS 2019! You can find a 8-min video summary here