About me
I’m a PhD student at the Jagiellonian University, working on machine learning with the GMUM group under supervision of Prof. Jacek Tabor. My main research interests are centered around the issue of efficiency in deep learning, highlighting the following questions in particular:
- Continual Learning – how to remember the past and reuse it efficiently when learning from a stream of data.
- Reinforcement Learning – how to increase the sample efficiency and leverage models pre-trained on offline data.
- Conditional Computation – how to adapt the computing power of the model to a given example.
- Generative Models – how to perform weakly supervised conditional generation and adapt existing models for conditional generation.
News
- (November 2022) I will be at NeurIPS to present our work on Disentangling Transfer in Continual Reinforcement Learning. If you’ll also be there and you want to grab a coffee together, send me an e-mail or a Twitter DM!
- (August 2022) Our textbook on introduction to deep learning is finally out! Unfortunately, only in Polish for the time being.
- (July 2022) MLSS^N was terrific! Many thanks to all lecturers, co-organizers and participants. Check out the lectures and stay tuned, as we’re already working on the next edition!
- (May 2022) Our paper on continual learning with weight interval regularization has been accepted to ICML 2022 as a short presentation.
- (March 2022) Our workshop on Dynamic Neural Networks has been accepted to ICML 2022. See you in Baltimore!
- (March 2022) Started a research internship with João Sacramento at ETH Zurich!
- (December 2021) PluGeN, our paper on introducing supervision to pre-trained, was accepted to AAAI 2022.
- (September 2021) Two of our papers, Zero Time Waste and Continual World were accepted to the NeurIPS 2021 conference as poster presentations.
- (September 2021) A paper on closed-loop imitation learning for self-driving cars, which I worked on during my internship at Woven Planet, was accepted to CORL 2021 conference.
- (July 2021) I was named a “best reviewer” (top 10% best scored reviewers) at ICML 2021.
- (April 2021) Our proposal for funding a ML & neuro summer school was accepted in the Nawa Spinaker program! The school is planned for June 2022 in Kraków.
- (April 2021) Started my internship at Woven Planet Level-5 (previously Lyft Level-5), working on imitation learning for planning in self-driving cars.
- (February 2021) Presented my student abstract on investigating the role of batch size in experience replay methods for continual learning at AAAI 2021.
- (August 2020) Our paper on conditional semi-supervised generation with mixtures of Gaussians was published in IEEE Transactions on Neural Networks and Learning Systems.
- (July 2020) Co-organized the EEML 2020 summer school.
- (December 2019) Presented a paper on biologically-inspired spatial neural networks at the NeurIPS 2019 workshop “Real Neurons & Hidden Units”.
- (November 2019) Co-organized a tutorial on reinforcement learning at the MLinPL conference.
- (October 2019) Started my PhD at the Jagiellonian University with GMUM.
Selected publications

Disentangling Transfer in Continual Reinforcement Learning
Maciej Wołczyk*, Michał Zając*, Razvan Pascanu, Łukasz Kuciński, Piotr Miłoś
NeurIPS 2022
[Paper]
Maciej Wołczyk*, Michał Zając*, Razvan Pascanu, Łukasz Kuciński, Piotr Miłoś
NeurIPS 2022
[Paper]


