dr.ir. D.C. Mocanu (Decebal)

Universitair docent

Over mij

News (2022):

  • 25 April - We had the pleasure of hosting Utku Evci, Research Engineer at Google Brain Montreal, to give a very engaging in-person talk (link)
  • 20 April - one paper on sparse training and deep reinforcement learning accepted at IJCAI 2022 (link)
  • 15 April - our tutorial "Sparse Neural Networks Training" has been accepted at ECMLPKDD 2022 (link)
  • 6 April - Shiwei Liu defended his outstanding PhD thesis (link) with cum laude
  • 28 Jan - 2 sparse training papers accepted at ICLR 2022 (Links: 12)


Narrative CV:

Decebal Mocanu is Assistant Professor in Artificial Intelligence and Machine Learning within the DMB group, Faculty of Electrical Engineering, Mathematics, and Computer Science at the University of Twente; and Guest Assistant Professor within the Data Mining group, Department of Mathematics and Computer Science at the Eindhoven University of Technology (TU/e). 

From September 2017 until February 2020, Decebal was Assistant Professor in Artificial Intelligence and Machine Learning within the Data Mining group, Department of Mathematics and Computer Science, TU/e and a member of TU/e Young Academy of Engineering. In 2017, he received his PhD in Artificial Intelligence and Network Science from TU/e. During his doctoral studies, Decebal undertook three research visits at the University of Pennsylvania (2014), Julius Maximilians University of Wurzburg (2015), and the University of Texas at Austin (2016).

Prior to this, in 2013, he obtained his MSc in Artificial Intelligence from Maastricht University. During his master studies, Decebal also worked as a part time software developer at We Focus BV in Maastricht. In the last year of his master studies, he also worked as an intern at Philips Research in Eindhoven, where he prepared his internship and master thesis projects. Decebal obtained his Licensed Engineer degree from University Politehnica of Bucharest. While in Bucharest, between 2001 and 2010, Decebal started MDC Artdesign SRL (a software house specialized in web development), worked as a computer laboratory assistant at the University Nicolae Titulescu, and as a software engineer at Namedrive LLC.



Decebal and his co-authors have laid the ground (connected papers) for sparse training in deep learning, while introducing both fixed and dynamic sparsity. Besides the expected computational benefits, sparse training seems to achieve in many cases better generalisation than dense training.

  • Fixed sparsity in A topological insight into restricted Boltzmann machines, Machine Learning (2016), preprint https://arxiv.org/abs/1604.05978 (2016)
  • Dynamic sparsity in Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science, Nature Communications (2018), preprint https://arxiv.org/abs/1707.04780 (2017).


Decebal short-term research interest is to conceive scalable deep artificial neural network models and their corresponding learning algorithms using principles from network science, evolutionary computing, optimization and neuroscience. Such models shall have sparse and evolutionary connectivity, make use of previous knowledge, and have strong generalization capabilities to be able to learn, and to reason, using few examples in a continuous and adaptive manner.

Most science carried out throughout human evolution uses the traditional reductionism paradigm, which even if it is very successful, still has some limitations. Aristotle wrote in Metaphysics “The whole is more than the sum of its parts”. Inspired by this quote, in long term, Decebal would like to follow the alternative complex systems paradigm and study the synergy between artificial intelligence, neuroscience, and network science for the benefits of science and society.

Github: http://github.com/dcmocanu/



Pure Link

Google Scholar Link


Current PhD students:

Graduated PhD students:

  • Shiwei Liu (cum laude), Sparse Neural Network Training with In-Time Over-Parameterization (graduated 2022, Postdoctoral Researcher - TU Eindhoven)
  • Anil YamanEvolution of biologically inspired learning in artificial neural networks (graduated 2019, Assistant Professor - VU Amsterdam) 

PDEng students supervision (graduated)

  • Pranav Bhatnagar, Automatic Microscope Alignment via Machine Learning (at Thermo Fisher Scientific, Eindhoven)September 2019
  • Eleftherios Koulierakis, Detection of outbreak of infectious diseases : a data science perspective (at GDD, Eindhoven)July 2018

MSc students supervision (graduated)

  • Mickey Beurskens, Pass the Ball! - Learning Strategic Behavioural Patterns for Distributed Multi Agent Robotic Systems, November 2020
  • Sonali Fotedar (cum laude), Information Extraction on Free-Text Sleep Narratives using Natural Language Processing (at Philips Research, Eindhoven), November 2020
  • Selima Curci (cum laude), Large Scale Sparse Neural Networks, October 2020
  • Manuel Munõz Sánchez (cum laude), Domain Knowledge-based Drivable Space Estimation (at TNO Helmond), September 2020
  • Chuan-Bin Huang, Novel Evolutionary Algorithms for Robust Training of Very Small Multilayer Perceptron Models, August 2020
  • Jeroen Brouns, Bridging the Domain-Gap in Computer Vision Tasks (at Philips Research, Eindhoven), December 2019
  • Daniel Ballesteros Castilla, Deep Reinforcement Learning for Intraday Power Trading (at ENGIE Global Markets, Brussels), December 2019
  • Mauro Comi (cum laude), Deep Reinforcement Learning for Light Transport Path Guiding, November 2019
  • Saurabh Bahulikar, Unsupervised Learning for Early Detection of Merchant Risk in Payments (at Payvision, Amsterdam), November 2019
  • Thomas Hagebols, Block-sparse evolutionary training using weight momentum evolution: training methods for hardware efficient sparse neural networks (at Philips Research, Eindhoven), March 2019
  • Bram Linders, Prediction and reduction of MRP nervousness by parameterization from a cost perspective (2nd supervisor, at Prodrive Technologies), February 2019
  • Joost Pieterse (cum laude), Evolving sparse neural networks using cosine similarity, July 2018


Conference tutorials

  • D.C. Mocanu, E. Mocanu, T. Pinto, Z. Vale, Scalable Deep Learning: How far is one billion neurons?, European Conference on Artificial Intelligence (ECAI 2020), Materials
  • D.C. Mocanu, E. Mocanu, T. Pinto, Z. Vale, Scalable Deep Learning: How far is one billion neurons?, International Joint Conference on Artificial Intelligence (IJCAI 2020), Materials
  • D.C. Mocanu, E. Mocanu, P.H. Nguyen, M. Gibescu, Z. Vale, D. Ernst, Scalable Deep Learning: from theory to practice (T11), International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2019), LinkMaterials
  • D.C. Mocanu, E. Mocanu, Z. Vale, D. Ernst, Scalable Deep Learning: from theory to practice (T24), International Joint Conference on Artificial Intelligence (IJCAI 2019), LinkMaterials
  • E. Mocanu, D.C. Mocanu, Scalable Deep Learning: from theory to practice, European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECMLPKDD 2019), LinkMaterials

Verbonden aan Opleidingen



Vakken Collegejaar  2021/2022

Vakken in het huidig collegejaar worden toegevoegd op het moment dat zij definitief zijn in het Osiris systeem. Daarom kan het zijn dat de lijst nog niet compleet is voor het gehele collegejaar.

Vakken Collegejaar  2020/2021



Universiteit Twente
Faculty of Electrical Engineering, Mathematics and Computer Science
Zilverling (gebouwnr. 11)
Hallenweg 19
7522NH  Enschede

Navigeer naar locatie


Universiteit Twente
Faculty of Electrical Engineering, Mathematics and Computer Science
Postbus 217
7500 AE Enschede