We are a multi-disciplinary team of scientists and engineers who like doing research for fun. Our only objective is
good machine learning research that is useful and interesting. Our collaborators include researchers from labs like
Google Brain, Oxford University, and
Toronto's Vector Institute.
Our published experiments and tools can be found on our GitHub.
Neural networks are extremely flexible models due to their large number of parameters, which is beneficial for learning, but also highly redundant. This makes it possible to compress neural networks without having a drastic effect on performance. We introduce targeted dropout, a strategy for post hoc pruning of neural network weights and units that builds the pruning mechanism directly into learning. At each weight update, targeted dropout selects a candidate set for pruning using a simple selection criterion, and then stochastically prunes the network via dropout applied to this set. The resulting network learns to be explicitly robust to pruning, comparing favourably to more complicated regularization schemes while at the same time being extremely simple to implement, and easy to tune.
Unsupervised Cipher Cracking Using Discrete GANs
This work details CipherGAN, an architecture inspired by CycleGAN used for inferring the underlying cipher mapping given banks of unpaired ciphertext and plaintext. We demonstrate that CipherGAN is capable of cracking language data enciphered using shift and Vigenere ciphers to a high degree of fidelity and for vocabularies much larger than previously achieved. We present how CycleGAN can be made compatible with discrete data and train in a stable way. We then prove that the technique used in CipherGAN avoids the common problem of uninformative discrimination associated with GANs applied to discrete data.
Join The Team!
We're looking for individuals interested in contributing to cross-institutional research projects on machine
neural networks. We recruit students and industrial members globally. Our only requirement is a strong background
computer science, mathematics and statistics. Remote work is our modus operandi, although we regularly organize
sessions in cities with more than one member.
If you're interested in deep learning and neural networks, and are well-versed in calculus, statistics, and
then you're a perfect fit.
The team is volunteer-run, but we do provide mentorship and authorship opportunities, and we have a rich set of
and training infrastructure at our disposal!
firstname.lastname@example.org with your resume and a brief note on your interest in ML and any
background (courses taken, projects completed, etc.)