An extendable neural machine translation toolkit built around the Transformer model and implemented in TensorFlow. Supports multi-GPU training and gradient aggregation for large-scale experimentation. Transformer implementation now part of [Nematus](https://github.com/EdinburghNLP/nematus).
PyTorch re-implementation of the [*Noise Contrastive Estimation*](http://proceedings.mlr.press/v9/gutmann10a/gutmann10a.pdf) algorithm. Created as a practice exercise.
PyTorch re-implementation of Mueller's et al., [*Siamese Recurrent Architectures for Learning Sentence Similarity.*](https://dl.acm.org/doi/10.5555/3016100.3016291) (AAAI, 2016). Created as a practice exercise.
Master thesis project. A fully unsupervised model developed for automated, language-agnostic simplification of natural language sentences via information density reduction. Implemented in TensorFlow. Inconclusive results, not actively maintained.