An extendable neural machine translation toolkit built around the Transformer model and implemented in TensorFlow. Supports multi-GPU training and gradient aggregation for large-scale experimentation. Transformer implementation now part of [Nematus](https://github.com/EdinburghNLP/nematus).
Master thesis project. A fully unsupervised model developed for automated, language-agnostic simplification of natural language sentences via information density reduction. Implemented in TensorFlow. Inconclusive results, not actively maintained.