Skip to content

tfs-mt
Transformer from scratch for Machine Translation


🏠 Homepage ▶️ Getting started 🤗 Hugging Face 🎬 Demo

This project implements the Transformer architecture from scratch considering Machine Translation as the usecase. It's mainly intended as an educational resource and a functional implementation of the architecture and the training/inference logic.

Getting started

From PyPI

pip install tfs-mt

From source

Prerequisites

Steps

git clone https://github.com/Giovo17/tfs-mt.git
cd tfs-mt

uv sync

cp .env.example .env
# Edit .env file with your configuration

Usage

Training

To start training the model with the default configuration:

uv run src/train.py

Inference

To run inference using the trained model from the HuggingFace repo:

uv run src/inference.py

Configuration

The whole project parameters can be configured in src/tfs_mt/configs/config.yml. Key configurations include:

  • Model Architecture: Config, dropout, GloVe embedding init, ...
  • Training: Optimizer, Learning rate scheduler, number of epochs, ...
  • Data: Dataset, Dataloader, Tokenizer, ...

License

  • Source code: licensed under the MIT License.

    • Note: This project includes modified code derived from PyTorch Ignite, which is licensed under the BSD 3-Clause License. See the LICENSE file for the full text of both licenses and original copyright notices.
  • Documentation: located in the docs/ directory, licensed under Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0). See docs/LICENSE.

Citation

If you use tfs-mt in your research or project, please cite:

@software{Spadaro_tfs-mt,
author = {Spadaro, Giovanni},
licenses = {MIT, CC BY-SA 4.0},
title = {{tfs-mt}},
url = {https://github.com/Giovo17/tfs-mt}
}