Skip to content

hoangledoan/Transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformer

This repository contains the implementation of a Transformer model. The code is written in Python.

Demo

Transformer Demo

Project Structure

The project is organized into the following directories and files:

Files

  • model: Contains the core components of the Transformer model.
    • attention.py: Implements the attention mechanism.
    • decoder.py: Implements the decoder part of the Transformer.
    • embedding.py: Implements the embedding layer.
    • encoder.py: Implements the encoder part of the Transformer.
    • feedforward.py: Implements the feedforward neural network.
    • loss.py: Contains the loss function used for training.
    • multi_head_attention.py: Implements multi-head attention.
    • transformer.py: Assembles the full Transformer model.
  • 3_transformer.ipynb: Jupyter notebook demonstrating the use of the Transformer model.

Installation

To run this project, you will need Python and the dependencies listed in requirements.txt. You can install them using pip:

About

EN-DE translator based transformer

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published