Fine-tuning a Transformer model using

Recently I set out to train a Transformer model, based on Distil-GPT2, to write something like my mothers’ poetry.

After much searching for the most concise way to do this, I think I’ve figured out a reasonable easy-to-understand approach that works for me in Google Colab.

Leave a Reply

Your email address will not be published. Required fields are marked *