Skip to content

anbestCL/XLNetforSummarisation

Repository files navigation

XLNet for Summarisation

This repository contains code of the master thesis Summarization of Texts using a Pretrained Language Model.

Abstract

Large pre-trained language models have already been proven useful for several downstream tasks. For abstractive summarization, they have either been used as input for sequence-to-sequence models or finetuned to the task specifically. This thesis explores to what extent the Transformer-based language model, XLNet, can be applied to abstractive summarization without finetuning. Results on ROUGE scores indicate that for unsupervised summarization, XLNet does not reach state-of-the-art levels. Applying a repetition penalty and possibly adding a prompt can however improve performance.

About

abstractive text summarisation using xlnet

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published