Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
Jun 30, 2024 - Rust
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
🚇 Archive of daily ridership data from BART.
This repository contains the implementation of a Transformer-based model for abstractive text summarization and a rule-based approach for extractive text summarization.
Point-and-click bartCause analysis and causal inference education
Stochastic tree ensembles (BART / XBART) for supervised learning and causal inference
Boundary Banter is a project focused on automating the generation of cricket news from live text commentary using advanced Natural Language Processing (NLP) techniques.
Our BART-based Natural Language Processing model, generating answerable open questions. Made for a Language Technology Practical course.
The Role of Model Architecture and Scale in Predicting Molecular Properties: Insights from Fine-Tuning RoBERTa, BART, and LLaMA
This project aims to simplify and summarize scientific data , convert it to a audio format as a podcast , and create a power point presentation from the paper. This helps researchers, academics and students altogether.
Instruction fine tuning BART for Dialogue Summarization | IT4772E | NLP Project 20232
BrainHack 2024 competition repository for the TIL-AI category in the Novice track for Team dingdongs.
Document Summarizer using NLP and LLMs. BART model is used.
Fine Tuning is a cost-efficient way of preparing a model for specialized tasks. Fine-tuning reduces required training time as well as training datasets. We have open-source pre-trained models. Hence, we do not need to perform full training every time we create a model.
Cybertron: the home planet of the Transformers in Go
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
The LARGE LANGUAGE MODEL FOR HYDROGEN STORAGE project uses advanced natural language processing to improve research efficiency. It offers concise summaries and answers questions about hydrogen storage research papers, helping users quickly understand key insights and latest advancements.
This repository contains implementations of abstractive text summarization using RNN ,RNN with Reinforcement learning and Transformer architectures.
ECE-5424 Advanced Machine Learning Final Project - LLM Prompt Recovery task
Add a description, image, and links to the bart topic page so that developers can more easily learn about it.
To associate your repository with the bart topic, visit your repo's landing page and select "manage topics."