Skip to content

Latest commit

 

History

History
121 lines (89 loc) · 5.4 KB

README.md

File metadata and controls

121 lines (89 loc) · 5.4 KB

AdaSeq: An All-in-One Library for Developing State-of-the-Art Sequence Understanding Models

license modelscope version issues stars downloads contribution

English | 简体中文

Introduction

AdaSeq (Alibaba Damo Academy Sequence Understanding Toolkit) is an easy-to-use all-in-one library, built on ModelScope, that allows researchers and developers to train custom models for sequence understanding tasks, including part-of-speech tagging (POS Tagging), chunking, named entity recognition (NER), entity typing, relation extraction (RE), etc.

🌟 Features:
  • Plentiful Models:

    AdaSeq provide plenty of cutting-edge models, training methods and useful toolkits for sequence understanding tasks.

  • State-of-the-Art:

    Our aim to develop the best implementation, which can beat many off-the-shelf frameworks on performance.

  • Easy-to-Use:

    One line of command is all you need to obtain the best model.

  • Extensible:

    It's easy to register a module, or build a customized sequence understanding model by assembling the predefined modules.

⚠️Notice: This project is under quick development. This means some interfaces could be changed in the future.

📢 What's New

⚡ Quick Experience

You can try out our models via online demos built on ModelScope: [English NER] [Chinese NER] [CWS]

More tasks, more languages, more domains: All modelcards we released can be found in this page Modelcards.

🛠️ Model Zoo

Supported models:

💾 Dataset Zoo

We collected many datasets for sequence understanding tasks. All can be found in this page Datasets.

📦 Installation

AdaSeq project is based on Python version >= 3.7 and PyTorch version >= 1.8.

  • installation via pip:
pip install adaseq
  • installation from source:
git clone https://github.com/modelscope/adaseq.git
cd adaseq
pip install -r requirements.txt -f https://modelscope.oss-cn-beijing.aliyuncs.com/releases/repo.html

Verify the Installation

To verify whether AdaSeq is installed properly, we provide a demo config for training a model (the demo config will be automatically downloaded).

adaseq train -c demo.yaml

You will see the training logs on your terminal. Once the training is done, the results on test set will be printed: test: {"precision": xxx, "recall": xxx, "f1": xxx}. A folder experiments/toy_msra/ will be generated to save all experimental results and model checkpoints.

📖 Tutorials

📝 Contributing

All contributions are welcome to improve AdaSeq. Please refer to CONTRIBUTING.md for the contributing guideline.

📄 License

This project is licensed under the Apache License (Version 2.0).