Skip to content

Commit

Permalink
Merge pull request #66 from boun-tabi-LMG/gokceuludogan-patch-pip
Browse files Browse the repository at this point in the history
Update installation commands & docs
  • Loading branch information
gokceuludogan committed Feb 2, 2024
2 parents d10987d + c71f310 commit e068908
Show file tree
Hide file tree
Showing 6 changed files with 150 additions and 134 deletions.
9 changes: 8 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,14 @@ Turkish LM Tuner is a library for fine-tuning Turkish language models on various

## Installation

You can use the following command to install the library:
You can install `turkish-lm-tuner` via PyPI:

```bash

pip install turkish-lm-tuner
```

Alternatively, you can use the following command to install the library:

```bash

Expand Down
10 changes: 9 additions & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,17 @@ Turkish LM Tuner is a library for fine-tuning Turkish language models on various

## Installation

You can use the following command to install the library:
You can install `turkish-lm-tuner` via PyPI:

```bash

pip install turkish-lm-tuner
```

Alternatively, you can use the following command to install the library:

```bash

pip install git+https://github.com/boun-tabi-LMG/turkish-lm-tuner.git
```

Expand Down
6 changes: 3 additions & 3 deletions docs/tutorials/finetuning.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
"The library can be installed as follows:\n",
"\n",
"```bash\n",
"pip install git+https://github.com/boun-tabi-LMG/turkish-lm-tuner.git\n",
"pip install turkish-lm-tuner\n",
"```"
]
},
Expand Down Expand Up @@ -53,7 +53,7 @@
"task = \"summarization\"\n",
"task_mode = '' # either '', '[NLU]', '[NLG]', '[S2S]'\n",
"task_format=\"conditional_generation\"\n",
"model_name = \"boun-tabi-lmt/TURNA\"\n",
"model_name = \"boun-tabi-LMG/TURNA\"\n",
"max_input_length = 764\n",
"max_target_length = 128\n",
"\n",
Expand Down Expand Up @@ -121,8 +121,8 @@
"\n",
"model_trainer = TrainerForConditionalGeneration(\n",
" model_name=model_name, task=task,\n",
" optimizer_params=optimizer_params,\n",
" training_params=training_params,\n",
" optimizer_params=optimizer_params,\n",
" model_save_path=\"turna_summarization_tr_news\",\n",
" max_input_length=max_input_length,\n",
" max_target_length=max_target_length, \n",
Expand Down
8 changes: 4 additions & 4 deletions docs/tutorials/getting-started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
"`turkish-lm-tuner` can be installed as follows:\n",
"\n",
"```bash\n",
"pip install git+https://github.com/boun-tabi-LMG/turkish-lm-tuner.git\n",
"pip install turkish-lm-tuner\n",
"```\n",
"\n"
]
Expand All @@ -42,7 +42,7 @@
"dataset_name = \"tr_news\"\n",
"task = \"summarization\"\n",
"task_format = \"conditional_generation\"\n",
"model_name = \"boun-tabi-lmt/TURNA\"\n",
"model_name = \"boun-tabi-LMG/TURNA\"\n",
"max_input_length = 764\n",
"max_target_length = 128\n",
"\n",
Expand Down Expand Up @@ -76,13 +76,13 @@
" 'optimizer_type': 'adafactor',\n",
" 'scheduler': False\n",
"}\n",
"model_save_path = \"turna_summarization_tr_news\"\n",
"\n",
"# Finetuning the model\n",
"model_trainer = TrainerForConditionalGeneration(model_name, task, optimizer_params, training_params, \"turna_summarization_tr_news\", max_input_length, max_target_length, dataset_processor.dataset.postprocess_data)\n",
"model_trainer = TrainerForConditionalGeneration(model_name, task, training_params, optimizer_params, model_save_path, max_input_length, max_target_length, dataset_processor.dataset.postprocess_data)\n",
"trainer, model = model_trainer.train_and_evaluate(train_dataset, eval_dataset, None)\n",
"\n",
"# Save the model\n",
"model_save_path = \"turna_summarization_tr_news\"\n",
"model.save_pretrained(model_save_path)\n",
"dataset_processor.tokenizer.save_pretrained(model_save_path)\n",
"```\n"
Expand Down
Loading

0 comments on commit e068908

Please sign in to comment.