Skip to content

Commit

Permalink
[API] Update Kayra presets
Browse files Browse the repository at this point in the history
  • Loading branch information
Aedial committed Aug 16, 2023
1 parent 7802373 commit af9ebf2
Show file tree
Hide file tree
Showing 6 changed files with 104 additions and 25 deletions.
2 changes: 1 addition & 1 deletion docs/source/novelai_api/Full_list_of_modules.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Calliope, Snek, and Genji have no module support.
"special_openings" is a module spcifically trained to replace the previously used preamble. It is used at the beginning of the story, under certain conditions.
"special_openings" is a module specifically trained to replace the previously used preamble. It is used at the beginning of the story, under certain conditions.

<br/>

Expand Down
28 changes: 14 additions & 14 deletions novelai_api/presets/presets_kayra_v1/Asper.preset
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
{
"presetVersion": 3,
"name": "Asper",
"id": "fa6824e5-9313-4b29-bde8-cc59a35831de",
"id": "46d559d7-8f0f-4e3f-9dbf-578d0b8de9e4",
"remoteId": "",
"parameters": {
"textGenerationSettingsVersion": 5,
"temperature": 1.35,
"temperature": 1.16,
"max_length": 40,
"min_length": 1,
"top_k": 225,
"top_p": 0.99,
"top_a": 0.12,
"typical_p": 0.975,
"tail_free_sampling": 0.984,
"repetition_penalty": 1.7,
"repetition_penalty_range": 3200,
"repetition_penalty_slope": 0,
"top_k": 175,
"top_p": 0.998,
"top_a": 0.004,
"typical_p": 0.96,
"tail_free_sampling": 0.994,
"repetition_penalty": 1.68,
"repetition_penalty_range": 2240,
"repetition_penalty_slope": 1.5,
"repetition_penalty_frequency": 0,
"repetition_penalty_presence": 0.02,
"repetition_penalty_presence": 0.005,
"repetition_penalty_default_whitelist": true,
"cfg_scale": 1,
"cfg_uc": "",
Expand Down Expand Up @@ -47,11 +47,11 @@
"enabled": true
},
{
"id": "top_g",
"enabled": true
"id": "top_a",
"enabled": false
},
{
"id": "top_a",
"id": "top_g",
"enabled": false
},
{
Expand Down
68 changes: 68 additions & 0 deletions novelai_api/presets/presets_kayra_v1/Cosmic Cube.preset
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
{
"presetVersion": 3,
"name": "Cosmic Cube",
"id": "7a428c5f-a054-4663-8edd-e24c50cafeea",
"remoteId": "",
"parameters": {
"textGenerationSettingsVersion": 5,
"temperature": 0.9,
"max_length": 40,
"min_length": 1,
"top_k": 0,
"top_p": 1,
"top_a": 0,
"typical_p": 0.924,
"tail_free_sampling": 0.92,
"repetition_penalty": 3,
"repetition_penalty_range": 4000,
"repetition_penalty_slope": 0,
"repetition_penalty_frequency": 0,
"repetition_penalty_presence": 0,
"repetition_penalty_default_whitelist": false,
"cfg_scale": 1.48,
"cfg_uc": "",
"phrase_rep_pen": "off",
"top_g": 0,
"mirostat_tau": 4.95,
"mirostat_lr": 0.22,
"order": [
{
"id": "mirostat",
"enabled": true
},
{
"id": "cfg",
"enabled": true
},
{
"id": "typical_p",
"enabled": true
},
{
"id": "temperature",
"enabled": true
},
{
"id": "tfs",
"enabled": true
},
{
"id": "top_k",
"enabled": false
},
{
"id": "top_a",
"enabled": false
},
{
"id": "top_g",
"enabled": false
},
{
"id": "top_p",
"enabled": false
}
]
},
"model": "kayra-v1"
}
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"presetVersion": 3,
"name": "Green Active Writer",
"id": "c07c743f-e92e-4e30-9903-0f0909df176c",
"id": "8ab0fd2d-f746-4c5f-81b5-5f57211bd67c",
"remoteId": "",
"parameters": {
"textGenerationSettingsVersion": 5,
Expand Down
23 changes: 17 additions & 6 deletions novelai_api/presets/presets_kayra_v1/Stelenes.preset
Original file line number Diff line number Diff line change
@@ -1,27 +1,30 @@
{
"presetVersion": 3,
"name": "Stelenes",
"id": "7dad434e-706f-4af8-8011-3284c8a83540",
"id": "3556df5b-401e-453a-8a51-4a41537cd0c9",
"remoteId": "",
"parameters": {
"textGenerationSettingsVersion": 4,
"textGenerationSettingsVersion": 5,
"temperature": 2.5,
"max_length": 40,
"min_length": 1,
"top_k": 0,
"top_p": 1,
"top_a": 1,
"typical_p": 0.966,
"tail_free_sampling": 0.933,
"typical_p": 0.969,
"tail_free_sampling": 0.941,
"repetition_penalty": 1,
"repetition_penalty_range": 2048,
"repetition_penalty_range": 1024,
"repetition_penalty_slope": 0,
"repetition_penalty_frequency": 0,
"repetition_penalty_presence": 0,
"repetition_penalty_default_whitelist": true,
"cfg_scale": 1,
"cfg_uc": "",
"phrase_rep_pen": "aggressive",
"phrase_rep_pen": "medium",
"top_g": 0,
"mirostat_tau": 0,
"mirostat_lr": 1,
"order": [
{
"id": "top_k",
Expand Down Expand Up @@ -50,6 +53,14 @@
{
"id": "top_a",
"enabled": false
},
{
"id": "top_g",
"enabled": false
},
{
"id": "mirostat",
"enabled": false
}
]
},
Expand Down
6 changes: 3 additions & 3 deletions novelai_api/presets/presets_kayra_v1/Writer's Daemon.preset
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"presetVersion": 3,
"name": "Writer's Daemon",
"id": "c766b615-9853-4ee7-b6dc-0ecc379fc5db",
"id": "48fe62ed-a75d-4600-9eeb-8b75103769bc",
"remoteId": "",
"parameters": {
"textGenerationSettingsVersion": 5,
Expand All @@ -13,7 +13,7 @@
"top_a": 0.02,
"typical_p": 0.95,
"tail_free_sampling": 0.95,
"repetition_penalty": 1.6,
"repetition_penalty": 1.625,
"repetition_penalty_range": 2016,
"repetition_penalty_slope": 0,
"repetition_penalty_frequency": 0,
Expand All @@ -24,7 +24,7 @@
"phrase_rep_pen": "very_aggressive",
"top_g": 0,
"mirostat_tau": 5,
"mirostat_lr": 0.2,
"mirostat_lr": 0.25,
"order": [
{
"id": "cfg",
Expand Down

3 comments on commit af9ebf2

@Devansh-kajve
Copy link

@Devansh-kajve Devansh-kajve commented on af9ebf2 Oct 22, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey everyone @Aedial @bdavs @arthus-leroy @Cikmo

I am working on a game project and wanted to integrate this Novelai api in it,

I can run generatetext.py well but I don't understand how to provide context of previous generations and continue the same story like in NovelAI, everytime I call it's just an individual generation.

I did not knew how to reach out so I am commenting here, Can anyone here please help me with it?

@Aedial
Copy link
Owner Author

@Aedial Aedial commented on af9ebf2 Oct 22, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey everyone @Aedial @bdavs @arthus-leroy @Cikmo

I am working on a game project and wanted to integrate this Novelai api in it,

I can run generatetext.py well but I don't understand how to provide context of previous generations and continue the same story like in NovelAI, everytime I call it's just an individual generation.

I did not knew how to reach out so I am commenting here, Can anyone here please help me with it?

Hello @Devansh-kajve,

The context is all one big box where you put what the AI needs to see (the prompt argument). As such, you should append the new text to the old and cut text to fit the context size allowed by your subscription tier (3k for Tablet, 6k for Scroll, 8k for Opus, generation length included). If you do not cut, the AI will cut from the top of context, which might end in the middle of a sentence (so it's important to do it from your side).

Next time you want to ask a question, open an issue, I will get notified of it. No need to ping contributors.

@Devansh-kajve
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for replying, I do have some other questions, asking them in issue

Please sign in to comment.