Skip to content

Commit 377ed8d

Browse files
authored
Get rid of multi-part prompt and instead use 2 requests with 2 independent prompts sent in parallel (#41)
## Summary <!--AI:e4b5301d--> Updating the script to eliminate the use of a multi-part prompt and instead implement two separate requests with independent prompts that are sent in parallel. This change simplifies the interaction model by removing the need to parse and separate responses based on a delimiter. The updated approach allows for more straightforward and potentially more efficient processing, as each part of the prompt can be handled independently and concurrently. The modifications involve significant changes to the AI prompt handling within the script. The previous multi-part prompt structure, which required specific formatting and separation of the template modification and summary generation, has been replaced. The new implementation splits these tasks into distinct sections, each with its own clear instructions and responsibilities. This not only makes the code easier to maintain but also aligns better with modular design principles. ### Essential Code Lines ```python AI_PROMPT_INJECT_PLACEHOLDER = f""" I am passing you the Pull Request Template text. Modify it according to the instructions below and return the text back to me. ... """ AI_PROMPT_GENERATE_SUMMARY = f""" I am passing you the Pull Request Title and the code diff. Generate the Pull Request Summary according to the following instructions. ... """ ``` These lines define the new structure for handling AI prompts, where the placeholder injection and summary generation are clearly separated into distinct tasks. This separation facilitates parallel processing and enhances the clarity of the script's operations. We could've used "structured response", but it is a) complicated to implement (requires manipulation with JSON Schema), and b) it would complicate any future move to a local ChatGPT model (only the remote API supports structured responses). So instead, just sending 2 requests in parallel using Task micro-framework. ## How was this tested? ``` python3 tests/test_ai.py ``` ## PRs in the Stack - ➡ #41 (The stack is managed by [git-grok](https://github.com/dimikot/git-grok).)
1 parent 9275b36 commit 377ed8d

File tree

2 files changed

+114
-70
lines changed

2 files changed

+114
-70
lines changed

git-grok

Lines changed: 104 additions & 69 deletions
Original file line numberDiff line numberDiff line change
@@ -60,19 +60,14 @@ AI_WHO_YOU_ARE = """
6060
You are a senior software engineer who writes professional, concise, and
6161
informative pull request descriptions.
6262
"""
63-
AI_PROMPT = f"""
64-
# Response format
65-
66-
The response MUST consist of EXACTLY 2 parts separated by the EXACT
67-
"{AI_SEPARATOR}" marker located on a separate line. Those 2 parts are:
68-
Modified PR Template and Pull Request Summary.
69-
70-
## 1st Part: Modified PR Template
71-
72-
- The 1st part of the response is the exact text of the PR template that I
73-
passed in the input, BUT with "{AI_PLACEHOLDER}" marker injected on a
74-
separate line at the place where it will make sense to insert the Pull
75-
Request Summary later.
63+
AI_PROMPT_INJECT_PLACEHOLDER = f"""
64+
I am passing you the Pull Request Template text. Modify it according to the
65+
instructions below and return the text back to me.
66+
67+
- It must be the exact text of the Pull Request Template that I passed in
68+
the input, BUT with "{AI_PLACEHOLDER}" marker injected on a separate line
69+
at the place where it will make sense for me to later insert the Pull
70+
Request Summary.
7671
- Your goal is to detect the single best place for this "{AI_PLACEHOLDER}"
7772
marker injection and inject it there. At the place where a human-readable
7873
summary would be expected.
@@ -82,14 +77,31 @@ AI_PROMPT = f"""
8277
- Only include the response text in that part. Don't add any comments or
8378
thoughts, just the text of the PR template with the "{AI_PLACEHOLDER}"
8479
marker injected at the best place.
80+
"""
81+
AI_PROMPT_GENERATE_SUMMARY = f"""
82+
I am passing you the Pull Request Title and the code diff. Generate the Pull
83+
Request Summary according to the following instructions.
8584
86-
## 2nd Part: Pull Request Summary
85+
Guidelines for all of the texts you generate:
8786
88-
- The 2nd part of the response must be a concise, informative, and
89-
professional pull request summary. When building, infer the information
90-
from the PR title and the diff provided. Don't be too short, but also
91-
don't be too verbose: the result should be pleasant for a human engineer
92-
to read and understand.
87+
- NEVER USE THE EXAGGERATION WORDS LIKE "CRITICAL", "CRUCIAL", "SIGNIFICANT"
88+
AND ANY OTHER WORDS WITH SIMILAR MEANING. Avoid pompous phrases like "This
89+
is a crucial step", "This change is foundational", "This is a significant
90+
enhancement" etc. Overall, be humble, don't try to judge the PR change and
91+
its importance; just provide the dry facts.
92+
- DON'T HALLUCINATE! If you're not sure about something, better not mention
93+
it than hallucinate. Also, do not say "likely" - avoid guessing! If
94+
unsure, try to infer hints from the PR title.
95+
96+
# Top Part: Summary
97+
98+
- Generate a concise, informative, and professional Pull Request Summary.
99+
When building, infer the information from the Pull Request Title and the
100+
diff provided.
101+
- Do not generate "Summary" sub-header or any other sub-headers. Instead,
102+
just generate the text (one or multiple paragraphs).
103+
- Don't be too short, but also don't be too verbose: the result should be
104+
pleasant for a human engineer to read and understand.
93105
- Use PR title; it is provided as a hint for the PRIMARY ESSENCE of the
94106
change in the diff (especially when unsure, or when there are multiple
95107
unrelated changes in the diff). The title is manually created by the diff
@@ -107,26 +119,23 @@ AI_PROMPT = f"""
107119
- NEVER TRY TO FILL ANY TEST STEPS OR TEST PLAN in the summary; only build
108120
human-readable text. Your goal is NOT to create test plans. Your goal is
109121
to only fill the human readable summary of the change.
110-
- Avoid pompous phrases like "This is a crucial step", "This change is
111-
foundational" etc.
112-
- DON'T HALLUCINATE! If you're not sure about something, better not mention
113-
it than hallucinate. Again, try to infer hints from the PR title.
114122
115-
### Ending of the 2nd Part: Essential Code Lines (Optional)
123+
# Bottom part: Essential Code Lines (Optional)
116124
117-
- In the end of the 2nd part text, extract 1-5 most "essential code lines"
118-
from the diff and mention these extracted lines in a code block (markdown;
119-
use the language tag in triple-backtick syntax).
120-
- Make sure that those "essential code lines" ARE SOMEHOW RELATED to the PR
125+
- In the end, extract 1-5 most "Essential Code Lines" from the diff and
126+
mention these extracted lines in a code block (markdown; use the language
127+
tag in triple-backtick syntax).
128+
- Use "Essential Code Lines" as a subheader.
129+
- Make sure that those "Essential Code Lines" ARE SOMEHOW RELATED to the PR
121130
title provided. Typically, when author of a PR writes its title, it has
122131
some essential code lines in mind; try to infer them based on the title if
123132
possible or when unsure.
124133
- If there is nothing interesting to extract, or if the markdown code
125134
triple-backtick text you're about to inject is empty, simply skip this
126135
ending of block 2 and don't even mention that it's absent (since it's
127136
optional).
128-
- Otherwise, give the corresponding explanation of the Essential Code Lines
129-
as well (in a very short form).
137+
- Otherwise, append the corresponding explanation of the Essential Code
138+
Lines as well (in a very short form, as 1 paragraph below the code block).
130139
- Avoid pompous phrases like "This is a crucial step", "This change is
131140
foundational" etc.
132141
- NEVER TREAT import (or module inclusion, require etc.) statements as
@@ -776,15 +785,14 @@ class Main:
776785

777786
self.print_ai_generating()
778787

779-
prompt = self.ai_build_prompt(
780-
title=commit.title,
781-
diff=self.git_get_commit_diff(commit_hash=commit.hash),
782-
pr_template=pr_template or AI_DEFAULT_PR_TEMPLATE,
783-
)
784788
injected_text = self.ai_generate_injected_text(
785-
api_key=self.settings.ai_api_key,
786-
model=self.settings.ai_model,
787-
prompt=prompt,
789+
prompt_inject_placeholder=self.ai_build_prompt_inject_placeholder(
790+
pr_template=pr_template or AI_DEFAULT_PR_TEMPLATE,
791+
),
792+
prompt_generate_summary=self.ai_build_prompt_generate_summary(
793+
title=commit.title,
794+
diff=self.git_get_commit_diff(commit_hash=commit.hash),
795+
),
788796
)
789797
self.settings.ai_generated_snippets.insert(0, injected_text.text)
790798
self.settings_merge_save()
@@ -1375,51 +1383,59 @@ class Main:
13751383
str(self.settings.ai_temperature if self.settings else ""),
13761384
str(AI_DIFF_CONTEXT_LINES),
13771385
unindent(AI_WHO_YOU_ARE),
1378-
unindent(AI_PROMPT).rstrip(),
1386+
unindent(AI_PROMPT_INJECT_PLACEHOLDER).rstrip(),
1387+
unindent(AI_PROMPT_GENERATE_SUMMARY).rstrip(),
13791388
]
13801389
)
13811390
)
13821391

1392+
#
1393+
# Builds a prompt for the AI to inject a placeholder to the PR template.
1394+
#
1395+
def ai_build_prompt_inject_placeholder(
1396+
self,
1397+
*,
1398+
pr_template: str,
1399+
) -> AiPrompt:
1400+
pr_template = pr_template.strip() + "\n" if pr_template.strip() else ""
1401+
return AiPrompt(
1402+
who_you_are=unindent(AI_WHO_YOU_ARE).rstrip(),
1403+
prompt=unindent(AI_PROMPT_INJECT_PLACEHOLDER).rstrip(),
1404+
input=f"Here is the {PR_TEMPLATE_FILE} developers use:\n{AI_SEPARATOR}\n{pr_template}{AI_SEPARATOR}\n\n",
1405+
)
1406+
13831407
#
13841408
# Builds a prompt for the AI to generate a PR description.
13851409
#
1386-
def ai_build_prompt(
1410+
def ai_build_prompt_generate_summary(
13871411
self,
13881412
*,
13891413
title: str,
13901414
diff: str,
1391-
pr_template: str,
13921415
) -> AiPrompt:
1393-
sep = "-" * 60
1416+
title = title.strip() + "\n" if title.strip() else ""
13941417
diff = diff.strip() + "\n" if diff.strip() else ""
1395-
pr_template = pr_template.strip() + "\n" if pr_template.strip() else ""
1396-
who_you_are = unindent(AI_WHO_YOU_ARE).rstrip()
1397-
prompt = unindent(AI_PROMPT).rstrip()
1398-
input = (
1399-
f"Here is the PR title:\n{sep}\n{title.strip()}{sep}\n\n"
1400-
+ f"Here is the {PR_TEMPLATE_FILE} developers use:\n{sep}\n{pr_template}{sep}\n\n"
1401-
+ f"Here is the git diff:\n{sep}\n{diff}{sep}\n\n"
1402-
).rstrip()
14031418
return AiPrompt(
1404-
who_you_are=who_you_are,
1405-
prompt=prompt,
1406-
input=input,
1419+
who_you_are=unindent(AI_WHO_YOU_ARE).rstrip(),
1420+
prompt=unindent(AI_PROMPT_GENERATE_SUMMARY).rstrip(),
1421+
input=(
1422+
f"Here is the PR title:\n{AI_SEPARATOR}\n{title}{AI_SEPARATOR}\n\n"
1423+
+ f"Here is the git diff:\n{AI_SEPARATOR}\n{diff}{AI_SEPARATOR}\n\n"
1424+
),
14071425
)
14081426

14091427
#
1410-
# Generates an injected text block (like PR summary).
1428+
# Sends an AI request and returns the text response.
14111429
#
1412-
def ai_generate_injected_text(
1430+
def ai_send_request(
14131431
self,
14141432
*,
1415-
api_key: str,
1416-
model: str,
14171433
prompt: AiPrompt,
1418-
) -> AiInjectedText:
1434+
) -> str:
14191435
assert self.settings
14201436
data = json.dumps(
14211437
{
1422-
"model": model,
1438+
"model": self.settings.ai_model,
14231439
"messages": [
14241440
{"role": "system", "content": prompt.who_you_are},
14251441
{"role": "system", "content": prompt.prompt},
@@ -1434,7 +1450,7 @@ class Main:
14341450
"https://api.openai.com/v1/chat/completions",
14351451
data=data.encode(),
14361452
headers={
1437-
"Authorization": f"Bearer {api_key}",
1453+
"Authorization": f"Bearer {self.settings.ai_api_key}",
14381454
"Content-Type": "application/json",
14391455
},
14401456
)
@@ -1450,14 +1466,7 @@ class Main:
14501466
context.verify_mode = ssl.CERT_NONE
14511467
with urlopen(req, context=context) as res:
14521468
res = json.load(res)
1453-
res = str(res["choices"][0]["message"]["content"].strip())
1454-
parts = [part.strip() for part in res.split(AI_SEPARATOR)]
1455-
if len(parts) != 2:
1456-
raise UserException(f"Invalid response from AI:\n[[[\n{res}\n]]]")
1457-
return AiInjectedText(
1458-
template_with_placeholder=parts[0],
1459-
text=ai_hash_build(self.ai_get_rules_hash()) + "\n" + parts[1],
1460-
)
1469+
return str(res["choices"][0]["message"]["content"].strip())
14611470
except HTTPError as e:
14621471
res = e.read().decode()
14631472
returncode = e.code
@@ -1475,6 +1484,32 @@ class Main:
14751484
returncode=returncode,
14761485
)
14771486

1487+
#
1488+
# Generates an injected text block (like PR summary).
1489+
#
1490+
def ai_generate_injected_text(
1491+
self,
1492+
*,
1493+
prompt_inject_placeholder: AiPrompt,
1494+
prompt_generate_summary: AiPrompt,
1495+
) -> AiInjectedText:
1496+
task_inject_placeholder = Task(
1497+
self.ai_send_request,
1498+
prompt=prompt_inject_placeholder,
1499+
)
1500+
task_generate_summary = Task(
1501+
self.ai_send_request,
1502+
prompt=prompt_generate_summary,
1503+
)
1504+
return AiInjectedText(
1505+
template_with_placeholder=task_inject_placeholder.wait().strip(),
1506+
text=(
1507+
ai_hash_build(self.ai_get_rules_hash())
1508+
+ "\n"
1509+
+ task_generate_summary.wait().strip()
1510+
),
1511+
)
1512+
14781513
#
14791514
# Runs a shell command returning results.
14801515
#
@@ -2100,7 +2135,7 @@ def ai_hash_build(hash: str) -> str:
21002135

21012136

21022137
#
2103-
# Extracts a hash from a body if it has an AI comment..
2138+
# Extracts a hash from a body if it has an AI comment.
21042139
#
21052140
def ai_hash_extract(body: str) -> str | None:
21062141
return m.group(1) if (m := re.search(ai_hash_build("([a-f0-9]+)"), body)) else None

tests/helpers.py

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -175,7 +175,7 @@ def git_init_and_cd_to_test_dir(
175175
tasks = [
176176
GarbageTask(
177177
kind=pr.kind,
178-
task=git_grok.Task(check_output_x, "gh", "pr", "close", pr.name),
178+
task=git_grok.Task(gh_pr_close_if_open, pr.name),
179179
)
180180
for pr in prs
181181
] + [
@@ -197,6 +197,15 @@ def git_init_and_cd_to_test_dir(
197197
# branch has already been deleted by a parallel test run or so).
198198

199199

200+
def gh_pr_close_if_open(name: str):
201+
try:
202+
check_output_x("gh", "pr", "close", name)
203+
except CalledProcessError as e:
204+
if "Could not close the pull request" in e.stderr:
205+
return
206+
raise
207+
208+
200209
def git_touch(file: str, content: str | None = None):
201210
with open(file, "w") as f:
202211
f.write(content or "")

0 commit comments

Comments
 (0)