Skip to content

Questioning and Answering over LinkedData: LLM2SPARQL & Prompt Engineering(TO BE FINISHED)

Junjun Cao edited this page Jun 17, 2024 · 1 revision

How to Use Large Language Model to Convert Natural Language Question to SPARQL Query?

Prompt Engineering(PE)

In order to better make use of PE, we can get the chatGPT API

from https://platform.openai.com/docs/api-reference/chat.

  • Chat
    Go to "Chat"(one will mostly use this API) tab on the left; See "Create chat completion".

Here list a lot of parameters you can adjust to refine the response of GPT, the most frequently param are:

max_tokens temperature: A higher tempe will yield a higher randomness--a trade-off between creativity and accuracy. top_p: similar to temperature.

See the Default Example request codes at the right section:

from openai import OpenAI
client = OpenAI()

completion = client.chat.completions.create(
model="gpt-3.5-turbo-16k",
messages=[
{"role": "system", "content": "You are a helpful assistant."},# Set a role for the GPT system.
{"role": "user", "content": "Hello!"}#The value for content is your prompt talking with GPT.
]
)

print(completion.choices[0].message)

  • API Keys
    Go to ""Authentication" tab on the left -> click on API Keys...To apply for your own API key.
    Then one can fill in the class OpenAI() as below:
    client = OpenAI(# "OpenAI()" is a class.
    api_key="Fill in your personal API key.",
    base_url="https:..."# Need this parameter if it is a proxy API.
    )

--Now you can execute above Default Example request to see its response.

  • Response
    See the json codes underneath those Example request codes.
    print(completion.choices[0].message) is according to path in Response as a json file.

Let's come back: What does a Prompt Comprise?

  1. Instructions
  2. Question
  3. Input Data
  4. Examples
    --either the first or second one must be present.

What is a prompt engineering?

Many believe it will replace other aspects of machine learning, such as feature engineering or architecture engineering for large neural networks.
Please directly refer to What is prompt engineering?.
In above screen cast, there is an instance about(I adjust it a bit.):

for user, blurb in students.items():
prompt = "Given the following information about {}, write a 4 paragraph college essay: {}".format(user, blurb)
a = callGPT(prompt)
print(a)

How to make that work?

  1. Let's trace back to installing the package of "openai", just by executing pip install openai
  2. Hence we can make a function.
  3. Then if we have a student list in format of json, we combine all of the above codes together:

from openai import OpenAI
client = OpenAI(
api_key="Fill in your personal API key.",
base_url="https:..."
)
# **Part1**_Hence we can make a fucntion:
def callGPT(prompt):
completion = client.chat.completions.create(
model="gpt-3.5-turbo-16k",
max_tokens=50,
temperature=0.9,
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": prompt}
]
)
return completion.choices[0].message.content

# **Part2**_Then if we have a student list in format of json...
students = {
"Tom":"ethnomusicology","Jack":"compositon","Kate":"music technology"
}

# **Part3**_Use a "for" loop to prompt against the data list input:
for user, blurb in students.items():
print("\n")
print(user, blurb)
prompt = "Given the following information about {}, write a general introduction of one's research area of {}".format(user, blurb)
answer = callGPT(prompt)
print(answer)

To execute above, you will do it!

image