ChatGPT was made available by OpenAI on 2022-11-30. As of 2023-12-16 I hadn't used ChatGPT (Generative Pre-trained Transformer) or other large language models (LLMs). In this post I document best practices other folks have come up with. My intent is to identify whether ChatGPT could be useful for tasks relevant to the Physics Derivation Graph.
Sites I reviewed for best practices for prompts:
General text response
Prompt
Answer the question based on the context below. Keep the answer short and concise. Respond "Unsure about answer" if not sure about the answer.
Context: <context>
Question: <question>
Answer:
Prompt
Before providing the answer, state what you know and what you assume. Then state the steps that led you to your answer. Finally, provide the answer.
Prompt
Let's think step by step.
Prompt
Let's work this out in a step by step way to be sure we have the right answer.
Prompt
Imagine three different experts are answering this question.
All experts will write down 1 step of their thinking, then share it with the group.
Then all experts will go on to the next step, etc.
If any expert realizes they are wrong at any point then they leave.
The question is...
What to use an LLM for
The following table is from
https://www.philschmid.de/instruction-tune-llama-2
Capability
Example Instruction
Brainstorming
Provide a diverse set of creative ideas for new flavors of ice cream.
Classification
Categorize these movies as either comedy, drama, or horror based on the plot summary.
Closed QA
Answer the question 'What is the capital of France?' with a single word.
Generation
Write a poem in the style of Robert Frost about nature and the changing seasons.
Information Extraction
Extract the names of the main characters from this short story.
Open QA
Why do leaves change color in autumn? Explain the scientific reasons.
Summarization
Summarize this article on recent advancements in renewable energy in 2-3 sentences.
Software generation
You are an expert programmer that writes simple, concise code and explanations. Write a python function to generate the nth fibonacci number.
A simple python function to remove whitespace from a string
Code Llama supports a special prompt called infill
<PRE> def compute_gcd(x, y): <SUF>return result <MID>
Code review
Where is the bug in this code?
```
def fib(n):
if n <= 0:
return n
else:
return fib(n-1) + fib(n-2)
```
Tests of Software
write a unit test for this function:
Prompts for Retrieval Augmented Generation (RAG)
RAG = https://www.promptingguide.ai/techniques/rag
You are an expert Q&A system that is trusted around the world.\n"
"Always answer the query only using the provided context information, "
"and not prior knowledge."
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the query.\n"
"Query: {query_str}\n"
"Answer: "
From https://docs.llamaindex.ai/en/stable/examples/prompts/prompts_rag.html
Context information is below.
---------------------
{context_str}
---------------------
Given the context information and not prior knowledge,
answer the query asking about citations over different topics.
Some examples are given below.
{few_shot_examples}
Query: {query_str}
Answer:
See JSON for ChatGPT https://minimaxir.com/2023/12/chatgpt-structured-data/ and associated comments https://news.ycombinator.com/item?id=38782678
ReplyDelete