Skip to content

Commit

Permalink
Prompt Improver Improvements (#24)
Browse files Browse the repository at this point in the history
* adding temperature default value

also modifying mini things on improve-prompt markdown files

* fix bug with improver

* prompt improver fixups

* give generic prompt to user and system

---------

Co-authored-by: Leon Lam Nilsson <[email protected]>
  • Loading branch information
eksno and failandimprove1 authored Feb 27, 2024
1 parent 4ccaf7b commit 70abbbb
Show file tree
Hide file tree
Showing 7 changed files with 699 additions and 15 deletions.
4 changes: 2 additions & 2 deletions aitino/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,9 +53,9 @@ def compile(id: UUID) -> dict[str, str | Composition]:

@app.get("/improve")
def improve(
word_limit: int, prompt: str, temperature: float, prompt_type: PromptType
word_limit: int, prompt: str, prompt_type: PromptType, temperature: float
) -> str:
return improve_prompt(word_limit, prompt, temperature, prompt_type)
return improve_prompt(word_limit, prompt, prompt_type, temperature)


@app.get("/maeve")
Expand Down
11 changes: 3 additions & 8 deletions aitino/improver.py
Original file line number Diff line number Diff line change
@@ -1,23 +1,18 @@
import os
from enum import Enum
from pathlib import Path
from typing import Literal

from openai import OpenAI
from pydantic import BaseModel


class PromptType(Enum):
GENERIC = "generic"
SYSTEM = "system"
USER = "user"
PromptType = Literal["generic", "system", "user"]


def improve_prompt(
word_limit: int, prompt: str, temperature: float, prompt_type: PromptType
word_limit: int, prompt: str, prompt_type: PromptType, temperature: float = 0.0
) -> str:
with open(
Path(os.getcwd(), "aitino", "prompts", f"{prompt_type}-improve-prompt.md"),
Path(os.getcwd(), "aitino", "prompts", "improve", prompt_type + ".md"),
"r",
encoding="utf-8",
) as f:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,10 +7,6 @@ START PROMPT WRITING KNOWLEDGE
Prompt engineering
This guide shares strategies and tactics for getting better results from large language models (sometimes referred to as GPT models) like GPT-4. The methods described here can sometimes be deployed in combination for greater effect. We encourage experimentation to find the methods that work best for you.

You can also explore example prompts which showcase what our models are capable of:

Prompt examples
Explore prompt examples to learn what GPT models can do
Six strategies for getting better results
Write clear instructions
These models can’t read your mind. If outputs are too long, ask for brief replies. If outputs are too simple, ask for expert-level writing. If you dislike the format, demonstrate the format you’d like to see. The less the model has to guess at what you want, the more likely you’ll get it.
Expand Down Expand Up @@ -347,4 +343,5 @@ END PROMPT WRITING KNOWLEDGE

1. Output the prompt in clean, human-readable Markdown format.
2. Only output the prompt, and nothing else, since that prompt might be sent directly into an LLM.
3. Do not include a response to the initial prompt, like "Certainly!", or "Gladly!"
3. Do not include a response to the initial prompt, like "Certainly!", or "Gladly!". No additional commentary or explanation should be included either.

346 changes: 346 additions & 0 deletions aitino/prompts/improve/system.md

Large diffs are not rendered by default.

346 changes: 346 additions & 0 deletions aitino/prompts/improve/user.md

Large diffs are not rendered by default.

Empty file.
Empty file.

0 comments on commit 70abbbb

Please sign in to comment.