Skip to content

✨ Add example script for generating options with ollama.generate() #535

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

SamuelCarmona83
Copy link

This pull request adds a new example script, generate-with-options.py, to demonstrate how to use the ollama.generate() function with various options to control model behavior. The script includes detailed examples showcasing the effects of parameters such as temperature, top_p, num_predict, seed, stop sequences, and repeat_penalty.

Key additions in the example script:

Demonstrations of model behavior control:

  • Creative response generation: Demonstrates high creativity and diverse sampling using options like temperature=1.0, top_p=0.9, and num_predict=80.
  • Focused and reproducible response: Shows deterministic output with low temperature (temperature=0.3) and reproducible results using a fixed seed (seed=42).
  • Controlled response with stop sequences: Illustrates how to stop generation early using specific phrases in the stop parameter.

Comparative analysis and advanced features:

  • Temperature comparison: Compares responses generated at different temperature levels (0.1, 0.7, 1.2) to highlight the impact of temperature on creativity and randomness.
  • Context window control: Demonstrates the use of num_ctx to manage the model's context window for generating responses based on a…rate()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant