Skip to content

0.3.0

Compare
Choose a tag to compare
@tomkat-cr tomkat-cr released this 29 Jan 12:29
· 23 commits to main since this release
0fc3eac

0.3.0 (2025-01-25)


New

Add AI/ML API provider and models [GS-55] [GS-156].
Add SUGGESTIONS_MODEL_REPLACEMENT parameter to avoid use the OpenAI reasoning models in the suggestions generation [GS-55].
Add SUGGESTIONS_DEFAULT_TIMEFRAME parameter to set the default timeframe to 48 hours for suggestions [GS-55].
Add LLM_MODEL_FORCED_VALUES parameter to set fixed values for models like o1-preview that only accepts temperature=1 [GS-55].
Add LLM_MODEL_PARAMS_NAMING parameter to rename the model parameters [GS-55].
Add DeepSeek-V3 model [GS-55].
Add titles to conversations, generated by AI [GS-55].

Changes

The suggestions generation prompt was enhanced to be one-shot, the suggestions {qty} was added, the {timeframe} token is replaced by the SUGGESTIONS_DEFAULT_TIMEFRAME parameter value and the application subject token {subject} was replaced by a more generic subject text [GS-55].
All prompts were enhanced locating the application subject at the end of each prompt [GS-55].
Restore the original question if App Ideation from prompt was used [GS-55].
User prompt were splited to have system prompts that configure the LLM model behavior [GS-55].

Fixes

Fix the error "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead." using OpenAI o1-preview/o1-mini models.
Fix the "unified" flag assignment in get_unified_flag() because it was returning False always [GS-55].