Skip to content

Commit 9cd29aa

Browse files
author
shadokan87
committed
✨ Handling model name conflict error and updated Doc
Handled the case where a pre defined model model name conflict with a custom one and updated doc
1 parent 69bbbe2 commit 9cd29aa

File tree

2 files changed

+13
-9
lines changed

2 files changed

+13
-9
lines changed

README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -177,7 +177,7 @@ main()
177177

178178
Token.js allows you to extend the predefined model list using the `extendModelList` method. Here are some example scenarios where this is useful:
179179
1. Adding AWS Bedrock models with regional prefixes like `us.anthropic.claude-3-sonnet`
180-
2. Supporting new model versions like `gpt-4-1106-preview` before they're added to the predefined list
180+
2. Supporting new model versions before they're added to the predefined list
181181
3. Using custom model deployments with unique names
182182
4. Adding experimental or beta models during testing
183183

@@ -207,7 +207,7 @@ const result = await tokenjs.chat.completions.create({
207207
});
208208
```
209209

210-
Note: When using extended models, you might need to use type casting (`as any`) for the model parameter.
210+
Note: When using extended models, type casting (`as any`) is required
211211

212212
The `featureSupport` parameter can be either:
213213
- A string matching an existing model name from the same provider to copy its feature support

src/index.ts

+11-7
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
import { LLMChat, LLMProvider } from './chat/index.js'
2+
import { InputError } from './handlers/types.js'
23
import { models } from './models.js'
34
import { ConfigOptions } from './userTypes/index.js'
45
export * from './userTypes/index.js'
@@ -76,11 +77,6 @@ export class TokenJS implements TokenJSInterface {
7677

7778
/**
7879
* Extends the predefined model list by adding a new model with specified features.
79-
* This is useful for:
80-
* 1. Adding AWS Bedrock models with regional prefixes like `us.anthropic.claude-3-sonnet`
81-
* 2. Supporting new model versions like `gpt-4-1106-preview` before they're added to the predefined list
82-
* 3. Using custom model deployments with unique names
83-
* 4. Adding experimental or beta models during testing
8480
*
8581
* @param provider - The LLM provider (e.g., 'bedrock', 'openai')
8682
* @param name - The model name/identifier to add
@@ -121,8 +117,7 @@ export class TokenJS implements TokenJSInterface {
121117
* });
122118
* ```
123119
*
124-
* Note: When using extended models, you might need to use type casting (`as any`)
125-
* for the model parameter until the type definitions are updated to include your custom model.
120+
* Note: When using extended models, type casting (`as any`) is required
126121
*/
127122
extendModelList<
128123
P extends Exclude<LLMProvider, 'openrouter' | 'openai-compatible'>
@@ -131,6 +126,15 @@ export class TokenJS implements TokenJSInterface {
131126
if (this.extendedModelExist(provider, name)) {
132127
return this
133128
}
129+
// If a model name is pre-defined, there is a conflict so we throw an error
130+
if (
131+
Array.isArray(models[provider].models) &&
132+
models[provider].models.includes(name)
133+
) {
134+
throw new InputError(
135+
`You tried to add the following custom model name: "${name}", for provider: "${provider}". But it conflicts with an existing pre-defined model name. Please try again using different name e.g.: "${name}-custom"`
136+
)
137+
}
134138

135139
const modelsRef = models[provider] as any
136140
modelsRef['models'] = [...(models as any)[provider]['models'], name]

0 commit comments

Comments
 (0)