@@ -29,6 +29,7 @@ Run local LLMs on iGPU, APU and CPU (AMD , Intel, and Qualcomm (Coming Soon)). E
29
29
- [ Launch Chatbot Web UI] ( #launch-chatbot-web-ui )
30
30
- [ Launch Model Management UI] ( #launch-model-management-ui )
31
31
- [ Compile OpenAI-API Compatible Server into Windows Executable] ( #compile-openai-api-compatible-server-into-windows-executable )
32
+ - [ Prebuilt Binary (Alpha)] ( #compile-openai-api-compatible-server-into-windows-executable )
32
33
- [ Acknowledgements] ( #acknowledgements )
33
34
34
35
## Supported Models (Quick Start)
@@ -59,39 +60,39 @@ Run local LLMs on iGPU, APU and CPU (AMD , Intel, and Qualcomm (Coming Soon)). E
59
60
60
61
1 . Custom Setup:
61
62
62
- - ** XPU** : Requires anaconda environment. ` conda create -n ellm python=3.10 libuv; conda activate llm ` .
63
+ - ** IPEX( XPU) ** : Requires anaconda environment. ` conda create -n ellm python=3.10 libuv; conda activate ellm ` .
63
64
- ** DirectML** : If you are using Conda Environment. Install additional dependencies: ` conda install conda-forge::vs2015_runtime ` .
64
65
65
66
2 . Install embeddedllm package. ` $env:ELLM_TARGET_DEVICE='directml'; pip install -e . ` . Note: currently support ` cpu ` , ` directml ` and ` cuda ` .
66
67
67
68
- ** DirectML:** ` $env:ELLM_TARGET_DEVICE='directml'; pip install -e .[directml] `
68
69
- ** CPU:** ` $env:ELLM_TARGET_DEVICE='cpu'; pip install -e .[cpu] `
69
70
- ** CUDA:** ` $env:ELLM_TARGET_DEVICE='cuda'; pip install -e .[cuda] `
70
- - ** XPU :** ` $env:ELLM_TARGET_DEVICE='xpu '; pip install -e .[xpu] `
71
+ - ** IPEX :** ` $env:ELLM_TARGET_DEVICE='ipex '; python setup.py develop `
71
72
- ** With Web UI** :
72
73
- ** DirectML:** ` $env:ELLM_TARGET_DEVICE='directml'; pip install -e .[directml,webui] `
73
74
- ** CPU:** ` $env:ELLM_TARGET_DEVICE='cpu'; pip install -e .[cpu,webui] `
74
75
- ** CUDA:** ` $env:ELLM_TARGET_DEVICE='cuda'; pip install -e .[cuda,webui] `
75
- - ** XPU :** ` $env:ELLM_TARGET_DEVICE='xpu '; pip install -e .[xpu, webui] `
76
+ - ** IPEX :** ` $env:ELLM_TARGET_DEVICE='ipex '; python setup.py develop; pip install -r requirements- webui.txt `
76
77
77
78
- ** Linux**
78
79
79
80
1 . Custom Setup:
80
81
81
- - ** XPU** : Requires anaconda environment. ` conda create -n ellm python=3.10 libuv; conda activate llm ` .
82
+ - ** IPEX( XPU) ** : Requires anaconda environment. ` conda create -n ellm python=3.10 libuv; conda activate ellm ` .
82
83
- ** DirectML** : If you are using Conda Environment. Install additional dependencies: ` conda install conda-forge::vs2015_runtime ` .
83
84
84
85
2 . Install embeddedllm package. ` ELLM_TARGET_DEVICE='directml' pip install -e . ` . Note: currently support ` cpu ` , ` directml ` and ` cuda ` .
85
86
86
87
- ** DirectML:** ` ELLM_TARGET_DEVICE='directml' pip install -e .[directml] `
87
88
- ** CPU:** ` ELLM_TARGET_DEVICE='cpu' pip install -e .[cpu] `
88
89
- ** CUDA:** ` ELLM_TARGET_DEVICE='cuda' pip install -e .[cuda] `
89
- - ** XPU :** ` ELLM_TARGET_DEVICE='xpu' pip install -e .[xpu] `
90
+ - ** IPEX :** ` ELLM_TARGET_DEVICE='ipex' python setup.py develop `
90
91
- ** With Web UI** :
91
92
- ** DirectML:** ` ELLM_TARGET_DEVICE='directml' pip install -e .[directml,webui] `
92
93
- ** CPU:** ` ELLM_TARGET_DEVICE='cpu' pip install -e .[cpu,webui] `
93
94
- ** CUDA:** ` ELLM_TARGET_DEVICE='cuda' pip install -e .[cuda,webui] `
94
- - ** XPU :** ` ELLM_TARGET_DEVICE='xpu' pip install -e .[xpu, webui] `
95
+ - ** IPEX :** ` $env: ELLM_TARGET_DEVICE='ipex'; python setup.py develop; pip install -r requirements- webui.txt `
95
96
96
97
### Launch OpenAI API Compatible Server
97
98
@@ -131,9 +132,19 @@ It is an interface that allows you to download and deploy OpenAI API compatible
131
132
## Compile OpenAI-API Compatible Server into Windows Executable
132
133
133
134
1. Install `embeddedllm`.
134
- 2. Install PyInstaller: `pip install pyinstaller`.
135
+ 2. Install PyInstaller: `pip install pyinstaller==6.9.0 `.
135
136
3. Compile Windows Executable: `pyinstaller .\ellm_api_server.spec`.
136
137
4. You can find the executable in the `dist\ellm_api_server`.
138
+ 5. Use it like `ellm_server`. `.\ellm_api_server.exe --model_path <path/to/model/weight>`.
139
+
140
+ ## Prebuilt OpenAI API Compatible Windows Executable (Alpha)
141
+ You can find the prebuilt OpenAI API Compatible Windows Executable in the Release page.
142
+
143
+ *Powershell/Terminal Usage (Use it like `ellm_server`)*:
144
+ ```powershell
145
+ .\ellm_api_server.exe --model_path <path/to/model/weight>
146
+ ```
147
+
137
148
138
149
## Acknowledgements
139
150
0 commit comments