You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -149,6 +145,6 @@ Cortex is an open source alternative to serving models with SageMaker or buildin
149
145
## Examples
150
146
151
147
<!-- CORTEX_VERSION_README_MINOR x3 -->
152
-
*[Image classification](https://github.com/cortexlabs/cortex/tree/0.15/examples/tensorflow/image-classifier): deploy an Inception model to classify images.
153
-
*[Search completion](https://github.com/cortexlabs/cortex/tree/0.15/examples/pytorch/search-completer): deploy Facebook's RoBERTa model to complete search terms.
154
-
*[Text generation](https://github.com/cortexlabs/cortex/tree/0.15/examples/pytorch/text-generator): deploy Hugging Face's DistilGPT2 model to generate text.
148
+
*[Image classification](https://github.com/cortexlabs/cortex/tree/0.16/examples/tensorflow/image-classifier): deploy an Inception model to classify images.
149
+
*[Search completion](https://github.com/cortexlabs/cortex/tree/0.16/examples/pytorch/search-completer): deploy Facebook's RoBERTa model to complete search terms.
150
+
*[Text generation](https://github.com/cortexlabs/cortex/tree/0.16/examples/pytorch/text-generator): deploy Hugging Face's DistilGPT2 model to generate text.
Copy file name to clipboardExpand all lines: docs/deployments/exporting.md
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ Here are examples for some common ML libraries:
11
11
The recommended approach is export your PyTorch model with [torch.save()](https://pytorch.org/docs/stable/torch.html?highlight=save#torch.save). Here is PyTorch's documentation on [saving and loading models](https://pytorch.org/tutorials/beginner/saving_loading_models.html).
12
12
13
13
<!-- CORTEX_VERSION_MINOR -->
14
-
[examples/pytorch/iris-classifier](https://github.com/cortexlabs/cortex/blob/master/examples/pytorch/iris-classifier) exports its trained model like this:
14
+
[examples/pytorch/iris-classifier](https://github.com/cortexlabs/cortex/blob/0.16/examples/pytorch/iris-classifier) exports its trained model like this:
It may also be possible to export your PyTorch model into the ONNX format using [torch.onnx.export()](https://pytorch.org/docs/stable/onnx.html#torch.onnx.export).
23
23
24
24
<!-- CORTEX_VERSION_MINOR -->
25
-
For example, if [examples/pytorch/iris-classifier](https://github.com/cortexlabs/cortex/blob/master/examples/pytorch/iris-classifier) were to export the model to ONNX, it would look like this:
25
+
For example, if [examples/pytorch/iris-classifier](https://github.com/cortexlabs/cortex/blob/0.16/examples/pytorch/iris-classifier) were to export the model to ONNX, it would look like this:
26
26
27
27
```python
28
28
placeholder = torch.randn(1, 4)
@@ -50,7 +50,7 @@ A TensorFlow `SavedModel` directory should have this structure:
50
50
```
51
51
52
52
<!-- CORTEX_VERSION_MINOR -->
53
-
Most of the TensorFlow examples use this approach. Here is the relevant code from [examples/tensorflow/sentiment-analyzer](https://github.com/cortexlabs/cortex/blob/master/examples/tensorflow/sentiment-analyzer):
53
+
Most of the TensorFlow examples use this approach. Here is the relevant code from [examples/tensorflow/sentiment-analyzer](https://github.com/cortexlabs/cortex/blob/0.16/examples/tensorflow/sentiment-analyzer):
[examples/tensorflow/iris-classifier](https://github.com/cortexlabs/cortex/blob/master/examples/tensorflow/iris-classifier) also use the `SavedModel` approach, and includes a Python notebook demonstrating how it was exported.
91
+
[examples/tensorflow/iris-classifier](https://github.com/cortexlabs/cortex/blob/0.16/examples/tensorflow/iris-classifier) also use the `SavedModel` approach, and includes a Python notebook demonstrating how it was exported.
92
92
93
93
### Other model formats
94
94
95
95
There are other ways to export Keras or TensorFlow models, and as long as they can be loaded and used to make predictions in Python, they will be supported by Cortex.
96
96
97
97
<!-- CORTEX_VERSION_MINOR -->
98
-
For example, the `crnn` API in [examples/tensorflow/license-plate-reader](https://github.com/cortexlabs/cortex/blob/master/examples/tensorflow/license-plate-reader) uses this approach.
98
+
For example, the `crnn` API in [examples/tensorflow/license-plate-reader](https://github.com/cortexlabs/cortex/blob/0.16/examples/tensorflow/license-plate-reader) uses this approach.
99
99
100
100
## Scikit-learn
101
101
@@ -104,7 +104,7 @@ For example, the `crnn` API in [examples/tensorflow/license-plate-reader](https:
104
104
Scikit-learn models are typically exported using `pickle`. Here is [Scikit-learn's documentation](https://scikit-learn.org/stable/modules/model_persistence.html).
105
105
106
106
<!-- CORTEX_VERSION_MINOR -->
107
-
[examples/sklearn/iris-classifier](https://github.com/cortexlabs/cortex/blob/master/examples/sklearn/iris-classifier) uses this approach. Here is the relevant code:
107
+
[examples/sklearn/iris-classifier](https://github.com/cortexlabs/cortex/blob/0.16/examples/sklearn/iris-classifier) uses this approach. Here is the relevant code:
108
108
109
109
```python
110
110
pickle.dump(model, open("model.pkl", "wb"))
@@ -157,7 +157,7 @@ model.save_model("model.bin")
157
157
It is also possible to export an XGBoost model to the ONNX format using [onnxmltools](https://github.com/onnx/onnxmltools).
158
158
159
159
<!-- CORTEX_VERSION_MINOR -->
160
-
[examples/xgboost/iris-classifier](https://github.com/cortexlabs/cortex/blob/master/examples/xgboost/iris-classifier) uses this approach. Here is the relevant code:
160
+
[examples/xgboost/iris-classifier](https://github.com/cortexlabs/cortex/blob/0.16/examples/xgboost/iris-classifier) uses this approach. Here is the relevant code:
Copy file name to clipboardExpand all lines: docs/deployments/predictors.md
+10-10Lines changed: 10 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -70,10 +70,10 @@ For proper separation of concerns, it is recommended to use the constructor's `c
70
70
### Examples
71
71
72
72
<!-- CORTEX_VERSION_MINOR -->
73
-
Many of the [examples](https://github.com/cortexlabs/cortex/tree/master/examples) use the Python Predictor, including all of the PyTorch examples.
73
+
Many of the [examples](https://github.com/cortexlabs/cortex/tree/0.16/examples) use the Python Predictor, including all of the PyTorch examples.
74
74
75
75
<!-- CORTEX_VERSION_MINOR -->
76
-
Here is the Predictor for [examples/pytorch/iris-classifier](https://github.com/cortexlabs/cortex/tree/master/examples/pytorch/iris-classifier):
76
+
Here is the Predictor for [examples/pytorch/iris-classifier](https://github.com/cortexlabs/cortex/tree/0.16/examples/pytorch/iris-classifier):
77
77
78
78
```python
79
79
import re
@@ -151,7 +151,7 @@ xgboost==1.0.2
151
151
```
152
152
153
153
<!-- CORTEX_VERSION_MINOR x2 -->
154
-
The pre-installed system packages are listed in [images/python-predictor-cpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/python-predictor-cpu/Dockerfile) (for CPU) or [images/python-predictor-gpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/python-predictor-gpu/Dockerfile) (for GPU).
154
+
The pre-installed system packages are listed in [images/python-predictor-cpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/0.16/images/python-predictor-cpu/Dockerfile) (for CPU) or [images/python-predictor-gpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/0.16/images/python-predictor-gpu/Dockerfile) (for GPU).
155
155
156
156
If your application requires additional dependencies, you can install additional [Python packages](python-packages.md) and [system packages](system-packages.md).
157
157
@@ -184,17 +184,17 @@ class TensorFlowPredictor:
184
184
```
185
185
186
186
<!-- CORTEX_VERSION_MINOR -->
187
-
Cortex provides a `tensorflow_client` to your Predictor's constructor. `tensorflow_client` is an instance of [TensorFlowClient](https://github.com/cortexlabs/cortex/tree/master/pkg/workloads/cortex/lib/client/tensorflow.py) that manages a connection to a TensorFlow Serving container to make predictions using your model. It should be saved as an instance variable in your Predictor, and your `predict()` function should call `tensorflow_client.predict()` to make an inference with your exported TensorFlow model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
187
+
Cortex provides a `tensorflow_client` to your Predictor's constructor. `tensorflow_client` is an instance of [TensorFlowClient](https://github.com/cortexlabs/cortex/tree/0.16/pkg/workloads/cortex/lib/client/tensorflow.py) that manages a connection to a TensorFlow Serving container to make predictions using your model. It should be saved as an instance variable in your Predictor, and your `predict()` function should call `tensorflow_client.predict()` to make an inference with your exported TensorFlow model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
188
188
189
189
For proper separation of concerns, it is recommended to use the constructor's `config` paramater for information such as configurable model parameters or download links for initialization files. You define `config` in your [API configuration](api-configuration.md), and it is passed through to your Predictor's constructor.
190
190
191
191
### Examples
192
192
193
193
<!-- CORTEX_VERSION_MINOR -->
194
-
Most of the examples in [examples/tensorflow](https://github.com/cortexlabs/cortex/tree/master/examples/tensorflow) use the TensorFlow Predictor.
194
+
Most of the examples in [examples/tensorflow](https://github.com/cortexlabs/cortex/tree/0.16/examples/tensorflow) use the TensorFlow Predictor.
195
195
196
196
<!-- CORTEX_VERSION_MINOR -->
197
-
Here is the Predictor for [examples/tensorflow/iris-classifier](https://github.com/cortexlabs/cortex/tree/master/examples/tensorflow/iris-classifier):
197
+
Here is the Predictor for [examples/tensorflow/iris-classifier](https://github.com/cortexlabs/cortex/tree/0.16/examples/tensorflow/iris-classifier):
198
198
199
199
```python
200
200
labels = ["setosa", "versicolor", "virginica"]
@@ -226,7 +226,7 @@ tensorflow==2.1.0
226
226
```
227
227
228
228
<!-- CORTEX_VERSION_MINOR -->
229
-
The pre-installed system packages are listed in [images/tensorflow-predictor/Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/tensorflow-predictor/Dockerfile).
229
+
The pre-installed system packages are listed in [images/tensorflow-predictor/Dockerfile](https://github.com/cortexlabs/cortex/tree/0.16/images/tensorflow-predictor/Dockerfile).
230
230
231
231
If your application requires additional dependencies, you can install additional [Python packages](python-packages.md) and [system packages](system-packages.md).
232
232
@@ -259,14 +259,14 @@ class ONNXPredictor:
259
259
```
260
260
261
261
<!-- CORTEX_VERSION_MINOR -->
262
-
Cortex provides an `onnx_client` to your Predictor's constructor. `onnx_client` is an instance of [ONNXClient](https://github.com/cortexlabs/cortex/tree/master/pkg/workloads/cortex/lib/client/onnx.py) that manages an ONNX Runtime session to make predictions using your model. It should be saved as an instance variable in your Predictor, and your `predict()` function should call `onnx_client.predict()` to make an inference with your exported ONNX model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
262
+
Cortex provides an `onnx_client` to your Predictor's constructor. `onnx_client` is an instance of [ONNXClient](https://github.com/cortexlabs/cortex/tree/0.16/pkg/workloads/cortex/lib/client/onnx.py) that manages an ONNX Runtime session to make predictions using your model. It should be saved as an instance variable in your Predictor, and your `predict()` function should call `onnx_client.predict()` to make an inference with your exported ONNX model. Preprocessing of the JSON payload and postprocessing of predictions can be implemented in your `predict()` function as well.
263
263
264
264
For proper separation of concerns, it is recommended to use the constructor's `config` paramater for information such as configurable model parameters or download links for initialization files. You define `config` in your [API configuration](api-configuration.md), and it is passed through to your Predictor's constructor.
265
265
266
266
### Examples
267
267
268
268
<!-- CORTEX_VERSION_MINOR -->
269
-
[examples/xgboost/iris-classifier](https://github.com/cortexlabs/cortex/tree/master/examples/xgboost/iris-classifier) uses the ONNX Predictor:
269
+
[examples/xgboost/iris-classifier](https://github.com/cortexlabs/cortex/tree/0.16/examples/xgboost/iris-classifier) uses the ONNX Predictor:
270
270
271
271
```python
272
272
labels = ["setosa", "versicolor", "virginica"]
@@ -303,7 +303,7 @@ requests==2.23.0
303
303
```
304
304
305
305
<!-- CORTEX_VERSION_MINOR x2 -->
306
-
The pre-installed system packages are listed in [images/onnx-predictor-cpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/onnx-predictor-cpu/Dockerfile) (for CPU) or [images/onnx-predictor-gpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/master/images/onnx-predictor-gpu/Dockerfile) (for GPU).
306
+
The pre-installed system packages are listed in [images/onnx-predictor-cpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/0.16/images/onnx-predictor-cpu/Dockerfile) (for CPU) or [images/onnx-predictor-gpu/Dockerfile](https://github.com/cortexlabs/cortex/tree/0.16/images/onnx-predictor-gpu/Dockerfile) (for GPU).
307
307
308
308
If your application requires additional dependencies, you can install additional [Python packages](python-packages.md) and [system packages](system-packages.md).
0 commit comments