You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: AI-and-Analytics/End-to-end-Workloads/Census/README.md
+6-6
Original file line number
Diff line number
Diff line change
@@ -19,14 +19,14 @@ Intel® Distribution of Modin* uses HDK to speed up your Pandas notebooks, scrip
19
19
| :--- | :---
20
20
| OS | 64-bit Ubuntu* 18.04 or higher
21
21
| Hardware | Intel Atom® processors <br> Intel® Core™ processor family <br> Intel® Xeon® processor family <br> Intel® Xeon® Scalable processor family
22
-
| Software | Intel® AI Analytics Toolkit (AI Kit) (Python version 3.8 or newer, Intel® Distribution of Modin*) <br> Intel® Extension for Scikit-learn* <br> NumPy
22
+
| Software | AI Tools (Python version 3.8 or newer, Intel® Distribution of Modin*) <br> Intel® Extension for Scikit-learn* <br> NumPy
23
23
24
-
The Intel® Distribution of Modin* and Intel® Extension for Scikit-learn* libraries are available together in [Intel® AI Analytics Toolkit (AI Kit)](https://software.intel.com/content/www/us/en/develop/tools/oneapi/ai-analytics-toolkit.html).
24
+
The Intel® Distribution of Modin* and Intel® Extension for Scikit-learn* libraries are available together in [AI Tools](https://www.intel.com/content/www/us/en/developer/topic-technology/artificial-intelligence/frameworks-tools.html).
25
25
26
26
27
27
## Key Implementation Details
28
28
29
-
This end-to-end workload sample code is implemented for CPU using the Python language. Once you have installed AI Kit, the Conda environment is prepared with Python version 3.8 (or newer), Intel Distribution of Modin*, Intel® Extension for Scikit-Learn, and NumPy.
29
+
This end-to-end workload sample code is implemented for CPU using the Python language. Once you have installed AI Tools, the Conda environment is prepared with Python version 3.8 (or newer), Intel Distribution of Modin*, Intel® Extension for Scikit-Learn, and NumPy.
30
30
31
31
In this sample, you will use Intel® Distribution of Modin* to ingest and process U.S. census data from 1970 to 2010 in order to build a ridge regression-based model to find the relation between education and total income earned in the US.
32
32
@@ -36,11 +36,11 @@ The data transformation stage normalizes the income to yearly inflation, balance
36
36
37
37
38
38
## Configure the Development Environment
39
-
If you do not already have the AI Kit installed, then download an online or offline installer for the [Intel® AI Analytics Toolkit (AI Kit)](https://software.intel.com/content/www/us/en/develop/tools/oneapi/ai-analytics-toolkit.html) or install the AI Kit using Conda.
39
+
If you do not already have the AI Tools installed, then download an online or offline installer for the [AI Tools](https://www.intel.com/content/www/us/en/developer/topic-technology/artificial-intelligence/frameworks-tools.html) or install the AI Tools using Conda.
40
40
41
-
>**Note**: See [Install Intel® AI Analytics Toolkit via Conda*](https://software.intel.com/content/www/us/en/develop/documentation/installation-guide-for-intel-oneapi-toolkits-linux/top/installation/install-using-package-managers/conda/install-intel-ai-analytics-toolkit-via-conda.html) in the *Intel® oneAPI Toolkits Installation Guide for Linux* OS* for information on Conda installation and configuration.
41
+
>**Note**: See [Install AI Tools via Conda*](https://software.intel.com/content/www/us/en/develop/documentation/installation-guide-for-intel-oneapi-toolkits-linux/top/installation/install-using-package-managers/conda/install-intel-ai-analytics-toolkit-via-conda.html) in the *Intel® oneAPI Toolkits Installation Guide for Linux* OS* for information on Conda installation and configuration.
42
42
43
-
The Intel® Distribution of Modin* and the Intel® Extension for Scikit-learn* are ready to use after AI Kit installation with the Conda Package Manager.
43
+
The Intel® Distribution of Modin* and the Intel® Extension for Scikit-learn* are ready to use after AI Tools installation with the Conda Package Manager.
Copy file name to clipboardExpand all lines: AI-and-Analytics/End-to-end-Workloads/JobRecommendationSystem/README.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -42,7 +42,7 @@ You will need to download and install the following toolkits, tools, and compone
42
42
43
43
Required AI Tools: <Intel® Extension for TensorFlow* - GPU><!-- List specific AI Tools that needs to be installed before running this sample -->
44
44
45
-
If you have not already, select and install these Tools via [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html). AI and Analytics samples are validated on AI Tools Offline Installer. It is recommended to select Offline Installer option in AI Tools Selector.
45
+
If you have not already, select and install these Tools via [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/topic-technology/artificial-intelligence/frameworks-tools-selector.html). AI and Analytics samples are validated on AI Tools Offline Installer. It is recommended to select Offline Installer option in AI Tools Selector.
46
46
47
47
>**Note**: If Docker option is chosen in AI Tools Selector, refer to [Working with Preset Containers](https://github.com/intel/ai-containers/tree/main/preset) to learn how to run the docker and samples.
48
48
@@ -85,7 +85,7 @@ For Jupyter Notebook, refer to [Installing Jupyter](https://jupyter.org/install)
85
85
## Run the Sample
86
86
>**Note**: Before running the sample, make sure [Environment Setup](https://github.com/oneapi-src/oneAPI-samples/tree/master/AI-and-Analytics/Getting-Started-Samples/INC-Quantization-Sample-for-PyTorch#environment-setup) is completed.
87
87
88
-
Go to the section which corresponds to the installation method chosen in [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html) to see relevant instructions:
88
+
Go to the section which corresponds to the installation method chosen in [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/topic-technology/artificial-intelligence/frameworks-tools-selector.html) to see relevant instructions:
Copy file name to clipboardExpand all lines: AI-and-Analytics/End-to-end-Workloads/LanguageIdentification/README.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ Languages are selected from the CommonVoice dataset for training, validation, an
11
11
12
12
## Purpose
13
13
14
-
Spoken audio comes in different languages and this sample uses a model to identify what that language is. The user will use an Intel® AI Analytics Toolkit container environment to train a model and perform inference leveraging Intel-optimized libraries for PyTorch*. There is also an option to quantize the trained model with Intel® Neural Compressor (INC) to speed up inference.
14
+
Spoken audio comes in different languages and this sample uses a model to identify what that language is. The user will use an AI Tools container environment to train a model and perform inference leveraging Intel-optimized libraries for PyTorch*. There is also an option to quantize the trained model with Intel® Neural Compressor (INC) to speed up inference.
15
15
16
16
## Prerequisites
17
17
@@ -39,7 +39,7 @@ For both training and inference, you can run the sample and scripts in Jupyter N
39
39
40
40
### Create and Set Up Environment
41
41
42
-
1. Create your conda environment by following the instructions on the Intel [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html). You can follow these settings:
42
+
1. Create your conda environment by following the instructions on the Intel [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/topic-technology/artificial-intelligence/frameworks-tools-selector.html). You can follow these settings:
Copy file name to clipboardExpand all lines: AI-and-Analytics/End-to-end-Workloads/LanguageIdentification/Training/lang_id_training.ipynb
+9
Original file line number
Diff line number
Diff line change
@@ -200,6 +200,15 @@
200
200
"\n",
201
201
">**Note**: If the folder name containing the model is changed from `lang_id_commonvoice_model`, you will need to modify the `pretrained_path` in `train_ecapa.yaml`, and the `source_model_path` variable in both the `inference_commonVoice.py` and `inference_custom.py` files in the `speechbrain_inference` class. "
Copy file name to clipboardExpand all lines: AI-and-Analytics/End-to-end-Workloads/LidarObjectDetection-PointPillars/include/devicemanager/devicemanager.hpp
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/INC_QuantizationAwareTraining_TextClassification/INC_QuantizationAwareTraining_TextClassification.ipynb
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/INC_QuantizationAwareTraining_TextClassification/INC_QuantizationAwareTraining_TextClassification.py
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/INC_QuantizationAwareTraining_TextClassification/README.md
+4-4
Original file line number
Diff line number
Diff line change
@@ -8,7 +8,7 @@ The `Fine-tuning Text Classification Model with Intel® Neural Compressor (INC)`
8
8
| Time to complete | 10 minutes
9
9
| Category | Concepts and Functionality
10
10
11
-
Intel® Neural Compressor (INC) simplifies the process of converting the FP32 model to INT8/BF16. At the same time, Intel® Neural Compressor (INC) tunes the quantization method to reduce the accuracy loss, which is a big blocker for low-precision inference as part of Intel® AI Analytics Toolkit (AI Kit).
11
+
Intel® Neural Compressor (INC) simplifies the process of converting the FP32 model to INT8/BF16. At the same time, Intel® Neural Compressor (INC) tunes the quantization method to reduce the accuracy loss, which is a big blocker for low-precision inference as part of AI Tools.
12
12
13
13
## Purpose
14
14
@@ -26,9 +26,9 @@ This sample shows how to fine-tune text model for emotion classification on pre-
26
26
27
27
You will need to download and install the following toolkits, tools, and components to use the sample.
28
28
29
-
-**Intel® AI Analytics Toolkit (AI Kit)**
29
+
-**AI Tools**
30
30
31
-
You can get the AI Kit from [Intel® oneAPI Toolkits](https://www.intel.com/content/www/us/en/developer/tools/oneapi/toolkits.html#analytics-kit). <br> See [*Get Started with the Intel® AI Analytics Toolkit for Linux**](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux) for AI Kit installation information and post-installation steps and scripts.
31
+
You can get the AI Tools from [Intel® oneAPI Toolkits](https://www.intel.com/content/www/us/en/developer/tools/oneapi/toolkits.html#analytics-kit). <br> See [*Get Started with the AI Tools for Linux**](https://www.intel.com/content/www/us/en/docs/oneapi-ai-analytics-toolkit/get-started-guide-linux/current/before-you-begin.html) for AI Tools installation information and post-installation steps and scripts.
32
32
33
33
-**Jupyter Notebook**
34
34
@@ -90,7 +90,7 @@ When working with the command-line interface (CLI), you should configure the one
90
90
```
91
91
2. Activate Conda environment without Root access (Optional).
92
92
93
-
By default, the AI Kit is installed in the `/opt/intel/oneapi` folder and requires root privileges to manage it.
93
+
By default, the AI Tools is installed in the `/opt/intel/oneapi` folder and requires root privileges to manage it.
94
94
95
95
You can choose to activate Conda environment without root access. To bypass root access to manage your Conda environment, clone and activate your desired Conda environment using the following commands similar to the following.
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPyTorch_GPU_InferenceOptimization_with_AMP/IntelPyTorch_GPU_InferenceOptimization_with_AMP.ipynb
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPyTorch_GPU_InferenceOptimization_with_AMP/IntelPyTorch_GPU_InferenceOptimization_with_AMP.py
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPyTorch_GPU_InferenceOptimization_with_AMP/README.md
+5-5
Original file line number
Diff line number
Diff line change
@@ -20,15 +20,15 @@ The Intel® Extension for PyTorch (IPEX) gives users the ability to perform PyTo
20
20
|:--- |:---
21
21
| OS | Ubuntu* 22.04 or newer
22
22
| Hardware | Intel® Data Center GPU Flex Series, Intel® Data Center GPU Max Series, and Intel® ARC™ A-Series GPUs(Experimental Support)
23
-
| Software | Intel® oneAPI AI Analytics Toolkit 2023.1 or later
23
+
| Software | AI Tools 2023.1 or later
24
24
25
25
### For Local Development Environments
26
26
27
27
You will need to download and install the following toolkits, tools, and components to use the sample.
28
28
29
-
-**Intel® AI Analytics Toolkit (AI Kit) 2023.1 or later**
29
+
-**AI Tools 2023.1 or later**
30
30
31
-
You can get the AI Kit from [Intel® oneAPI Toolkits](https://www.intel.com/content/www/us/en/developer/tools/oneapi/toolkits.html#analytics-kit). <br> See [*Get Started with the Intel® AI Analytics Toolkit for Linux**](https://www.intel.com/content/www/us/en/develop/documentation/get-started-with-ai-linux) for AI Kit installation information and post-installation steps and scripts.
31
+
You can get the AI Tools from [Intel® oneAPI Toolkits](https://www.intel.com/content/www/us/en/developer/tools/oneapi/toolkits.html#analytics-kit). <br> See [*Get Started with the AI Tools for Linux**](https://www.intel.com/content/www/us/en/docs/oneapi-ai-analytics-toolkit/get-started-guide-linux/current/before-you-begin.html) for AI Tools installation information and post-installation steps and scripts.
32
32
33
33
-**Jupyter Notebook**
34
34
@@ -88,7 +88,7 @@ When working with the command-line interface (CLI), you should configure the one
88
88
```
89
89
2. Activate Conda environment without Root access (Optional).
90
90
91
-
By default, the AI Kit is installed in the `/opt/intel/oneapi` folder and requires root privileges to manage it.
91
+
By default, the AI Tools is installed in the `/opt/intel/oneapi` folder and requires root privileges to manage it.
92
92
93
93
You can choose to activate Conda environment without root access. To bypass root access to manage your Conda environment, clone and activate your desired Conda environment and create a jupyter kernal using the following commands similar to the following.
94
94
@@ -110,7 +110,7 @@ When working with the command-line interface (CLI), you should configure the one
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPyTorch_TrainingOptimizations_AMX_BF16/IntelPyTorch_TrainingOptimizations_AMX_BF16.ipynb
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPyTorch_TrainingOptimizations_AMX_BF16/README.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ You will need to download and install the following toolkits, tools, and compone
37
37
38
38
Required AI Tools: Intel® Extension for PyTorch* (CPU)
39
39
40
-
If you have not already, select and install these Tools via [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html). AI and Analytics samples are validated on AI Tools Offline Installer. It is recommended to select Offline Installer option in AI Tools Selector.
40
+
If you have not already, select and install these Tools via [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/topic-technology/artificial-intelligence/frameworks-tools-selector.html). AI and Analytics samples are validated on AI Tools Offline Installer. It is recommended to select Offline Installer option in AI Tools Selector.
41
41
42
42
>**Note**: If Docker option is chosen in AI Tools Selector, refer to [Working with Preset Containers](https://github.com/intel/ai-containers/tree/main/preset) to learn how to run the docker and samples.
43
43
@@ -74,7 +74,7 @@ For Jupyter Notebook, refer to [Installing Jupyter](https://jupyter.org/install)
74
74
## Run the Sample
75
75
>**Note**: Before running the sample, make sure [Environment Setup](https://github.com/oneapi-src/oneAPI-samples/tree/master/AI-and-Analytics/Features-and-Functionality/IntelPyTorch_TrainingOptimizations_AMX_BF16#environment-setup) is completed.
76
76
77
-
Go to the section which corresponds to the installation method chosen in [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/tools/oneapi/ai-tools-selector.html) to see relevant instructions:
77
+
Go to the section which corresponds to the installation method chosen in [AI Tools Selector](https://www.intel.com/content/www/us/en/developer/topic-technology/artificial-intelligence/frameworks-tools-selector.html) to see relevant instructions:
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPyTorch_TrainingOptimizations_AMX_BF16/pytorch_training_amx_bf16.py
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPyTorch_TrainingOptimizations_AMX_BF16/pytorch_training_avx512_bf16.py
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPython_GPU_dpnp_Genetic_Algorithm/IntelPython_GPU_dpnp_Genetic_Algorithm.ipynb
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPython_GPU_dpnp_Genetic_Algorithm/IntelPython_GPU_dpnp_Genetic_Algorithm.py
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPython_Numpy_Numba_dpnp_kNN/IntelPython_Numpy_Numba_dpnp_kNN.ipynb
Copy file name to clipboardExpand all lines: AI-and-Analytics/Features-and-Functionality/IntelPython_Numpy_Numba_dpnp_kNN/IntelPython_Numpy_Numba_dpnp_kNN.py
0 commit comments