Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update set_context API for ms2.5 version #816

Merged
merged 1 commit into from
Mar 18, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.8"] #["3.7", "3.8", "3.9"] TBA
python-version: ["3.9"] #["3.7", "3.8", "3.9"] TBA
runs-on: ${{ matrix.os }}

steps:
Expand All @@ -35,7 +35,7 @@ jobs:
pip install -r requirements/dev.txt
# MindSpore must be installed following the instruction from official web, but not from pypi.
# That's why we exclude mindspore from requirements.txt. Does this work?
pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/2.3.0/MindSpore/unified/x86_64/mindspore-2.3.0-cp38-cp38-linux_x86_64.whl --trusted-host ms-release.obs.cn-north-4.myhuaweicloud.com -i https://pypi.tuna.tsinghua.edu.cn/simple
pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/2.5.0/MindSpore/unified/x86_64/mindspore-2.5.0-cp39-cp39-linux_x86_64.whl --trusted-host ms-release.obs.cn-north-4.myhuaweicloud.com -i https://pypi.tuna.tsinghua.edu.cn/simple
- name: Lint with pre-commit
uses: pre-commit/[email protected]
- name: Check_Cpplint
Expand Down
2 changes: 1 addition & 1 deletion deploy/eval_utils/eval_rec.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def read_gt_content(filename):


if __name__ == "__main__":
ms.set_context(device_target="CPU")
ms.set_device("CPU")
parser = argparse.ArgumentParser()
parser.add_argument("--gt_path", required=True, type=str)
parser.add_argument("--pred_path", required=True, type=str)
Expand Down
2 changes: 1 addition & 1 deletion deploy/py_infer/src/data_process/postprocess/builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ def __init__(self, tasks, **kwargs):
# if check device failed, set device_target="CPU"
if get_device_status() == 1:
# FIXME: set_context may be invalid sometimes, it's best to patch to XXXPostprocess.__init__
ms.set_context(device_target="CPU")
ms.set_device("CPU")

def __call__(self, *args, **kwargs):
return self._ops_func(*args, **kwargs)
Expand Down
4 changes: 2 additions & 2 deletions docs/en/tutorials/frequently_asked_questions.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ For example, the following combination of packages is suitable when the platform
- `Load dynamic library: libmindspore_ascend.so.2 failed. liboptiling.so: cannot open shared object file: No such file or directory`

```bash
python -c "import mindspore;mindspore.set_context(device_target='Ascend');mindspore.run_check()"
python -c "import mindspore;mindspore.set_device('Ascend');mindspore.run_check()"
[WARNING] ME(60105:13981374421 1776, MainProcess):2023-10-25-08: 14:33.640.411 [mindspore/run_check/_check_version.py:348] Using custom Ascend AI software package (Ascend Data Center Solution) path, package version checking is skipped. Please make sure Ascend AI software package (Ascend Data Center Solution) version is supported. For details, refer to the installation guidelines https://www.mindspore.cn/install
Traceback (most recent call last):
File "<string>", line 1, in module>
Expand Down Expand Up @@ -412,7 +412,7 @@ Remove the `lanms` item from `requirements.txt`, and run `pip install -r require
[WARNING] ME(44720:140507814819648,MainProcess):2023-11-01-03:01:38.884.384 [mindspore/run_check/_check_version.py:348] Using custom Ascend AI software package (Ascend Data Center Solution) path, package version checking is skipped. Please make sure Ascend AI software package (Ascend Data Center Solution) version is supported. For details, refer to the installation guidelines https://www.mindspore.cn/install
[WARNING] ME(44720:140507814819648,MainProcess):2023-11-01-03:01:38.884.675 [mindspore/run_check/_check_version.py:466] Can not find driver so(need by mindspore-ascend). Please check whether the Environment Variable LD_LIBRARY_PATH is set. For details, refer to the installation guidelines: https://www.mindspore.cn/install
>>> import mindspore.ops as ops
>>> ms.set_context(device_target="Ascend")
>>> ms.set_device("Ascend")
>>> ms.run_check()
MindSpore version: 2.2.0.20231025
The result of multiplication calculation is correct, MindSpore has been installed on platform [Ascend] successfully!
Expand Down
4 changes: 2 additions & 2 deletions docs/zh/tutorials/frequently_asked_questions.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@
- `Load dynamic library: libmindspore_ascend.so.2 failed. liboptiling.so: cannot open shared object file: No such file or directory`

```bash
python -c "import mindspore;mindspore.set_context(device_target='Ascend');mindspore.run_check()"
python -c "import mindspore;mindspore.set_device('Ascend');mindspore.run_check()"
[WARNING] ME(60105:13981374421 1776, MainProcess):2023-10-25-08: 14:33.640.411 [mindspore/run_check/_check_version.py:348] Using custom Ascend AI software package (Ascend Data Center Solution) path, package version checking is skipped. Please make sure Ascend AI software package (Ascend Data Center Solution) version is supported. For details, refer to the installation guidelines https://www.mindspore.cn/install
Traceback (most recent call last):
File "<string>", line 1, in module>
Expand Down Expand Up @@ -418,7 +418,7 @@ ERROR: Could not build wheels for lanms-neo, which is required to install pyproj
[WARNING] ME(44720:140507814819648,MainProcess):2023-11-01-03:01:38.884.384 [mindspore/run_check/_check_version.py:348] Using custom Ascend AI software package (Ascend Data Center Solution) path, package version checking is skipped. Please make sure Ascend AI software package (Ascend Data Center Solution) version is supported. For details, refer to the installation guidelines https://www.mindspore.cn/install
[WARNING] ME(44720:140507814819648,MainProcess):2023-11-01-03:01:38.884.675 [mindspore/run_check/_check_version.py:466] Can not find driver so(need by mindspore-ascend). Please check whether the Environment Variable LD_LIBRARY_PATH is set. For details, refer to the installation guidelines: https://www.mindspore.cn/install
>>> import mindspore.ops as ops
>>> ms.set_context(device_target="Ascend")
>>> ms.set_device("Ascend")
>>> ms.run_check()
MindSpore version: 2.2.0.20231025
The result of multiplication calculation is correct, MindSpore has been installed on platform [Ascend] successfully!
Expand Down
3 changes: 2 additions & 1 deletion mindocr/models/backbones/yolov8_backbone.py
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,8 @@ def yolov8_backbone(


def test_yolo_backbone():
ms.set_context(mode=ms.PYNATIVE_MODE, device_target='Ascend', device_id=3)
ms.set_context(mode=ms.PYNATIVE_MODE)
ms.set_device("Ascend", 3)
ms.set_seed(0)

network = YOLOv8Backbone()
Expand Down
3 changes: 2 additions & 1 deletion mindocr/models/heads/yolov8_head.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,7 +124,8 @@ def yolov8_head(nc=5, reg_max=16, stride=None, in_channels=None) -> YOLOv8Head:


def test_yolov8_head():
ms.set_context(mode=ms.PYNATIVE_MODE, device_target='Ascend', device_id=3)
ms.set_context(mode=ms.PYNATIVE_MODE)
ms.set_device("Ascend", 3)
ms.set_seed(0)

network = yolov8_head()
Expand Down
12 changes: 4 additions & 8 deletions tools/infer/text/predict_llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,17 +45,13 @@ def __init__(self, args):
config = LLMConfig(config_path)
ms.set_context(
mode=ms.GRAPH_MODE,
device_target="Ascend",
enable_graph_kernel=False,
graph_kernel_flags="--disable_expand_ops=Softmax,Dropout --enable_parallel_fusion=true "
"--reduce_fuse_depth=8 --enable_auto_tensor_inplace=true",
ascend_config={"precision_mode": "must_keep_origin_dtype"},
max_call_depth=10000,
max_device_memory="58GB",
save_graphs=False,
save_graphs_path="./graph",
device_id=os.environ.get("DEVICE_ID", 0),
)
ms.set_recursion_limit(10000)
ms.set_device("Ascend", os.environ.get("DEVICE_ID", 0))
ms.runtime.set_memory(max_size="58GB")
ms.device_context.ascend.op_precision.precision_mode("must_keep_origin_dtype")
self.tokenizer = QwenTokenizer(**config.processor.tokenizer)
self.model = VaryQwenForCausalLM.from_pretrained(config_path)

Expand Down
2 changes: 1 addition & 1 deletion tools/train.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def main(cfg):
# init env
ms.set_context(mode=cfg.system.mode)
if cfg.train.max_call_depth:
ms.set_context(max_call_depth=cfg.train.max_call_depth)
ms.set_recursion_limit(cfg.train.max_call_depth)
if cfg.system.mode == 0:
ms.set_context(jit_config={"jit_level": "O2"})
if cfg.system.distribute:
Expand Down