Skip to content

Commit df79169

Browse files
slancer50shacharl
and
shacharl
authored
Release/2.5.0 (#133)
* docs: update documentation and bump version for release * fix comment --------- Co-authored-by: shacharl <[email protected]>
1 parent 10670ef commit df79169

34 files changed

+2359
-123
lines changed

docs/_sources/index.rst.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -22,6 +22,7 @@ Control-M Python Client is built for data scientists and developers who prefer a
2222
notebooks/jobproperties
2323
notebooks/connectionprofiles
2424
notebooks/run_ondemand
25+
notebooks/get_jobs
2526
beyond
2627

2728
.. toctree::
Lines changed: 189 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,189 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"id": "060c8aab",
6+
"metadata": {},
7+
"source": [
8+
"# Get Jobs"
9+
]
10+
},
11+
{
12+
"cell_type": "markdown",
13+
"id": "06976a85",
14+
"metadata": {},
15+
"source": [
16+
"[AutomationAPI Documentation](https://documents.bmc.com/supportu/API/Monthly/en-US/Documentation/API_Services_DeployService.htm#deploy3)\n",
17+
"\n",
18+
"\n",
19+
"The get_jobs() method in the Control-M Python Client provides an easy way to retrieve job and folder definitions from the Control-M Server. It wraps the deploy jobs::get AAPI command and deserializes the resulting JSON into Python objects, allowing users to fetch and work with jobs and folders in a way that mirrors the state before deployment.\n",
20+
"\n",
21+
"This guide demonstrates how to use get_jobs() to fetch jobs from the Control-M server."
22+
]
23+
},
24+
{
25+
"cell_type": "code",
26+
"execution_count": 2,
27+
"id": "93e45a7e",
28+
"metadata": {},
29+
"outputs": [
30+
{
31+
"name": "stdout",
32+
"output_type": "stream",
33+
"text": [
34+
"{\n",
35+
" \"Folder_Sanity_1\": {\n",
36+
" \"Type\": \"Folder\",\n",
37+
" \"ControlmServer\": \"workbench\",\n",
38+
" \"LoadForecasts\": {\n",
39+
" \"Type\": \"Job:Databricks\",\n",
40+
" \"RunAs\": \"workbench\",\n",
41+
" \"RunAsDummy\": true,\n",
42+
" \"ConnectionProfile\": \"DATABRICKS\",\n",
43+
" \"Databricks Job ID\": \"991955986358417\",\n",
44+
" \"Parameters\": \"\\\"params\\\":{}\",\n",
45+
" \"Idempotency Token\": \"tokeni_%%ORDERID\"\n",
46+
" },\n",
47+
" \"RunAs\": \"workbench\"\n",
48+
" },\n",
49+
" \"Folder_Sanity_2\": {\n",
50+
" \"Type\": \"Folder\",\n",
51+
" \"ControlmServer\": \"workbench\",\n",
52+
" \"InventoryForecastModel\": {\n",
53+
" \"Type\": \"Job:AWS SageMaker\",\n",
54+
" \"RerunLimit\": {\n",
55+
" \"Times\": \"5\"\n",
56+
" },\n",
57+
" \"RunAs\": \"workbench\",\n",
58+
" \"RunAsDummy\": true,\n",
59+
" \"ConnectionProfile\": \"SAGEMAKER\",\n",
60+
" \"Pipeline Name\": \"InferencePipeline\",\n",
61+
" \"Idempotency Token\": \"Token_ControlM_for_SageMaker%%ORDERID\",\n",
62+
" \"Add Parameters\": \"checked\",\n",
63+
" \"Parameters\": \"{\\\"Name\\\":\\\"input_file\\\", \\\"Value\\\": \\\"file\\\"}\",\n",
64+
" \"Retry Pipeline Execution\": \"unchecked\"\n",
65+
" },\n",
66+
" \"RunAs\": \"workbench\"\n",
67+
" }\n",
68+
"}\n"
69+
]
70+
}
71+
],
72+
"source": [
73+
"from ctm_python_client.core.workflow import Workflow, WorkflowDefaults\n",
74+
"from ctm_python_client.core.comm import Environment\n",
75+
"from aapi import *\n",
76+
"\n",
77+
"# Step 1: Define the environment (Control-M Workbench in this case)\n",
78+
"env = Environment.create_workbench('workbench')\n",
79+
"\n",
80+
"# Step 2: Define your workflow with jobs and folders\n",
81+
"workflow = Workflow(env, WorkflowDefaults(run_as='workbench'))\n",
82+
"workflow.clear_all()\n",
83+
"run_as_dummy = True\n",
84+
"\n",
85+
"# Define jobs\n",
86+
"databricksLoad = JobDatabricks(\n",
87+
" 'LoadForecasts',\n",
88+
" connection_profile='DATABRICKS',\n",
89+
" databricks_job_id='991955986358417',\n",
90+
" parameters='\"params\":{}',\n",
91+
" idempotency_token='tokeni_%%ORDERID',\n",
92+
" run_as_dummy=run_as_dummy\n",
93+
")\n",
94+
"\n",
95+
"sagemaker_job = JobAwsSageMaker(\n",
96+
" 'InventoryForecastModel',\n",
97+
" connection_profile='SAGEMAKER',\n",
98+
" pipeline_name='InferencePipeline',\n",
99+
" idempotency_token='Token_ControlM_for_SageMaker%%ORDERID',\n",
100+
" add_parameters='checked',\n",
101+
" parameters='{\"Name\":\"input_file\", \"Value\": \"file\"}',\n",
102+
" run_as_dummy=run_as_dummy,\n",
103+
" retry_pipeline_execution='unchecked',\n",
104+
" rerun_limit=Job.RerunLimit(times='5')\n",
105+
")\n",
106+
"\n",
107+
"# Define folders\n",
108+
"folder1 = Folder('Folder_Sanity_1', controlm_server='workbench', job_list=[databricksLoad])\n",
109+
"folder2 = Folder('Folder_Sanity_2', controlm_server='workbench', job_list=[sagemaker_job])\n",
110+
"\n",
111+
"workflow.add(folder1)\n",
112+
"workflow.add(folder2)\n",
113+
"\n",
114+
"# Step 3: Deploy workflow\n",
115+
"workflow.build()\n",
116+
"workflow.deploy()\n",
117+
"\n",
118+
"# Step 4: Retrieve jobs after deployment\n",
119+
"workflow_actual = Workflow.get_jobs(env, server=\"workbench\", folder=\"Folder_Sanity_*\")\n",
120+
"\n",
121+
"print(workflow.dumps_json(indent=2))\n"
122+
]
123+
},
124+
{
125+
"cell_type": "markdown",
126+
"id": "7c0baf86",
127+
"metadata": {},
128+
"source": [
129+
"## Arguments"
130+
]
131+
},
132+
{
133+
"cell_type": "markdown",
134+
"id": "feb81b89",
135+
"metadata": {},
136+
"source": [
137+
"- **environment: Environment** :\n",
138+
" The Control-M environment to connect to. Required. \n",
139+
" This object defines the Control-M endpoint (Automation API endpoint, same as Control-M/EM), the authentication method, and the environment mode (on-prem or SaaS). \n",
140+
" You can create an Environment instance using one of the following static methods: \n",
141+
" - `create_workbench()` – for local development with Workbench. \n",
142+
" - `create_onprem(host, username, password)` – for on-premises Control-M using username/password authentication. \n",
143+
" - `create_saas(endpoint, api_key)` – for Helix Control-M (SaaS) using an API key. \n",
144+
" The environment determines how to authenticate and which API variant to use, depending on whether the backend is Control-M or Helix Control-M.\n",
145+
"\n",
146+
"\n",
147+
"- **server: str** :\n",
148+
"The exact Control-M/Server name to query. Required. \n",
149+
"No wildcards allowed.\n",
150+
"\n",
151+
"- **folder: str** :\n",
152+
"The folder name or pattern to fetch. Required. \n",
153+
"Supports wildcards (e.g., \"MyFolder_*\"). Filters jobs by folder name.\n",
154+
"\n",
155+
"- **job: str** (Not supported yet) \n",
156+
" Currently not supported. The API ignores this parameter if provided."
157+
]
158+
},
159+
{
160+
"cell_type": "code",
161+
"execution_count": null,
162+
"id": "59899dfc",
163+
"metadata": {},
164+
"outputs": [],
165+
"source": []
166+
}
167+
],
168+
"metadata": {
169+
"kernelspec": {
170+
"display_name": "venv",
171+
"language": "python",
172+
"name": "python3"
173+
},
174+
"language_info": {
175+
"codemirror_mode": {
176+
"name": "ipython",
177+
"version": 3
178+
},
179+
"file_extension": ".py",
180+
"mimetype": "text/x-python",
181+
"name": "python",
182+
"nbconvert_exporter": "python",
183+
"pygments_lexer": "ipython3",
184+
"version": "3.12.6"
185+
}
186+
},
187+
"nbformat": 4,
188+
"nbformat_minor": 5
189+
}

0 commit comments

Comments
 (0)