diff --git a/api/index.html b/api/index.html index bcc4e84..4fa79ec 100644 --- a/api/index.html +++ b/api/index.html @@ -1392,13 +1392,13 @@
Starship is a utility to migrate Airflow metadata such as Airflow Variables, Connections, Environment Variables, Pools, and DAG History between two Airflow instances.
"},{"location":"#installation","title":"Installation","text":"
pip install astronomer-starship\n
"},{"location":"#usage","title":"Usage","text":"astro dev init
with the Astro CLI to create a Astro Project locally in your terminal/dags
folder in the Astro Projectastro deploy
Astronomer
menu and select the Migration Tool \ud83d\ude80
optionI'm on Airflow 1, can I use Starship?
No, Starship is only compatible with Airflow 2.x and above, see Compatibility
I'm on Airflow>=2.7 and can't test connections?
You must have AIRFLOW__CORE__TEST_CONNECTION
set. See notes here
I'm using Google Cloud Composer 2.x and Airflow 2.x and do not see the Astronomer
menu and/or the Starship Airflow Plugin?
Run the following to ensure you are a privileged user.
gcloud config set project <PROJECT_NAME>\ngcloud composer environments run <ENVIRONMENT_NAME> --location <LOCATION> users add-role -- -e <USER_EMAIL> -r Admin\n
This project is an Airflow Plugin that adds custom API routes. Ensure your environments are correctly secured.
Artwork Starship logo by Lorenzo used with permission from The Noun Project under Creative Commons.
"},{"location":"api/","title":"API","text":""},{"location":"api/#airflow-version","title":"Airflow Version","text":"Returns the version of Airflow that the Starship API is connected to.
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.airflow_version--get-apistarshipairflow_version","title":"GET /api/starship/airflow_version
","text":"Parameters: None
Response:
OK\n
"},{"location":"api/#health","title":"Health","text":"Returns the health of the Starship API
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.health--get-apistarshiphealth","title":"GET /api/starship/health
","text":"Parameters: None
Response:
OK\n
"},{"location":"api/#environment-variables","title":"Environment Variables","text":"Get the Environment Variables, which may be used to set Airflow Connections, Variables, or Configurations
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.env_vars--get-apistarshipenv_vars","title":"GET /api/starship/env_vars
","text":"Parameters: None
Response:
{\n \"FOO\": \"bar\",\n \"AIRFLOW__CORE__SQL_ALCHEMY_CONN\": \"sqlite:////usr/local/airflow/airflow.db\",\n ...\n}\n
"},{"location":"api/#variable","title":"Variable","text":"Get Variables or set a Variable
Model: airflow.models.Variable
Table: variable
GET /api/starship/variable
","text":"Parameters: None
Response:
[\n {\n \"key\": \"key\",\n \"val\": \"val\",\n \"description\": \"My Var\"\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.variables--post-apistarshipvariable","title":"POST /api/starship/variable
","text":"Parameters: JSON
Field (*=Required) Version Type Example key* str key val* str val description str My VarResponse: List of Variables, as GET
Response
Get Pools or set a Pool
Model: airflow.models.Pool
Table: pools
Parameters: None
Response:
[\n {\n \"name\": \"my_pool\",\n \"slots\": 5,\n \"description\": \"My Pool\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.pools--post-apistarshippools","title":"POST /api/starship/pools","text":"Parameters: JSON
Field (*=Required) Version Type Example name* str my_pool slots* int 5 description str My Pool include_deferred* >=2.7 bool TrueResponse: List of Pools, as GET
Response
Get Connections or set a Connection
Model: airflow.models.Connections
Table: connection
GET /api/starship/connection
","text":"Parameters: None
Response:
[\n {\n \"conn_id\": \"my_conn\",\n \"conn_type\": \"http\",\n \"host\": \"localhost\",\n \"port\": \"1234\",\n \"schema\": \"https\",\n \"login\": \"user\",\n \"password\": \"foobar\", # pragma: allowlist secret\n \"extra\": \"{}\",\n \"conn_type\": \"http\",\n \"conn_type\": \"http\",\n \"conn_type\": \"http\",\n \"description\": \"My Var\"\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.connections--post-apistarshipconnection","title":"POST /api/starship/connection
","text":"Parameters: JSON
Field (*=Required) Version Type Example conn_id* str my_conn conn_type* str http host str localhost port int 1234 schema str https login str user password str ** extra dict {} description str My ConnResponse: List of Connections, as GET
Response
Get DAG or pause/unpause a DAG
Model: airflow.models.DagModel
Table: dags
GET /api/starship/dags
","text":"Parameters: None
Response:
[\n {\n \"dag_id\": \"dag_0\",\n \"schedule_interval\": \"0 0 * * *\",\n \"is_paused\": true,\n \"fileloc\": \"/usr/local/airflow/dags/dag_0.py\",\n \"description\": \"My Dag\",\n \"owners\": \"user\",\n \"tags\": [\"tag1\", \"tag2\"],\n \"dag_run_count\": 2,\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dags--patch-apistarshipdags","title":"PATCH /api/starship/dags
","text":"Parameters: JSON
Field (*=Required) Version Type Example dag_id* str dag_0 is_paused* bool true{\n \"dag_id\": \"dag_0\",\n \"is_paused\": true\n}\n
"},{"location":"api/#dag-runs","title":"DAG Runs","text":"Get DAG Runs or set DAG Runs
Model: airflow.models.DagRun
Table: dag_run
GET /api/starship/dag_runs
","text":"Parameters: Args
Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0Response:
[\n {\n \"dag_id\": \"dag_0\",\n \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n \"start_date\": \"1970-01-01T00:00:00+00:00\",\n \"end_date\": \"1970-01-01T00:00:00+00:00\",\n \"state\": \"SUCCESS\",\n \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n \"creating_job_id\": 123,\n \"external_trigger\": true,\n \"run_type\": \"manual\",\n \"conf\": None,\n \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n \"dag_hash\": \"....\"\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dag_runs--post-apistarshipdag_runs","title":"POST /api/starship/dag_runs
","text":"Parameters: JSON
Field (*=Required) Version Type Example dag_runs list[DagRun] [ ... ]{\n \"dag_runs\": [ ... ]\n}\n
DAG Run:
Field (*=Required) Version Type Example dag_id* str dag_0 queued_at date 1970-01-01T00:00:00+00:00 execution_date* date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 state str SUCCESS run_id* str manual__1970-01-01T00:00:00+00:00 creating_job_id int 123 external_trigger bool true run_type* str manual conf dict {} data_interval_start >2.1 date 1970-01-01T00:00:00+00:00 data_interval_end >2.1 date 1970-01-01T00:00:00+00:00 last_scheduling_decision date 1970-01-01T00:00:00+00:00 dash_hash str ... clean_number >=2.8 int 0"},{"location":"api/#task-instances","title":"Task Instances","text":"Get TaskInstances or set TaskInstances
Model: airflow.models.TaskInstance
Table: task_instance
GET /api/starship/task_instances
","text":"Parameters: Args
Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0Response:
{\n \"task_instances\": [\n {\n \"task_instances\": []\n \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n \"start_date\": \"1970-01-01T00:00:00+00:00\",\n \"end_date\": \"1970-01-01T00:00:00+00:00\",\n \"state\": \"SUCCESS\",\n \"creating_job_id\": 123,\n \"external_trigger\": true,\n \"run_type\": \"manual\",\n \"conf\": None,\n \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n \"dag_hash\": \"....\"\n },\n ...\n ],\n \"dag_run_count\": 2,\n}\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.task_instances--post-apistarshiptask_instances","title":"POST /api/starship/task_instances
","text":"Parameters: JSON
Field (*=Required) Version Type Example task_instances list[TaskInstance] [ ... ]{\n \"task_instances\": [ ... ]\n}\n
Task Instance:
Field (*=Required) Version Type Example dag_id* str dag_0 run_id* >2.1 str manual__1970-01-01T00:00:00+00:00 task_id* str task_0 map_index* >2.2 int -1 execution_date* <=2.1 date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 duration float 0.0 max_tries int 2 hostname str host unixname str unixname job_id int 123 pool* str default_pool pool_slots int 1 queue str queue priority_weight int 1 operator str BashOperator queued_dttm date 1970-01-01T00:00:00+00:00 queued_by_job_id int 123 pid int 123 external_executor_id int trigger_id >2.1 str trigger_timeout >2.1 date 1970-01-01T00:00:00+00:00 executor_config str"},{"location":"operator/","title":"Operator","text":"The Starship Operator should be used in instances where the Airflow Webserver is unable to correctly host a Plugin.
The AstroMigrationOperator
should be used if migrating from a Google Cloud Composer 1 (with Airflow 2.x) or MWAA v2.0.2 environment. These environments do not support webserver plugins and will require using the AstroMigrationOperator
to migrate data.
Add the following line to your requirements.txt
in your source environment:
astronomer-starship==1.2.1\n
"},{"location":"operator/#usage","title":"Usage","text":"Add the following DAG to your source environment:
dags/astronomer_migration_dag.pyfrom airflow import DAG\n\nfrom astronomer.starship.operators import AstroMigrationOperator\nfrom datetime import datetime\n\nwith DAG(\ndag_id=\"astronomer_migration_dag\",\nstart_date=datetime(2020, 8, 15),\nschedule_interval=None,\n) as dag:\n\nAstroMigrationOperator(\ntask_id=\"export_meta\",\ndeployment_url='{{ dag_run.conf[\"deployment_url\"] }}',\ntoken='{{ dag_run.conf[\"astro_token\"] }}',\n)\n
Deploy this DAG to your source Airflow environment, configured as described in the Configuration section below
astro_token
: To retrieve anf Astronomer token, navigate to cloud.astronomer.io/token and log in using your Astronomer credentialsdeployment_url
: To retrieve a deployment URL - navigate to the Astronomer Airlow deployment that you'd like to migrate to in the Astronomer UI, click Open Airflow
and copy the page URL (excluding /home
on the end of the URL)https://astronomer.astronomer.run/abcdt4ry/home
, you'll use https://astronomer.astronomer.run/abcdt4ry
The config dictionary used when triggering the DAG should be formatted as:
{\n \"deployment_url\": \"your-deployment-url\",\n \"astro_token\": \"your-astro-token\"\n}\n
5. Once the DAG successfully runs, your connections, variables, and environment variables should all be migrated to Astronomer The AstroMigrationOperator
can be configured as follows:
variables_exclude_list
: List the individual Airflow Variables which you do not want to be migrated. Any Variables not listed will be migrated to the desination Airflow deployment.connection_exclude_list
: List the individual Airflow Connections which you do not want to be migrated. Any Variables not listed will be migrated to the desination Airflow deployment.env_include_list
: List the individual Environment Variables which you do want to be migrated. Only the Environment Variables listed will be migrated to the desination Airflow deployment. None are migrated by default.
AstroMigrationOperator(\n task_id=\"export_meta\",\n deployment_url='{{ dag_run.conf[\"deployment_url\"] }}',\n token='{{ dag_run.conf[\"astro_token\"] }}',\n variables_exclude_list=[\"some_var_1\"],\n connection_exclude_list=[\"some_conn_1\"],\n env_include_list=[\"FOO\", \"BAR\"],\n)\n
You must be an Admin to see Plugins on GCC.
"},{"location":"migration_source/gcc/#installation","title":"Installation","text":"+ Add Package
and put astronomer-starship
under Package name
I'm using Google Cloud Composer 2.x and Airflow 2.x and do not see the Astronomer
menu and/or the Starship Airflow Plugin?
Run the following to ensure you are a privileged user.
gcloud config set project <PROJECT_NAME>\ngcloud composer environments run <ENVIRONMENT_NAME> --location <LOCATION> users add-role -- -e <USER_EMAIL> -r Admin\n
requirements.txt
astronomer-starship
to the file, save it, and re-upload it to S3Edit
, and pick the newer version of your Requirements File Next
, then eventually Save
, and then wait for your deployment to restart and dependencies to installAstronomer Starship can send your Airflow workloads to new places!
"},{"location":"#what-is-it","title":"What is it?","text":"Starship is a utility to migrate Airflow metadata such as Airflow Variables, Connections, Environment Variables, Pools, and DAG History between two Airflow instances.
"},{"location":"#installation","title":"Installation","text":"
pip install astronomer-starship\n
"},{"location":"#usage","title":"Usage","text":"astro dev init
with the Astro CLI to create a Astro Project locally in your terminal/dags
folder in the Astro Projectastro deploy
Astronomer
menu and select the Migration Tool \ud83d\ude80
optionI'm on Airflow 1, can I use Starship?
No, Starship is only compatible with Airflow 2.x and above, see Compatibility
I'm on Airflow>=2.7 and can't test connections?
You must have AIRFLOW__CORE__TEST_CONNECTION
set. See notes here
I'm using Google Cloud Composer 2.x and Airflow 2.x and do not see the Astronomer
menu and/or the Starship Airflow Plugin?
Run the following to ensure you are a privileged user.
gcloud config set project <PROJECT_NAME>\ngcloud composer environments run <ENVIRONMENT_NAME> --location <LOCATION> users add-role -- -e <USER_EMAIL> -r Admin\n
This project is an Airflow Plugin that adds custom API routes. Ensure your environments are correctly secured.
Artwork Starship logo by Lorenzo used with permission from The Noun Project under Creative Commons.
"},{"location":"api/","title":"API","text":""},{"location":"api/#airflow-version","title":"Airflow Version","text":"Returns the version of Airflow that the Starship API is connected to.
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.airflow_version--get-apistarshipairflow_version","title":"GET /api/starship/airflow_version
","text":"Parameters: None
Response:
OK\n
"},{"location":"api/#health","title":"Health","text":"Returns the health of the Starship API
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.health--get-apistarshiphealth","title":"GET /api/starship/health
","text":"Parameters: None
Response:
OK\n
"},{"location":"api/#environment-variables","title":"Environment Variables","text":"Get the Environment Variables, which may be used to set Airflow Connections, Variables, or Configurations
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.env_vars--get-apistarshipenv_vars","title":"GET /api/starship/env_vars
","text":"Parameters: None
Response:
{\n \"FOO\": \"bar\",\n \"AIRFLOW__CORE__SQL_ALCHEMY_CONN\": \"sqlite:////usr/local/airflow/airflow.db\",\n ...\n}\n
"},{"location":"api/#variable","title":"Variable","text":"Get Variables or set a Variable
Model: airflow.models.Variable
Table: variable
GET /api/starship/variable
","text":"Parameters: None
Response:
[\n {\n \"key\": \"key\",\n \"val\": \"val\",\n \"description\": \"My Var\"\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.variables--post-apistarshipvariable","title":"POST /api/starship/variable
","text":"Parameters: JSON
Field (*=Required) Version Type Example key* str key val* str val description str My VarResponse: List of Variables, as GET
Response
Get Pools or set a Pool
Model: airflow.models.Pool
Table: pools
Parameters: None
Response:
[\n {\n \"name\": \"my_pool\",\n \"slots\": 5,\n \"description\": \"My Pool\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.pools--post-apistarshippools","title":"POST /api/starship/pools","text":"Parameters: JSON
Field (*=Required) Version Type Example name* str my_pool slots* int 5 description str My Pool include_deferred* >=2.7 bool TrueResponse: List of Pools, as GET
Response
Get Connections or set a Connection
Model: airflow.models.Connections
Table: connection
GET /api/starship/connection
","text":"Parameters: None
Response:
[\n {\n \"conn_id\": \"my_conn\",\n \"conn_type\": \"http\",\n \"host\": \"localhost\",\n \"port\": \"1234\",\n \"schema\": \"https\",\n \"login\": \"user\",\n \"password\": \"foobar\", # pragma: allowlist secret\n \"extra\": \"{}\",\n \"conn_type\": \"http\",\n \"conn_type\": \"http\",\n \"conn_type\": \"http\",\n \"description\": \"My Var\"\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.connections--post-apistarshipconnection","title":"POST /api/starship/connection
","text":"Parameters: JSON
Field (*=Required) Version Type Example conn_id* str my_conn conn_type* str http host str localhost port int 1234 schema str https login str user password str ** extra dict {} description str My ConnResponse: List of Connections, as GET
Response
Get DAG or pause/unpause a DAG
Model: airflow.models.DagModel
Table: dags
GET /api/starship/dags
","text":"Parameters: None
Response:
[\n {\n \"dag_id\": \"dag_0\",\n \"schedule_interval\": \"0 0 * * *\",\n \"is_paused\": true,\n \"fileloc\": \"/usr/local/airflow/dags/dag_0.py\",\n \"description\": \"My Dag\",\n \"owners\": \"user\",\n \"tags\": [\"tag1\", \"tag2\"],\n \"dag_run_count\": 2,\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dags--patch-apistarshipdags","title":"PATCH /api/starship/dags
","text":"Parameters: JSON
Field (*=Required) Version Type Example dag_id* str dag_0 is_paused* bool true{\n \"dag_id\": \"dag_0\",\n \"is_paused\": true\n}\n
"},{"location":"api/#dag-runs","title":"DAG Runs","text":"Get DAG Runs or set DAG Runs
Model: airflow.models.DagRun
Table: dag_run
GET /api/starship/dag_runs
","text":"Parameters: Args
Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0Response:
[\n {\n \"dag_id\": \"dag_0\",\n \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n \"start_date\": \"1970-01-01T00:00:00+00:00\",\n \"end_date\": \"1970-01-01T00:00:00+00:00\",\n \"state\": \"SUCCESS\",\n \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n \"creating_job_id\": 123,\n \"external_trigger\": true,\n \"run_type\": \"manual\",\n \"conf\": None,\n \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n \"dag_hash\": \"....\"\n },\n ...\n]\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dag_runs--post-apistarshipdag_runs","title":"POST /api/starship/dag_runs
","text":"Parameters: JSON
Field (*=Required) Version Type Example dag_runs list[DagRun] [ ... ]{\n \"dag_runs\": [ ... ]\n}\n
DAG Run:
Field (*=Required) Version Type Example dag_id* str dag_0 queued_at date 1970-01-01T00:00:00+00:00 execution_date* date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 state str SUCCESS run_id* str manual__1970-01-01T00:00:00+00:00 creating_job_id int 123 external_trigger bool true run_type* str manual conf dict {} data_interval_start >2.1 date 1970-01-01T00:00:00+00:00 data_interval_end >2.1 date 1970-01-01T00:00:00+00:00 last_scheduling_decision date 1970-01-01T00:00:00+00:00 dag_hash str ... clear_number >=2.8 int 0"},{"location":"api/#task-instances","title":"Task Instances","text":"Get TaskInstances or set TaskInstances
Model: airflow.models.TaskInstance
Table: task_instance
GET /api/starship/task_instances
","text":"Parameters: Args
Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0Response:
{\n \"task_instances\": [\n {\n \"task_instances\": []\n \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n \"start_date\": \"1970-01-01T00:00:00+00:00\",\n \"end_date\": \"1970-01-01T00:00:00+00:00\",\n \"state\": \"SUCCESS\",\n \"creating_job_id\": 123,\n \"external_trigger\": true,\n \"run_type\": \"manual\",\n \"conf\": None,\n \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n \"dag_hash\": \"....\"\n },\n ...\n ],\n \"dag_run_count\": 2,\n}\n
"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.task_instances--post-apistarshiptask_instances","title":"POST /api/starship/task_instances
","text":"Parameters: JSON
Field (*=Required) Version Type Example task_instances list[TaskInstance] [ ... ]{\n \"task_instances\": [ ... ]\n}\n
Task Instance:
Field (*=Required) Version Type Example dag_id* str dag_0 run_id* >2.1 str manual__1970-01-01T00:00:00+00:00 task_id* str task_0 map_index* >2.2 int -1 execution_date* <=2.1 date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 duration float 0.0 max_tries int 2 hostname str host unixname str unixname job_id int 123 pool* str default_pool pool_slots int 1 queue str queue priority_weight int 1 operator str BashOperator queued_dttm date 1970-01-01T00:00:00+00:00 queued_by_job_id int 123 pid int 123 external_executor_id int trigger_id >2.1 str trigger_timeout >2.1 date 1970-01-01T00:00:00+00:00 executor_config str"},{"location":"operator/","title":"Operator","text":"The Starship Operator should be used in instances where the Airflow Webserver is unable to correctly host a Plugin.
The AstroMigrationOperator
should be used if migrating from a Google Cloud Composer 1 (with Airflow 2.x) or MWAA v2.0.2 environment. These environments do not support webserver plugins and will require using the AstroMigrationOperator
to migrate data.
Add the following line to your requirements.txt
in your source environment:
astronomer-starship==1.2.1\n
"},{"location":"operator/#usage","title":"Usage","text":"Add the following DAG to your source environment:
dags/astronomer_migration_dag.pyfrom airflow import DAG\n\nfrom astronomer.starship.operators import AstroMigrationOperator\nfrom datetime import datetime\n\nwith DAG(\ndag_id=\"astronomer_migration_dag\",\nstart_date=datetime(2020, 8, 15),\nschedule_interval=None,\n) as dag:\n\nAstroMigrationOperator(\ntask_id=\"export_meta\",\ndeployment_url='{{ dag_run.conf[\"deployment_url\"] }}',\ntoken='{{ dag_run.conf[\"astro_token\"] }}',\n)\n
Deploy this DAG to your source Airflow environment, configured as described in the Configuration section below
astro_token
: To retrieve anf Astronomer token, navigate to cloud.astronomer.io/token and log in using your Astronomer credentialsdeployment_url
: To retrieve a deployment URL - navigate to the Astronomer Airlow deployment that you'd like to migrate to in the Astronomer UI, click Open Airflow
and copy the page URL (excluding /home
on the end of the URL)https://astronomer.astronomer.run/abcdt4ry/home
, you'll use https://astronomer.astronomer.run/abcdt4ry
The config dictionary used when triggering the DAG should be formatted as:
{\n \"deployment_url\": \"your-deployment-url\",\n \"astro_token\": \"your-astro-token\"\n}\n
5. Once the DAG successfully runs, your connections, variables, and environment variables should all be migrated to Astronomer The AstroMigrationOperator
can be configured as follows:
variables_exclude_list
: List the individual Airflow Variables which you do not want to be migrated. Any Variables not listed will be migrated to the desination Airflow deployment.connection_exclude_list
: List the individual Airflow Connections which you do not want to be migrated. Any Variables not listed will be migrated to the desination Airflow deployment.env_include_list
: List the individual Environment Variables which you do want to be migrated. Only the Environment Variables listed will be migrated to the desination Airflow deployment. None are migrated by default.
AstroMigrationOperator(\n task_id=\"export_meta\",\n deployment_url='{{ dag_run.conf[\"deployment_url\"] }}',\n token='{{ dag_run.conf[\"astro_token\"] }}',\n variables_exclude_list=[\"some_var_1\"],\n connection_exclude_list=[\"some_conn_1\"],\n env_include_list=[\"FOO\", \"BAR\"],\n)\n
You must be an Admin to see Plugins on GCC.
"},{"location":"migration_source/gcc/#installation","title":"Installation","text":"+ Add Package
and put astronomer-starship
under Package name
I'm using Google Cloud Composer 2.x and Airflow 2.x and do not see the Astronomer
menu and/or the Starship Airflow Plugin?
Run the following to ensure you are a privileged user.
gcloud config set project <PROJECT_NAME>\ngcloud composer environments run <ENVIRONMENT_NAME> --location <LOCATION> users add-role -- -e <USER_EMAIL> -r Admin\n
requirements.txt
astronomer-starship
to the file, save it, and re-upload it to S3Edit
, and pick the newer version of your Requirements File Next
, then eventually Save
, and then wait for your deployment to restart and dependencies to install