diff --git a/api/index.html b/api/index.html index bcc4e84..4fa79ec 100644 --- a/api/index.html +++ b/api/index.html @@ -1392,13 +1392,13 @@

Astronomer Starship can send your Airflow workloads to new places!

"},{"location":"#what-is-it","title":"What is it?","text":"

Starship is a utility to migrate Airflow metadata such as Airflow Variables, Connections, Environment Variables, Pools, and DAG History between two Airflow instances.

"},{"location":"#installation","title":"Installation","text":"
pip install astronomer-starship\n
"},{"location":"#usage","title":"Usage","text":"
  1. Create a Workspace in Astro or Software to hold Astro Deployments
  2. Create an Astro Deployment matching the source Airflow deployment configuration as possible
  3. Run astro dev init with the Astro CLI to create a Astro Project locally in your terminal
  4. Add any DAGs to the /dags folder in the Astro Project
  5. Complete any additional setup required to convert your existing Airflow deployment to an Astro Project
  6. Install Starship (and any additional Python Dependencies) to the Astro Project
  7. Install Starship to your existing Airflow Deployment
  8. Deploy the Astro Project to the Astro Deployment with astro deploy
  9. In the Airflow UI of the source Airflow deployment, navigate to the new Astronomer menu and select the Migration Tool \ud83d\ude80 option
  10. Follow the UI prompts to migrate, or if needed, look at the instructions to use the Operator
"},{"location":"#compatability","title":"Compatability","text":"Source Compatible Airflow 1 \u274c GCC 1 - Airflow 2.x Operator GCC 2 - Airflow 2.x \u2705 MWAA v2.0.2 Operator MWAA \u2265 v2.2.2 \u2705 OSS Airflow VM \u2705 Astronomer Products \u2705"},{"location":"#faq","title":"FAQ","text":"

You must have AIRFLOW__CORE__TEST_CONNECTION set. See notes here

"},{"location":"#security-notice","title":"Security Notice","text":"

This project is an Airflow Plugin that adds custom API routes. Ensure your environments are correctly secured.

Artwork Starship logo by Lorenzo used with permission from The Noun Project under Creative Commons.

"},{"location":"api/","title":"API","text":""},{"location":"api/#airflow-version","title":"Airflow Version","text":"

Returns the version of Airflow that the Starship API is connected to.

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.airflow_version--get-apistarshipairflow_version","title":"GET /api/starship/airflow_version","text":"

Parameters: None

Response:

OK\n

"},{"location":"api/#health","title":"Health","text":"

Returns the health of the Starship API

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.health--get-apistarshiphealth","title":"GET /api/starship/health","text":"

Parameters: None

Response:

OK\n

"},{"location":"api/#environment-variables","title":"Environment Variables","text":"

Get the Environment Variables, which may be used to set Airflow Connections, Variables, or Configurations

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.env_vars--get-apistarshipenv_vars","title":"GET /api/starship/env_vars","text":"

Parameters: None

Response:

{\n    \"FOO\": \"bar\",\n    \"AIRFLOW__CORE__SQL_ALCHEMY_CONN\": \"sqlite:////usr/local/airflow/airflow.db\",\n    ...\n}\n

"},{"location":"api/#variable","title":"Variable","text":"

Get Variables or set a Variable

Model: airflow.models.Variable

Table: variable

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.variables--get-apistarshipvariable","title":"GET /api/starship/variable","text":"

Parameters: None

Response:

[\n    {\n        \"key\": \"key\",\n        \"val\": \"val\",\n        \"description\": \"My Var\"\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.variables--post-apistarshipvariable","title":"POST /api/starship/variable","text":"

Parameters: JSON

Field (*=Required) Version Type Example key* str key val* str val description str My Var

Response: List of Variables, as GET Response

"},{"location":"api/#pools","title":"Pools","text":"

Get Pools or set a Pool

Model: airflow.models.Pool

Table: pools

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.pools--get-apistarshippools","title":"GET /api/starship/pools","text":"

Parameters: None

Response:

[\n    {\n        \"name\": \"my_pool\",\n        \"slots\": 5,\n        \"description\": \"My Pool\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.pools--post-apistarshippools","title":"POST /api/starship/pools","text":"

Parameters: JSON

Field (*=Required) Version Type Example name* str my_pool slots* int 5 description str My Pool include_deferred* >=2.7 bool True

Response: List of Pools, as GET Response

"},{"location":"api/#connections","title":"Connections","text":"

Get Connections or set a Connection

Model: airflow.models.Connections

Table: connection

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.connections--get-apistarshipconnection","title":"GET /api/starship/connection","text":"

Parameters: None

Response:

[\n    {\n        \"conn_id\": \"my_conn\",\n        \"conn_type\": \"http\",\n        \"host\": \"localhost\",\n        \"port\": \"1234\",\n        \"schema\": \"https\",\n        \"login\": \"user\",\n        \"password\": \"foobar\",  # pragma: allowlist secret\n        \"extra\": \"{}\",\n        \"conn_type\": \"http\",\n        \"conn_type\": \"http\",\n        \"conn_type\": \"http\",\n        \"description\": \"My Var\"\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.connections--post-apistarshipconnection","title":"POST /api/starship/connection","text":"

Parameters: JSON

Field (*=Required) Version Type Example conn_id* str my_conn conn_type* str http host str localhost port int 1234 schema str https login str user password str ** extra dict {} description str My Conn

Response: List of Connections, as GET Response

"},{"location":"api/#dags","title":"DAGs","text":"

Get DAG or pause/unpause a DAG

Model: airflow.models.DagModel

Table: dags

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dags--get-apistarshipdags","title":"GET /api/starship/dags","text":"

Parameters: None

Response:

[\n    {\n        \"dag_id\": \"dag_0\",\n        \"schedule_interval\": \"0 0 * * *\",\n        \"is_paused\": true,\n        \"fileloc\": \"/usr/local/airflow/dags/dag_0.py\",\n        \"description\": \"My Dag\",\n        \"owners\": \"user\",\n        \"tags\": [\"tag1\", \"tag2\"],\n        \"dag_run_count\": 2,\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dags--patch-apistarshipdags","title":"PATCH /api/starship/dags","text":"

Parameters: JSON

Field (*=Required) Version Type Example dag_id* str dag_0 is_paused* bool true
{\n    \"dag_id\": \"dag_0\",\n    \"is_paused\": true\n}\n
"},{"location":"api/#dag-runs","title":"DAG Runs","text":"

Get DAG Runs or set DAG Runs

Model: airflow.models.DagRun

Table: dag_run

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dag_runs--get-apistarshipdag_runs","title":"GET /api/starship/dag_runs","text":"

Parameters: Args

Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0

Response:

[\n    {\n        \"dag_id\": \"dag_0\",\n        \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n        \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n        \"start_date\": \"1970-01-01T00:00:00+00:00\",\n        \"end_date\": \"1970-01-01T00:00:00+00:00\",\n        \"state\": \"SUCCESS\",\n        \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n        \"creating_job_id\": 123,\n        \"external_trigger\": true,\n        \"run_type\": \"manual\",\n        \"conf\": None,\n        \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n        \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n        \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n        \"dag_hash\": \"....\"\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dag_runs--post-apistarshipdag_runs","title":"POST /api/starship/dag_runs","text":"

Parameters: JSON

Field (*=Required) Version Type Example dag_runs list[DagRun] [ ... ]
{\n    \"dag_runs\": [ ... ]\n}\n

DAG Run:

Field (*=Required) Version Type Example dag_id* str dag_0 queued_at date 1970-01-01T00:00:00+00:00 execution_date* date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 state str SUCCESS run_id* str manual__1970-01-01T00:00:00+00:00 creating_job_id int 123 external_trigger bool true run_type* str manual conf dict {} data_interval_start >2.1 date 1970-01-01T00:00:00+00:00 data_interval_end >2.1 date 1970-01-01T00:00:00+00:00 last_scheduling_decision date 1970-01-01T00:00:00+00:00 dash_hash str ... clean_number >=2.8 int 0"},{"location":"api/#task-instances","title":"Task Instances","text":"

Get TaskInstances or set TaskInstances

Model: airflow.models.TaskInstance

Table: task_instance

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.task_instances--get-apistarshiptask_instances","title":"GET /api/starship/task_instances","text":"

Parameters: Args

Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0

Response:

{\n    \"task_instances\": [\n        {\n            \"task_instances\": []\n            \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n            \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n            \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n            \"start_date\": \"1970-01-01T00:00:00+00:00\",\n            \"end_date\": \"1970-01-01T00:00:00+00:00\",\n            \"state\": \"SUCCESS\",\n            \"creating_job_id\": 123,\n            \"external_trigger\": true,\n            \"run_type\": \"manual\",\n            \"conf\": None,\n            \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n            \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n            \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n            \"dag_hash\": \"....\"\n        },\n        ...\n    ],\n    \"dag_run_count\": 2,\n}\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.task_instances--post-apistarshiptask_instances","title":"POST /api/starship/task_instances","text":"

Parameters: JSON

Field (*=Required) Version Type Example task_instances list[TaskInstance] [ ... ]
{\n    \"task_instances\": [ ... ]\n}\n

Task Instance:

Field (*=Required) Version Type Example dag_id* str dag_0 run_id* >2.1 str manual__1970-01-01T00:00:00+00:00 task_id* str task_0 map_index* >2.2 int -1 execution_date* <=2.1 date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 duration float 0.0 max_tries int 2 hostname str host unixname str unixname job_id int 123 pool* str default_pool pool_slots int 1 queue str queue priority_weight int 1 operator str BashOperator queued_dttm date 1970-01-01T00:00:00+00:00 queued_by_job_id int 123 pid int 123 external_executor_id int trigger_id >2.1 str trigger_timeout >2.1 date 1970-01-01T00:00:00+00:00 executor_config str"},{"location":"operator/","title":"Operator","text":"

The Starship Operator should be used in instances where the Airflow Webserver is unable to correctly host a Plugin.

The AstroMigrationOperator should be used if migrating from a Google Cloud Composer 1 (with Airflow 2.x) or MWAA v2.0.2 environment. These environments do not support webserver plugins and will require using the AstroMigrationOperator to migrate data.

"},{"location":"operator/#installation","title":"Installation","text":"

Add the following line to your requirements.txt in your source environment:

astronomer-starship==1.2.1\n
"},{"location":"operator/#usage","title":"Usage","text":"
  1. Add the following DAG to your source environment:

    dags/astronomer_migration_dag.py
    from airflow import DAG\n\nfrom astronomer.starship.operators import AstroMigrationOperator\nfrom datetime import datetime\n\nwith DAG(\ndag_id=\"astronomer_migration_dag\",\nstart_date=datetime(2020, 8, 15),\nschedule_interval=None,\n) as dag:\n\nAstroMigrationOperator(\ntask_id=\"export_meta\",\ndeployment_url='{{ dag_run.conf[\"deployment_url\"] }}',\ntoken='{{ dag_run.conf[\"astro_token\"] }}',\n)\n
  2. Deploy this DAG to your source Airflow environment, configured as described in the Configuration section below

  3. Once the DAG is available in the Airflow UI, click the \"Trigger DAG\" button, then click \"Trigger DAG w/ config\", and input the following in the configuration dictionary:
  4. astro_token: To retrieve anf Astronomer token, navigate to cloud.astronomer.io/token and log in using your Astronomer credentials
  5. deployment_url: To retrieve a deployment URL - navigate to the Astronomer Airlow deployment that you'd like to migrate to in the Astronomer UI, click Open Airflow and copy the page URL (excluding /home on the end of the URL)
    • For example, if your deployment URL is https://astronomer.astronomer.run/abcdt4ry/home, you'll use https://astronomer.astronomer.run/abcdt4ry
  6. The config dictionary used when triggering the DAG should be formatted as:

    {\n \"deployment_url\": \"your-deployment-url\",\n \"astro_token\": \"your-astro-token\"\n}\n
    5. Once the DAG successfully runs, your connections, variables, and environment variables should all be migrated to Astronomer

"},{"location":"operator/#configuration","title":"Configuration","text":"

The AstroMigrationOperator can be configured as follows:

"},{"location":"migration_source/gcc/","title":"Google Cloud Composer","text":""},{"location":"migration_source/gcc/#compatability","title":"Compatability","text":"Source Compatible Airflow 1 \u274c GCC 1 - Airflow 2.x Operator GCC 2 - Airflow 2.x \u2705"},{"location":"migration_source/gcc/#notes","title":"Notes","text":"

You must be an Admin to see Plugins on GCC.

"},{"location":"migration_source/gcc/#installation","title":"Installation","text":"
  1. Navigate to your Environments
  2. Go to PyPi Packages
  3. Click + Add Package and put astronomer-starship under Package name
"},{"location":"migration_source/gcc/#faq","title":"FAQ","text":""},{"location":"migration_source/mwaa/","title":"(AWS) Managed Apache Airflow","text":""},{"location":"migration_source/mwaa/#compatability","title":"Compatability","text":"Source Compatible Airflow 1 \u274c MWAA v2.0.2 Operator MWAA \u2265 v2.2.2 \u2705"},{"location":"migration_source/mwaa/#installation","title":"Installation","text":"
  1. Navigate to your Environments
  2. Download your existing requirements.txt
  3. Add astronomer-starship to the file, save it, and re-upload it to S3
  4. Click Edit, and pick the newer version of your Requirements File
  5. Click Next, then eventually Save, and then wait for your deployment to restart and dependencies to install
"}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Home","text":"

Astronomer Starship can send your Airflow workloads to new places!

"},{"location":"#what-is-it","title":"What is it?","text":"

Starship is a utility to migrate Airflow metadata such as Airflow Variables, Connections, Environment Variables, Pools, and DAG History between two Airflow instances.

"},{"location":"#installation","title":"Installation","text":"
pip install astronomer-starship\n
"},{"location":"#usage","title":"Usage","text":"
  1. Create a Workspace in Astro or Software to hold Astro Deployments
  2. Create an Astro Deployment matching the source Airflow deployment configuration as possible
  3. Run astro dev init with the Astro CLI to create a Astro Project locally in your terminal
  4. Add any DAGs to the /dags folder in the Astro Project
  5. Complete any additional setup required to convert your existing Airflow deployment to an Astro Project
  6. Install Starship (and any additional Python Dependencies) to the Astro Project
  7. Install Starship to your existing Airflow Deployment
  8. Deploy the Astro Project to the Astro Deployment with astro deploy
  9. In the Airflow UI of the source Airflow deployment, navigate to the new Astronomer menu and select the Migration Tool \ud83d\ude80 option
  10. Follow the UI prompts to migrate, or if needed, look at the instructions to use the Operator
"},{"location":"#compatability","title":"Compatability","text":"Source Compatible Airflow 1 \u274c GCC 1 - Airflow 2.x Operator GCC 2 - Airflow 2.x \u2705 MWAA v2.0.2 Operator MWAA \u2265 v2.2.2 \u2705 OSS Airflow VM \u2705 Astronomer Products \u2705"},{"location":"#faq","title":"FAQ","text":"

You must have AIRFLOW__CORE__TEST_CONNECTION set. See notes here

"},{"location":"#security-notice","title":"Security Notice","text":"

This project is an Airflow Plugin that adds custom API routes. Ensure your environments are correctly secured.

Artwork Starship logo by Lorenzo used with permission from The Noun Project under Creative Commons.

"},{"location":"api/","title":"API","text":""},{"location":"api/#airflow-version","title":"Airflow Version","text":"

Returns the version of Airflow that the Starship API is connected to.

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.airflow_version--get-apistarshipairflow_version","title":"GET /api/starship/airflow_version","text":"

Parameters: None

Response:

OK\n

"},{"location":"api/#health","title":"Health","text":"

Returns the health of the Starship API

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.health--get-apistarshiphealth","title":"GET /api/starship/health","text":"

Parameters: None

Response:

OK\n

"},{"location":"api/#environment-variables","title":"Environment Variables","text":"

Get the Environment Variables, which may be used to set Airflow Connections, Variables, or Configurations

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.env_vars--get-apistarshipenv_vars","title":"GET /api/starship/env_vars","text":"

Parameters: None

Response:

{\n    \"FOO\": \"bar\",\n    \"AIRFLOW__CORE__SQL_ALCHEMY_CONN\": \"sqlite:////usr/local/airflow/airflow.db\",\n    ...\n}\n

"},{"location":"api/#variable","title":"Variable","text":"

Get Variables or set a Variable

Model: airflow.models.Variable

Table: variable

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.variables--get-apistarshipvariable","title":"GET /api/starship/variable","text":"

Parameters: None

Response:

[\n    {\n        \"key\": \"key\",\n        \"val\": \"val\",\n        \"description\": \"My Var\"\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.variables--post-apistarshipvariable","title":"POST /api/starship/variable","text":"

Parameters: JSON

Field (*=Required) Version Type Example key* str key val* str val description str My Var

Response: List of Variables, as GET Response

"},{"location":"api/#pools","title":"Pools","text":"

Get Pools or set a Pool

Model: airflow.models.Pool

Table: pools

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.pools--get-apistarshippools","title":"GET /api/starship/pools","text":"

Parameters: None

Response:

[\n    {\n        \"name\": \"my_pool\",\n        \"slots\": 5,\n        \"description\": \"My Pool\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.pools--post-apistarshippools","title":"POST /api/starship/pools","text":"

Parameters: JSON

Field (*=Required) Version Type Example name* str my_pool slots* int 5 description str My Pool include_deferred* >=2.7 bool True

Response: List of Pools, as GET Response

"},{"location":"api/#connections","title":"Connections","text":"

Get Connections or set a Connection

Model: airflow.models.Connections

Table: connection

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.connections--get-apistarshipconnection","title":"GET /api/starship/connection","text":"

Parameters: None

Response:

[\n    {\n        \"conn_id\": \"my_conn\",\n        \"conn_type\": \"http\",\n        \"host\": \"localhost\",\n        \"port\": \"1234\",\n        \"schema\": \"https\",\n        \"login\": \"user\",\n        \"password\": \"foobar\",  # pragma: allowlist secret\n        \"extra\": \"{}\",\n        \"conn_type\": \"http\",\n        \"conn_type\": \"http\",\n        \"conn_type\": \"http\",\n        \"description\": \"My Var\"\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.connections--post-apistarshipconnection","title":"POST /api/starship/connection","text":"

Parameters: JSON

Field (*=Required) Version Type Example conn_id* str my_conn conn_type* str http host str localhost port int 1234 schema str https login str user password str ** extra dict {} description str My Conn

Response: List of Connections, as GET Response

"},{"location":"api/#dags","title":"DAGs","text":"

Get DAG or pause/unpause a DAG

Model: airflow.models.DagModel

Table: dags

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dags--get-apistarshipdags","title":"GET /api/starship/dags","text":"

Parameters: None

Response:

[\n    {\n        \"dag_id\": \"dag_0\",\n        \"schedule_interval\": \"0 0 * * *\",\n        \"is_paused\": true,\n        \"fileloc\": \"/usr/local/airflow/dags/dag_0.py\",\n        \"description\": \"My Dag\",\n        \"owners\": \"user\",\n        \"tags\": [\"tag1\", \"tag2\"],\n        \"dag_run_count\": 2,\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dags--patch-apistarshipdags","title":"PATCH /api/starship/dags","text":"

Parameters: JSON

Field (*=Required) Version Type Example dag_id* str dag_0 is_paused* bool true
{\n    \"dag_id\": \"dag_0\",\n    \"is_paused\": true\n}\n
"},{"location":"api/#dag-runs","title":"DAG Runs","text":"

Get DAG Runs or set DAG Runs

Model: airflow.models.DagRun

Table: dag_run

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dag_runs--get-apistarshipdag_runs","title":"GET /api/starship/dag_runs","text":"

Parameters: Args

Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0

Response:

[\n    {\n        \"dag_id\": \"dag_0\",\n        \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n        \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n        \"start_date\": \"1970-01-01T00:00:00+00:00\",\n        \"end_date\": \"1970-01-01T00:00:00+00:00\",\n        \"state\": \"SUCCESS\",\n        \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n        \"creating_job_id\": 123,\n        \"external_trigger\": true,\n        \"run_type\": \"manual\",\n        \"conf\": None,\n        \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n        \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n        \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n        \"dag_hash\": \"....\"\n    },\n    ...\n]\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.dag_runs--post-apistarshipdag_runs","title":"POST /api/starship/dag_runs","text":"

Parameters: JSON

Field (*=Required) Version Type Example dag_runs list[DagRun] [ ... ]
{\n    \"dag_runs\": [ ... ]\n}\n

DAG Run:

Field (*=Required) Version Type Example dag_id* str dag_0 queued_at date 1970-01-01T00:00:00+00:00 execution_date* date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 state str SUCCESS run_id* str manual__1970-01-01T00:00:00+00:00 creating_job_id int 123 external_trigger bool true run_type* str manual conf dict {} data_interval_start >2.1 date 1970-01-01T00:00:00+00:00 data_interval_end >2.1 date 1970-01-01T00:00:00+00:00 last_scheduling_decision date 1970-01-01T00:00:00+00:00 dag_hash str ... clear_number >=2.8 int 0"},{"location":"api/#task-instances","title":"Task Instances","text":"

Get TaskInstances or set TaskInstances

Model: airflow.models.TaskInstance

Table: task_instance

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.task_instances--get-apistarshiptask_instances","title":"GET /api/starship/task_instances","text":"

Parameters: Args

Field (*=Required) Version Type Example dag_id* str dag_0 limit int 10 offset int 0

Response:

{\n    \"task_instances\": [\n        {\n            \"task_instances\": []\n            \"run_id\": \"manual__1970-01-01T00:00:00+00:00\",\n            \"queued_at\": \"1970-01-01T00:00:00+00:00\",\n            \"execution_date\": \"1970-01-01T00:00:00+00:00\",\n            \"start_date\": \"1970-01-01T00:00:00+00:00\",\n            \"end_date\": \"1970-01-01T00:00:00+00:00\",\n            \"state\": \"SUCCESS\",\n            \"creating_job_id\": 123,\n            \"external_trigger\": true,\n            \"run_type\": \"manual\",\n            \"conf\": None,\n            \"data_interval_start\": \"1970-01-01T00:00:00+00:00\",\n            \"data_interval_end\": \"1970-01-01T00:00:00+00:00\",\n            \"last_scheduling_decision\": \"1970-01-01T00:00:00+00:00\",\n            \"dag_hash\": \"....\"\n        },\n        ...\n    ],\n    \"dag_run_count\": 2,\n}\n

"},{"location":"api/#astronomer_starship.starship_api.StarshipApi.task_instances--post-apistarshiptask_instances","title":"POST /api/starship/task_instances","text":"

Parameters: JSON

Field (*=Required) Version Type Example task_instances list[TaskInstance] [ ... ]
{\n    \"task_instances\": [ ... ]\n}\n

Task Instance:

Field (*=Required) Version Type Example dag_id* str dag_0 run_id* >2.1 str manual__1970-01-01T00:00:00+00:00 task_id* str task_0 map_index* >2.2 int -1 execution_date* <=2.1 date 1970-01-01T00:00:00+00:00 start_date date 1970-01-01T00:00:00+00:00 end_date date 1970-01-01T00:00:00+00:00 duration float 0.0 max_tries int 2 hostname str host unixname str unixname job_id int 123 pool* str default_pool pool_slots int 1 queue str queue priority_weight int 1 operator str BashOperator queued_dttm date 1970-01-01T00:00:00+00:00 queued_by_job_id int 123 pid int 123 external_executor_id int trigger_id >2.1 str trigger_timeout >2.1 date 1970-01-01T00:00:00+00:00 executor_config str"},{"location":"operator/","title":"Operator","text":"

The Starship Operator should be used in instances where the Airflow Webserver is unable to correctly host a Plugin.

The AstroMigrationOperator should be used if migrating from a Google Cloud Composer 1 (with Airflow 2.x) or MWAA v2.0.2 environment. These environments do not support webserver plugins and will require using the AstroMigrationOperator to migrate data.

"},{"location":"operator/#installation","title":"Installation","text":"

Add the following line to your requirements.txt in your source environment:

astronomer-starship==1.2.1\n
"},{"location":"operator/#usage","title":"Usage","text":"
  1. Add the following DAG to your source environment:

    dags/astronomer_migration_dag.py
    from airflow import DAG\n\nfrom astronomer.starship.operators import AstroMigrationOperator\nfrom datetime import datetime\n\nwith DAG(\ndag_id=\"astronomer_migration_dag\",\nstart_date=datetime(2020, 8, 15),\nschedule_interval=None,\n) as dag:\n\nAstroMigrationOperator(\ntask_id=\"export_meta\",\ndeployment_url='{{ dag_run.conf[\"deployment_url\"] }}',\ntoken='{{ dag_run.conf[\"astro_token\"] }}',\n)\n
  2. Deploy this DAG to your source Airflow environment, configured as described in the Configuration section below

  3. Once the DAG is available in the Airflow UI, click the \"Trigger DAG\" button, then click \"Trigger DAG w/ config\", and input the following in the configuration dictionary:
  4. astro_token: To retrieve anf Astronomer token, navigate to cloud.astronomer.io/token and log in using your Astronomer credentials
  5. deployment_url: To retrieve a deployment URL - navigate to the Astronomer Airlow deployment that you'd like to migrate to in the Astronomer UI, click Open Airflow and copy the page URL (excluding /home on the end of the URL)
    • For example, if your deployment URL is https://astronomer.astronomer.run/abcdt4ry/home, you'll use https://astronomer.astronomer.run/abcdt4ry
  6. The config dictionary used when triggering the DAG should be formatted as:

    {\n \"deployment_url\": \"your-deployment-url\",\n \"astro_token\": \"your-astro-token\"\n}\n
    5. Once the DAG successfully runs, your connections, variables, and environment variables should all be migrated to Astronomer

"},{"location":"operator/#configuration","title":"Configuration","text":"

The AstroMigrationOperator can be configured as follows:

"},{"location":"migration_source/gcc/","title":"Google Cloud Composer","text":""},{"location":"migration_source/gcc/#compatability","title":"Compatability","text":"Source Compatible Airflow 1 \u274c GCC 1 - Airflow 2.x Operator GCC 2 - Airflow 2.x \u2705"},{"location":"migration_source/gcc/#notes","title":"Notes","text":"

You must be an Admin to see Plugins on GCC.

"},{"location":"migration_source/gcc/#installation","title":"Installation","text":"
  1. Navigate to your Environments
  2. Go to PyPi Packages
  3. Click + Add Package and put astronomer-starship under Package name
"},{"location":"migration_source/gcc/#faq","title":"FAQ","text":""},{"location":"migration_source/mwaa/","title":"(AWS) Managed Apache Airflow","text":""},{"location":"migration_source/mwaa/#compatability","title":"Compatability","text":"Source Compatible Airflow 1 \u274c MWAA v2.0.2 Operator MWAA \u2265 v2.2.2 \u2705"},{"location":"migration_source/mwaa/#installation","title":"Installation","text":"
  1. Navigate to your Environments
  2. Download your existing requirements.txt
  3. Add astronomer-starship to the file, save it, and re-upload it to S3
  4. Click Edit, and pick the newer version of your Requirements File
  5. Click Next, then eventually Save, and then wait for your deployment to restart and dependencies to install
"}]} \ No newline at end of file diff --git a/sitemap.xml b/sitemap.xml index 0216d1d..52bd2cf 100644 --- a/sitemap.xml +++ b/sitemap.xml @@ -2,27 +2,27 @@ https://astronomer.github.io/starship/ - 2024-03-19 + 2024-03-21 daily https://astronomer.github.io/starship/api/ - 2024-03-19 + 2024-03-21 daily https://astronomer.github.io/starship/operator/ - 2024-03-19 + 2024-03-21 daily https://astronomer.github.io/starship/migration_source/gcc/ - 2024-03-19 + 2024-03-21 daily https://astronomer.github.io/starship/migration_source/mwaa/ - 2024-03-19 + 2024-03-21 daily \ No newline at end of file diff --git a/sitemap.xml.gz b/sitemap.xml.gz index 0d7cecb..f27bde8 100644 Binary files a/sitemap.xml.gz and b/sitemap.xml.gz differ