Navigate the Airflow UI to find the sample Airflow dag we just brought inīy default, Airflow loads all DAG-s in paused status. Note: This is what requires Airflow to be able to connect to datahub-gms the host (this is the container running datahub-gms image) and this is why we needed to connect the Airflow containers to the datahub_network using our custom docker-compose file.command inside that container to register the datahub_rest connection type and connect it to the datahub-gms host on port 8080. Find the container running airflow webserver: docker ps | grep webserver | cut -d " " -f 1.(Look for the AIRFLOW_LINEAGE_BACKEND and AIRFLOW_LINEAGE_DATAHUB_KWARGS variables)įirst you need to initialize airflow in order to create initial database tables and the initial airflow user.Ĭontainer airflow_deploy_airflow-scheduler_1 Started 15.7sĪttaching to airflow-init_1, airflow-scheduler_1, airflow-webserver_1, airflow-worker_1, flower_1, postgres_1, redis_1Īirflow-init_1 | DB: | INFO - * Running on (Press CTRL+C to quit) This docker-compose file also sets up the ENV variables to configure Airflow's Lineage Backend to talk to DataHub.Modifies the port-forwarding to map the Airflow Webserver port 8080 to port 58080 on the localhost (to avoid conflicts with DataHub metadata-service, which is mapped to 8080 by default).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |