Open your IDE or source code editor and select the option to clone the repository
Paste the repository link in the URL field and submit.
It requires "airflow-env" virtual environment configured locally.
Airflow is by default configured to use SQLite database. Configuration can be seen on local machine
~/airflow/airflow.cfg
under sql_alchemy_conn
.
Installing required dependency for MySQL connection in airflow-env
on local machine.
$ pyenv activate airflow-env
$ pip install PyMySQL
Now set sql_alchemy_conn = mysql+pymysql://root:@127.0.0.1:23306/airflow?charset=utf8mb4
in file
~/airflow/airflow.cfg
on local machine.
Add Interpreter to PyCharm pointing interpreter path to ~/.pyenv/versions/airflow-env/bin/python
, which is virtual
environment airflow-env
created with pyenv earlier. For adding an Interpreter go to File -> Setting -> Project:
airflow -> Python Interpreter
.
In PyCharm IDE open airflow project, directory /files/dags
of local machine is by default mounted to docker
machine when breeze airflow is started. So any DAG file present in this directory will be picked automatically by
scheduler running in docker machine and same can be seen on http://127.0.0.1:28080
.
Copy any example DAG present in the /airflow/example_dags
directory to /files/dags/
.
Add a __main__
block at the end of your DAG file to make it runnable. It will run a back_fill
job:
if __name__ == "__main__":
dag.clear()
dag.run()
Add AIRFLOW__CORE__EXECUTOR=DebugExecutor
to Environment variable of Run Configuration.
Click on Add configuration
Add Script Path and Environment Variable to new Python configuration
Now Debug an example dag and view the entries in tables such as dag_run, xcom
etc in MySQL Workbench.
Click on the branch symbol in the status bar
Give a name to a branch and checkout
Follow the Quick start for typical development tasks.
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。