```
bash_command='sh xxxx.sh'
```
- xxxx.sh:根据需求
* Linux命令
* hive -f
* spark-sql -f
* spark-submit python | jar
+ **提交**
```
python first_bash_operator.py
```
+ **查看**

+ **执行**

-
小结
- 实现Shell命令的调度测试
知识点08:依赖调度测试
-
目标:实现AirFlow的依赖调度测试
-
实施
-
需求:使用BashOperator调度执行多个Task,并构建依赖关系
-
代码
- 创建
cd /root/airflow/dags vim second_bash_operator.py
- 开发
# import from datetime import timedelta from airflow import DAG from airflow.operators.bash import BashOperator from airflow.utils.dates import days_ago # define args default_args = { 'owner': 'airflow', 'email': ['airflow@example.com'], 'email\_on\_failure': True, 'email\_on\_retry': True, 'retries': 1, 'retry\_delay': timedelta(minutes=1), } # define dag dag = DAG( 'second\_airflow\_dag', default_args=default_args, description='first airflow task DAG', schedule_interval=timedelta(days=1), start_date=days_ago(1), tags=['itcast\_bash'], ) # define task1 say_hello_task = BashOperator( task_id='say\
-