6

Here are my dag file and BashOperator task:

my_dag = { 
dag_id = 'my_dag',
start_date = datetime(year=2017, month=3, day=28),
schedule_interval='01***',
}

my_bash_task = BashOperator(
task_id="my_bash_task",
bash_command=bash_command,
dag=my_dag)

bash_command = "/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh ""

Following this answer I even gave a space after the bash file to avoid TemplateNotFound error. But while running this task gave me this error:airflow.exceptions.AirflowException: Bash command failed.

bash_command file contents are:

#!/bin/bash
DATABASE=db_name
FILE=$DATABASE-`date +%F-%H%M%S`.backup
export PGPASSWORD=password
pg_dump -h localhost -p 5432 -U developer -F c -b -v -f ~/Dropbox/database_backup/location/$FILE db_name
unset PGPASSWORD

However instead of pointing the bash_command to the bash file writing the command in multi line string works:

bash_command = """
DATABASE=db_name
FILE=$DATABASE-`date +%F-%H%M%S`.backup
export PGPASSWORD=password
pg_dump -h localhost -p 5432 -U developer -F c -b -v -f ~/Dropbox/database_backup/location/$FILE db_name
unset PGPASSWORD
"""

Because of this I am assuming that the error is not because bash commands. I even tried replacing #!/bin/bash in the bash file with #!/bin/sh, that did not work either.

I ran sh db_back_up_bash.sh from terminator and it works fine.

Update The Actual code:

bash_file_location_to_backup_db = '{{"/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh"}}'
# bash_file_location_to_backup_db = "/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh "
bash_command = """
DATABASE=ksaprice_scraping
FILE=$DATABASE-`date +%F-%H%M%S`.backup
export PGPASSWORD=password
pg_dump -h localhost -p 5432 -U developer -F c -b -v -f ~/Dropbox/database_backup/ksaprice/$FILE ksaprice_scraping
unset PGPASSWORD
"""

backup_scraped_db_in_dropbox_task = BashOperator(
    task_id="backup_scraped_db_in_dropbox_task",
    # bash_command=bash_command,# this works fine
    bash_command=bash_file_location_to_backup_db,#this give error :airflow.exceptions.AirflowException: Bash command failed
    dag=dag_crawl
)

Error trace:

[2017-04-11 20:02:14,905] {bash_operator.py:90} INFO - Output:
2017-04-11 20:02:14,905 | INFO| root : Output:
[2017-04-11 20:02:14,906] {bash_operator.py:94} INFO - /tmp/airflowtmp7FffJ2/backup_scraped_db_in_dropbox_taskQ6IVxm: line 1: /home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh: Permission denied
2017-04-11 20:02:14,906 | INFO| root : /tmp/airflowtmp7FffJ2/backup_scraped_db_in_dropbox_taskQ6IVxm: line 1: /home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh: Permission denied
[2017-04-11 20:02:14,906] {bash_operator.py:97} INFO - Command exited with return code 126
2017-04-11 20:02:14,906 | INFO| root : Command exited with return code 126
[2017-04-11 20:02:14,906] {models.py:1417} ERROR - Bash command failed
Traceback (most recent call last):
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
    result = task_copy.execute(context=context)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/operators/bash_operator.py", line 100, in execute
    raise AirflowException("Bash command failed")
AirflowException: Bash command failed
2017-04-11 20:02:14,906 | ERROR| root : Bash command failed
Traceback (most recent call last):
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
    result = task_copy.execute(context=context)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/operators/bash_operator.py", line 100, in execute
    raise AirflowException("Bash command failed")
AirflowException: Bash command failed
[2017-04-11 20:02:14,907] {models.py:1441} INFO - Marking task as FAILED.
2017-04-11 20:02:14,907 | INFO| root : Marking task as FAILED.
[2017-04-11 20:02:14,947] {models.py:1462} ERROR - Bash command failed
2017-04-11 20:02:14,947 | ERROR| root : Bash command failed
Traceback (most recent call last):
  File "/home/jak/my_projects/workflow_env/bin/airflow", line 28, in <module>
    args.func(args)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/bin/cli.py", line 585, in test
    ti.run(ignore_task_deps=True, ignore_ti_state=True, test_mode=True)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/utils/db.py", line 53, in wrapper
    result = func(*args, **kwargs)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/models.py", line 1374, in run
    result = task_copy.execute(context=context)
  File "/home/jak/my_projects/workflow_env/local/lib/python2.7/site-packages/airflow/operators/bash_operator.py", line 100, in execute
    raise AirflowException("Bash command failed")
airflow.exceptions.AirflowException: Bash command faile

2 Answers 2

3

I consider this a bug in airflow, jinja should not expect .sh files to contain template information in a BashOperator.

I got around it by putting the command into a format Jinja will interpret correctly:

bash_command = '{{"/home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh"}}'

Sign up to request clarification or add additional context in comments.

6 Comments

Following your suggestion results in error:**airflow.exceptions.AirflowException: Bash command failed**. I think @jhnclvr result of your has same affect as this anwser: stackoverflow.com/questions/42259298/…
Are you sure you are still getting: TemplateNotFound ? I have used this pattern a number of times now for my BashOperators and it works fine. Maybe post the whole stack trace here if possible, but I think now there is just an actual exception/error when you run the bash command.
not TemplateNotFound error please check updated code in question
'Permission denied' in the error, this is the problem I guess. I am new to linux don't know how to run bash script without permission denied error.
Since we fixed the first problem this should be a new question or it will confuse people who look at this later. However try running sudo chmod 755 /home/jak/my_projects/workflow_env/repo_workflow/db_backup_bash.sh in a terminal and see if that solves it.
|
0

I changed the permission of the bash file and it worked. Check the permission because airflow needs to access and execute the file.

bash_command="/root/airflow/test.sh "

Hope this helps...

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.