-
Notifications
You must be signed in to change notification settings - Fork 394
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configuration variable empty #86
Comments
What version of Airflow are you using? |
Hi @prakshalj0512 Sorry for the delay it wasn't on top of my list to fix that lately. Now yes... I opened different issue but finally all related to this one. If I echo ENABLE_DELETE I got nothing (even if I set the Variable in AF AND having it hard coded like in the original DAG). And exactly the same for every variable. I removed this My airflow version is 1.10.12. |
Hi,
|
I did already I have remove the lock file before hand. If I print the value of And that's true for every variable I can extract. *** Reading local file: /data/airflow/logs/clean_airflow_logs/log_cleanup_worker_num_1_dir_0/2021-04-07T00:00:00 00:00/22.log
[2021-04-08 11:20:27,696] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: clean_airflow_logs.log_cleanup_worker_num_1_dir_0 2021-04-07T00:00:00 00:00 [queued]>
[2021-04-08 11:20:27,756] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: clean_airflow_logs.log_cleanup_worker_num_1_dir_0 2021-04-07T00:00:00 00:00 [queued]>
[2021-04-08 11:20:27,756] {taskinstance.py:880} INFO -
--------------------------------------------------------------------------------
[2021-04-08 11:20:27,756] {taskinstance.py:881} INFO - Starting attempt 22 of 23
[2021-04-08 11:20:27,757] {taskinstance.py:882} INFO -
--------------------------------------------------------------------------------
[2021-04-08 11:20:27,777] {taskinstance.py:901} INFO - Executing <Task(BashOperator): log_cleanup_worker_num_1_dir_0> on 2021-04-07T00:00:00 00:00
[2021-04-08 11:20:27,782] {standard_task_runner.py:54} INFO - Started process 1234 to run task
[2021-04-08 11:20:27,837] {standard_task_runner.py:77} INFO - Running: ['airflow', 'run', 'clean_airflow_logs', 'log_cleanup_worker_num_1_dir_0', '2021-04-07T00:00:00 00:00', '--job_id', '26180', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/clean_airflow_logs.py', '--cfg_path', '/tmp/tmpadn59i2f']
[2021-04-08 11:20:27,838] {standard_task_runner.py:78} INFO - Job 26180: Subtask log_cleanup_worker_num_1_dir_0
[2021-04-08 11:20:27,928] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: clean_airflow_logs.log_cleanup_worker_num_1_dir_0 2021-04-07T00:00:00 00:00 [running]> sea-cloud-airflow-applications.europe-west1-c.c.lisea-mesea-sandbox-272216.internal
[2021-04-08 11:20:27,997] {bash_operator.py:113} INFO - Tmp dir root location:
/tmp
[2021-04-08 11:20:28,007] {bash_operator.py:136} INFO - Temporary script location: /tmp/airflowtmpzu5t5bna/log_cleanup_worker_num_1_dir_0r1i1mn12
[2021-04-08 11:20:28,007] {bash_operator.py:146} INFO - Running command:
echo "Getting Configurations..."
BASE_LOG_FOLDER="/data/airflow/logs"
WORKER_SLEEP_TIME="3"
sleep s
MAX_LOG_AGE_IN_DAYS=""
if [ "" == "" ]; then
echo "maxLogAgeInDays conf variable isn't included. Using Default '30'."
MAX_LOG_AGE_IN_DAYS='30'
fi
ENABLE_DELETE=true
echo "Finished Getting Configurations"
echo ""
echo "Configurations:"
echo "BASE_LOG_FOLDER: ''"
echo "MAX_LOG_AGE_IN_DAYS: ''"
echo "ENABLE_DELETE: ''"
cleanup() {
echo "Executing Find Statement: $1"
FILES_MARKED_FOR_DELETE=`eval $1`
echo "Process will be Deleting the following File(s)/Directory(s):"
echo ""
echo "Process will be Deleting `echo "" | grep -v '^$' | wc -l` File(s)/Directory(s)" # "grep -v '^$'" - removes empty lines.
# "wc -l" - Counts the number of lines
echo ""
if [ "" == "true" ];
then
if [ "" != "" ];
then
echo "Executing Delete Statement: $2"
eval $2
DELETE_STMT_EXIT_CODE=$?
if [ "" != "0" ]; then
echo "Delete process failed with exit code ''"
echo "Removing lock file..."
rm -f /tmp/airflow_log_cleanup_worker.lock
if [ "" != "0" ]; then
echo "Error removing the lock file. Check file permissions. To re-run the DAG, ensure that the lock file has been deleted (/tmp/airflow_log_cleanup_worker.lock)."
exit
fi
exit
fi
else
echo "WARN: No File(s)/Directory(s) to Delete"
fi
else
echo "WARN: You're opted to skip deleting the File(s)/Directory(s)!!!"
fi
}
if [ ! -f /tmp/airflow_log_cleanup_worker.lock ]; then
echo "Lock file not found on this node! Creating it to prevent collisions..."
touch /tmp/airflow_log_cleanup_worker.lock
CREATE_LOCK_FILE_EXIT_CODE=$?
if [ "" != "0" ]; then
echo "Error creating the lock file. Check if the airflow user can create files under tmp directory. Exiting..."
exit
fi
echo ""
echo "Running Cleanup Process..."
FIND_STATEMENT="find /*/* -type f -mtime "
DELETE_STMT=" -exec rm -f {} \;"
cleanup "" ""
CLEANUP_EXIT_CODE=$?
FIND_STATEMENT="find /*/* -type d -empty"
DELETE_STMT=" -prune -exec rm -rf {} \;"
cleanup "" ""
CLEANUP_EXIT_CODE=$?
FIND_STATEMENT="find /* -type d -empty"
DELETE_STMT=" -prune -exec rm -rf {} \;"
cleanup "" ""
CLEANUP_EXIT_CODE=$?
echo "Finished Running Cleanup Process"
echo "Deleting lock file..."
rm -f /tmp/airflow_log_cleanup_worker.lock
REMOVE_LOCK_FILE_EXIT_CODE=$?
if [ "" != "0" ]; then
echo "Error removing the lock file. Check file permissions. To re-run the DAG, ensure that the lock file has been deleted (/tmp/airflow_log_cleanup_worker.lock)."
exit
fi
else
echo "Another task is already deleting logs on this worker node. Skipping it!"
echo "If you believe you're receiving this message in error, kindly check if /tmp/airflow_log_cleanup_worker.lock exists and delete it."
exit 0
fi
[2021-04-08 11:20:28,030] {bash_operator.py:153} INFO - Output:
[2021-04-08 11:20:28,032] {bash_operator.py:157} INFO - Getting Configurations...
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - sleep: invalid time interval ‘s’
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - Try 'sleep --help' for more information.
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - maxLogAgeInDays conf variable isn't included. Using Default '30'.
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - Finished Getting Configurations
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO -
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - Configurations:
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - BASE_LOG_FOLDER: ''
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - MAX_LOG_AGE_IN_DAYS: ''
[2021-04-08 11:20:28,048] {bash_operator.py:157} INFO - ENABLE_DELETE: ''
[2021-04-08 11:20:28,049] {bash_operator.py:157} INFO - Lock file not found on this node! Creating it to prevent collisions...
[2021-04-08 11:20:28,068] {bash_operator.py:157} INFO - Error creating the lock file. Check if the airflow user can create files under tmp directory. Exiting...
[2021-04-08 11:20:28,068] {bash_operator.py:161} INFO - Command exited with return code 0
[2021-04-08 11:20:28,099] {taskinstance.py:1070} INFO - Marking task as SUCCESS.dag_id=clean_airflow_logs, task_id=log_cleanup_worker_num_1_dir_0, execution_date=20210407T000000, start_date=20210408T092027, end_date=20210408T092028
[2021-04-08 11:20:32,641] {local_task_job.py:102} INFO - Task exited with return code 0 |
Sorry it wasn't the right log ending. 2021-04-08 13:14:30,840] {bash_operator.py:157} INFO - Lock file not found on this node! Creating it to prevent collisions...
[2021-04-08 13:14:30,842] {bash_operator.py:157} INFO - Error creating the lock file. Check if the airflow user can create files under tmp directory. Exiting...
[2021-04-08 13:14:30,842] {bash_operator.py:161} INFO - Command exited with return code 0
[ And I can assure you that the file was created |
I followed the readme and create variable and adjust some in the code.
At the end I got :
Where I'm expected :
Also the script doesn't run since I got the "error" :
Also in this case the script should
exit 1
since there was an issue. Here my dag and task say that everything run fine.The text was updated successfully, but these errors were encountered: