I was trying to install and configure apache airflow on dev Hadoop cluster of a three nodes with below configurations/version:
Operating System: Red Hat Ente
Follow these steps to install Apache Airflow with MySQL using Anaconda3
1) Install Pre-requisites
yum install gcc gcc-c++ -y
yum install libffi-devel mariadb-devel cyrus-sasl-devel -y
dnf install redhat-rpm-config
2) Install Anaconda3 (comes with Python 3.7.6)
yum install libXcomposite libXcursor libXi libXtst libXrandr alsa-lib mesa-libEGL libXdamage mesa-libGL libXScrnSaver
wget https://repo.anaconda.com/archive/Anaconda3-2020.02-Linux-x86_64.sh
chmod +x Anaconda3-2020.02-Linux-x86_64.sh
./Anaconda3-2020.02-Linux-x86_64.sh
Make sure you do conda initialize
when prompted during installation.
This will make sure the correct version of python and pip are used in the subsequent steps.
3) Install Apache Airflow
pip install apache-airflow[mysql,celery]
You can add other subpackages as required. I have included only the ones required for Airflow to use MySQL database as backend.
4) Initialize Airflow
export AIRFLOW_HOME=~/airflow
airflow initdb
From here, I have mimicked the steps you have followed to configure MySQL Server
5) Install MySQL Server
rpm -Uvh https://repo.mysql.com/mysql80-community-release-el7-3.noarch.rpm
sed -i 's/enabled=1/enabled=0/' /etc/yum.repos.d/mysql-community.repo
yum --enablerepo=mysql80-community install mysql-server
systemctl start mysqld.service
6) Login to MySQL and configure database for Airflow
mysql> CREATE DATABASE airflow CHARACTER SET utf8 COLLATE utf8_unicode_ci;
mysql> CREATE user 'airflow'@'localhost' identified by 'Airflow123';
mysql> GRANT ALL privileges on *.* to 'airflow'@'localhost';
7) Update Airflow configuration file (~/airflow/airflow.cfg)
sql_alchemy_conn = mysql://airflow:Airflow123@localhost:3306/airflow
executor = CeleryExecutor
8) Initialize Airflow
airflow initdb