Airflow for the Enterprise (Nike's Journey)
Submitted by ChandraSekhar Kandavilli (@chandrakandavilli) on Friday, 14 June 2019
Session type: Short talk of 20 mins
Nike has a wide variety of systems in the enterprise landscape. All these systems produce data in different shapes and sizes. We are building the Nike data foundation so that we meet the below goals.
- Deliver trusted, accurate, timely and consistent information and insights to the business.
- Enable governing data as an asset and sharing data at scale.
- Enable faster and better informed business decisions resulting in serving our
- customers and consumers more efficiently.
One key aspect in the enterprise data foundation is having consolidated data which helps in Decision Support. It is very important that we collect all the transactional data and run transformation heavy and consolidation jobs which provide the data for enabling the reporting dash boards. The Nike’s data landscape is extremely complex with diverse systems, integration points and massive volumes. We need a highly reliable and scalable platform to orchestrate all of these data pipelines.
This talk is about how Nike utilizes the open source technology Apache Airflow to run complex ETL/ELT/Decision Support/Advance Analytics jobs which executes code base developed using various technologies (Spark, Hive, Python, Shell, AWS commands) and handle complex integration with other collaboration tools like E-mail / Slack / Box etc.
• What makes Airflow Architecture unique?
• How to leverage Custom Operators for repeat operations?
• How to effectively use Netflix Genie with cloud services like AWS.
• How to simplify CI / CD for Airflow using Jenkins and Code Repos like Git/Bitbucket
Chandra is currently a data engineering manager part of Nike, Enterprise Data & Analytics Team, leading teams in the space of Enterprise Data Foundation and Retail Data Products.