Senior Data Engineer - Python Airflow

Jobs for humanity - City Of Cincinnati
new offer (26/06/2024)

job description

Job Description
Position Type :

Full time Type Of Hire :

Experienced (relevant combo of work and education) Education Desired :

Bachelor of Computer Science Travel Percentage :

5 - 10%
Job Description
Are you ready to unleash your full potential? We’re looking for people who are passionate about payments to chart Worldpay’s path to being the largest and most-loved payments company in the world.
About the team:
Worldpay Cloud Data Platform team provides WorldPay big Data capabilities including large scale data processing, analytics, fraud platforms, machine learning engineering, stream processing and data insights. Worldpay Cloud Data Platform team help maintain and advance cloud data platforms and services to build data products powering our business services and customers.
About the role:
We are seeking a talented and experienced Data engineer proficient in python, airflow, and AWS to join our dynamic team. The ideal candidate will have at least 6 years of hands-on experience in Data engineering practices, cloud infrastructure management, automation, and CI/CD pipeline development. This role will involve collaborating with development, operations, and quality assurance teams to streamline and optimize our software delivery processes
What you will be doing:
Develop and implement strategies for data engineering initiatives using Python, AWS, Airflow, and Snowflake technologies
Monitor trends in the data engineering industry and stay up to date on current technologies
Collaborate with product team to develop solutions that meet their goals and objectives
Act as a subject matter expert for Apache Airflow and provide technical guidance to team members
Install, configure, and maintain Astronomer Airflow environments
Build complex data engineering pipelines using Python and Airflow
Will be responsible for designing, developing, and maintaining scalable workflows and orchestration systems using Astronomer
Create and manage Directed Acyclic Graphs (DAGs) to automate data pipelines and processes
Leverage AWS Glue, Step Functions, and other services for orchestrating data workflows
Develop custom operators and plugins for Astronomer to extend its capabilities
Integrate code with defined CI/CD framework and AWS services required for building secure data pipeline
Manage user access and permissions, ensuring data security and compliance with company policies
Implement and monitor security controls, including encryption, authentication, and network security
Conduct regular security audits and vulnerability assessments.
Manage data ingestion and ETL processes
Automate routine tasks and processes using scripting and automation tools
Maintain comprehensive documentation for configurations, procedures, and troubleshooting steps
What you bring:
5+ yrs in a pivotal Software/Data Engineering role with deep exposure to modern data stacks, particularly Snowflake, Airflow, DBT, and AWS data services
Proficiency in cloud platforms such as AWS, Azure, or Google Cloud
Experience with PySpark/Hadoop and/or AWS Glue ETL and/or Databricks w/ python is preferred. Must have experience on ETL development life cycle, best practices of ETL pipelines, thorough work experience on data warehouse using combination of Python, Snowflake &
AWS Services
Data engineering experience with AWS Services (S3, Lambda, Glue, Lake Formation, EMR), Kafka, Streaming, Databricks is highly preferred.
Experience in Astronomer and/or Airflow
Understanding data pipelines and modern ways of automating data pipeline using cloud-based testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
Serving as the Databricks account owner, including security and privacy setup, marketplace plugins and integration with other tools
Experience Unity Catalog migration, workspaces and audit logs
Strong experience with Amazon Web Services (AWS) accounts and high-level usage monitoring
Proficiency in scripting languages such as Python, Bash, or PowerShell
Experience with CI/CD tools like Jenkins, GitLab CI, CircleCI, or similar
Familiarity with monitoring and logging tools (e.G., Prometheus, Grafana, ELK stack)
Excellent problem-solving skills and attention to detail
Strong communication skills and the ability to work collaboratively in a team environment
Experience with AWS CLI and Networking
Experience with architecting and maintaining high availability production systems
Experience with developing monitoring architecture and implementing monitoring agents, dashboards, escalations, and alerts
Knowledge of security controls for the public cloud (encryption of data in motion/rest and key management)
Demonstrated knowledge and hands-on experience with AWS alerting/monitoring tools
Experience with infrastructure as code (IaC) tools such as Terraform
Added bonus if you have:
AWS certification
Airflow Certification
Python Certification
What we offer you:
A competitive salary and benefits
A variety of career development tools, resources and opportunities
The chance to work on some of the most challenging, relevant issues in the payment industry
Time to support charities and give back in your community
FIS is committed to providing its employees with an exciting career opportunity and competitive compensation. The pay range for this full-time position is $(phone number removed) - $(phone number removed) and reflects the minimum and maximum target for new hire salaries for this position based on the posted role, level, and location. Within the range, actual individual starting pay is determined additional factors, including job-related skills, experience, and relevant education or training. Any changes in work location will also impact actual individual starting pay. Please consult with your recruiter about the specific salary range for your preferred location during the hiring process.Privacy Statement
FIS is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how FIS protects personal information online, please see the Online Privacy Notice.
EEOC Statement
FIS is an equal opportunity employer. We evaluate qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, marital status, genetic information, national origin, disability, veteran status, and other protected characteristics. The EEO is the Law poster is available here supplement document available here
For positions located in the US, the following conditions apply. If you are made a conditional offer of employment, you will be required to undergo a drug test. ADA Disclaimer:
In developing this job description care was taken to include all competencies needed to successfully perform in this position. However, for Americans with Disabilities Act (ADA) purposes, the essential functions of the job may or may not have been described for purposes of ADA reasonable accommodation. All reasonable accommodation requests will be reviewed and evaluated on a case-by-case basis.
Sourcing Model
Recruitment at FIS works primarily on a direct sourcing model;
a relatively small portion of our hiring is through recruitment agencies. FIS does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company.
#pridepass

Apply now for
Senior Data Engineer - Python Airflow

Warning: you will leave the jobtome site.

These offers may interest you:

Go back