AgileEngine is an Inc. 5000 company that creates award-winning software for Fortune 500 brands and trailblazing startups across 17+ industries. We rank among the leaders in areas like application development and AI/ML, and our people-first culture has earned us multiple Best Place to Work awards.
WHY JOIN US
If you're looking for a place to grow, make an impact, and work with people who care, we'd love to meet you!
ABOUT THE ROLE
We are looking for a
Senior Data Engineer
to design and maintain scalable data pipelines and platforms supporting Accounting, Finance, Payments, and Tax functions within a large-scale financial systems platform. You will build ETL/ELT workflows using Python, dbt, and Airflow, working across Snowflake, Redshift, Kafka, and Spark to ensure data availability and reliability for critical business operations. The role operates in a cross-functional Agile environment with direct stakeholder engagement across Engineering, Product, and Analytics teams.
WHAT YOU WILL DO
- Design, develop, and maintain scalable and reliable ETL/ELT pipelines;
- Build and optimize data workflows using tools such as dbt and Airflow;
- Develop and maintain data solutions leveraging technologies such as Snowflake, Redshift, Kafka, Spark, and Hive;
- Work with both SQL and NoSQL databases to support data ingestion, transformation, and analytics needs;
- Monitor, troubleshoot, and improve data pipeline performance, scalability, and reliability;
- Collaborate closely with cross-functional teams including Engineering, Product, Analytics, and Business stakeholders;
- Participate in Agile ceremonies and contribute to sprint planning and delivery commitments;
- Manage multiple priorities effectively while ensuring timely and high-quality deliverables;
- Contribute to data architecture discussions and help drive engineering best practices.
MUST HAVES
-
5+ years
of professional experience in Data Engineering or Data Warehousing roles;
- Strong programming skills in Python;
-
5+ years
of experience building data pipelines using ETL/ELT tools, with dbt preferred;
-
3+ years
of hands-on experience with big data technologies such as Snowflake, Redshift, Kafka, Spark, or Hive;
- Extensive experience working with both SQL and NoSQL databases;
- Strong expertise with workflow orchestration and pipeline management tools such as Airflow;
- Strong understanding of scalable data architecture and engineering best practices;
- Excellent communication and stakeholder management skills;
- Proven ability to manage competing priorities and deliver within agreed sprint commitments;
- Comfortable working in highly collaborative, cross-functional Agile teams;
- Self-starter mindset with strong analytical, problem-solving, and critical-thinking abilities;
- Master’s degree in Computer Science, Mathematics, Statistics, or a related technical field preferred, or Bachelor’s degree with relevant experience;
- Upper-intermediate English level.
NICE TO HAVES
- Experience working in cloud-based data environments;
- Familiarity with modern data warehousing and distributed systems concepts;
- Exposure to data governance, observability, or data quality frameworks.
PERKS AND BENEFITS
-
Professional growth
: Mentorship, TechTalks, and personalized growth roadmaps.
-
Competitive compensation
: USD-based pay with education, fitness, and team activity budgets.
-
Exciting projects
: Modern solutions with Fortune 500 and top product companies.
-
Flextime
: Flexible schedule with remote and office options.