AuxoAI partners with Fortune 500-scale enterprises to design and deploy Agentic AI systems that drive measurable outcomes—from data modernization to deployed functional products. AuxoAI enables the creation of AI-first enterprises with productized solutions, AI first engineering, and forward deployed experts.
We are looking for a strong Data Engineer to join our growing team. The ideal candidate brings solid ETL fundamentals, hands-on pipeline experience, and cloud platform proficiency — with a preference for GCP/BigQuery expertise.
Responsibilities: • Design, build, and maintain scalable data pipelines and ETL/ELT workflows • Work with Dataform or dbt to implement transformation logic and dimensional data models (star and snowflake schemas) • Implement and manage Slowly Changing Dimensions (SCD Type 1, Type 2, etc.) to support historical data tracking and analytics • Develop and optimize data solutions on GCP (BigQuery, GCS) or AWS/Azure • Support data migration initiatives and data mesh architecture patterns • Collaborate with analysts, scientists, and business stakeholders to deliver reliable, well-modeled data products • Apply data governance, data modelling standards, and quality best practices across the data lifecycle • Troubleshoot pipeline issues and drive proactive monitoring and resolutionRequirement:
• 5 -8 years of hands-on Data Engineering experience • Strong ETL/ELT fundamentals — pipeline design, transformation logic, and end-to-end ownership • Solid understanding of data warehousing concepts, including dimensional modelling, star schema, and snowflake schema design • Hands-on experience implementing Slowly Changing Dimensions (SCD Type 1, Type 2, and/or hybrid approaches) • Proficiency with Dataform or dbt (preferred); strong SQL is a must • Experience with BigQuery (preferred) or equivalent cloud data warehouse (Redshift, Snowflake, Synapse) • Cloud platform experience: GCP (preferred), AWS, or Azure — including object storage (GCS, S3, ADLS) • Exposure to data migration projects and/or data mesh principles • Programming skills in Python or SQL; Spark/PySpark is a plus • Bachelor's or Master's degree in Computer Science, Engineering, or related field