Designation
Consultant
Reporting to
Manas Yetirajam
Role type
Snowflake Engineer
Employment type
Full-time
Job Requirements
Mandatory Skills
Bachelor’s degree in computer science, Data Science, engineering, mathematics, information systems, or a related technical discipline
5+ years of relevant experience in data engineering roles
Detailed knowledge of data warehouse technical architectures, data modelling, infrastructure components, ETL/ ELT and reporting/analytic tools and environments, data structures and hands-on SQL coding
Proficient in at least one or more programming languages: Java, Python, Ruby, Scala
Experienced with AWS services such as Redshift, S3, EC2, Lambda, Athena, EMR, AWS Glue, Datapipeline.
Exposure to data visualization and reporting with tools such as Amazon QuickSight, Metabase, Tableau, or similar software
Experience building metrics deck and dashboards for KPIs including the underlying data models.
Understand how to design, implement, and maintain a platform providing secured access to large datasets
Primary Roles and Responsibilities
As a Snowflake Data Engineer, you will design, build, and operate enterprise-grade data platforms on Snowflake. You will develop secure, governed, and cost-efficient data pipelines and models; enable analytics and AI/ML use cases using Snowflake core services and Snowflake Cortex AI; and partner with business stakeholders to deliver reliable, high-performance data capabilities. The role spans ingestion, transformation, data architecture, performance tuning, security, and observability—anchored in Python, SQL, and agentic development patterns.
Preferred Skills
Education: Bachelor’s degree in Computer Science, Data Science, Engineering, Mathematics, Information Systems, or related technical discipline
Experience: 5+ years in data engineering with modern data warehousing; 3+ years hands-on with Snowflake (enterprise implementations)
Technical proficiency:
Strong Python and SQL (query optimization, UDFs, stored procedures)
Background in agentic development (designing LLM/agent workflows, orchestration, prompt engineering, tool/function calling, evaluation)
Data architecture/platform enablement with Snowflake (account/org setup, multi-DB design, RBAC, data sharing, governance)
Data warehouse architectures, dimensional/data vault modeling, ETL/ELT patterns, and CI/CD for data
Building metrics layers and KPI dashboards with underlying semantic/data models
Snowflake platform skills:
Core: Warehouses, Databases/Schemas, Stages, Snowpipe, Tasks, Streams, External Tables
Performance and cost optimization (clustering, micro-partitions, query profiling, warehouse sizing, workload isolation)
Security and governance: RBAC, masking/row-access policies, network policies, platform observability, data lineage
Integration with cloud object stores and identity (e.g., AWS S3 + IAM roles, or Azure/GCP equivalents)
Data visualization exposure with tools such as Tableau, QuickSight, Power BI, or Metabase
Ability to design, implement, and maintain secure access to large datasets at scale
Primary Roles and Responsibilities
Architect and operationalize Snowflake environments (multi-account/org patterns, RBAC, security controls, data sharing/clean rooms)
Define data models (dimensional/vault), data contracts, and semantic layers aligned to business KPIs
Data Engineering and Automation
Build resilient ELT workflows using Snowflake-native features (Snowpipe, Streams & Tasks) and orchestrators (Airflow, dbt, etc.)
Develop high-performance SQL transformations, stored procedures (Snowflake Scripting/Python), and UDFs
Implement CI/CD for data (versioning, testing, data quality checks, Dev/Test/Prod promotion)
Performance, Reliability, and Cost Management
Tune queries and storage, optimize clustering and warehouse configurations, and set workload isolation/SLOs
Establish monitoring, alerting, and cost governance; drive continuous improvements in efficiency
Security, Governance, and Compliance
Enforce data privacy and access controls (masking, row access, object dependencies), and document lineage
Implement data sharing and collaboration with strong governance, policies, and auditability
AI/ML and Advanced Analytics on Snowflake
Leverage Snowflake Cortex AI to analyze unstructured data and build LLM-powered apps using SQL/Python
Implement and operationalize:
Snowflake Intelligence for natural language Q&A across structured/unstructured data (e.g., PDFs, Salesforce)
Cortex Search for semantic search over enterprise documents
Cortex Analyst for conversational text-to-SQL over structured data
Document AI for document and image extraction
Cortex Code to accelerate development for data engineering and analytics
Ensure all AI features run within Snowflake’s secure, governed perimeter with role-based access control
Stakeholder Partnership
Collaborate with business owners to translate requirements into data solutions and analyses
Deliver metrics decks and dashboards; perform root-cause analysis and develop business cases for improvements
Other Information
Number of interview rounds
2
Mode of interview
Virtual
Job location
Bangalore/Pune/Gurgaon
Clean room policy (specific to business)
NA
Culture
Corporate Social Responsibility programs
Maternity and paternity leave
Opportunities to network and connect
Discounts on products and services
Note: Benefits/Perks listed above may vary depending on the nature of your employment with KPMG and the country where you work.
As a Snowflake Data Engineer, you will design, build, and operate enterprise-grade data platforms on Snowflake. You will develop secure, governed, and cost-efficient data pipelines and models; enable analytics and AI/ML use cases using Snowflake core services and Snowflake Cortex AI; and partner with business stakeholders to deliver reliable, high-performance data capabilities. The role spans ingestion, transformation, data architecture, performance tuning, security, and observability—anchored in Python, SQL, and agentic development patterns.
Education: Bachelor’s degree in Computer Science, Data Science, Engineering, Mathematics, Information Systems, or related technical discipline
Experience: 5+ years in data engineering with modern data warehousing; 3+ years hands-on with Snowflake (enterprise implementations)
Technical proficiency:
Strong Python and SQL (query optimization, UDFs, stored procedures)
Background in agentic development (designing LLM/agent workflows, orchestration, prompt engineering, tool/function calling, evaluation)
Data architecture/platform enablement with Snowflake (account/org setup, multi-DB design, RBAC, data sharing, governance)
Data warehouse architectures, dimensional/data vault modeling, ETL/ELT patterns, and CI/CD for data
Building metrics layers and KPI dashboards with underlying semantic/data models