Are you ready to shape a better tomorrow?
AIA Digital+ is a Technology, Digital and Analytics innovation hub dedicated to powering AIA to be more efficient, connected and innovative as it fulfils its Purpose to help millions of people across Asia-Pacific live Healthier, Longer, Better Lives.
If you are hungry and driven to play an active role in shaping a better tomorrow, we want to hear from you. Because the work we do at AIA Digital+ makes a difference in the lives of millions of people, every day. We will equip you with the critical skills, tools and technology, and endless opportunities to learn, contribute and thrive in a dynamic and exciting environment.
If you want to shape a brighter future at AIA Digital+, please read on.
About the Role
Responsible for supporting the day-to-day activities of a particular area of the business on either full-time or part-time basis, with the aim of learning practical knowledge of the business area, the company and the industry
Roles and Responsibilities:
1. ETL / Data Pipeline Development
Assist in building and enhancing ETL/ELT pipelines for ingesting data from various insurance systems (policy, claims, agent management, etc.).
Develop and maintain data transformation logic using SQL/PySpark under supervision.
Participate in creating data mappings, transformation rules, and workflow documentation.
2. Data Quality & Validation
Perform data profiling, identify data anomalies, and support root-cause analysis.
Implement and test basic data quality checks (e.g., completeness, consistency, integrity).
Assist data engineers in validating data output across staging, refined, and consumption layers.
3. Testing & Deployment Support
Assist in preparing test cases, sample datasets, and expected results for ETL pipeline testing.
Support UAT activities, including executing test scripts and recording results.
Help with deployment activities using the team’s CI/CD framework.
4. Collaboration & Cross‑Team Support
Work closely with data engineers, business analysts, and data consumers to understand requirements.
Participate in team meetings, sprint reviews, and technical discussions.
Seek feedback proactively and continuously develop technical and analytical skills.
5. Learning & Development
Learn modern data engineering tools and concepts, such as:
Cloud data platforms (e.g., Azure Databricks / Data Factory / Delta Lake)
SQL, Python, PySpark
Data modeling fundamentals
ETL/ELT design patterns
Demonstrate curiosity and willingness to explore new tools and technologies relevant to the data platform
6. Other duties as assigned by superior
Job Requirements
Education: Bachelor’s degree in computer science, IT, or related field (or equivalent coursework).
Technical Skills
SQL/Python: Expertise in SQL script and Python.
Databases: Familiarity with relational databases (SQL Server, Oracle or PostgreSQL, etc.).
Soft Skills:
Strong problem-solving mindset and eagerness to learn.
Good communication skills to collaborate with cross-functional teams.
Good at English reading and writing
Work Approach & Attitude
Curiosity and growth mindset; comfortable learning new tools quickly. Ownership of tasks from assignment, then delivery to validation.
Reliable, organized, and able to manage time across multiple small tasks
Experience for ETL tools (e.g. Databricks, Informatica) and cloud platforms (Azure etc.) is added advantage.
Build a career with us as we help our customers and the community live healthier, longer, better lives.
You must provide all requested information, including Personal Data, to be considered for this career opportunity. Failure to provide such information may influence the processing and outcome of your application. You are responsible for ensuring that the information you submit is accurate and up-to-date.