Overview
Support the Agency's data transformation agenda through project-based assignments focusing on data engineering, analytics, and light AIML development.
Key Responsibilities
- Design, develop, and maintain scalable ELT/ETL pipelines in Snowflake.
- Collaborate to translate requirements into robust data models and efficient SQL transformations.
- Implement data quality frameworks and perform data validation.
- Support the development and maintenance of the Agency's Data Lakehouse architecture.
- Create and maintain documentation for data pipelines, lineage, and metadata.
- Develop dashboards and visualizations (Power BI, Tableau).
- Contribute to data cataloguing, metadata management, and adherence to data governance frameworks.
- Apply machine learning techniques to extract actionable insights.
- Support identification of opportunities for predictive and prescriptive analytics.
- Assist in building and deploying proof-of-concept ML models.
- Contribute to the exploration of AI/ML use cases.
Required Experience
Minimum of 2 years of relevant experience, including internships, in one or more of the following areas: Data Management or Information Management, Data Science or Analytics, Data Engineering, Artificial Intelligence (AI) or Machine Learning (ML).
Qualifications
University degree in Computer Science, Data Management, Information Management, Data Analytics, Statistics, Engineering, or a related field.