We are seeking a dynamic and detail-oriented Data Engineer to join our growing team. The ideal candidate will handle data engineering tasks such as preparing data for ingestion via Oracle, PostgreSQL and Snowflake, as well as supporting API-related developments.
Responsibilities
- Build and maintain the Group’s data pipeline architecture, ensuring data is accessible, clean, and ready for use in data models, also with a focus on automating the pricing process through Snowflake and API integration.
- Develop and maintain Python scripts and data pipelines for automation and streamlined data process.
- Collaborate with multiple teams such as Pricing, Systems Architecture, Data Integration, and Software Development to automate and optimise data processes, reducing manual effort and trialling new free and paid data sources to enhance pricing decisions.
- Improve the efficiency of adding data to the API, exploring innovative approaches for smoother and more efficient data delivery.
- Contribute to scaling and improving the infrastructure for better data usability, ensuring smooth connectivity between data pipelines and pricing platforms to support strategic pricing decisions.
- Communicate pipeline issues, provide clear resolution steps and timelines, and confirm resolution, while facilitating cross-functional collaboration to enhance data quality and accessibility.
- Provide succinct summaries to senior stakeholders, not only on issues but also on progress and new opportunities within the pricing pipeline and data processes.
Qualifications
- Proven experience with data pipeline architectures, ideally with Oracle, Snowflake and Python.
- Advanced experience in data engineering, including API-related development and data integration projects.
- Strong proficiency in ETL processes and experience working with pricing systems or similar platforms.
- Strong project management and organisational skills, with experience managing multiple deadlines while collaborating across dynamic, cross-functional teams.
- Ability to manage and prepare large datasets, ensuring smooth integration into systems for analytical and decision-making purposes.
- Strong problem-solving skills and experience identifying and addressing data issues quickly.
- Proficiency in coding languages such as Python, SQL, and R, used for producing and manipulating data sets for analysis and modelling.
- Excellent written and verbal communication skills, including the ability to document technical processes and share insights clearly across different teams.
- We are looking for a candidate with 2+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
Desirable
- Experience in Snowflake will be an advantage.
- Experience in Tableau or similar BI tools.
- Experience in Java/.NET
- Knowledge of CI/CD practices.
- Experience with data governance, security and compliance frameworks.
Generating Apply Link...