You design and maintain data pipelines, ensuring data quality while collaborating with data scientists. Your role includes managing the data lifecycle and developing tools to enhance efficiency and support R&D initiatives.
Anforderungen
- •Master’s degree in Computer Science
- •Solid experience in data engineering
- •Proficiency in Python, Java, or Scala
- •Experience with Snowflake, ETL tools, and SQL
- •Familiarity with AWS and Azure
- •Proficiency in Databricks and Unity Catalog
- •Strong problem-solving skills
- •Effective communication experience
Deine Aufgaben
- •Contribute to design and maintenance of data pipelines.
- •Implement and monitor data quality checks.
- •Design data pipelines for data collection and processing.
- •Collaborate with data scientists on data requirements.
- •Manage data lifecycle and assist in variable creation.
- •Develop digital tools to enhance efficiency.
- •Work with R&D stakeholders to leverage data.
- •Provide technical support on data-related issues.
Deine Vorteile
Career development opportunities
Flexible working arrangements
Respectful and inclusive culture
Dynamic international environment