Du entwirfst skalierbare Dateninfrastrukturen und implementierst ETL-Pipelines, während du eng mit Stakeholdern zusammenarbeitest und Datenqualität sicherstellst.
Anforderungen
- •Bachelor’s or Master’s degree in Computer Science
- •7+ years of experience in software development
- •5+ years of experience in data engineering
- •Proven experience with big data technologies
- •Experience with cloud platforms (AWS, Azure, GCP)
- •Proficiency in Java/Kotlin, Python or Scala
- •Strong knowledge of SQL
- •Experience with relational databases
- •Experience with data modeling and ETL/ELT processes
- •Experience with data pipeline orchestration tools
- •Excellent problem-solving and analytical skills
- •Strong communication and collaboration skills
- •Ability to work independently and as part of a team
- •Experience with real-time data processing
- •Knowledge of machine learning workflows
- •Familiarity with feature management tools
- •Knowledge of MLOps practices and tools
- •Familiarity with data governance frameworks
Deine Aufgaben
- •Dateninfrastruktur-Lösungen entwerfen und implementieren
- •ETL/ELT-Pipelines entwickeln und warten
- •Datenqualität, Integrität und Konsistenz sicherstellen
- •Datenanforderungen von Stakeholdern verstehen und umsetzen
- •Nahtlose Integration und Bereitstellung von Datenlösungen gewährleisten
- •Datenlagerung und -abruf optimieren
- •Best Practices für Datenmanagement und Sicherheit implementieren
- •Regelmäßige Wartung der Dateninfrastruktur durchführen
- •Datenpipelines und -infrastruktur überwachen und beheben
Deine Vorteile
Wachstumsmöglichkeiten im Insurtech
Kreative Freiräume und schnelle Entscheidungen
Generöses Mitarbeiterbeteiligungsprogramm
Attraktive Unternehmensrente
Betriebliche Krankenversicherung mit Erstattung
Teamgeist und gemeinsame Veranstaltungen
Hybrides Arbeiten für Flexibilität
Vielfältige Entwicklungsangebote
Gesundheitsangebote wie Radleasing
Moderne Büros mit kostenlosen Getränken
Option auf unbezahlten Zusatzurlaub
Original Beschreibung
## Senior Data Engineer (m/f/d)
###### Permanent employee, Full-time · München, Mobile Work
---
##### The role
We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this role, you will be responsible for architecting and implementing a scalable data infrastructure with efficient data pipelines. You will work closely with data scientists, analytics and product teams to ensure the creation of impactful solutions. This role requires a deep understanding of software and data engineering principles, cloud technologies, and big data frameworks.
Key Responsibilities:
* Architect and implement scalable data infrastructure solutions using modern tools, services and cloud environments
* Develop and maintain ETL/ELT pipelines to ingest, process, and transform large datasets from various sources
* Ensure data quality, integrity, and consistency across the data infrastructure
* Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions that meet their needs
* Collaborate with DevOps and IT teams to ensure seamless integration and deployment of data solutions
* Optimise data storage and retrieval processes to ensure high performance and low latency
* Implement best practices for data governance, security, and compliance
* Perform regular maintenance and updates to keep the data infrastructure up-to-date with the latest technologies and industry standards
* Monitor and troubleshoot data pipelines and infrastructure to ensure reliability and availability
##### What we are looking for
* Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
* 7+ years of experience in software and 5+ years in data engineering, with a focus on building and managing data lakehouse or data warehouse solutions
* Proven experience with big data technologies (e.g., Apache Spark, Apache Iceberg, Snowflake, Databricks) and cloud platforms (AWS, Azure, GCP)
* Proficiency in programming languages such as Java/Kotlin, Python or Scala
* Strong knowledge of SQL and experience with relational databases
* Experience with data modeling, ETL/ELT processes, and data pipeline orchestration tools (e.g., Apache Airflow)
* Excellent problem-solving and analytical skills
* Strong communication and collaboration skills
* Ability to work independently and as part of a team in a fast-paced environment
Preferred Qualifications:
* Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis)
* Knowledge of machine learning and data science workflows
* Familiarity with advanced feature management tools
* Knowledge of MLOps practices and tools such as Docker, Kubernetes, and MLflow
* Familiarity with data governance and compliance frameworks (e.g., GDPR, CCPA)