The AI Job Search Engine
Senior Data Engineer(m/w/x)
Architecting and scaling Databricks lakehouse solutions with Delta Lake and Medallion architecture for reputational risk data. Proven Dimensional Data Modelling and Data Vault experience required. Flexible working hours, state-of-the-art open-source technologies.
Requirements
- Bachelor’s Degree in computer science or related STEM field
- 5+ years hands-on Data Engineering experience
- Strong Python and SQL proficiency
- Solid Batch and stream processing experience
- Proven Dimensional Data Modelling and Data Vault experience
- Data Orchestration tools experience (Airflow or Dagster)
- Familiarity with data quality and validation frameworks
- Metadata tools integration experience (Collibra, OpenMetadata)
- Strong Git and CI/CD pipelines understanding
- Cloud platforms experience (AWS preferred)
- Practical Data Lakehouse experience (Databricks, Snowflake)
- Proactive mindset, strong ownership, initiative, drive
- Strong communication skills, professional English proficiency
- BPM software workflow configuration experience (Camunda)
- ML teams experience, ML/DL/NLP concepts familiarity
- Valid work permit
Tasks
- Architect, build, and scale modern data platforms
- Lead design and delivery of enterprise data infrastructure
- Implement Databricks lakehouse solutions
- Apply Delta Lake and Unity Catalog principles
- Utilize Medallion architecture (Bronze/Silver/Gold)
- Design, build, and maintain scalable ELT pipelines
- Leverage Databricks workflows and Delta Live Tables
- Develop pipelines with Apache Spark
- Develop and optimize high-throughput streaming and batch pipelines
- Use Spark Structured Streaming and Auto Loader
- Tune Databricks data platform performance
- Optimize Databricks data platform costs
- Govern Databricks cluster and compute resources
- Define and enforce data contracts and schemas
- Enforce data governance standards
- Utilize Unity Catalog and Delta Lake for governance
- Ensure data quality, observability, and lineage
- Use Databricks Data Observability tools
- Utilize Great Expectations for data validation
- Collaborate with data scientists, analysts, and platform teams
- Deliver reliable, self-serve data products
- Establish and champion data engineering best practices
- Establish and champion data engineering standards
- Establish reusable data engineering frameworks
- Stay current with Databricks ecosystem developments
- Monitor lakehouse architecture trends
- Track emerging data engineering patterns
- Participate in code reviews
- Maintain high standards for code quality
- Ensure code performance and security
- Engage in Agile/Scrum ceremonies
- Contribute architectural insights
- Provide technical direction to the team
Work Experience
Education
Languages
Tools & Technologies
Benefits
Flexible Working
- •Flexible working hours
Learning & Development
- •Skill development
Other Benefits
- •Team support
Startup Environment
- •Agile development ecosystem
Modern Equipment
- •State-of-the-art open-source technologies
Informal Culture
- •Entrepreneurial, international, and dynamic work environment
- •Diverse company culture
Purpose-Driven Work
- •Shared mission
- FriendsuranceFull-timeWith HomeofficeSeniorBerlin
- Tietoevry
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorfrom 3,843 / monthWien, Linz, Graz, Berlin - 4flow SE
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin - dda diconium data GmbH
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeManagementBerlin - Zalando
Senior CRM Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin
Senior Data Engineer(m/w/x)
Architecting and scaling Databricks lakehouse solutions with Delta Lake and Medallion architecture for reputational risk data. Proven Dimensional Data Modelling and Data Vault experience required. Flexible working hours, state-of-the-art open-source technologies.
Requirements
- Bachelor’s Degree in computer science or related STEM field
- 5+ years hands-on Data Engineering experience
- Strong Python and SQL proficiency
- Solid Batch and stream processing experience
- Proven Dimensional Data Modelling and Data Vault experience
- Data Orchestration tools experience (Airflow or Dagster)
- Familiarity with data quality and validation frameworks
- Metadata tools integration experience (Collibra, OpenMetadata)
- Strong Git and CI/CD pipelines understanding
- Cloud platforms experience (AWS preferred)
- Practical Data Lakehouse experience (Databricks, Snowflake)
- Proactive mindset, strong ownership, initiative, drive
- Strong communication skills, professional English proficiency
- BPM software workflow configuration experience (Camunda)
- ML teams experience, ML/DL/NLP concepts familiarity
- Valid work permit
Tasks
- Architect, build, and scale modern data platforms
- Lead design and delivery of enterprise data infrastructure
- Implement Databricks lakehouse solutions
- Apply Delta Lake and Unity Catalog principles
- Utilize Medallion architecture (Bronze/Silver/Gold)
- Design, build, and maintain scalable ELT pipelines
- Leverage Databricks workflows and Delta Live Tables
- Develop pipelines with Apache Spark
- Develop and optimize high-throughput streaming and batch pipelines
- Use Spark Structured Streaming and Auto Loader
- Tune Databricks data platform performance
- Optimize Databricks data platform costs
- Govern Databricks cluster and compute resources
- Define and enforce data contracts and schemas
- Enforce data governance standards
- Utilize Unity Catalog and Delta Lake for governance
- Ensure data quality, observability, and lineage
- Use Databricks Data Observability tools
- Utilize Great Expectations for data validation
- Collaborate with data scientists, analysts, and platform teams
- Deliver reliable, self-serve data products
- Establish and champion data engineering best practices
- Establish and champion data engineering standards
- Establish reusable data engineering frameworks
- Stay current with Databricks ecosystem developments
- Monitor lakehouse architecture trends
- Track emerging data engineering patterns
- Participate in code reviews
- Maintain high standards for code quality
- Ensure code performance and security
- Engage in Agile/Scrum ceremonies
- Contribute architectural insights
- Provide technical direction to the team
Work Experience
Education
Languages
Tools & Technologies
Benefits
Flexible Working
- •Flexible working hours
Learning & Development
- •Skill development
Other Benefits
- •Team support
Startup Environment
- •Agile development ecosystem
Modern Equipment
- •State-of-the-art open-source technologies
Informal Culture
- •Entrepreneurial, international, and dynamic work environment
- •Diverse company culture
Purpose-Driven Work
- •Shared mission
About the Company
RepRisk AG
Industry
FinancialServices
Description
RepRisk is the world’s most respected Data as a Service (DaaS) company for reputational risks and responsible business conduct.
- Friendsurance
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin - Tietoevry
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorfrom 3,843 / monthWien, Linz, Graz, Berlin - 4flow SE
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin - dda diconium data GmbH
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeManagementBerlin - Zalando
Senior CRM Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin