Your personal AI career agent
Senior Data Platform Engineer(m/w/x)
Building and operating event-driven data pipelines on GCP for ML/AI features. 4+ years data engineering experience with Python and Airflow required. 30 vacation days, flexible hours, and mental health coaching.
Requirements
- 4+ years Data Engineering or similar experience
- Strong Python skills for production code
- Strong SQL skills (window functions, CTEs, optimization)
- Experience with event-driven data pipelines
- Expert with Airflow (DAGs, dependencies, retries, monitoring)
- Strong Snowflake or modern data warehouse knowledge
- Cloud platform experience (GCP, AWS, Azure)
- Infrastructure-as-code experience (Terraform, Helm)
- K8S and Docker containerization experience
- Data quality mindset (profiling, validation, checks)
- Data for AI readiness (ML/AI data preparation, governance, lineage, privacy)
- Awareness of data privacy requirements (PII, GDPR, anonymization)
- AI-enabled engineering practices (using AI assistants, code generation)
Tasks
- Design and build data infrastructure
- Operate data pipelines for reliability and compliance
- Enable ML and AI-powered product experiences
- Manage GCP resources and environments
- Build and maintain data ingestion pipelines
- Operate and improve data orchestration
- Design complex data models
- Monitor data quality and build validation
- Implement privacy and compliance controls
- Prepare data for ML and AI use cases
- Leverage AI in engineering workflows
- Develop scalable AI platform capabilities
Work Experience
- 4 years
Education
- Bachelor's degreeOR
- Master's degree
Languages
- English – Business Fluent
Tools & Technologies
- Python
- SQL
- Airflow
- Snowflake
- GCP
- AWS
- Azure
- Terraform
- Helm
- K8S
- Docker
Benefits
More Vacation Days
- 30 annual vacation days
Flexible Working
- Flexible working hours
Mental Health Support
- Holistic well-being
- Free mental health coaching sessions
Healthcare & Fitness
- Yoga
Additional Allowances
- Monthly allowance for services
Competitive Pay
- Employee stock options
Workation & Sabbatical
- Work from abroad up to six weeks
Team Events
- Team social events
- Internal tech meetups
- International team get-togethers
Other Benefits
- Free tax declaration filing
- Internal support for tax questions
Informal Culture
- Dogs allowed in office
Like this job?
BetaYour Career Agent finds similar jobs for you every day.
Not a perfect match?
- Merantix MomentumFull-timeWith HomeofficeExperiencedBerlin
- 4flow SE
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin - Tietoevry
(Senior) Data & AI Engineer(m/w/x)
Full-timeWith HomeofficeSeniorWien, Graz, Linz, Berlin, Regensburgfrom 3,954 / month - Deutsche Bank Aktiengesellschaft
Senior Data Engineer – Corporate Bank Technology - Commercial Banking(m/w/x)
Full-time/Part-timeWith HomeofficeSeniorBerlin - Delivery Hero
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin
Senior Data Platform Engineer(m/w/x)
Building and operating event-driven data pipelines on GCP for ML/AI features. 4+ years data engineering experience with Python and Airflow required. 30 vacation days, flexible hours, and mental health coaching.
Requirements
- 4+ years Data Engineering or similar experience
- Strong Python skills for production code
- Strong SQL skills (window functions, CTEs, optimization)
- Experience with event-driven data pipelines
- Expert with Airflow (DAGs, dependencies, retries, monitoring)
- Strong Snowflake or modern data warehouse knowledge
- Cloud platform experience (GCP, AWS, Azure)
- Infrastructure-as-code experience (Terraform, Helm)
- K8S and Docker containerization experience
- Data quality mindset (profiling, validation, checks)
- Data for AI readiness (ML/AI data preparation, governance, lineage, privacy)
- Awareness of data privacy requirements (PII, GDPR, anonymization)
- AI-enabled engineering practices (using AI assistants, code generation)
Tasks
- Design and build data infrastructure
- Operate data pipelines for reliability and compliance
- Enable ML and AI-powered product experiences
- Manage GCP resources and environments
- Build and maintain data ingestion pipelines
- Operate and improve data orchestration
- Design complex data models
- Monitor data quality and build validation
- Implement privacy and compliance controls
- Prepare data for ML and AI use cases
- Leverage AI in engineering workflows
- Develop scalable AI platform capabilities
Work Experience
- 4 years
Education
- Bachelor's degreeOR
- Master's degree
Languages
- English – Business Fluent
Tools & Technologies
- Python
- SQL
- Airflow
- Snowflake
- GCP
- AWS
- Azure
- Terraform
- Helm
- K8S
- Docker
Benefits
More Vacation Days
- 30 annual vacation days
Flexible Working
- Flexible working hours
Mental Health Support
- Holistic well-being
- Free mental health coaching sessions
Healthcare & Fitness
- Yoga
Additional Allowances
- Monthly allowance for services
Competitive Pay
- Employee stock options
Workation & Sabbatical
- Work from abroad up to six weeks
Team Events
- Team social events
- Internal tech meetups
- International team get-togethers
Other Benefits
- Free tax declaration filing
- Internal support for tax questions
Informal Culture
- Dogs allowed in office
Like this job?
BetaYour Career Agent finds similar jobs for you every day.
About the Company
Taxfix
Industry
FinancialServices
Description
The company simplifies tax filing through an intuitive app, enabling users to file taxes confidently.
Not a perfect match?
- Merantix Momentum
AI Platform Engineer(m/w/x)
Full-timeWith HomeofficeExperiencedBerlin - 4flow SE
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin - Tietoevry
(Senior) Data & AI Engineer(m/w/x)
Full-timeWith HomeofficeSeniorWien, Graz, Linz, Berlin, Regensburgfrom 3,954 / month - Deutsche Bank Aktiengesellschaft
Senior Data Engineer – Corporate Bank Technology - Commercial Banking(m/w/x)
Full-time/Part-timeWith HomeofficeSeniorBerlin - Delivery Hero
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin