Your personal AI career agent
AI Platform Engineer(m/w/x)
Building and operating internal analytics platforms with dbt and GCP BigQuery. Python and SQL proficiency required. Visa sponsorship, annual learning budget, weekly AI innovation events.
Requirements
- MSc in Computer Science, Software Engineering, or related technical field, or equivalent experience
- Strong SQL proficiency and dbt experience
- Hands-on experience building and maintaining data pipelines and ETL/ELT processes
- Python proficiency for data processing and automation
- Cloud data platforms experience (preferably GCP BigQuery, AWS/Azure transferable)
- Familiarity with data orchestration tools (Airflow, Prefect, or similar)
- Understanding of data modeling, warehousing concepts, and analytics engineering best practices
- Some exposure to containerization technologies (Docker) and infrastructure tools
- Understanding of CI/CD processes and version control
- Experience with monitoring and observability tools
- Interest in learning Infrastructure as Code tools like Terraform
- Excellent communication skills
- Initiative and self-motivation
- Enthusiasm for problem-solving and groundwork contribution
- Empathy and openness to new ideas and perspectives
- Fluency in English
Tasks
- Build and maintain data pipelines using dbt, SQL, and modern data stack tools
- Develop robust data models and transformations for analytics and machine learning
- Implement data quality frameworks and monitoring with tools like Great Expectations
- Collaborate with data scientists, analysts, and stakeholders to create scalable data solutions
- Lead the operation and enhancement of the internal analytics platform
- Design and implement data validation, testing, and documentation standards
- Learn and contribute to Infrastructure as Code (IaC) solutions
- Take ownership of data infrastructure optimization and scaling
- Support the enhancement of orchestration tools like Flyte and Airflow
- Develop internal development tools for data and engineering teams
- Participate in open-source projects and contribute to the data engineering community
- Collaborate with client teams, including CV, NLP, and Fullstack, to deliver data-driven solutions
Work Experience
- approx. 1 - 4 years
Education
- Master's degree
Languages
- English – Native
Tools & Technologies
- SQL
- dbt
- Python
- GCP BigQuery
- AWS
- Azure
- Airflow
- Prefect
- Docker
- Terraform
Benefits
Other Benefits
- Visa sponsorship
Learning & Development
- Annual learning and development budget
Team Events
- Weekly AI innovation events
- Monthly AI innovation events
Flexible Working
- Flexible work-from-home policy
- Flexible working hours
Workation & Sabbatical
- Sabbatical leave
Modern Office
- Brand new office space
Modern Equipment
- Hardware allowance
Additional Allowances
- Monthly mobility allowance
- Monthly sports allowance
- Monthly grocery allowance
Not a perfect match?
- DWS International GmbHFull-timeWith HomeofficeExperiencedBerlin
- Flix
Data Platform Engineer (Mid-Level)(m/w/x)
Full-timeWith HomeofficeExperiencedBerlin - 4flow SE
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin - Taxfix
(Senior) Analytics Engineer - Marketing(m/w/x)
Full-timeWith HomeofficeExperiencedBerlin - Intercom
Senior Analytics Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin
AI Platform Engineer(m/w/x)
Building and operating internal analytics platforms with dbt and GCP BigQuery. Python and SQL proficiency required. Visa sponsorship, annual learning budget, weekly AI innovation events.
Requirements
- MSc in Computer Science, Software Engineering, or related technical field, or equivalent experience
- Strong SQL proficiency and dbt experience
- Hands-on experience building and maintaining data pipelines and ETL/ELT processes
- Python proficiency for data processing and automation
- Cloud data platforms experience (preferably GCP BigQuery, AWS/Azure transferable)
- Familiarity with data orchestration tools (Airflow, Prefect, or similar)
- Understanding of data modeling, warehousing concepts, and analytics engineering best practices
- Some exposure to containerization technologies (Docker) and infrastructure tools
- Understanding of CI/CD processes and version control
- Experience with monitoring and observability tools
- Interest in learning Infrastructure as Code tools like Terraform
- Excellent communication skills
- Initiative and self-motivation
- Enthusiasm for problem-solving and groundwork contribution
- Empathy and openness to new ideas and perspectives
- Fluency in English
Tasks
- Build and maintain data pipelines using dbt, SQL, and modern data stack tools
- Develop robust data models and transformations for analytics and machine learning
- Implement data quality frameworks and monitoring with tools like Great Expectations
- Collaborate with data scientists, analysts, and stakeholders to create scalable data solutions
- Lead the operation and enhancement of the internal analytics platform
- Design and implement data validation, testing, and documentation standards
- Learn and contribute to Infrastructure as Code (IaC) solutions
- Take ownership of data infrastructure optimization and scaling
- Support the enhancement of orchestration tools like Flyte and Airflow
- Develop internal development tools for data and engineering teams
- Participate in open-source projects and contribute to the data engineering community
- Collaborate with client teams, including CV, NLP, and Fullstack, to deliver data-driven solutions
Work Experience
- approx. 1 - 4 years
Education
- Master's degree
Languages
- English – Native
Tools & Technologies
- SQL
- dbt
- Python
- GCP BigQuery
- AWS
- Azure
- Airflow
- Prefect
- Docker
- Terraform
Benefits
Other Benefits
- Visa sponsorship
Learning & Development
- Annual learning and development budget
Team Events
- Weekly AI innovation events
- Monthly AI innovation events
Flexible Working
- Flexible work-from-home policy
- Flexible working hours
Workation & Sabbatical
- Sabbatical leave
Modern Office
- Brand new office space
Modern Equipment
- Hardware allowance
Additional Allowances
- Monthly mobility allowance
- Monthly sports allowance
- Monthly grocery allowance
About the Company
Merantix Momentum
Industry
IT
Description
The company is a Berlin-based AI startup focusing on accelerating AI adoption across many industries through B2B solutions and research projects.
Not a perfect match?
- DWS International GmbH
Platform Engineer(m/w/x)
Full-timeWith HomeofficeExperiencedBerlin - Flix
Data Platform Engineer (Mid-Level)(m/w/x)
Full-timeWith HomeofficeExperiencedBerlin - 4flow SE
Senior Data Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin - Taxfix
(Senior) Analytics Engineer - Marketing(m/w/x)
Full-timeWith HomeofficeExperiencedBerlin - Intercom
Senior Analytics Engineer(m/w/x)
Full-timeWith HomeofficeSeniorBerlin