The AI Job Search Engine
Data & Integration Engineer(m/w/x)
Building and maintaining scalable data pipelines for AI/ML use cases in pharmaceutical research. Proficiency with Apache ecosystem, AWS data services, and Snowflake expected. Additional days off, subsidized staff restaurant.
Requirements
- Bachelor's/Master's degree in Computer Science, Engineering, related field, or equivalent proven experience as Data Engineer, Software Developer, or similar role
- Proficiency with Apache ecosystem (Spark, Parquet, Iceberg, Kafka, Airflow)
- Advanced SQL expertise for data modeling, optimized queries, and performance improvement
- Hands-on experience with AWS data services (Kinesis, Glue, Appflow, Lambda, S3)
- Hands-on experience with Snowflake
- Hands-on experience with dbt (dbt labs) for data pipelines
- Strong analytical skills with unstructured datasets
- Excellent English written and verbal communication skills
Tasks
- Design scalable data pipelines
- Develop scalable data pipelines
- Maintain scalable data pipelines
- Design ETL processes
- Develop ETL processes
- Maintain ETL processes
- Collaborate with data architects and modelers
- Collaborate with IT team members
- Define cloud-based data architecture strategy
- Evolve cloud-based data architecture strategy
- Optimize data storage solutions
- Manage data storage solutions
- Optimize data integrations (e.g., S3, Snowflake, dbt, Snaplogic)
- Manage data integrations (e.g., S3, Snowflake, dbt, Snaplogic)
- Ensure data quality, integrity, security, and accessibility
- Utilize AWS cloud services for data engineering workflows
- Monitor data pipelines for performance
- Optimize data pipelines for performance
- Monitor data pipelines for scalability
- Optimize data pipelines for scalability
- Monitor data pipelines for cost efficiency
- Optimize data pipelines for cost efficiency
Work Experience
- approx. 1 - 4 years
Education
- Bachelor's degreeOR
- Master's degree
Languages
- English – Business Fluent
Tools & Technologies
- Spark
- Parquet
- Iceberg
- Kafka
- Airflow
- SQL
- Kinesis
- Glue
- Appflow
- Lambda
- S3
- Snowflake
- dbt
Benefits
Flexible Working
- Home office
- Flexible working hours
More Vacation Days
- Additional days off
Free or Subsidized Food
- Subsidized staff restaurant
- Vegetarian and vegan options
Learning & Development
- Training and development opportunities
Healthcare & Fitness
- Health promotion programs
Public Transport Subsidies
- Public transport ticket
Not a perfect match?
- Raiffeisen Bank InternationalFull-timeWith HomeofficeExperiencedWienfrom 64,500 / year
- Raiffeisen Bank International
Data Engineer for Corporate & Investment Banking(m/w/x)
Full-timeWith HomeofficeExperiencedWienfrom 64,500 / year - Capgemini
(Senior) Data Engineer (Databricks/Snowflake)(m/w/x)
Full-timeWith HomeofficeExperiencedWien, Grazfrom 60,000 / year - paiqo GmbH
Azure Data Engineer II - Agentic AI Platform(m/w/x)
Full-timeWith HomeofficeExperiencedWien - Bundesimmobiliengesellschaft m.b.H.
Data Engineer(m/w/x)
Full-timeWith HomeofficeExperiencedWienfrom 56,000 / year
Data & Integration Engineer(m/w/x)
Building and maintaining scalable data pipelines for AI/ML use cases in pharmaceutical research. Proficiency with Apache ecosystem, AWS data services, and Snowflake expected. Additional days off, subsidized staff restaurant.
Requirements
- Bachelor's/Master's degree in Computer Science, Engineering, related field, or equivalent proven experience as Data Engineer, Software Developer, or similar role
- Proficiency with Apache ecosystem (Spark, Parquet, Iceberg, Kafka, Airflow)
- Advanced SQL expertise for data modeling, optimized queries, and performance improvement
- Hands-on experience with AWS data services (Kinesis, Glue, Appflow, Lambda, S3)
- Hands-on experience with Snowflake
- Hands-on experience with dbt (dbt labs) for data pipelines
- Strong analytical skills with unstructured datasets
- Excellent English written and verbal communication skills
Tasks
- Design scalable data pipelines
- Develop scalable data pipelines
- Maintain scalable data pipelines
- Design ETL processes
- Develop ETL processes
- Maintain ETL processes
- Collaborate with data architects and modelers
- Collaborate with IT team members
- Define cloud-based data architecture strategy
- Evolve cloud-based data architecture strategy
- Optimize data storage solutions
- Manage data storage solutions
- Optimize data integrations (e.g., S3, Snowflake, dbt, Snaplogic)
- Manage data integrations (e.g., S3, Snowflake, dbt, Snaplogic)
- Ensure data quality, integrity, security, and accessibility
- Utilize AWS cloud services for data engineering workflows
- Monitor data pipelines for performance
- Optimize data pipelines for performance
- Monitor data pipelines for scalability
- Optimize data pipelines for scalability
- Monitor data pipelines for cost efficiency
- Optimize data pipelines for cost efficiency
Work Experience
- approx. 1 - 4 years
Education
- Bachelor's degreeOR
- Master's degree
Languages
- English – Business Fluent
Tools & Technologies
- Spark
- Parquet
- Iceberg
- Kafka
- Airflow
- SQL
- Kinesis
- Glue
- Appflow
- Lambda
- S3
- Snowflake
- dbt
Benefits
Flexible Working
- Home office
- Flexible working hours
More Vacation Days
- Additional days off
Free or Subsidized Food
- Subsidized staff restaurant
- Vegetarian and vegan options
Learning & Development
- Training and development opportunities
Healthcare & Fitness
- Health promotion programs
Public Transport Subsidies
- Public transport ticket
About the Company
Boehringer Ingelheim
Industry
Pharmaceuticals
Description
The company is focused on delivering lasting value to patients through innovative therapies and early clinical development.
Not a perfect match?
- Raiffeisen Bank International
Data Engineer for Corporate & Investment Banking(m/w/x)
Full-timeWith HomeofficeExperiencedWienfrom 64,500 / year - Raiffeisen Bank International
Data Engineer for Corporate & Investment Banking(m/w/x)
Full-timeWith HomeofficeExperiencedWienfrom 64,500 / year - Capgemini
(Senior) Data Engineer (Databricks/Snowflake)(m/w/x)
Full-timeWith HomeofficeExperiencedWien, Grazfrom 60,000 / year - paiqo GmbH
Azure Data Engineer II - Agentic AI Platform(m/w/x)
Full-timeWith HomeofficeExperiencedWien - Bundesimmobiliengesellschaft m.b.H.
Data Engineer(m/w/x)
Full-timeWith HomeofficeExperiencedWienfrom 56,000 / year