Du analysierst Geschäftsflüsse, entwickelst Datenverarbeitungspipelines und setzt komplexe Datenumwandlungen um, um die Datenqualität sicherzustellen.
Anforderungen
- •Bachelor's or master's student in computer science
- •Strong interest in ETL/ELT principles and data warehouse modelling
- •Experience with ETL frameworks/tools like SSIS, Talend, Apache NiFi
- •Strong interest in relational databases like Oracle, SQL
- •Knowledge of Kafka, Json, Avro, Xml formats
- •Curious and rigorous with analytical skills
- •Fluent in French and English
- •Residence in Switzerland
Deine Aufgaben
- •Aktuelle Geschäftsabläufe analysieren
- •Batch-Datenverarbeitungs-Pipelines entwickeln
- •Komplexe Datenumwandlungen umsetzen
- •An Datenmodellierung teilnehmen
- •Validierungs- und Nichtregressionstests implementieren
- •Dokumentation der Verarbeitungen erstellen
Original Beschreibung
# Internship – Data Integration Specialist (6 months)
**Geneva** | **Full time**
**Internship – Data Integration Specialist (6 months)**
As part of our Data Integration team in the Data Office, the intern will participate in the implementation and industrialization of data pipelines to feed our data warehouse. This internship is a concrete opportunity for the intern to train in modern data engineering tools and to contribute to a structuring data integration project.
**YOUR ROLE**
* Analyze the current business flows in different sources: databases, KAFKA, etc.
* Design and develop batch data processing pipelines to feed the data warehouse.
* Implement complex data transformations (enrichment, joins, cleaning, etc.).
* Participate in data modelling (star or snowflake model) and the creation of fact and dimension tables.
* Contribute to the implementation of validation tests and non-regression tests to guarantee the quality and stability of flows.
* Document the processing carried out and participate in the continuous improvement of processes.
**YOUR PROFILE**
* Bachelor’s or master's student in computer science, data engineering, or equivalent course
* Strong interested in ETL/ELT principles and data warehouse modelling. A previous experience with an ETL framework/tool will be appreciated (SSIS, Talend, xDi, Apache NiFi, Airflow…)
* Strong interest in relational databases (Oracle, SQL, etc.)
* Knowledge of Kafka, Json, Avro, Xml formats will be appreciated
* Curious, rigorous, with a good analytical mind and a desire to progress in a technical environment
* Fluent in French and English
* **Residence in Switzerland**
**TECHNICAL ENVIRONMENT**
* Development under an ETL framework
* Database: Oracle, SqlServer
* Language / Format: SQL, Json, Avro
* Kafka streaming
* Tools: Git, Jira
**Our Maison’s DNA** is defined by five core values. **Excellence** drives us to be the best at what we do, while **Innovation** fuels our progress. **Respect** underpins every interaction, and **Integrity** shapes our actions. Together, we are **One Team**, united in serving our clients with unwavering dedication.
As a responsible and supportive employer, we promote a diverse and inclusive work environment for our employees and candidates. **Diversity, Equity and Inclusion** are woven into the fabric of our Maison’s DNA, and we strive to ensure that our employees can fulfill both their personal and professional aspirations by encouraging internal mobility and individual upskilling programs. We firmly believe that building Diverse Teams contributes to our successes and to deliver on this, we actively embed Diversity, Equity and Inclusion in our business strategy.
**It is an exciting time to join our Teams.**