Die KI-Suchmaschine für Jobs
Software Engineer, AI Data Infrastructure(m/w/x)
Beschreibung
In this role, you will be at the forefront of maintaining and evolving a robust data platform for AI pipelines. Your work will involve designing data ingestion processes, building APIs, and ensuring data integrity while collaborating with research teams.
Lass KI die perfekten Jobs für dich finden!
Lade deinen CV hoch und die Nejo-KI findet passende Stellenangebote für dich.
Anforderungen
- •4 - 6 years proficiency in Python and Go
- •Comfort with containerization and orchestration tools like Docker and Kubernetes
- •Experience with AWS services (S3, KMS, IAM) and Terraform
- •Skilled in designing and operating data ingestion and transformation workflows
- •Familiarity with CI/CD pipelines and version control practices, ideally using GitHub Actions
- •Commitment to building secure and observable systems
- •Ability to contribute lightweight internal dashboards using frameworks like Streamlit or Next.js
- •Obsession with reliability, observability, and data governance
- •Strong fundamentals in data modeling and schema design
- •Comfortable working with NoSQL systems like MongoDB
- •Experience with event-driven data pipelines using SQS, SNS, Lambda, or Step Functions
- •Knowledge of data partitioning strategies and large-scale dataset optimization
- •Familiarity with metadata management and dataset versioning
- •Exposure to monitoring and alerting stacks such as Datadog or Prometheus
- •Proficiency in Rust or interest in learning it
Berufserfahrung
4 - 6 Jahre
Aufgaben
- •Design and maintain ingestion pipelines for data transfer
- •Develop transformation processes for clean, structured tables
- •Build internal APIs and backend services for data access
- •Instrument systems with metrics and automated checks
- •Contribute to MLOps tooling for dataset monitoring
- •Improve CI/CD pipelines and integration tests
- •Build lightweight dashboards for internal data access
- •Design scalable, fault-tolerant real-time data pipelines
- •Own the lifecycle of critical data assets with tracking and control
- •Work with structured and semi-structured data sources
Tools & Technologien
Sprachen
Englisch – verhandlungssicher
Benefits
Gesundheits- & Fitnessangebote
- •Healthcare
- •Dental
Sonstige Vorteile
- •Vision
- •Life insurance
Betriebliche Altersvorsorge
- •401(k) plan and match
Flexibles Arbeiten
- •Flexible time off
Parkplatz & Pendelvorteile
- •Commuter benefits
- SpeechifyVollzeitnur vor OrtSeniorMünchen
- Resaro
Senior Software Engineer - AI Platform(m/w/x)
Vollzeitnur vor OrtSeniorMünchen - Intrinsic
Senior Software Engineer, ML Ops & Infrastructure(m/w/x)
Vollzeitnur vor OrtSeniorMünchen - Intrinsic
Senior Research Engineer, Data Engine(m/w/x)
Vollzeitnur vor OrtSeniorMünchen - NVIDIA
Senior Software Engineer(m/w/x)
Vollzeitnur vor OrtSeniorMünchen
Software Engineer, AI Data Infrastructure(m/w/x)
Die KI-Suchmaschine für Jobs
Beschreibung
In this role, you will be at the forefront of maintaining and evolving a robust data platform for AI pipelines. Your work will involve designing data ingestion processes, building APIs, and ensuring data integrity while collaborating with research teams.
Lass KI die perfekten Jobs für dich finden!
Lade deinen CV hoch und die Nejo-KI findet passende Stellenangebote für dich.
Anforderungen
- •4 - 6 years proficiency in Python and Go
- •Comfort with containerization and orchestration tools like Docker and Kubernetes
- •Experience with AWS services (S3, KMS, IAM) and Terraform
- •Skilled in designing and operating data ingestion and transformation workflows
- •Familiarity with CI/CD pipelines and version control practices, ideally using GitHub Actions
- •Commitment to building secure and observable systems
- •Ability to contribute lightweight internal dashboards using frameworks like Streamlit or Next.js
- •Obsession with reliability, observability, and data governance
- •Strong fundamentals in data modeling and schema design
- •Comfortable working with NoSQL systems like MongoDB
- •Experience with event-driven data pipelines using SQS, SNS, Lambda, or Step Functions
- •Knowledge of data partitioning strategies and large-scale dataset optimization
- •Familiarity with metadata management and dataset versioning
- •Exposure to monitoring and alerting stacks such as Datadog or Prometheus
- •Proficiency in Rust or interest in learning it
Berufserfahrung
4 - 6 Jahre
Aufgaben
- •Design and maintain ingestion pipelines for data transfer
- •Develop transformation processes for clean, structured tables
- •Build internal APIs and backend services for data access
- •Instrument systems with metrics and automated checks
- •Contribute to MLOps tooling for dataset monitoring
- •Improve CI/CD pipelines and integration tests
- •Build lightweight dashboards for internal data access
- •Design scalable, fault-tolerant real-time data pipelines
- •Own the lifecycle of critical data assets with tracking and control
- •Work with structured and semi-structured data sources
Tools & Technologien
Sprachen
Englisch – verhandlungssicher
Benefits
Gesundheits- & Fitnessangebote
- •Healthcare
- •Dental
Sonstige Vorteile
- •Vision
- •Life insurance
Betriebliche Altersvorsorge
- •401(k) plan and match
Flexibles Arbeiten
- •Flexible time off
Parkplatz & Pendelvorteile
- •Commuter benefits
Über das Unternehmen
Tools for Humanity
Branche
IT
Beschreibung
The company is built on privacy-preserving proof-of-human technology and enables the free flow of digital assets for all.
- Speechify
Software Engineer, Data Infrastructure & Acquisition(m/w/x)
Vollzeitnur vor OrtSeniorMünchen - Resaro
Senior Software Engineer - AI Platform(m/w/x)
Vollzeitnur vor OrtSeniorMünchen - Intrinsic
Senior Software Engineer, ML Ops & Infrastructure(m/w/x)
Vollzeitnur vor OrtSeniorMünchen - Intrinsic
Senior Research Engineer, Data Engine(m/w/x)
Vollzeitnur vor OrtSeniorMünchen - NVIDIA
Senior Software Engineer(m/w/x)
Vollzeitnur vor OrtSeniorMünchen