Back to jobs
Senior DevOps Engineer (Cloud experience)
Successfully
Req. VR-119184
Join our Development Center in Bucharest, and become a member of our open-minded, progressive and professional team. In this role you will be working on projects for one our world famous clients.
This role is DevOps for Data Engineering. You will not be a general infrastructure operator—you will build, operate, and evolve the DevOps foundation that data engineers depend on to reliably develop, deploy, and run data pipelines and analytics platforms across a hybrid on‑premise and Google Cloud environment.
You will work day‑to‑day with data engineers and data architects to turn platform requirements into automated, secure, observable, and cloud‑native systems, enabling teams to ship data products safely and at scale.
You will have a chance to grow your technical and soft skills, and build a thorough expertise of the industry of our client.
On top of attractive salary and benefits package, Luxoft will invest into your professional training, and allow you to grow your professional career.
DevOps Enablement for Data Engineering Teams Design and operate DevOps capabilities that directly support data ingestion, transformation, orchestration, and analytics workloads across hybrid environments.
Cloud Infrastructure for Data Platforms Build and manage GCP infrastructure optimized for data engineering use cases, including compute, storage, networking, and IAM required for batch and streaming data platforms.
GitHub‑First Engineering Practices Own GitHub as the system of record for infrastructure, CI/CD, and platform code:
Repository structure and standards
Branching strategies and pull‑request workflows
Code review, versioning, and release practices
GitHub‑based automation and integrations
CI/CD for Data Pipelines & Platforms Design and maintain CI/CD pipelines that enable:
Deployment of data pipelines and orchestration workflows
Infrastructure and platform changes via IaC
Safe promotion across environments (dev, test, prod)
Infrastructure as Code (IaC) Implement and maintain Infrastructure as Code to provision and manage hybrid and cloud resources in a repeatable, auditable, and version‑controlled manner.
Containerization & Platform Operations Build and operate containerized platforms using Docker and Kubernetes to support data processing frameworks, orchestration tools, and supporting services.
Observability for Data Systems Implement monitoring, logging, and alerting with a strong focus on data platforms—pipeline health, job failures, performance bottlenecks, SLAs, and operational visibility.
Security & Governance Enablement Partner with data engineers and architects to enforce security, access controls, secrets management, and compliance requirements across the data platform.
Performance & Cost Optimization Continuously optimize infrastructure and platform configurations to balance performance, scalability, and cost efficiency for data workloads.
Cross‑Functional Collaboration Act as a close technical partner to data engineers, data architects, and analytics teams—translating platform needs into reliable, self‑service DevOps solutions.
Must have
Proven experience as a DevOps / Platform / SRE engineer supporting data engineering platforms.
Strong, hands‑on proficiency with GitHub as a core engineering platform (PRs, reviews, CI/CD integrations, repo governance).
Solid cloud experience, with strong emphasis on Google Cloud Platform (GCP).
Experience designing and operating CI/CD pipelines for data platforms and infrastructure.
Hands‑on experience with Infrastructure as Code (e.g., Terraform or equivalent).
Strong Linux and automation skills using Python, Bash, or similar.
Practical experience with Docker and Kubernetes.
Experience implementing monitoring, logging, and alerting for distributed data systems.
Strong understanding of security, networking, and access control in enterprise environments.
Nice to have
GCP certification (Professional Cloud DevOps Engineer, Cloud Architect, or similar).
Direct experience supporting data engineering platforms such as:
Dataproc, Dataflow
BigQuery, Cloud Storage
Familiarity with workflow orchestration platforms (Apache Airflow, Cloud Composer).
Experience integrating CI/CD with data pipelines and orchestration workflows.
Exposure to big data technologies and distributed processing frameworks.
Experience building self‑service platforms for data engineering teams.
Languages
English: B2 Upper Intermediate
Seniority
Senior
Bucharest, Romania
Req. VR-119184
DevOps
BCM Industry
25/02/2026
Req. VR-119184
Apply for Senior DevOps Engineer (Cloud experience) in Bucharest
*Indicates a required field