Back to jobs
Regular Data Engineer - Hybrid Environment (On-Prem & Cloud)
Successfully
Req. VR-119051
Join our Development Center and become a member of our open-minded, progressive and professional team. You will have a chance to grow your technical and soft skills, and build a thorough expertise of the industry of our client. In this role you will be working on projects for one our world famous clients, a large international investment bank.
On top of attractive salary and benefits package, Luxoft will invest into your professional training, and allow you to grow your professional career.
Key Responsibilities:
Solution Design: Architect data pipelines down to the low-level elements, ensuring clarity and precision in implementation.
Data Sourcing: Extract data from diverse repositories including relational
databases (Oracle, PostgreSQL), NoSQL stores, file systems, and other
structured/unstructured sources.
Data Transformation: Design and implement ETL/ELT workflows to standardize and cleanse data using best practices in data engineering.
Pipeline Development: Build scalable, fault-tolerant data pipelines that support batch and streaming use cases.
Cloud data processing: Load transformed data into GCP destinations such as BigQuery or Cloud Storage using tools like Dataproc, Dataflow, and other GCPnative services.
Workflow Orchestration: Design and manage workflows using orchestration tools such as Apache Airflow or Cloud Composer.
Data Format Expertise: Work with various data formats including JSON, AVRO, Parquet, CSV, and others.
Optimization & Monitoring: Ensure performance, reliability, and cost-efficiency of data pipelines through continuous monitoring and tuning.
Collaboration: Work closely with data architects, analysts, and business
stakeholders to understand data requirements and deliver high-quality solutions.
Must have
Required Skills & Experience:
Experience in data engineering across hybrid environments (on-premise and cloud).
Proficiency in SQL and Python or Java/Scala.
Hands-on experience with ETL/ELT tools and frameworks.
Good understanding of GCP data services: BigQuery, Dataproc, Dataflow, Cloud Storage.
Familiarity with data modeling, schema design, and metadata management.
Knowledge of data governance, security, and compliance best practices.
Nice to have
Preferred Qualifications:
GCP certification (e.g., Professional Data Engineer) is a major plus
Experience with CI/CD for data pipelines.
Exposure to containerization and Kubernetes.
Familiarity with data cataloging tools / metadata management
Experience with Big Data technologies
Languages
English: C1 Advanced
Seniority
Regular
Bucharest, Romania
Req. VR-119051
BigData Development
BCM Industry
13/11/2025
Req. VR-119051
Apply for Regular Data Engineer - Hybrid Environment (On-Prem & Cloud) in Bucharest
*Indicates a required field