Back to jobs
Senior Data Engineer
Successfully
Req. VR-118484
Join our Development Center and become a member of our open-minded, progressive and professional team. You will have a chance to grow your technical and soft skills, and build a thorough expertise of the industry of our client. In this role you will be working on projects for one our world famous clients, a large international investment bank.
On top of attractive salary and benefits package, Luxoft will invest into your professional training, and allow you to grow your professional career.
Key Responsibilities:
Solution Design: Architect data pipelines down to the low-level elements, ensuring clarity and precision in implementation.
Data Sourcing: Extract data from diverse repositories including relational
databases (Oracle, PostgreSQL), NoSQL stores, file systems, and other
structured/unstructured sources.
Data Transformation: Design and implement ETL/ELT workflows to standardize and cleanse data using best practices in data engineering.
Pipeline Development: Build scalable, fault-tolerant data pipelines that support batch and streaming use cases.
Cloud data processing: Load transformed data into GCP destinations such as BigQuery or Cloud Storage using tools like Dataproc, Dataflow, and other GCPnative services.
Workflow Orchestration: Design and manage workflows using orchestration tools such as Apache Airflow or Cloud Composer.
Data Format Expertise: Work with various data formats including JSON, AVRO, Parquet, CSV, and others.
Optimization & Monitoring: Ensure performance, reliability, and cost-efficiency of data pipelines through continuous monitoring and tuning.
Collaboration: Work closely with data architects, analysts, and business
stakeholders to understand data requirements and deliver high-quality solutions.
Must have
Required Skills & Experience:
Proven experience in data engineering across hybrid environments (on-premise and cloud).
Strong proficiency in SQL and Python or Java/Scala.
Hands-on experience with ETL/ELT tools and frameworks.
Deep understanding of GCP data services: BigQuery, Dataproc, Dataflow, Cloud Storage.
Familiarity with data modeling, schema design, and metadata management.
Experience with workflow orchestration tools (e.g., Apache Airflow, Cloud
Composer).
Knowledge of data governance, security, and compliance best practices.
Nice to have
Preferred Qualifications:
GCP certification (e.g., Professional Data Engineer) is a major plus
Experience with CI/CD for data pipelines.
Exposure to containerization and Kubernetes.
Familiarity with data cataloging tools / metadata management
Experience with Big Data technologies
Languages
English: C1 Advanced
Seniority
Senior
Bucharest, Romania
Req. VR-118484
BigData Development
BCM Industry
20/10/2025
Req. VR-118484
Apply for Senior Data Engineer in Bucharest
*Indicates a required field