Position

Data Engineer,
Remote Ukraine

Location


Remote Ukraine

Office Address


Project Description


We are establishing brand new Data Analytics team for global range customer from retail area.
In essence, client is leveraging the Azure PaaS platform with all kind of (business) data to build an advanced analytics platform aiming at delivering better insights and applications to the business.
The platforms are continuously being enhanced to support (additional) CI/CD and validated learning environment for science, machine learning and AI capabilities for all areas customer-facing like digital omni-channel interaction and commerce, commerce relevance, personalisation, loyalty and marketing and non-customer-facing like assortment optimization, supply chain optimization, external parties and IoT.
We will be working on end to end functionality including architecture, data preparation, processing and consumption by systems.

Responsibilities


    As Data Engineer you'll be working with alongside data architects to take data throughout its lifecycle - acquisition, exploration, data cleaning, integration, analysis, interpretation and visualization. You will be creating the pipeline for data processing, data visualization, and analytics products, including automated services, and APIs.

    You will be the go-to person for end-to-end data handling, management and analytics processes.

    You will:

    • Ingest data-sources into our data management platforms
    • Structure data into a scalable and easily understood architecture
    • Work in a multi-disciplined team where you'll turn data discoveries and ideas into models and insights. You'll find how to leverage the data and the models to create and improve products for our customers, in lean development cycles.
    • Be able to implement/build methodologies as well as (understand how to) scale them together with the businesses;
    • Maintain a good, current and demonstrable knowledge of adjacent application and market developments both for inspiration and for benchmarking the concepts.

Skills


Must have

    Essential Experience Required

    - Python/Pyspark
    - Azure Databricks (it's not mandatory if candidate knows Pyspark)
    - SQL
    - 5+ years industrial experience in the domain of large-scale data management, visualization and analytics


    Other qualifications
    - Basic knowledge in Azure Data factory.
    - MSc in a computational field or another relevant area
    - Hands-on experience incl. solid programming to implement pipelines integrating database management systems, cleaning data and improving its data quality
    - Expertise in advanced data modelling
    - Experience with Microsoft data management tools and the Azure platform environment
    - Curious, proactive, fast learner able to quickly picking-up new areas
    - Experience with agile methodologies
    - Perfect communication skills
    - Hands-on!
    - Can Do approach!

Nice to have

    • Working on cloud-based big data solutions using Hadoop/Spark;
    • SSAS cube development;
    • Enterprise BI reporting - Power BI;
    • Azure DevOps - CI/CD.

Languages


English: B2 Upper Intermediate

Relocation package


If needed, we can help you with relocation process. Click here for more details: see more details

Work Type


DWH Development

Ref Number


VR-56704

Explore More

LoGeek Magazine
icon Logeek Luxoft
Learn more
Events
icon Events Luxoft
Learn more
Relocation Program
icon Relocation Luxoft
Learn more
Referral
Platform
icon Referral Luxoft
Learn more
Students
and Grads
icon Students Luxoft
Learn more

More job opportunities in
DWH Development

Specialization Position / Title Location Send to a friend
DWH Development Data Engineer Remote Ukraine, UA
DWH Development Senior Data Engineer Remote Ukraine, UA
DWH Development Dev Tech Lead Remote Ukraine, UA
DWH Development Regular Data Engineer Remote Ukraine, UA