Position

GCP Data Science Engineer,
Remote Poland

Location


Remote Poland

Office Address


Project Description


Implementation of a new service at Google Cloud Platform (GCP). Service a long term strategic solution for the bank providing a processing (BigData) of european payments data for the purpose of data analysis, data science, product management decisions, AML, storage, archiving and KYC processes. Mass Payment Data & Cloud processing platform using the latest technology stack and integrated Google tools. Interesting set of surrounding interfaces using different integration layers and protocols (API, SAPI, PubSub, BQ ingestions, Juniper ingress, egress, ConnectDirect, IBM MQ etc) as the consolidation of the orchestration layer achieving the sustainable quality and outcome.

Responsibilities


    • Build Machine Learning (ML) models on Payments data for various use cases.
    • Explain model results to stakeholders.
    • Explore Payments data for ML use case identification.
    • Data Analytics.
    • Analyze / Review requirements, prepare the design document, system / solution proposal document and system test plans.
    • Execute project specific development or configuration and maintenance activities in accordance to applicable standards and quality parameters.
    • Setting up the right environment for projects.
    • Ensure delivery within schedule by adhering to the engineering and quality standards.
    • Own & deliver end to end projects of Payment Data Platform.
    • Able to work under pressure on deliverables, violations and incidents.
    • Provide Weekly & Monthly Project Updates to stake holders & Management.

Skills


Must have

    • Experienced in Machine Learning (ML) modelling techniques like NLP, time series, Recommender systems, Neural nets etc.
    • Experienced in building production ML applications.
    • Experienced in data engineering (Hadoop, Data proc, Spark, etc.)
    • Python, SQL, Google Cloud Platform (GCP) knowledge/experience.
    • Experienced with relational databases like Oracle, MSSQL, Big Query etc.
    • Reporting tools like Tableau, Data studio, etc.
    • CI / CD for model deployment.
    • Comfortable with Unix/Linux.
    • As per Agile development methodology should be flexible to support developed code in production environment.
    • Knowledge on GCP Data services preferred.
    • Working experience in a managing a large database.
    • Existing hands on experience in the field of large data/mass data processing.

Nice to have

    none

Languages


English: B2 Upper Intermediate

Seniority


Senior

Relocation package


If needed, we can help you with relocation process. Click here for more information.

Vacancy Specialization


Data Science

Ref Number


VR-67873

More job opportunities in
Data Science

Specialization Position / Title Location Seniority Send to a friend
Data Science Senior Data Scientist Remote Poland, PL Senior