Data Engineer (with ETL, Azure),



Office Address

Project Description

As part of our strategic partnership with one of the biggest financial institutions in the World, we are hiring various IT specialists, who will become part of their new IT Service Center in Sofia. The bank is an international organization dedicated to providing financing, advice, and research to developing nations to aid their economic advancement. The bank predominantly acts as an organization that attempts to fight poverty by offering developmental assistance to middle- and low-income countries. Our client is a provider of financial and technical assistance to individual countries around the globe. The bank considers itself a unique financial institution that sets up partnerships to reduce poverty and support economic development.

Team specific:
This position focuses on data architecture with a perspective of data integration through Extract/Transform/Load. The incumbent will be working with business owners of data, business analysts, and data custodians. The incumbent will be expected to take data Extract/Transform/Load artifacts, as well as deploy the detailed design through enterprise standard tools.

The selected candidate will work closely with the BI team to provide technical leadership to facilitate development projects that involve the computing environment which may include the coordination of software upgrades and the installation of new products. He/she will design and validate data solutions that are practical, flexible, scalable, reusable and strategic. These efforts enhance data quality, enrich access and provide business decision makers with information upon which they can make more accurate and effective decisions across multiple domains.


    • Develop ETL mappings, interacting with large data processing pipelines in distributed data stores, using cloud-based ETL tools.
    • Determine database structural requirements by analyzing client operations, applications, and programming, while reviewing objectives with clients and evaluating current systems
    • Work with application DBA and modelers to construct data stores;
    • Define database physical structure and functional capabilities to accommodate data integration requirements, security, back-up, and recovery specifications
    • Ensure data is ready for use by consuming application, analyst and scientist using frameworks and microservices to serve data.
    • Collaborate with data architects, modelers and IT team members on project goals
    • Ensure optimum performance techniques for data integration, coordinate deployment actions, and document actions
    • Integrate new data management technologies and software engineering tools into existing structures
    • Maintain overall performance of components involved in Data Integration through ETL by identifying and resolving production and application development problems
    • Answer user questions
    • Provide maintenance and support to data integration and ETL components by coding utilities, and resolving problems


Must have

    Educational Qualifications and Experience:
     Education: Bachelor's degree in Computer Science/Engineering
     Role Specific Experience: 7+ years of hands-on experience working on ETL mappings, interacting with large data processing pipelines in distributed data stores, and distributed file systems using cloud-based ETL tools such as IICS and Azure Data Factory/SSIS.
     Extensive experience coding complex SQL queries in one or more leading RDBMS e.g. Oracle, Azure Synapse, MS SQL Server, Postgres etc.

    Required technologies:
    o Tools: Informatica Cloud Services (IICS), Informatica PowerCenter 10.x, Azure Data Factory
    o Database: Postgres, Oracle 19c, Azure Synapse/SQL DW/ SQL DB, SQL Server 2016/2014/2012
    o Cloud Technologies: Azure Microsoft Technologies, Unix, Linux, Windows

    Required Skills/Abilities:
    • Advanced knowledge of Data Integration concepts and standard approaches
    • Strong leadership and communication skills
    • Ability to work independently once guidance and goals are provided

Nice to have

    Experience in one or more of the following technologies:
    o Data dictionaries
    o Data warehousing
    o Enterprise application integration
    o Metadata registry
    o Master Data Management (MDM)
    o Relational Databases
    o NoSQL
    o Semantics
    o Data retention
    o Structured Query Language (SQL)
    o Procedural SQL
    o Unified Modeling Language (UML)
    o XML, including schema definitions (XSD and RELAX NG) and transformations
    o Additional consideration will be given to those individuals who possess the following specific competencies (in no specific order): Tibco Data Virtualization, SAP BW/Hana


English: B2 Upper Intermediate



Relocation package

If needed, we can help you with relocation process. Click here for more information.

Vacancy Specialization

ETL (Informatica, Ab Initio etc.)

Ref Number