Senior BigData developer,



Office Address

Project Description

PRG (Portfolio Reporting Group) is strategic programme where they are replacing the old application and building new platform on new technologies. They are taking the data feeds from 120+ front office data sources and aggregating and ensuring they report the correctly the data to number of regulatory bodies and stock exchanges as would be required. Currently the team would leverage the EAP (Enterprise Analytics Platform) to build the current application. MVP has gone live in December but major feed onboarding happening over the next 2 years.


    The person will help the larger team to onboard feeds faster and also work with EAP team. Hence experience and expertise in Bigdata technologies with Python/PySpark/ Spark SQL.


Must have

    • Technical
    o Fundamentals of Spark using the Dataframe API
    o Understanding partitioning of data
    o Analysing and performance tuning Spark queries e.g. looking at the DAG
    o Knowledge of Hadoop and its ecosystem of technologies especially Hive
    o Python
    o OOP concepts using Python
    o Knowledge of Conditional Statements & Loops: If-else Control Structures, For/While Loops
    o Demonstrate a comprehensive understanding of Complex Data Types: Shallow & Deep Copies, Working with Lists & Tuples, Dictionaries & Sets
    o Understand Fundamental Data Structures & their Implementation
    o Good knowledge of Exceptions & Command Line Arguments
    o Contributes to quality assurance by writing unit and functional tests.
    o Ensures development happens for all Software Components in accordance with Detailed Software Requirements specification, the functional design and the technical design document.
    o Basic knowledge of UNIX
    o Demonstrate source control knowledge (preferably GIT)
    o Ability to analyse databases directly using query language tools such as SQL
    o Experience on ETL process on Big Data
    o Have an understanding of data relationships, normalisation
    • Non-Technical
    o Use of JIRA / Confluence
    o Appreciation of release management and software maintenance
    o Provides Level 3 support
    o Contributes to problem and root cause analysis.
    o Collaborates with colleagues participating in other stages of the Software Development Lifecycle (SDLC).
    o Strong analytical skills.
    o Should have good understanding of architecture
    o Ability to work in virtual teams
    o Excellent team player and open minded approach
    o Ability to share information, transfer knowledge and expertise to team members.
    o Ability to design and write code in accordance with provided business requirements
    o Ability to work in a fast paced environment with competing and alternating priorities with constant focus on delivery.
    o Candidate is expected to have high desire to learn new technologies and implement various solutions in fast paced environment.

Nice to have

    o Knowledge of control M
    o Hands on experience of Team City
    o Understanding of Financial Products
    o Ability to interpret and write complex SQLs


English: C1 Advanced

Relocation package

If needed, we can help you with relocation process. Click here for more details: see more details

Work Type

BigData (Hadoop etc.)

Ref Number


Explore More

LoGeek Magazine
icon Logeek Luxoft
Learn more
icon Events Luxoft
Learn more
Relocation Program
icon Relocation Luxoft
Learn more
icon Referral Luxoft
Learn more
and Grads
icon Students Luxoft
Learn more

More job opportunities in
BigData (Hadoop etc.)

Specialization Position / Title Location Send to a friend
BigData (Hadoop etc.) Datastage Application Developer Bangalore, IN
BigData (Hadoop etc.) Azure Databricks Developer Bangalore, IN
BigData (Hadoop etc.) Azure Data Engineer Bangalore, IN
BigData (Hadoop etc.) Hadoop Administrator Bangalore, IN