Back to jobs
Developer - Hadoop, Teradata, Python
Successfully
Req. VR-117926
Our Client a leading bank in Asia with a global network of more than 500 branches and offices in 19 countries and territories in Asia Pacific, Europe, and North America, are looking for Consultants to be part of the project.
The Technology and Operations function is comprised of five teams of specialists with distinct capabilities: business partnership, technology, operations, risk governance, and planning support and services. They work closely together to harness the power of technology to support our physical and digital banking services and operations. This includes developing, centralising, and standardising technology systems as well as banking operations in Malaysia and overseas branches.
The client has more than 80 years of history in the banking industry and is expanding its footprint in Malaysia. You will be working in a newly set-up technology centre located in Kuala Lumpur as part of Technology and Operations to deliver innovative financial technology solutions that enable business growth and technology transformation.
Design, develop, and maintain data pipelines and ETL workflows using Informatica Data Integration Suite, Python, and R.
Build and optimize large-scale data processing systems on Cloudera Hadoop (6.x) and Teradata Inteliflex platforms.
Implement data ingestion, transformation, and storage solutions integrating diverse data sources, including Oracle, SQL Server, PostgreSQL, and AS400.
Develop and deploy dashboards and analytics solutions using QlikSense, Microsoft Power BI, and other visualization tools.
Collaborate with business teams to deliver analytics and decision-support solutions across domains like Credit Risk Analytics, Credit Scoring,
Treasury & Wealth Management, and Trade Finance.
Leverage data science tools (Python, R Studio, Kafka, Spark) to support predictive modeling, scoring, and advanced analytics use cases.
Participate in code reviews, performance tuning, and data quality validation using tools like QuerySurge, SonarCube, and JIRA.
Automate workflows, deployments, and job scheduling using Jenkins, Control-M, and Bitbucket.
Ensure scalability, security, and governance of data solutions in production environments across Linux, AIX, Windows, and AS400 platforms.
Must have
3 to 5years experience in Big Data & Data Engineering: Cloudera Hadoop (6.x), Spark, Hive, HUE, Impala, Kafka
ETL & Data Integration: Informatica (BDM, IDQ, IDL), QuerySurge
Databases: Teradata Inteliflex, Oracle, SQL Server, PostgreSQL
Data Visualization: QlikSense Discovery, Microsoft Power BI
Programming & Analytics: Python, R, R Studio
Version Control & Automation: Jenkins, Bitbucket, Control-M
OS: AS400, AIX, Linux, Windows
Domain Knowledge:
Minimum 1 of the following:
Credit Risk Analytics
Credit Scoring & Decision Support
Treasury & Wealth Management (Murex)
Trade Finance & Accounts Receivable (FITAS, ARF)
Retail Banking & Cards (Silver Lake)
Data Modeling (FSLDM / Data Marts)
Nice to have
AS400, Experian PowerCurve, SAS
Languages
English: C1 Advanced
Seniority
Regular
Kuala Lumpur, Malaysia
Req. VR-117926
BigData Development
BCM Industry
01/10/2025
Req. VR-117926
Apply for Developer - Hadoop, Teradata, Python in Kuala Lumpur
*Indicates a required field