Back to jobs
Data Modeler with SQL
Successfully
Req. VR-117328
Support one of Australia's leading banks in modernizing their customer entitlements platform and Identity and Access Management. The current landscape includes multiple data sources, databases, and hundreds of Oracle Stored Procedures fronted by web services. The goal is to re-architect these into a single, unified platform. This is an engineering role (not a BA), requiring hands-on data modeling, manifest-driven mapping, and delivery of working solutions. Candidates are expected to be in the office 50% of the time (Sydney-based hybrid model).
Map the current target state using a manifest-driven approach; propose and implement schema changes.
Design and evolve models across relational, graph, and NoSQL databases.
Refactor Oracle PL/SQL logic and deliver migrations, pipelines, and automated tests.
Build and optimise solutions on AWS (RDS, S3, Lambda, ECS, EKS, IAM, Neptune if applicable).
Automate validations and CI/CD steps using Python and Groovy.
Partner with IAM, API, and platform teams to ensure compliance, auditability, and secure entitlements.
Must have
Data modelling and SQL/PLSQL
AWS data services and automation
Python and Groovy scripting
Graph DB (Neptune or Neo4j) and NoSQL databases
Ping Manifest/schema and policies governance
IAM principles and audit compliance
Strong data modelling across conceptual, logical, and physical layers with clear artefacts (ER models, graph schemas, JSON/Avro/DDL)
Hands-on experience with SQL and PL/SQL for reading/refactoring stored procedures and writing performant queries
Practical experience with AWS services, including RDS, S3, Lambda, IAM/Secrets Manager, CloudWatch/CloudTrail, and Amazon Neptune.
Proficiency in Python for data engineering, validation, profiling, and reconciliation tasks; Groovy for scripting and pipeline integration (e.g., GitHub actions)
Experience with manifest-driven mapping or schema registry patterns; ability to govern schema changes and versioning
Understanding of IAM models (RBAC/ABAC), PII handling, and audit/lineage requirements in regulated environments
Nice to have
Proficient with Ping Products
IDM, Authorize, Directory, Federate
Experience with Cypher (Neo4j), Gremlin (TinkerPop), or SPARQL for graph traversal and entitlements modeling
Exposure to Terraform or AWS CDK for infrastructure as code in data platforms
Performance tuning experience with Redshift or Aurora
Familiarity with metadata/lineage tooling and data contract frameworks
Banking domain experience, especially in entitlements and authorization
Knowledge of API/service integration patterns and schema registries
Languages
English: C1 Advanced
Seniority
Senior
Bengaluru, India
Req. VR-117328
Data Modeling
BCM Industry
11/09/2025
Req. VR-117328
Apply for Data Modeler with SQL in Bengaluru
*Indicates a required field