Machine Learning Compiler Architect

Apply
Apply

Share

successfully icon

Successfully

The vacancy has been successfully added to favorites

location icon

Mountain View, United States of America

specialization icon

Software/System Architecture

lob icon

Automotive Industry

date icon

11/02/2026

Req. VR-120910

Apply
Project description

Our customer is a software development team with the top GERMAN OEM in automotive industry. Customer of ours is building the leading tech stack for the automotive industry and creating a unified software platform for over 10 million new vehicles per year. We're looking for talented, digital minds like you to help us create code that moves the world. Together with you, we'll build outstanding digital experiences and products for all customer brands that will transform mobility. Join us as we shape the future of the car and everyone around it.
The Chief Machine Learning Compiler Architect, within the NPU Hardware & Software organization, is intended for an individual with broad background in compiler development and architecture, with significant experience in AI/ML hardware accelerators and advanced compilation technologies. The Chief Machine Learning Compiler Architect will be responsible for designing and developing the compiler architecture for our state-of-the-art Neural Processing Unit (NPU), optimizing and transforming machine learning models into efficient executable formats that are tailored for our specialized hardware. Additionally, you will be responsible for leading research initiatives in advanced compilation techniques and driving adoption of cutting-edge optimization strategies and compilation methodologies.

Responsibilities
bullet icon

Compiler Architecture & Design

bullet icon

Design and develop a robust compiler architecture that effectively interacts with our NPU

bullet icon

Implement advanced graph optimizations that incorporate both hardware agnostic and hardware specific enhancements

bullet icon

Develop and optimize algorithms for tiling and memory management to efficiently utilize the NPU's resources

bullet icon

Create sophisticated optimization passes for neural network inference and training workloads

bullet icon

Code Generation & Hardware Integration

bullet icon

Map high-level operations to optimized library macros and convert them into hardware-level instructions

bullet icon

Generate and manage DMA commands to facilitate data movement and operation within the hardware ecosystem

bullet icon

Collaborate with hardware engineers and system architects to ensure seamless integration and maximal performance of the NPU

bullet icon

Implement efficient scheduling and resource allocation algorithms for concurrent AI workload execution

bullet icon

Innovation & Technology Leadership

bullet icon

Stay updated with the latest trends and advancements in compiler technology and machine learning to continuously improve the compiler design

bullet icon

Lead research initiatives in advanced compilation techniques for AI accelerators

bullet icon

Drive adoption of cutting-edge optimization strategies and compilation methodologies

bullet icon

Mentor engineering teams on compiler design principles and best practices

Skills

Must have

bullet icon

General Skills:

bullet icon

Expert communicator across cultural and team boundaries

bullet icon

Expertise in motivating teams and fostering a collaborative and productive environment

bullet icon

Background in managing multiple and competing stakeholder interests; establishing trust, clear roles and responsibilities, and goodwill between partner engineering organizations

bullet icon

Experience managing cross-functional and/or cross-team projects

bullet icon

Technical leadership experience with ability to mentor engineering teams

bullet icon

Strategic thinking capabilities with focus on long-term architectural decisions

bullet icon

Collaborate and work with multiple teams across geographies and time zones

bullet icon

Required Specialized Skills:

bullet icon

12+ years of experience in compiler development or architecture, particularly targeting AI or ML hardware accelerators

bullet icon

Strong understanding of machine learning algorithms and their computational implications

bullet icon

Working experience with TVM, IREE, XLA, MLIR or LLVM

bullet icon

Proficiency in programming languages such as C++ and Python

bullet icon

Experience with graph optimization techniques and memory management strategies in compilers

bullet icon

Demonstrated ability to translate high-level functional requirements into detailed technical designs

bullet icon

Deep knowledge of hardware architecture principles and AI accelerator design concepts

bullet icon

Proven track record of leading compiler architecture projects from concept to production deployment

Nice to have

bullet icon

Desired Skills:

bullet icon

Prior experience with NPU hardware

bullet icon

Knowledge of automotive industry standards and functional safety requirements

bullet icon

Experience with neural network quantization and optimization techniques

bullet icon

Background in high-performance computing and parallel processing architectures

bullet icon

Publications or contributions to open-source compiler projects

bullet icon

Experience with GenAI tools for accelerated engineering workflows and AI-assisted development practices

bullet icon

Enthusiasm for adopting innovative AI-augmented development practices and continuous learning in rapidly evolving GenAI technologies

Other
seniority icon

Languages

English: C1 Advanced

seniority icon

Seniority

Lead

Mountain View, United States of America

Req. VR-120910

Software/System Architecture

Automotive Industry

11/02/2026

Req. VR-120910

Apply for Machine Learning Compiler Architect in Mountain View

*Indicates a required field

Under the terms of your specific consent or to perform our obligations under a contract with you, as applicable, we, Luxoft Holding Inc. will manually and electronically process your personal data, specifically your first name, last name, phone number, e-mail address and other data you provide us through this form.


Within this context, we process personal data only for the specific purpose(s) indicated in the individual consent language or other notices provided below.


We will – insofar as reasonably necessary for the purpose you have agreed to and within the scope of applicable laws – transfer your personal data to other entities within the Luxoft Group and to the group of third party recipients listed in our Privacy Notice. Such Recipients can be located outside the European Union (EU) and/or the European Economic Area (EEA) (“Third Countries”). The Third Countries concerned, e.g. the USA, may not have the level of data protection that you enjoy e.g. under the GDPR. This can result in disadvantages such as an impeded enforcement of data subjects’ rights, a lack of control over further processing and access by state authorities. You may only have limited legal remedies against this. Insofar our transfer of your personal data to recipients in Third Countries is not covered by an adequacy decision of the EU Commission, we achieve an adequate level of data protection as further detailed out in our Privacy Notice.


With your consent, we personalise marketing communications to you by way of carrying out marketing research analysis, analysing the surfing-behaviour of our website visitors and to adjust it to their detected tendencies, as well as to plan more efficient future marketing activities. This personalised marketing does not include any automated decision-making activities.


Further information on how we process personal data in general is available in our Privacy Notice. You may withdraw any given consent at any time. The withdrawal of your consent(s) will not affect the lawfulness of processing before its withdrawal. For any request in this context, please e-mail us at: DPO@luxoft.com.


Before uploading CV or any other information to this website, to learn more about your obligations and restrictions arising from the use of this website, please read our Terms of Use.