Senior/Expert AWS AI Engineer

Łódź, PL, 93-281 Gdańsk, PL, 80-309 Gdynia, PL, 81-537 Warszawa, PL, 02-460

Job ID:30562

 

At Nordea, we’re committed to being a partner our customers and society can count on. Compliance and integrity go hand in hand. Joining us means you’ll have an impact on how we do banking – today and tomorrow. So, bring your ideas, skills and unique background. With us, you’ll be in good company with plenty of opportunities to collaborate, grow and make your mark on something bigger.

 

Financial Crime Prevention Technology (FCPT) adds value by ​providing a strong technology team with the right capabilities for the Group Financial Crime Prevention business teams.

 

The Data and Analytics team is a part of FCPT and is responsible for providing the tools and technology required to enable the business to leverage the value of our data through advanced analytics, generative and predictive AI.

 

About this opportunity

 

We are seeking an exceptionally skilled and visionary AWS Senior/Expert AI Engineer to spearhead our advanced DevOps, MLOps, and LLMOps initiatives. This is a senior technical role where you will define, establish best practices, and lead the implementation of robust, scalable, and secure operational pipelines for our software, machine learning models, and large language model applications across multiple, high-impact project teams.

 

What you’ll be doing:

  • Cloud Infrastructure & Orchestration: Design, implement, and manage scalable and secure AWS-based infrastructure for AI/ML workloads, utilizing services like AWS Step Functions, EventBridge, AWS
  • Managed Workflows for Apache Airflow (MWAA), and AWS Lambda for workflow orchestration.
  • Data Processing & ETL: Develop, optimize, and maintain robust big data ETL and analytics pipelines using PySpark with Python and/or Spark with Scala on AWS Glue, Amazon EMR, and Amazon EKS.
  • Data Storage & Management: Implement efficient data storage solutions primarily on Amazon S3, ensuring data accessibility, security, and integrity for AI/ML applications.
  • Data Querying & Analysis: Utilize AWS Athena for ad-hoc querying and analysis of large datasets stored in S3, supporting data exploration and model development.
  • Hadoop Ecosystem Integration: Leverage expertise in the Hadoop ecosystem (Hive, Impala, Sqoop, HDFS, Oozie) for managing and processing large-scale datasets.
  • Programming & Data Transformation: Apply strong Python programming skills, including extensive experience with Pandas DataFrame transformations, for data manipulation and analysis.
  • Deployment & MLOps: Implement and maintain Continuous Integration (CI) and Continuous Delivery (CD) pipelines using Jenkins, and manage infrastructure as code (IaC) with Terraform for automated deployment of AI/ML solutions.
  • Performance Optimization: Continuously monitor, evaluate, and optimize the performance, cost-efficiency, and reliability of deployed AI/ML infrastructure and data pipelines.
  • Collaboration: Work closely with data scientists, product managers, and business stakeholders to translate requirements into scalable technical solutions.
  • Code Quality: Write clean, well-documented, and testable code, adhering to best practices in software development, MLOps, and cloud engineering.
  • Mentorship (Senior/Expert): Mentor junior engineers, share knowledge, and contribute to the overall growth and technical excellence of the team.

 

Who you are

 

Collaboration. Ownership. Passion. Courage. These are the values that guide us in being at our best- and that we imagine you share with us.

 

To succeed in this role, you should have:

  • Education: Bachelor's or Master's degree in Computer Science, Software Engineering, Data Engineering, or a related technical field.


Experience:

  • Senior: 5+ years of professional experience in data engineering, cloud infrastructure, or AI/ML engineering, with a strong focus on AWS.
  • Expert: 8+ years of professional experience, including leading complex data/AI infrastructure projects and significant contributions to production systems on AWS.
  • AWS Expertise (Must Have):
  • Orchestration Services: Hands-on experience with AWS Step Functions, EventBridge, AWS Managed Workflows for Apache Airflow (MWAA), and AWS Lambda.
  • Processing Services: Proven experience with AWS Glue, Amazon EMR, and Amazon EKS.
  • Storage Service: Expert knowledge of Amazon S3.
  • Querying & Analysis: Experience with AWS Athena.
  • Big Data & ETL:
  • Extensive experience with big data ETL and analytics development using PySpark with Python and/or Spark with Scala.
  • Working experience with the Hadoop ecosystem including Hive, Impala, Sqoop, HDFS, and Oozie.

 

Programming:

  • Expert proficiency in Python, including extensive experience with Pandas DataFrame transformations.
  • Proficiency in Scala for Spark development (highly preferred).
  • CI/CD & IaC:
  • Strong knowledge of Continuous Integration (CI) and Continuous Delivery (CD) Pipelines using Jenkins.
  • Experience with Infrastructure as Code (IaC) using Terraform.
  • Problem-Solving: Excellent analytical and problem-solving skills, with the ability to design and implement robust, scalable data and AI solutions.
  • Communication: Strong communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences.

 

Preferred Skills & Qualifications:

  • Experience with MLflow for ML lifecycle management.
  • Hands-on experience with AWS SageMaker and / or AWS Bedrock for model training and deployment.
  • Familiarity with other AWS AI/ML services (e.g., Amazon Rekognition, Amazon Comprehend).
  • Experience with real-time data processing and streaming technologies (e.g., Apache Kafka).
  • Certifications such as AWS Certified Solutions Architect, AWS Certified Data Analytics, or AWS Certified Machine Learning.

 

If it sounds like you, get in touch!

 

Next steps

 

Submit your application no later than 31/10/2025.

 

At Nordea, we know that an inclusive workplace is a sustainable workplace. We deeply believe that our diverse backgrounds, experiences, characteristics and traits make us better at serving our customers and communities. So please come as you are.

 

Please include permit for processing personal data in CV as following: 

 

In accordance with art. 6 (1) a and b. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) hereinafter ‘GDPR’. I agree to have: my personal data, education and employment history proceeded for the purposes of current and future recruitment processes in Nordea Bank Abp. 

 

The administrator of your personal data is: Nordea Bank Abp operating in Poland through its Branch, address: Aleja Edwarda Rydza Śmiglego 20, 93-281 Łodź. Your personal data will be processed for the recruitment processes in Nordea Bank Abp. You have a right to access your personal data, right to rectify and right to delete. Disclosing the personal data in the scope specified by the provisions of Polish Labour Code from 26 June 1974 and executive acts are mandatory. Providing personal data is necessary to conduct the recruitment processes. The request for the deletion of your personal data means resignation from further participation in recruitment processes and causes the immediate removal of your application. Detailed information concerning processing of your personal data can be found at: https://www.nordea.com/en/doc/nordea-privacy-policy-for-applicants.pdf.

 

We reserve the right to reply only to selected applications.

Department:  IT/Technology

Learn more about us

Learn more about us

How we recruit

Who we are

Sustainability in Nordea

Our purpose and values