Senior IT Developer (with Hadoop)

emagine

experience expertise

About the Company

Emagine is a high-end business and IT consulting company that specializes in delivering expertise to help businesses solve challenges and drive progress. With a network of over 40,000 experts, Emagine helps organizations scale and adapt to evolving technological landscapes. The company offers tailored services, including team extensions, nearshoring, and managed services, to empower clients across industries. Founded in 1989, Emagine has a proven track record of delivering impactful results for blue-chip companies across Europe. With 900+ permanent employees and three nearshore centers in Poland, Emagine provides scalable, high-quality expertise to clients worldwide.

About the Role

A Senior IT Developer is needed to design and implement a global reporting solution for a leading bank. This role involves enhancing the bank’s reporting capabilities using technologies like Scala, Spark, and Hadoop, supporting the transformation and technological advancement in the banking sector. The developer will work within an Agile team, following the SAFe framework, and collaborate with cross-functional teams, including Product Owners, Solution Architects, Analysts, Testers, and other Developers.

Key Responsibilities

  • Design and implement a scalable, high-performance global reporting solution for the bank.

  • Consolidate all account and payment transaction data into Hadoop.

  • Develop reporting solutions to manage billions of transaction records for 10 million customers.

  • Collaborate with cross-functional teams to ensure successful project execution.

  • Follow Agile methodologies, working in a SAFe framework.

  • Continuously improve the performance and scalability of the reporting solution.

Required Skills

  • 3+ years of experience in functional programming using Scala.

  • Proficient in writing Spark-based applications in Scala.

  • Experience with the Hadoop technology stack: Hive, Oozie, Kafka.

  • Familiar with containerized technologies like Docker and Kubernetes.

Nice to Have

  • Deep understanding of distributed systems.

  • Experience in data engineering and building ETL/ELT pipelines.

  • Knowledge of performance tuning for Hadoop/Spark solutions.

  • Experience developing RESTful services and web applications (Bootstrap, ReactJS).

  • Familiarity with Agile methodologies.

Visit the official website below to access the full details of this vacancy:

Copyright © 2025 hadoop-jobs. All Rights Reserved.