Hadoop and Spark Lead Developer

  • Full Time
  • Plano
Infosys

Transforming Enterprises To Become A Thriving Live Enterprise. AI-Powered. Digital Agility At Scale. Always-On Learning.

About the Company

Infosys is a global leader in next-generation digital services and consulting, helping clients in over 50 countries navigate their digital transformation. With more than 30 years of experience, Infosys expertly guides enterprises through complex digital journeys. The company empowers businesses with an AI-driven core, enabling effective change execution and delivering agile digital solutions at scale. This approach drives exceptional performance and customer satisfaction. Infosys’ commitment to continuous learning and innovation ensures the ongoing development of digital skills, expertise, and ideas, fostering continuous improvement within its client organizations.

About the Role

Infosys is seeking a Hadoop and Spark Lead Developer to drive digital transformation for clients in a global delivery model. In this role, you will research and apply emerging technologies, recommend the best solutions, and contribute to technology best practices. You will collaborate with key stakeholders and contribute across various stages of the Software Development Life Cycle. Join a culture of learning, teamwork, and diversity, where excellence is celebrated.

Key Responsibilities

  • Lead the end-to-end implementation of data warehouses and data marts.

  • Research technologies independently and propose suitable solutions for digital transformation.

  • Interface with stakeholders to gather requirements and ensure the alignment of technology solutions.

  • Contribute to technology-specific best practices and development standards.

  • Develop solutions using Hadoop, Spark, Scala, and Python in a global delivery environment.

  • Ensure effective data integration, data quality, and data architecture management.

  • Work on relational and dimensional modeling, as well as unstructured data modeling.

  • Participate in Agile software development processes.

  • Collaborate with multi-stakeholder teams, including Business and Technology.

Required Qualifications

  • Minimum of 4 years of experience in Information Technology.

  • At least 3 years of experience with Hadoop, Spark, and Scala/Python.

  • Proven experience in the end-to-end implementation of data warehouses and data marts.

  • Strong knowledge of SQL and Unix shell scripting.

  • In-depth understanding of data integration, data quality, and data architecture.

  • Experience with relational modeling, dimensional modeling, and unstructured data modeling.

  • Familiarity with Agile software development methodologies.

  • Experience in the banking domain is preferred.

  • Strong communication and analytical skills.

  • Ability to work in a collaborative, diverse team environment with both business and technology stakeholders.

  • Experience working in a global delivery environment.

Complete details about this role can be found on the official website below:

Copyright © 2025 hadoop-jobs. All Rights Reserved.