VMC Soft Technologies offers IT Services, Product Development Solutions and US IT RPO/ Staffing
About the Company
VMC Soft Technologies is a fast-growing full-service IT company specializing in IT services, product development solutions, training, IT recruitment, RPO, staffing, and business process outsourcing. The company focuses on providing innovative and cost-effective solutions by first understanding the unique needs of its clients. VMC Soft Technologies offers customized delivery methods, bringing in the right people to help organizations achieve their goals.
About the Role
A leading technology company is seeking a Hadoop Developer with over 5 years of experience in data warehousing and a minimum of 4 years in Big Data, specifically with Cloudera. The ideal candidate will have hands-on experience with the Hadoop ecosystem, including tools like Hadoop, Hive, Spark, Python, and Scala. A strong background in Python and Unix shell scripting is required, along with familiarity with scheduling tools like Autosys. Experience with Agile methodologies, SQL, and relational databases is essential. The role involves coordinating technology solutions across multiple teams, managing risks, and ensuring project deliverables meet client requirements.
Responsibilities
-
Apply knowledge of the Hadoop ecosystem to solve relevant business problems.
-
Work with Big Data technologies like Hadoop, Hive, Spark, Python, and Scala in production environments.
-
Use Unix shell scripting to automate tasks and improve processes.
-
Optimize SQL queries for performance and troubleshoot critical software issues.
-
Coordinate delivery across technology teams and facilitate communication with stakeholders.
-
Manage dependencies, risks, and impediments to ensure timely project delivery.
-
Conduct status reviews, change controls, and meetings to track project progress.
-
Ensure execution aligns with deliverable requirements and manages financial commitments.
-
Promote collaboration across teams and articulate clear updates on deliverables.
Required Skills
-
5+ years in data warehousing architectural approaches.
-
Minimum of 4 years of hands-on experience with the Hadoop ecosystem (Cloudera).
-
Strong proficiency in Big Data technologies like Hadoop, Hive, Spark, Python, and Scala.
-
Experience with Unix shell scripting and scheduling tools like Autosys.
-
In-depth knowledge of SQL and relational databases.
-
Experience with Agile methodologies and distributed systems.
-
Excellent problem-solving and performance tuning skills.
-
Strong communication skills and ability to work collaboratively with teams.