Hadoop Developer

Unison Consulting

  • North East Region, Singapore
  • Permanent
  • Full-time
  • 1 month ago
  • Apply easily
Job Description:We are seeking a skilled and experienced Hadoop Developer to join our team. As a Hadoop Developer at Unison Consulting, you will be responsible for designing, developing, and maintaining our Hadoop-based data processing and analytics solutions. You will work with a team of talented data engineers and data scientists to build efficient and scalable data pipelines, leveraging the power of Hadoop technologies.Responsibilities:
  • Hadoop Development: Design, develop, and maintain Hadoop-based data processing applications, workflows, and pipelines.
  • Big Data Processing: Leverage Hadoop technologies such as HDFS, MapReduce, Spark, Hive, and HBase for batch and real-time data processing.
  • Data Ingestion: Develop and optimize data ingestion processes, ensuring data is collected, transformed, and loaded into the Hadoop cluster efficiently.
  • Performance Optimization: Monitor and optimize the performance of Hadoop jobs and data pipelines to ensure scalability and efficiency.
  • Data Transformation: Transform and prepare data for analysis, ensuring data quality, integrity, and compatibility with downstream applications.
  • Troubleshooting: Identify and resolve data processing issues in a timely manner.
  • Collaboration: Work closely with data engineers, data scientists, and other cross-functional teams to understand data requirements and deliver effective solutions.
  • Documentation: Create and maintain comprehensive documentation of Hadoop configurations and data processing workflows.
Requirements
  • Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Hadoop Developer with strong knowledge of Hadoop ecosystem components (HDFS, MapReduce, Spark, Hive, HBase).
  • Proficiency in Java, Scala, or Python programming for building data processing applications.
  • Hands-on experience with data ingestion, transformation, and ETL processes using Hadoop.
  • Familiarity with data processing tools and frameworks such as Pig, Sqoop, and Flume.
  • Strong knowledge of SQL and experience with relational databases.
  • Experience with performance tuning and optimization of Hadoop applications.
  • Good understanding of data warehousing concepts and methodologies.
  • Excellent problem-solving skills and attention to detail.
  • Effective communication skills to collaborate with team members and stakeholders.
  • Ability to work in a fast-paced, dynamic environment and adapt to changing technologies and project requirements.
  • Experience with cloud platforms and technologies is a plus.

Unison Consulting

Similar Jobs

  • Hadoop Administrator

    Unison Consulting

    • North East Region, Singapore
    • $60,000-120,000 per year
    Plan, install, configure, and deploy Hadoop clusters based on business requirements and workload demands. Monitor and manage the health of Hadoop clusters, addressing performance …
    • 2 months ago
    • Apply easily
  • Python Developer

    Unison Consulting

    • North East Region, Singapore
    • $24,000-48,000 per year
    Develop and maintain Python-based applications and tools. Write clean, efficient, and maintainable code. Collaborate with cross-functional teams to design, develop, and deploy ne…
    • 1 month ago
    • Apply easily
  • Senior Software Developer

    Rohde & Schwarz

    • Loyang, Singapore
    Contact Your Rohde & Schwarz recruiting team is looking forward to receiving your application. Info Ville / région Singapore - Loyang (Singapour) Niveau débutant Profes…
    • 1 month ago