ETL Developer (Hadoop)

RM10,000 - RM15,000 monthly
  • GMA Tech Consulting
  • Jalan Ampang, Kuala Lumpur City Centre, Kuala Lumpur, Federal Territory of Kuala Lumpur, Malaysia
  • Jan 17, 2019
Full time Software Developers/Architects, Applications Systems/Business and Data Analysts Data Warehousing Specialists

Job Description

 Job Purpose

  • To support development and maintenance of Hadoop solutions for the Enterprise.
  • To be part of initiatives that brings data into the data lake and delivers insights.
  • To perform production support and maintenance of existing datasets in Hadoop.

The Job 

  • Develop ETL/ELT jobs based on requirements from source to Data lake platform.
  • Ensure that all development standards are being followed..
  • Perform code review functions for applications programs developed by team members.
  • Provide production and operational support after production deployment.
  • Monitor and manage production jobs to verify execution and measure performance to assure ongoing data quality and optimization of the system to manage scalability and performance and identify improvement opportunities for key ETL processes.
  • Work effectively with all technical personnel (Development Team, business analysts, security, risk and compliance, data center, project managers, data architects and testers), and clearly translate business priorities and objectives into technical solutions.

Our Requirements

  • A Bachelor’s degree in IT
  • Min 8 years’ experience in data related work in Warehouse / Data Marts, with at least 2+ years of experience as Hadoop/ETL experience (data processing and scripting). .
  • Ability to plan and organize technical work and deliverables. Ability to follow guidelines and adhere to the established software development standards and conventions.
  • Self-motivated and independent. Able to work with minimum supervision and to work well with stakeholders and project staff.
  • Ability to prioritize and multi-task across numerous work streams.
  • Strong interpersonal skills; ability to work on cross-functional teams. Strong verbal and written communication skills.
  • Deep knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise wide data architectures and data warehousing/BI.
  • Demonstrated problem-solving skills. Ability to learn effectively and meet deadlines.
  • Strong scripting skills in Linux environment and SQL.
  • Expertise in Hadoop ecosystems HDFS (Hortonworks)
  • Hands-on Experience in Sqoop, Hive, HBase, Spark, Oozie, Python, Scala is a must.

Years of experience

5 - 10 years,   10 years and above  

Required Languages

English  

Do you accept foreigners

Yes

Required Skills

Hadoop, Sqoop, Hive, HBase, Spark, Oozie, Python, Scala