Globex is hiring for Hadoop Administrator
Looking for a Hadoop Admin with hands-on experience in installation, configuration, debugging, tuning and administration. This role is responsible to install, configure and maintain various components of Hadoop ecosystem and maintain their integrity.
In this position you will apply your skills to have a consistent and stable landscape, manage, test and deliver client tools and solutions. You will continue evaluating industrial trends, establish best practices and optimize the BI and advanced analytics environments to deploy analytics solutions at scale. You will be closely working with leadership teams from the different organization to establish the foundation to create value for diverse business functions such as supply chain, pricing, Industrial IOT & digital factory implementation, and will impact business units that span through multiple geographic areas.
Install and configure various components of Hadoop ecosystem and maintain their integrity
Deploy and maintain a Hadoop cluster, adding and removing nodes using cluster monitoring tools like Ganglia Nagios or Cloudera Manager
Work closely with the database team, network team, BI team and application teams to make sure that all the big data applications are highly available and performing as expected.
Manage all aspects of our AWS infrastructure (compute, storage, network, permissions, cost) using configuration management tools like Ansible, Cloud Formation and shell scripts
Assist in designing, automating, implementing and sustainment of Amazon machine images (AMI) across Cloud environment.
ConfigureS3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.
Implement and sustain Amazon machine images (AMI) across Cloud environment
Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.
Monitor the cluster connectivity and performance.
Backup and recovery tasks
Resource and security management
Troubleshoot application errors and ensure that they do not occur again
You Must Have:
Bachelor's degree in computer science, information technology, data science, data analytics, interactive media, or related field
4+ years of experience in Data warehousing or similar analytic data experience
2+ years of experience creating and managing complex data architectures.
3+ years of experience in database design, development and data modeling.
2+ years if experience working as a Hadoop Admin
Hands-on experience on Cloudera installation, configuration, debugging, tuning and administration.
Must have prior Cloudera Hadoop cluster deployment experience from the scratch.
Hands-on experience on Cloudera, working with data delivery teams to setup new Hadoop users. This includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive.
Competency in Red Hat Linux administration (security, configuration, tuning, troubleshooting and monitoring).
Expert knowledge on Active Directory/LDAP security integration with Cloudera Big Data platform.
Performance tuning experience of Cloudera clusters and Spark (PySpark, Spark, Language R) and MapReduce routines.
Experience Optimizing clusters for future workloads.
Experience with VPC, Subnets, and Route tables
Hands-on experience on node management, monitoring and response, support processes creation, upgrades and patches, logging configuration and managing user rights and space quota.
Working knowledge of Networks, Linux OS and Unix Shell Scripting.
Experience working on Agile projects and Agile methodology in general
Hands on experience with Cloudera/HDP/Apache distributions
Hands on experience in deploying Big Data clusters involving Hadoop (HDFS/Yarn/Hive), HBase, Kafka, Solr, Spark, Zookeeper, Oozie using Ambari/Cloudera Manager
Good understanding of Hadoop benchmarks and other performance benchmarks for optimal configuration
Good understanding of OS parameters, network topologies, storage options for optimal Hadoop cluster set up
Hands on experience in any one RDBMS -MySQL/PostgreSQL - Ability to install any of these databases as meta store database, create schema, populate data, import/export, create
Good understanding and experience of configuring HA, Kerberos, Encryption using Sentry, Ranger, KMS
Fluent in at least one scripting language (Shell/Perl/Python etc.)
Good knowledge of Linux as Hadoop runs on Linux
Ability to debug cluster issues, job failures, performance issues etc.
Exposure to cloud technologies like EC2/Azure/VMware will be added advantage
Hands on experience on monitoring tools like Nagios, Ganglia etc.
Strong verbal and written communication skills are mandatory
Excellent analytical and problem-solving skills are mandatory
Solid troubleshooting abilities and able to work with a team to fix large production issues
Hadoop, AWS certifications is a plus
Salary: INR 7,00,000 - 13,00,000 PA.
Industry:IT-Software / Software Services
Functional Area:IT Software - Other
Redhat LinuxShell ScriptingClouderaHadoopBig DataMapreduceHdfsYARNHBaseHiveOozieSparkAWS
Desired Candidate Profile
Please refer to the Job description above
Globex Digital Solutions Pvt. Ltd
GLOBEX DIGITAL WAS STARTED IN 2009 AS A IT SERVICES ORGANIZATION HAS GROWN INTO A STAFFING & IT SOLUTIONS COMPANY.
Our aim is simple and that is to bridge the gap between evolving technologies & resources availability. Being customer centric company, we try to keep everything simple and transparent. We apply strategic thinking and practicality to each of our assignments, with the goal of delivering success for our client's & resource's
Contact Company:Globex Digital Solutions Pvt. Ltd