hadoop admin architect in hyderabad

posted
contact
randstad india
position type
temporary
apply now

posted
location
hyderabad, telangana
function
Other
position type
temporary
experience
10-16
reference number
54562
contact
randstad india

job description

hadoop admin architect in hyderabad

Skills & Qualification: • Total 12+ years of Hadoop administration experience. Out of total experienceminimum of 6+ years of experience as Enterprise scale Hadoop Architect. • Experience with any of the Hadoop distributions. Hortonworks HDP isdesirable • Experience with installation, configuration, administration, HighAvailability and Security • Experience administering Hadoop applications such as HDFS, MapReduce, YARN,Hive, Pig, Oozie, Slider, Sqoop, Nifi, Airflow, Ranger, Griffin, Atlas, DatalQ. • Experience with Hadoop in public cloud such as AWS is a must. • Experience with managing Hadoop clusters using tools such as Ambari • Experience with AWS Services such as Elastic Compute Cloud (EC2), AutoScaling ,Elastic Load Balancing (ELB), CloudFormation (CF), Identity andAccess Management (IAM), CloudTrail, CloudWatch (CW), Simple Storage Service(S3) ,Elastic Block Store (EBS),Elastic File System (EFS). • Experience with Devops Environment. • Excellent problem solving and troubleshooting skills • Linux administration experience preferred. Certifications (Desirable) • HDP Certified Administrator or Any Hadoop Administration Certification • AWS Administrator Associate or up Representative Job Duties 1\. Installation a. Configure a local HDP repository b. Install ambari-server and ambari-agent c. Install HDP using the Ambari install wizard d. Add a new node to an existing cluster e. Decommission a node f. Add an HDP service to a cluster using Ambari 2\. Configuration a. Define and deploy a rack topology script b. Change the configuration of a service using Ambari c. Configure the Capacity Scheduler d. Configure the location of log files for services e. Create a home directory for a user and configure permissions f. Configure the include and exclude DataNode files 3\. Troubleshooting a. Restart an HDP service b. View an application's log file c. Configure and manage alerts d. Troubleshoot a failed job 4\. High Availability a. Configure NameNode HA b. Configure ResourceManager HA c. Copy data between two clusters using distcp d. Create a snapshot of an HDFS directory e. Recover a snapshot f. Configure HiveServer2 HA 5\. Security a. Install and configure Knox b. Install and configure Ranger c. Configure HDFS ACLs d. Configure Hadoop for Kerberos Optional Skills • Experience with Hadoop in-memory SQL like database technologies such asLLAP, Kognitio and Presto • Experience with other data technologies such as Incorta, Endeca, Tableau,Qliksense • Experience with ELT/ETL processes • Experience with Database, Amazon RDS (PostgreSQL, SQServer), Redshift • Experience with NoSQL technologies such as Dynamo DB or Mongo DB etc.

skills

Architect, Hadoop admin, nifi, airflow