Required candidates with >14 years of experience in Big Data Administration. Will be responsible for driving the implementation and ongoing support of the Big Data platforms. The successful candidate will be responsible for managing a large-scale data lake with X Petabytes of storage across all domains of Bank, hundreds of users and the daily rate of X Terabytes of new data being ingested. Candidate should have expertise on DEVOPS technology stack well - CICD and Build Tools.
...
Responsibilities:
• Managing Big Data Platforms with responsibilities including and not limited to deployments, Upgrades, Back up, recovery, Resource Mgmt., performance tuning, capacity planning and security within the Big-data platforms.
• Knowledge of best practices related to security, performance, and disaster recovery.
• Open and coordinate service requests with different Vendor Support. Ensure alignment with current IT standards and controls including SDLC & ITIL processes (Incident, Problem, Change management).
• Allocate and manage compute, memory, storage, number of name-node objects for individual pools and user groups.
• Define and Set up ACL Policies in Ranger.
• Experience Maintaining and Managing high availability systems.
• Identify Opportunities for Process Improvements, Efficiency and Quality.
• Manage review and audit of Platforms w.r.t security, Financial standards authorities.
• Translate business requirements and logical designs into working physical database designs.
• Identify scope of automation and implementation as and when required.
Tech Skills:
• A deep understanding of Hadoop Ecosystem - Architecture and Working Principal.
• Cluster connectivity, security and the factors that affect distributed system performance.
• Cluster Installation and Upgrade.
• Data Security on Hadoop platform - at REST and Transit.
• Kerberos Authentication.
• SSL/TLS.
• Clear concepts on Active Directory and CA.
• Performance tuning - Hive and Impala.
• Experience on streaming processing services like Kafka.
• DevOps - Decent Expertise with Continuous Integration CICD and Build Tools.
• Good to have Knowledge of programming in Spark/PySpark/Hive
• Exposure on Public Cloud service Providers.
• Firm Knowledge of Linux and Shell scripting.
• Experience with Ansible and Containerization Tech will be a Plus
• Experience managing data pipelines/ETL using NIFI/Apache Airflow
Soft Skills:
• Adaptability
• Keen to learn new technologies
• Receptive to change
• Good collaboration & communication skills, the ability to participate in an interdisciplinary team.
• Strong written communications and documentation experience.
• Prior Banking Experience is a PLUS
show more
Required candidates with >14 years of experience in Big Data Administration. Will be responsible for driving the implementation and ongoing support of the Big Data platforms. The successful candidate will be responsible for managing a large-scale data lake with X Petabytes of storage across all domains of Bank, hundreds of users and the daily rate of X Terabytes of new data being ingested. Candidate should have expertise on DEVOPS technology stack well - CICD and Build Tools.
Responsibilities:
• Managing Big Data Platforms with responsibilities including and not limited to deployments, Upgrades, Back up, recovery, Resource Mgmt., performance tuning, capacity planning and security within the Big-data platforms.
• Knowledge of best practices related to security, performance, and disaster recovery.
• Open and coordinate service requests with different Vendor Support. Ensure alignment with current IT standards and controls including SDLC & ITIL processes (Incident, Problem, Change management).
• Allocate and manage compute, memory, storage, number of name-node objects for individual pools and user groups.
• Define and Set up ACL Policies in Ranger.
...
• Experience Maintaining and Managing high availability systems.
• Identify Opportunities for Process Improvements, Efficiency and Quality.
• Manage review and audit of Platforms w.r.t security, Financial standards authorities.
• Translate business requirements and logical designs into working physical database designs.
• Identify scope of automation and implementation as and when required.
Tech Skills:
• A deep understanding of Hadoop Ecosystem - Architecture and Working Principal.
• Cluster connectivity, security and the factors that affect distributed system performance.
• Cluster Installation and Upgrade.
• Data Security on Hadoop platform - at REST and Transit.
• Kerberos Authentication.
• SSL/TLS.
• Clear concepts on Active Directory and CA.
• Performance tuning - Hive and Impala.
• Experience on streaming processing services like Kafka.
• DevOps - Decent Expertise with Continuous Integration CICD and Build Tools.
• Good to have Knowledge of programming in Spark/PySpark/Hive
• Exposure on Public Cloud service Providers.
• Firm Knowledge of Linux and Shell scripting.
• Experience with Ansible and Containerization Tech will be a Plus
• Experience managing data pipelines/ETL using NIFI/Apache Airflow
Soft Skills:
• Adaptability
• Keen to learn new technologies
• Receptive to change
• Good collaboration & communication skills, the ability to participate in an interdisciplinary team.
• Strong written communications and documentation experience.
• Prior Banking Experience is a PLUS
show more