Key Responsibilities –
1. Dedicated Hadoop Administrator to assist with Technical Guidance & Architecture on Hortonworks Data Platform (HDP) and Hortonworks Data Flow (HDF)
2. Contribution for HDP/HDF Cluster Planning, Architecture Design, Review, Validation & Performance Optimization as required
3. Contribution for HDP Security implementation with Kerberos, Knox, Ranger, KMS, etc.
4. Supporting team on Data Ingestion into existing HDP using HDF and other HDP components (Kafka, Storm, Spark, etc.)
5. Supporting team on Data Governance and Data Quality using HDP Components and Integration
6. Application Deployment, Data Ingestion, Data Storage & Data Access Authorization (Sqoop, PIG, Hive, HBase, etc.)
7. HDP/HDF Platform Administration and monitoring (Ganglia, Nagios, Splunk Dashboards etc) recommendations & implementation
8. Ensuring Availability and Reliability of Data & Analytics Systems
9. Assistance on Cluster Upgrade and Versioning; best practices (Rolling & Express Upgrades)
10. Disaster Recovery: Associated Architecture; Data Replication; RTO/RPO
11. Need to work with Customer’s functional teams for Hadoop design best practices as requiredContribute to new Product validation, implementation & Knowledge Transfer (KT)
We are an equal opportunities employer and we are proud to make diversity a strength for our company. Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, sexual or gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination.