Back to offers

Specialist Software Engineer - Big Data

IT (Information Technology)
Apply

Permanent contract
Bangalore, India
Hybrid

Reference 25000OID
Start date 2026/01/29
Publication date 2025/12/21

Responsibilities

Job Summary:

We are seeking a highly skilled and motivated Specialist Software Engineer with deep expertise in Big Data technologies, data pipeline orchestration, and observability tooling. The ideal candidate will be responsible for designing, developing, and maintaining scalable data processing systems and integrating observability solutions to ensure system reliability and performance.

Key Responsibilities:Big Data Engineering:
  • Design and implement robust data pipelines using Apache Kafka, Apache NiFi, Apache Spark, and Sqoop.
  • Manage and optimize distributed data storage systems including Hadoop, HDFS, Druid, and ElasticSearch.
  • Integrate and maintain data visualization and monitoring tools like Kibana, Grafana, and Logstash.
  • Ensure efficient data ingestion, transformation, and delivery across various platforms.
Programming & Scripting:
  • Develop automation scripts and data processing utilities using Python 3 and Shell scripting.
  • Build reusable components and libraries for data manipulation and system integration.
Observability & Monitoring:
  • Implement and configure observability agents such as Fluentd, Telegraf, and Logstash.
  • Collaborate with platform teams to integrate OpenTelemetry for distributed tracing and metrics collection (good to have).
  • Maintain dashboards and alerts for system health and performance monitoring.
DevOps & CI/CD:
  • Contribute to CI/CD pipeline development using GitHub Actions.
  • Collaborate with DevOps teams to ensure seamless deployment and integration of data services.
Collaboration & Documentation:
  • Work closely with cross-functional teams including data scientists, platform engineers, and product managers.
  • Document system architecture, data flows, and operational procedures.
  • Participate in code reviews, knowledge sharing sessions, and technical mentoring.
Required Skills & Qualifications:
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
  • 5+ years of experience in Big Data engineering and scripting.
  • Strong hands-on experience with:
    • Kafka, NiFi, Hadoop, HDFS, Spark, Sqoop
    • ElasticSearch, Druid, Kibana, Grafana
    • Python3, Shell scripting
    • Logstash, Fluentd, Telegraf
  • Familiarity with GitHub Actions and basic DevOps practices.
  • Exposure to OpenTelemetry is a plus.
  • Excellent problem-solving, analytical, and communication skills.
Preferred Qualifications:
  • Experience in building real-time data streaming applications.
  • Knowledge of data governance, security, and compliance in Big Data environments.
  • Certifications in Big Data technologies or cloud platforms (AWS/GCP/Azure) are a plus.

Profile required

Job Summary:

We are seeking a highly skilled and motivated Specialist Software Engineer with deep expertise in Big Data technologies, data pipeline orchestration, and observability tooling. The ideal candidate will be responsible for designing, developing, and maintaining scalable data processing systems and integrating observability solutions to ensure system reliability and performance.

Key Responsibilities:Big Data Engineering:
  • Design and implement robust data pipelines using Apache Kafka, Apache NiFi, Apache Spark, and Sqoop.
  • Manage and optimize distributed data storage systems including Hadoop, HDFS, Druid, and ElasticSearch.
  • Integrate and maintain data visualization and monitoring tools like Kibana, Grafana, and Logstash.
  • Ensure efficient data ingestion, transformation, and delivery across various platforms.
Programming & Scripting:
  • Develop automation scripts and data processing utilities using Python 3 and Shell scripting.
  • Build reusable components and libraries for data manipulation and system integration.
Observability & Monitoring:
  • Implement and configure observability agents such as Fluentd, Telegraf, and Logstash.
  • Collaborate with platform teams to integrate OpenTelemetry for distributed tracing and metrics collection (good to have).
  • Maintain dashboards and alerts for system health and performance monitoring.
DevOps & CI/CD:
  • Contribute to CI/CD pipeline development using GitHub Actions.
  • Collaborate with DevOps teams to ensure seamless deployment and integration of data services.
Collaboration & Documentation:
  • Work closely with cross-functional teams including data scientists, platform engineers, and product managers.
  • Document system architecture, data flows, and operational procedures.
  • Participate in code reviews, knowledge sharing sessions, and technical mentoring.
Required Skills & Qualifications:
  • Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field.
  • 5+ years of experience in Big Data engineering and scripting.
  • Strong hands-on experience with:
    • Kafka, NiFi, Hadoop, HDFS, Spark, Sqoop
    • ElasticSearch, Druid, Kibana, Grafana
    • Python3, Shell scripting
    • Logstash, Fluentd, Telegraf
  • Familiarity with GitHub Actions and basic DevOps practices.
  • Exposure to OpenTelemetry is a plus.
  • Excellent problem-solving, analytical, and communication skills.
Preferred Qualifications:
  • Experience in building real-time data streaming applications.
  • Knowledge of data governance, security, and compliance in Big Data environments.
  • Certifications in Big Data technologies or cloud platforms (AWS/GCP/Azure) are a plus.

Why join us

We are committed to support accelerating our Group’s ESG strategy by implementing ESG principles in all our activities and policies. They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection.

Business insight

At Societe Generale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious.

Whether you’re joining us for a period of months, years or your entire career, together we can have a positive impact on the future. Creating, daring, innovating and taking action are part of our DNA.

If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us!

Still hesitating?

You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities. There are many ways to get involved.

Diversity and Inclusion

We are an equal opportunities employer and we are proud to make diversity a strength for our company. Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination.
Share

Titre
Similar jobs

Titre
Jobs & contracts