You are successfully logged out of your my randstad account

You have successfully deleted your account

    Thank you for subscribing to your personalised job alerts.

    11 jobs found for Devops in Maharashtra

    filter1
    clear all
      • pune, maharashtra
      • contract
          Overall 4+ years of experience in Azure and DevOps. ·Must be well versed with automation on cloud (both infrastructure and application deployments). ·Automation using ARM Templates and Terraform·Must have a good understanding and experience of containers and DevOps tools and services ·Must be able to understand the networking and communication on Cloud platform. ·Must have a good understanding and experience of AKS/K8S·Should be willing to work both of
          Overall 4+ years of experience in Azure and DevOps. ·Must be well versed with automation on cloud (both infrastructure and application deployments). ·Automation using ARM Templates and Terraform·Must have a good understanding and experience of containers and DevOps tools and services ·Must be able to understand the networking and communication on Cloud platform. ·Must have a good understanding and experience of AKS/K8S·Should be willing to work both of
      • pune, maharashtra
      • contract
      5+yrs relevant experience Spark , ETL, SQLAnyone – AWS Azure GCP preferably AWS Optional –Spark StreamingRole & Responsibilities –Experience in developing and optimizing ETL pipelines, big data pipelines.Must have strong big-data core knowledge & experience in programming using Spark – Python/ScalaFamiliarity with DevOps framework – Git/Bitbucket , Jenkins etc.
      5+yrs relevant experience Spark , ETL, SQLAnyone – AWS Azure GCP preferably AWS Optional –Spark StreamingRole & Responsibilities –Experience in developing and optimizing ETL pipelines, big data pipelines.Must have strong big-data core knowledge & experience in programming using Spark – Python/ScalaFamiliarity with DevOps framework – Git/Bitbucket , Jenkins etc.
      • pune, maharashtra
      • contract
       This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
       This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
      • mumbai, maharashtra
      • permanent
      Should be able to:Assesses business process requirementsManages timely system hardware and application software implementation and upgradesTroubleshoots problems with the applicationSupport business processes and provide logical solutions for requirementsLiaisoning with different functional teams for resolving functional needs.Ensure meeting timelines for application deliverables by co-ordinating among different technical teams.Must have experience as Heal
      Should be able to:Assesses business process requirementsManages timely system hardware and application software implementation and upgradesTroubleshoots problems with the applicationSupport business processes and provide logical solutions for requirementsLiaisoning with different functional teams for resolving functional needs.Ensure meeting timelines for application deliverables by co-ordinating among different technical teams.Must have experience as Heal
      • pune, maharashtra
      • contract
       This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
       This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
      • pune, maharashtra
      • contract
          6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
          6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
      • pune, maharashtra
      • contract
       8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
       8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
      • pune, maharashtra
      • contract
      3+ years of relevant strong experience in developing data processing task using Spark and  AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
      3+ years of relevant strong experience in developing data processing task using Spark and  AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
      • mumbai, maharashtra
      • permanent
      Role : AEM DeveloperJob descriptionRoles and Responsibilities•Create Sites on Adobe Experience Manager•Designing and developing web applications using the Adobe platform, including guidance ofsite structure, components, templates, workflows, dialogs, object model designs (JavaAPIs) and unit testing using AEM architecture (CRX, OSGI, JCR)•Setup and configure AEM authoring, publish, and dispatcher environment with Adoberecommended best practices• Integrate A
      Role : AEM DeveloperJob descriptionRoles and Responsibilities•Create Sites on Adobe Experience Manager•Designing and developing web applications using the Adobe platform, including guidance ofsite structure, components, templates, workflows, dialogs, object model designs (JavaAPIs) and unit testing using AEM architecture (CRX, OSGI, JCR)•Setup and configure AEM authoring, publish, and dispatcher environment with Adoberecommended best practices• Integrate A
      • mumbai, maharashtra
      • permanent
      1. The Enterprise Architect - Cloud & Infrastructure will be a key leader responsible for setting together with CTO organization, the architectural vision for Mondelez’ Cloud & Infrastructure area. He/She will serve as an expert and trusted partner to create and realized that vision.      2. You work with business and IT stakeholders to define a future-state of requirements, principles and models. You also work to position the organization to deliver its f
      1. The Enterprise Architect - Cloud & Infrastructure will be a key leader responsible for setting together with CTO organization, the architectural vision for Mondelez’ Cloud & Infrastructure area. He/She will serve as an expert and trusted partner to create and realized that vision.      2. You work with business and IT stakeholders to define a future-state of requirements, principles and models. You also work to position the organization to deliver its f
      • pune, maharashtra
      • contract
      Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
      Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a

    Thank you for subscribing to your personalised job alerts.

    Explore over 8 jobs in Maharashtra

    It looks like you want to switch your language. This will reset your filters on your current job search.