Overall 4+ years of experience in Azure and DevOps. ·Must be well versed with automation on cloud (both infrastructure and application deployments). ·Automation using ARM Templates and Terraform·Must have a good understanding and experience of containers and DevOps tools and services ·Must be able to understand the networking and communication on Cloud platform. ·Must have a good understanding and experience of AKS/K8S·Should be willing to work both of
Overall 4+ years of experience in Azure and DevOps. ·Must be well versed with automation on cloud (both infrastructure and application deployments). ·Automation using ARM Templates and Terraform·Must have a good understanding and experience of containers and DevOps tools and services ·Must be able to understand the networking and communication on Cloud platform. ·Must have a good understanding and experience of AKS/K8S·Should be willing to work both of
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
Should be able to:Assesses business process requirementsManages timely system hardware and application software implementation and upgradesTroubleshoots problems with the applicationSupport business processes and provide logical solutions for requirementsLiaisoning with different functional teams for resolving functional needs.Ensure meeting timelines for application deliverables by co-ordinating among different technical teams.Must have experience as Heal
Should be able to:Assesses business process requirementsManages timely system hardware and application software implementation and upgradesTroubleshoots problems with the applicationSupport business processes and provide logical solutions for requirementsLiaisoning with different functional teams for resolving functional needs.Ensure meeting timelines for application deliverables by co-ordinating among different technical teams.Must have experience as Heal
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
Role : AEM DeveloperJob descriptionRoles and Responsibilities•Create Sites on Adobe Experience Manager•Designing and developing web applications using the Adobe platform, including guidance ofsite structure, components, templates, workflows, dialogs, object model designs (JavaAPIs) and unit testing using AEM architecture (CRX, OSGI, JCR)•Setup and configure AEM authoring, publish, and dispatcher environment with Adoberecommended best practices• Integrate A
Role : AEM DeveloperJob descriptionRoles and Responsibilities•Create Sites on Adobe Experience Manager•Designing and developing web applications using the Adobe platform, including guidance ofsite structure, components, templates, workflows, dialogs, object model designs (JavaAPIs) and unit testing using AEM architecture (CRX, OSGI, JCR)•Setup and configure AEM authoring, publish, and dispatcher environment with Adoberecommended best practices• Integrate A
1. The Enterprise Architect - Cloud & Infrastructure will be a key leader responsible for setting together with CTO organization, the architectural vision for Mondelez’ Cloud & Infrastructure area. He/She will serve as an expert and trusted partner to create and realized that vision. 2. You work with business and IT stakeholders to define a future-state of requirements, principles and models. You also work to position the organization to deliver its f
1. The Enterprise Architect - Cloud & Infrastructure will be a key leader responsible for setting together with CTO organization, the architectural vision for Mondelez’ Cloud & Infrastructure area. He/She will serve as an expert and trusted partner to create and realized that vision. 2. You work with business and IT stakeholders to define a future-state of requirements, principles and models. You also work to position the organization to deliver its f
Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
let similar jobs come to you
We will keep you updated when we have similar job postings.
Thank you for subscribing to your personalised job alerts.