You are successfully logged out of your my randstad account

You have successfully deleted your account

    Thank you for subscribing to your personalised job alerts.

    5 Permanent jobs found in Maharashtra

    filter3
    clear all
      • pune, maharashtra
      • permanent
      AWS Data EngineerRoles: Developer to Lead levelIT Experience range: 7+ yearsLocation: PAN India, Remote Skills and Experience: Must Have: Minimum of 2+ years in a solution architecture role OR Development using service and hosting solutions on AWS IaaS, PaaS and SaaS platforms. Good experience of AWS storage, and it's compute services.Good hands-on expertise on Python, SQL, PySpark, Hive and JenkinsAble to effectively use of AWS managed services (Step function, EMR, Lambda, Glue and Athena ) and design/develop data engineering solutionsExperience with analysis of requirement and client interaction.Good communication skills.Hands-on experience on developing Data platform and its components (Data lake, On cloud Datawarehouse, Batch and streaming data pipeline) Good to haveExperience with Databricks Basic understanding of data modeling Experience in migrating workloads from on-premises to cloud and cloud to cloud migrationsUnderstanding of Boto3 librariesExperience on AWS services like RDS, DynamoDB, RedshiftAbility to drive the deployment of the customers’ workloads into AWS and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for AWS cloud implementations. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies.Basic understanding of sage maker and ML algorithmsAct as a subject-matter expert OR developer around AWS and become a trusted advisor to multiple teams. Coach and mentor engineers to raise the technical ability of the rest of the team, and/or to become certified in required AWS technical certifications Education: Graduate/post graduate in CSE/IT or related fields
      AWS Data EngineerRoles: Developer to Lead levelIT Experience range: 7+ yearsLocation: PAN India, Remote Skills and Experience: Must Have: Minimum of 2+ years in a solution architecture role OR Development using service and hosting solutions on AWS IaaS, PaaS and SaaS platforms. Good experience of AWS storage, and it's compute services.Good hands-on expertise on Python, SQL, PySpark, Hive and JenkinsAble to effectively use of AWS managed services (Step function, EMR, Lambda, Glue and Athena ) and design/develop data engineering solutionsExperience with analysis of requirement and client interaction.Good communication skills.Hands-on experience on developing Data platform and its components (Data lake, On cloud Datawarehouse, Batch and streaming data pipeline) Good to haveExperience with Databricks Basic understanding of data modeling Experience in migrating workloads from on-premises to cloud and cloud to cloud migrationsUnderstanding of Boto3 librariesExperience on AWS services like RDS, DynamoDB, RedshiftAbility to drive the deployment of the customers’ workloads into AWS and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for AWS cloud implementations. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies.Basic understanding of sage maker and ML algorithmsAct as a subject-matter expert OR developer around AWS and become a trusted advisor to multiple teams. Coach and mentor engineers to raise the technical ability of the rest of the team, and/or to become certified in required AWS technical certifications Education: Graduate/post graduate in CSE/IT or related fields
      • pune, maharashtra
      • permanent
      Candidate with strong experience in Linux  6/7/8, VMware, Physical Hardware & Console (Dell & HP) Knowledge of Scripting (bash/Shell/Python)Exposure and Understanding of Patching using IBM BigFix tool;, Configure & solve any issues related to patching & compliance related activities.Minimum 2-4 Years relevant experience required in BigFix L2\\L3 Support and implementation.Ability to patch/release management and ensure all packages are deployed efficiently and to ensure appropriate resolution of BigFix and general patching issue.Knowledge of basics of ITIL (incident, change and problem management) Expert in Networking, Kernel, User Administration, filesystem troubleshooting. Process Skills: Primarily responsible to maintain Unix servers as BAU..Capable of analyzing and troubleshoot applications issues and implement break fixes as per defined SLA’s and SOPs.Experience & functional knowledge on baking applications.Available on 24 x 7 basis in the event of emergencies and should be contactable after office hoursResolve technical issues by implementing break fix.Participates as a team member and fosters teamwork by inter-group coordination within the modules of the project.
      Candidate with strong experience in Linux  6/7/8, VMware, Physical Hardware & Console (Dell & HP) Knowledge of Scripting (bash/Shell/Python)Exposure and Understanding of Patching using IBM BigFix tool;, Configure & solve any issues related to patching & compliance related activities.Minimum 2-4 Years relevant experience required in BigFix L2\\L3 Support and implementation.Ability to patch/release management and ensure all packages are deployed efficiently and to ensure appropriate resolution of BigFix and general patching issue.Knowledge of basics of ITIL (incident, change and problem management) Expert in Networking, Kernel, User Administration, filesystem troubleshooting. Process Skills: Primarily responsible to maintain Unix servers as BAU..Capable of analyzing and troubleshoot applications issues and implement break fixes as per defined SLA’s and SOPs.Experience & functional knowledge on baking applications.Available on 24 x 7 basis in the event of emergencies and should be contactable after office hoursResolve technical issues by implementing break fix.Participates as a team member and fosters teamwork by inter-group coordination within the modules of the project.
      • mumbai, maharashtra
      • permanent
      Experience Required: • At least 5+ years of experience in designing and developing enterprise data architecture technologies and a successful track record in implementing such architectures At least 3+ years of experience in the Investment management industry, preferred exposure to alternative assets and associated data. At least 3+ years of professional delivery experience with Azure in the following areas: • Solution/technical architecture • Big Data/analytics/information analysis/database management • Data Modeling, Data architecture, Data Integration, and Database Design • Data Integration strategy using Batch (ETL/ELT), Streaming (Real-time/near-real streaming), by adopting CDC (changed Data Capture) Framework• Extensive hands-on experience delivering Azure platform solutions: • Required: Azure Storage, Azure SQL DB/DW, Azure Data Factory, Azure Stream Analytics, Azure Synapse Analytics, Azure DevOps, Data Bricks• Nice to have: Azure ML Studio, Snowflake • Advanced hands-on SQL, Spark, Python, Scala, pySpark (2+ of these) knowledge and extensive experience working with relational databases for data querying and retrieval • Experience designing and implementing data models across multiple datasets, including best practices for dimensional modeling, storage considerations, and handling full and incremental batch scenarios• Hands-on experience with data visualization and analytics tools preferably Power BI • Experience implementing security controls (data encryption, security groups, and roles, row-level security, etc.) across data lake storage, models, and analytics workspaces • Familiarity with data governance best practices, including exposure to tools such as Azure Purview, Collibra, or equivalent
      Experience Required: • At least 5+ years of experience in designing and developing enterprise data architecture technologies and a successful track record in implementing such architectures At least 3+ years of experience in the Investment management industry, preferred exposure to alternative assets and associated data. At least 3+ years of professional delivery experience with Azure in the following areas: • Solution/technical architecture • Big Data/analytics/information analysis/database management • Data Modeling, Data architecture, Data Integration, and Database Design • Data Integration strategy using Batch (ETL/ELT), Streaming (Real-time/near-real streaming), by adopting CDC (changed Data Capture) Framework• Extensive hands-on experience delivering Azure platform solutions: • Required: Azure Storage, Azure SQL DB/DW, Azure Data Factory, Azure Stream Analytics, Azure Synapse Analytics, Azure DevOps, Data Bricks• Nice to have: Azure ML Studio, Snowflake • Advanced hands-on SQL, Spark, Python, Scala, pySpark (2+ of these) knowledge and extensive experience working with relational databases for data querying and retrieval • Experience designing and implementing data models across multiple datasets, including best practices for dimensional modeling, storage considerations, and handling full and incremental batch scenarios• Hands-on experience with data visualization and analytics tools preferably Power BI • Experience implementing security controls (data encryption, security groups, and roles, row-level security, etc.) across data lake storage, models, and analytics workspaces • Familiarity with data governance best practices, including exposure to tools such as Azure Purview, Collibra, or equivalent
      • pune, maharashtra
      • permanent
      Position QualificationsThe candidate should have a thorough interest and understanding of all aspects related to operation and support of Kubernetes and container technology, Azure cloud-based solutions in DevOps model, and strong scripting & automating skills. In order to ensure effective, stable and secure operation of DNV’s global Azure cloud IT infrastructure with on-premise elements, the key responsibilities for the GSS IT Kubernetes and DevOps Engineer are:Operation of Azure cloud-based servicesDeveloping and maintaining Kubernetes clustersScripting and automating (including integrations of cloud and on-premise components)React to customers’ feedback and requests using both ITIL and SCRUM methodologiesEnsure proper communication with customersIncidents troubleshooting and root cause analysisDevelopment of monitoring solution based on Azure Monitor toolsetCooperation with monitoring operatorsProducing design and operational documentationPosition QualificationsTo maximize your chances, you need:Strong knowledge and practical experience of Kubernetes Clusters and Container technology including:AKS, Docker, Helm, KustomizeService Mesh, networking, security toolsStrong operational knowledge of Microsoft Azure CloudOperational experience in DevOps toolchains and Infrastructure as Code tools including:Git, CI/CD pipelines – preferably Azure DevOpsTerraform, Terragrunt, Ansible, ARM templatesGitOps tools like ArgoCD or FluxKnowledge of monitoring toolset including Prometheus, Grafana and Azure Monitor suiteStrong scripting and automating skills using e.g. Bash, Python or PowerShellAny software development experience will be a plusLinux OS knowledgeFluent English in daily communication is required (both spoken and written);Experience in end-to-end project delivery with minimal supervision and documenting own work and processesUniversity education level requiredPersonal characteristics:Professional business communication skills (towards customers and stakeholders);Excellent interpersonal skillsSelf-motivatedEfficiency in daily workProactive and solution-oriented mindsetFlexibility and openness to changesResponsibility for own development and assigned tasks
      Position QualificationsThe candidate should have a thorough interest and understanding of all aspects related to operation and support of Kubernetes and container technology, Azure cloud-based solutions in DevOps model, and strong scripting & automating skills. In order to ensure effective, stable and secure operation of DNV’s global Azure cloud IT infrastructure with on-premise elements, the key responsibilities for the GSS IT Kubernetes and DevOps Engineer are:Operation of Azure cloud-based servicesDeveloping and maintaining Kubernetes clustersScripting and automating (including integrations of cloud and on-premise components)React to customers’ feedback and requests using both ITIL and SCRUM methodologiesEnsure proper communication with customersIncidents troubleshooting and root cause analysisDevelopment of monitoring solution based on Azure Monitor toolsetCooperation with monitoring operatorsProducing design and operational documentationPosition QualificationsTo maximize your chances, you need:Strong knowledge and practical experience of Kubernetes Clusters and Container technology including:AKS, Docker, Helm, KustomizeService Mesh, networking, security toolsStrong operational knowledge of Microsoft Azure CloudOperational experience in DevOps toolchains and Infrastructure as Code tools including:Git, CI/CD pipelines – preferably Azure DevOpsTerraform, Terragrunt, Ansible, ARM templatesGitOps tools like ArgoCD or FluxKnowledge of monitoring toolset including Prometheus, Grafana and Azure Monitor suiteStrong scripting and automating skills using e.g. Bash, Python or PowerShellAny software development experience will be a plusLinux OS knowledgeFluent English in daily communication is required (both spoken and written);Experience in end-to-end project delivery with minimal supervision and documenting own work and processesUniversity education level requiredPersonal characteristics:Professional business communication skills (towards customers and stakeholders);Excellent interpersonal skillsSelf-motivatedEfficiency in daily workProactive and solution-oriented mindsetFlexibility and openness to changesResponsibility for own development and assigned tasks
      • pune, maharashtra
      • permanent
      Job RequirementsRequired Credentials:Google Cloud Architect Certified or able to complete within the first 45 days of employment.Preferred security certification (e.g., CISSP, GIAC, CCSP, etc.)Required Qualifications:7+ years of expertise in security architecture, cloud security, and application securityDevSecOps and automation mindsetExpert level in IT infrastructure security (Linux, Windows, networks, cloud, etc)Highly collaborative in a fast-paced team environment with strong written and verbal communication skillsExperience with designing, implementing, and managing application security threat modelingExpertise with vulnerability scanning, container scanning, and SAST/DASTExpertise in identity & access management and certificate & key management solutions3+ years in software development using languages like Python, Go, bash, Java, etcProficient in establishment of security standards, policies, and best practice principles and documentationAbility to participate in software code refactoring to address application securityExposure to full stack development in a cloud environment using CI/CD principalsWorking knowledge in version control such as GitHub, GitLab, Bitbucket, etcExperience implementing security in microservices & serverless architecture, and in messaging between servicesExpert in implementing principle of least privilege and separation of duties with ability to architect for defense in depthAbility to support security governance and compliance using secure template management, IAM permissions, and configuration drift detection/remediationExperience using various tools to automate security in the release pipelineExperience implementing application authentication and authorization using SAML, OAuth, OIDC, LDAP, KerberosExperience with tooling used for Security Information and Event Management (SIEM), Endpoint Detection and Response, Managed Detection and Response, or Extended Detection and ResponseExpertise with dependency and library management and supply chain integrity Example technologies: Grafeas, SLSA, Black Duck, OpenSCAP, Trend Micro Cloud One, Orca Security, Splunk, Splunk Phantom, Sysdig, Aqua, kube-bench, kube-hunter, trivy, Clair, Check Point, Chef InSpec, GitLab SAST/DAST, Palo Alto Prisma Cloud, Palo Alto Cortex XSOAR, TFLint, ScoutSuite, CoreStack, CloudKnox, Hashicorp Vault, CyberArk, Thyotic, Nessus, Crowdstrike, Okta, Auth0, Active DirectoryUseful Qualifications:Candidates with these qualifications will have stronger standing, but they are not absolutely necessary.Understanding of Chaos EngineeringUnderstanding of PCI, SOC2, GDPR, FEDRAMP, and HIPAA compliance standardsExpertise in Microsoft Windows administration and security, Active Directory, and Group PolicyUnderstanding of cryptocurrency and blockchain technology
      Job RequirementsRequired Credentials:Google Cloud Architect Certified or able to complete within the first 45 days of employment.Preferred security certification (e.g., CISSP, GIAC, CCSP, etc.)Required Qualifications:7+ years of expertise in security architecture, cloud security, and application securityDevSecOps and automation mindsetExpert level in IT infrastructure security (Linux, Windows, networks, cloud, etc)Highly collaborative in a fast-paced team environment with strong written and verbal communication skillsExperience with designing, implementing, and managing application security threat modelingExpertise with vulnerability scanning, container scanning, and SAST/DASTExpertise in identity & access management and certificate & key management solutions3+ years in software development using languages like Python, Go, bash, Java, etcProficient in establishment of security standards, policies, and best practice principles and documentationAbility to participate in software code refactoring to address application securityExposure to full stack development in a cloud environment using CI/CD principalsWorking knowledge in version control such as GitHub, GitLab, Bitbucket, etcExperience implementing security in microservices & serverless architecture, and in messaging between servicesExpert in implementing principle of least privilege and separation of duties with ability to architect for defense in depthAbility to support security governance and compliance using secure template management, IAM permissions, and configuration drift detection/remediationExperience using various tools to automate security in the release pipelineExperience implementing application authentication and authorization using SAML, OAuth, OIDC, LDAP, KerberosExperience with tooling used for Security Information and Event Management (SIEM), Endpoint Detection and Response, Managed Detection and Response, or Extended Detection and ResponseExpertise with dependency and library management and supply chain integrity Example technologies: Grafeas, SLSA, Black Duck, OpenSCAP, Trend Micro Cloud One, Orca Security, Splunk, Splunk Phantom, Sysdig, Aqua, kube-bench, kube-hunter, trivy, Clair, Check Point, Chef InSpec, GitLab SAST/DAST, Palo Alto Prisma Cloud, Palo Alto Cortex XSOAR, TFLint, ScoutSuite, CoreStack, CloudKnox, Hashicorp Vault, CyberArk, Thyotic, Nessus, Crowdstrike, Okta, Auth0, Active DirectoryUseful Qualifications:Candidates with these qualifications will have stronger standing, but they are not absolutely necessary.Understanding of Chaos EngineeringUnderstanding of PCI, SOC2, GDPR, FEDRAMP, and HIPAA compliance standardsExpertise in Microsoft Windows administration and security, Active Directory, and Group PolicyUnderstanding of cryptocurrency and blockchain technology

    Thank you for subscribing to your personalised job alerts.

    Explore over 4 jobs in Maharashtra

    It looks like you want to switch your language. This will reset your filters on your current job search.