You are successfully logged out of your my randstad account

You have successfully deleted your account

    Thank you for subscribing to your personalised job alerts.

    8 jobs found for Data architect in PUNE, Maharashtra

    filter1
    clear all
      • pune, maharashtra
      • contract
      Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience  Role and Experience expected:  ·         Experience with implementations of the Datavault modelling, architecture, and methodology·         Knowledge of both data lake and data warehouse technologies·         Well versed wi
      Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience  Role and Experience expected:  ·         Experience with implementations of the Datavault modelling, architecture, and methodology·         Knowledge of both data lake and data warehouse technologies·         Well versed wi
      • pune, maharashtra
      • contract
      Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:·         Experience with implementing architecture solutions using Azure data Analytics platform·         Knowledge of both data lake and data warehouse technologies·         Well versed with ELT concepts and designGood with Azure Data F
      Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:·         Experience with implementing architecture solutions using Azure data Analytics platform·         Knowledge of both data lake and data warehouse technologies·         Well versed with ELT concepts and designGood with Azure Data F
      • pune, maharashtra
      • contract
          6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
          6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
      • pune, maharashtra
      • contract
       8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
       8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
      • pune, maharashtra
      • contract
      Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
      Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
      • pune, maharashtra
      • contract
      3+ years of relevant strong experience in developing data processing task using Spark and  AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
      3+ years of relevant strong experience in developing data processing task using Spark and  AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
      • pune, maharashtra
      • contract
      12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
      12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
      • pune, maharashtra
      • contract
      Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
      Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha

    Thank you for subscribing to your personalised job alerts.

    Explore over 8 jobs in Pune

    It looks like you want to switch your language. This will reset your filters on your current job search.