You are successfully logged out of your my randstad account

You have successfully deleted your account

    Thank you for subscribing to your personalised job alerts.

    11 jobs found for Data Architect

    filter
    clear all
      • chennai, tamil nadu
      • permanent
      Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years Responsibilities for DW ArchitectDesign and implement relational and dimensional models in Snowflake and related tools.Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.Coll
      Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years Responsibilities for DW ArchitectDesign and implement relational and dimensional models in Snowflake and related tools.Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.Coll
      • kolkata, west bengal
      • permanent
      Minimum 12+ years of experience in Microsoft or Azure BI stack (Microsoft SSAS, SSIS, SSRS, and Power BI) work-stream with experience in designing, developing, debugging & maintaining BI reports/dashboards.Hands-on experience deploying the PBI service to mid-market and/or enterprise-scale organizations.Must understand infrastructure (Security, Governance, Cloud Deployment strategies) to be successful in this role.Exposure to the migration of on-premises to
      Minimum 12+ years of experience in Microsoft or Azure BI stack (Microsoft SSAS, SSIS, SSRS, and Power BI) work-stream with experience in designing, developing, debugging & maintaining BI reports/dashboards.Hands-on experience deploying the PBI service to mid-market and/or enterprise-scale organizations.Must understand infrastructure (Security, Governance, Cloud Deployment strategies) to be successful in this role.Exposure to the migration of on-premises to
      • bangalore, karnataka
      • permanent
      Qualifications:•          Bachelor’s Degree.•          8+ years of experience in information technology with 5+ years working with data.•          Strong communication skills: ability to be well-organized, clear, and concise in conversations and written communications with development team members, business analysts, and project management.•          Experience in Cloud Architecture with Big data tools and machine learning preferred.•          Advanced dim
      Qualifications:•          Bachelor’s Degree.•          8+ years of experience in information technology with 5+ years working with data.•          Strong communication skills: ability to be well-organized, clear, and concise in conversations and written communications with development team members, business analysts, and project management.•          Experience in Cloud Architecture with Big data tools and machine learning preferred.•          Advanced dim
      • pune, maharashtra
      • contract
      Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience  Role and Experience expected:  ·         Experience with implementations of the Datavault modelling, architecture, and methodology·         Knowledge of both data lake and data warehouse technologies·         Well versed wi
      Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience  Role and Experience expected:  ·         Experience with implementations of the Datavault modelling, architecture, and methodology·         Knowledge of both data lake and data warehouse technologies·         Well versed wi
      • pune, maharashtra
      • contract
      Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:·         Experience with implementing architecture solutions using Azure data Analytics platform·         Knowledge of both data lake and data warehouse technologies·         Well versed with ELT concepts and designGood with Azure Data F
      Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:·         Experience with implementing architecture solutions using Azure data Analytics platform·         Knowledge of both data lake and data warehouse technologies·         Well versed with ELT concepts and designGood with Azure Data F
      • pune, maharashtra
      • contract
          6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
          6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
      • pune, maharashtra
      • contract
       8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
       8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
      • pune, maharashtra
      • contract
      Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
      Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
      • pune, maharashtra
      • contract
      3+ years of relevant strong experience in developing data processing task using Spark and  AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
      3+ years of relevant strong experience in developing data processing task using Spark and  AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
      • pune, maharashtra
      • contract
      12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
      12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
      • pune, maharashtra
      • contract
      Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
      Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha

    Thank you for subscribing to your personalised job alerts.

    Explore over 8 jobs

    It looks like you want to switch your language. This will reset your filters on your current job search.