You are successfully logged out of your my randstad account

You have successfully deleted your account

    Thank you for subscribing to your personalised job alerts.

    26 jobs found for Big Data

    filter
    clear all
      • bangalore, karnataka
      • permanent
      • 1
      Expected Skillset: Good experience in Bigdata Testing in all phases of testingShould have hands on experience in HDFS, Hive, Spark, Oracle and Mongodb.Good grasp on QA and testing methodologiesSound experience in writing SQLGood to have knowledge on - [ETL Informatica Testing] Roles and Responsibilities: Review and analyze system specifications. Reports test results and enter problems and issues into the bug database. Monitors and updates bug reports as ne
      Expected Skillset: Good experience in Bigdata Testing in all phases of testingShould have hands on experience in HDFS, Hive, Spark, Oracle and Mongodb.Good grasp on QA and testing methodologiesSound experience in writing SQLGood to have knowledge on - [ETL Informatica Testing] Roles and Responsibilities: Review and analyze system specifications. Reports test results and enter problems and issues into the bug database. Monitors and updates bug reports as ne
      • kenton
      • permanent
      • 12 months
      Roles&Responsibilities1. Experience in developing web and mobile applications using HTML5, CSS3, JQuery, jQueryMobile2. Knowledge about error management , network mgmt and battery specific device api3. Developer to know about back end integration process with api , json and RESTFul services4. Work with usability experts to ascertain feasibility5. Work with the architect to create the app architecture6. Create application and protocol designs7. Develop the
      Roles&Responsibilities1. Experience in developing web and mobile applications using HTML5, CSS3, JQuery, jQueryMobile2. Knowledge about error management , network mgmt and battery specific device api3. Developer to know about back end integration process with api , json and RESTFul services4. Work with usability experts to ascertain feasibility5. Work with the architect to create the app architecture6. Create application and protocol designs7. Develop the
      • hyderabad, telangana
      • permanent
      Hadoop development and implementation. Loading data from disparate data sets. Pre-processing using Hive. Designing, building, installing, configuring and supporting Hadoop. Translate complex functional and technical requirements into detailed design. Perform analysis of vast data stores and uncover insights. Maintain security and data privacy. Create scalable and high-performance web services for data tracking. High-speed querying. Acquire complete knowled
      Hadoop development and implementation. Loading data from disparate data sets. Pre-processing using Hive. Designing, building, installing, configuring and supporting Hadoop. Translate complex functional and technical requirements into detailed design. Perform analysis of vast data stores and uncover insights. Maintain security and data privacy. Create scalable and high-performance web services for data tracking. High-speed querying. Acquire complete knowled
      • gurgaon, haryana
      • permanent
      What We NeedMust have:• Strong programming experience with Python• Experience working with Big Data streaming services such as Kinesis, Kafka, etc.• Experience working with NoSQL data stores such as HBase, DynamoDB, etc• Experience working with Hadoop and Big Data processing frameworks (Spark, Hive, Nifi, Spark-Streaming, Flink, etc.)• Experience with SQL and SQL Analytical functions, experience participating in key business,
      What We NeedMust have:• Strong programming experience with Python• Experience working with Big Data streaming services such as Kinesis, Kafka, etc.• Experience working with NoSQL data stores such as HBase, DynamoDB, etc• Experience working with Hadoop and Big Data processing frameworks (Spark, Hive, Nifi, Spark-Streaming, Flink, etc.)• Experience with SQL and SQL Analytical functions, experience participating in key business,
      • bengaluru, karnataka
      • permanent
      Bachelors or higher degree in Computer Science or a related discipline. At least 2 years of data pipeline and data product design, development, delivery experience anddeploying ETL/ELT solutions on Azure Data Factory. Azure native data/big-data tools, technologies and services experience including – Storage BLOBS,
      Bachelors or higher degree in Computer Science or a related discipline. At least 2 years of data pipeline and data product design, development, delivery experience anddeploying ETL/ELT solutions on Azure Data Factory. Azure native data/big-data tools, technologies and services experience including – Storage BLOBS,
      • hyderabad, telangana
      • permanent
      Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases (PostgreSQL and MySQL preferably).Database administration and data performance management, including the details of indices, normalisation, query optimisation and resultant execution plans.Building and optimising ‘big data’ data pipelines, architectures and data sets.Root cause analysis on intern
      Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases (PostgreSQL and MySQL preferably).Database administration and data performance management, including the details of indices, normalisation, query optimisation and resultant execution plans.Building and optimising ‘big data’ data pipelines, architectures and data sets.Root cause analysis on intern
      • pune, maharashtra
      • permanent
      Basic qualifications• Bachelor’s degree in an engineering or technical field such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.• 3+ years of experience with data warehouse technical architectures, ETL/ ELT, reporting/analytic tools and, scripting.• 4+ years of demonstrated quantitative and qualitative data experience with data modeling, ETL development• 3+ years of data modeling experience and proficiency in writing complex
      Basic qualifications• Bachelor’s degree in an engineering or technical field such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.• 3+ years of experience with data warehouse technical architectures, ETL/ ELT, reporting/analytic tools and, scripting.• 4+ years of demonstrated quantitative and qualitative data experience with data modeling, ETL development• 3+ years of data modeling experience and proficiency in writing complex
      • pune, maharashtra
      • permanent
      Basic qualifications• Bachelor’s degree in an engineering or technical field such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.• 3+ years of experience with data warehouse technical architectures, ETL/ ELT, reporting/analytic tools and, scripting.• 4+ years of demonstrated quantitative and qualitative data experience with data modeling, ETL development• 3+ years of data modeling experience and proficiency in writing complex
      Basic qualifications• Bachelor’s degree in an engineering or technical field such as Computer Science, Physics, Mathematics, Statistics, Engineering or similar.• 3+ years of experience with data warehouse technical architectures, ETL/ ELT, reporting/analytic tools and, scripting.• 4+ years of demonstrated quantitative and qualitative data experience with data modeling, ETL development• 3+ years of data modeling experience and proficiency in writing complex
      • bangalore, karnataka
      • permanent
      Required:* At least 5 - 8 years of experience* Experience working with SaaS and complex enterprise systems* Experience in building enterprise systems using Java* Good knowledge of databases and SQL* Can work independently* Good communication skillsGood to have:* Comfortable with a range of server technologies* Big data analytics* Experience in Scala
      Required:* At least 5 - 8 years of experience* Experience working with SaaS and complex enterprise systems* Experience in building enterprise systems using Java* Good knowledge of databases and SQL* Can work independently* Good communication skillsGood to have:* Comfortable with a range of server technologies* Big data analytics* Experience in Scala
      • bangalore, karnataka
      • permanent
      Required: At least 4 - 8 years of experience Experience working with SaaS and complex enterprise systemsExperience in building enterprise systems using JavaGood knowledge of databases and SQL *Can work independently Good communication skills Good to have:Comfortable with a range of server technologies Big data analytics Experience in Scala
      Required: At least 4 - 8 years of experience Experience working with SaaS and complex enterprise systemsExperience in building enterprise systems using JavaGood knowledge of databases and SQL *Can work independently Good communication skills Good to have:Comfortable with a range of server technologies Big data analytics Experience in Scala
      • mumbai, maharashtra
      • permanent
      8+ years of overall industry experience and minimum of 8-10 years of experience building and deploying large scale data processing pipelines in a production environment• Technical Expertise: Experience building data pipelines and data centric applications using distributed storage platforms like GCP, HDFS, SAP BW and distributed processing platforms like Hadoop, HDInsight, Spark, Hive, Oozie, Airflow, etc• Focus on excellence: Has practical experience of D
      8+ years of overall industry experience and minimum of 8-10 years of experience building and deploying large scale data processing pipelines in a production environment• Technical Expertise: Experience building data pipelines and data centric applications using distributed storage platforms like GCP, HDFS, SAP BW and distributed processing platforms like Hadoop, HDInsight, Spark, Hive, Oozie, Airflow, etc• Focus on excellence: Has practical experience of D
      • noida
      • permanent
      Job Description: 3 to 9 years of software development experience in Core Java/Python/Scala & Big Data technologies (Spark/Hive/Hadoop) Skills Required: • Experience in working on Hadoop Distribution, good understanding of core concepts and best practices• Good experience in building/tuning Spark pipelines in Scala/Python• Good experience in writing complex Hive queries to derieve business critical insights• Good Programming experience with Java/Python/Scal
      Job Description: 3 to 9 years of software development experience in Core Java/Python/Scala & Big Data technologies (Spark/Hive/Hadoop) Skills Required: • Experience in working on Hadoop Distribution, good understanding of core concepts and best practices• Good experience in building/tuning Spark pipelines in Scala/Python• Good experience in writing complex Hive queries to derieve business critical insights• Good Programming experience with Java/Python/Scal
      • bangalore, karnataka
      • permanent
      • Proficient in design and development of Data pipelines/ Integrations jobs• Experience in Informatica Power Centre• Experience in constructing logical and physical data modelling to build the Data Warehouse /Data Lake• Good Experience in handling the SQL, Index, Partitioning, distribution etc.,• Extensive experience in ETL methodology for performing Data Profiling, Data Migration,Extraction Transformation and Loading using ETL tools• Good knowledge on dat
      • Proficient in design and development of Data pipelines/ Integrations jobs• Experience in Informatica Power Centre• Experience in constructing logical and physical data modelling to build the Data Warehouse /Data Lake• Good Experience in handling the SQL, Index, Partitioning, distribution etc.,• Extensive experience in ETL methodology for performing Data Profiling, Data Migration,Extraction Transformation and Loading using ETL tools• Good knowledge on dat
      • bangalore, karnataka
      • permanent
      • Proven track record in Front End development projects (Using Angular)• Experience in the area of ‘big-data’, ‘cloud technologies’ and ‘micro services’ is an advantage• Experience in the area of ‘analytics’ and ‘anomaly detection’ is an advantage• Sound knowledge in technologies such as Java, JavaScript, database systems, web protocols,CSS, HTML• Experience in the area of ‘NoSQL’, ‘Document’ and specially MongoDB database is anadvantage• Experience with R
      • Proven track record in Front End development projects (Using Angular)• Experience in the area of ‘big-data’, ‘cloud technologies’ and ‘micro services’ is an advantage• Experience in the area of ‘analytics’ and ‘anomaly detection’ is an advantage• Sound knowledge in technologies such as Java, JavaScript, database systems, web protocols,CSS, HTML• Experience in the area of ‘NoSQL’, ‘Document’ and specially MongoDB database is anadvantage• Experience with R
      • mumbai, maharashtra
      • permanent
      Job SpecificationsKnowledge:· Good handle on data architecture and analytics· Should have strong understanding of Data Warehouse design, BI reporting, and Dashboarddevelopment· Knowledge of visualization tools like Power BI, Tableau and databases like Snowflake, SQLServer· Knowledge of big data and analytics tools like R, Python, and NoSQL databases· Knowledge of Non-profit sector systems and solutions is desirableSkills/ CompetenciesDeciding and initiatin
      Job SpecificationsKnowledge:· Good handle on data architecture and analytics· Should have strong understanding of Data Warehouse design, BI reporting, and Dashboarddevelopment· Knowledge of visualization tools like Power BI, Tableau and databases like Snowflake, SQLServer· Knowledge of big data and analytics tools like R, Python, and NoSQL databases· Knowledge of Non-profit sector systems and solutions is desirableSkills/ CompetenciesDeciding and initiatin
      • chennai, tamil nadu
      • permanent
      • 6
      Integrator (Junior Integrator):Be engaged in the end to end technical integration process for Clients. They will represent the Team at internal and external meetings to build strong working relationships with our clients and our team. ● Support the onboarding of new clients through gathering client intelligence, mapping deliverables and supporting project planning. They will be responsible for first class implementation and testing of analytics solutions a
      Integrator (Junior Integrator):Be engaged in the end to end technical integration process for Clients. They will represent the Team at internal and external meetings to build strong working relationships with our clients and our team. ● Support the onboarding of new clients through gathering client intelligence, mapping deliverables and supporting project planning. They will be responsible for first class implementation and testing of analytics solutions a
      • bangalore, karnataka
      • contract
      • 12 months
      We are hiring for an Organization that creates connections across the global food system to help the world thrive.ROLES & RESPONSIBILITIESAssist our internal traders during pre-fixture - fetch Port information, restrictions and port disbursement amount from agents.Handling vessel certificates - keep vessel certificates up-to- date in Q88 and procure latest certificates from Master basis validity. Prepare and issue Letter of indemnity as required.Maintain r
      We are hiring for an Organization that creates connections across the global food system to help the world thrive.ROLES & RESPONSIBILITIESAssist our internal traders during pre-fixture - fetch Port information, restrictions and port disbursement amount from agents.Handling vessel certificates - keep vessel certificates up-to- date in Q88 and procure latest certificates from Master basis validity. Prepare and issue Letter of indemnity as required.Maintain r
      • bangalore, karnataka
      • permanent
      AWS Solutions Architect/DevOps/SysOps CertifiedStrong practical Linux and Windows-based systems administration skills in a Cloud or Virtualized environment.Experience in programming languages like Java/Python.Demonstrated ability to think strategically about business, product, and technical challengesIntegration of AWS cloud services with on-premises technologies from Microsoft, IBM, Oracle, HP, SAP etc.Should have good experience of SQL and NOSQL database
      AWS Solutions Architect/DevOps/SysOps CertifiedStrong practical Linux and Windows-based systems administration skills in a Cloud or Virtualized environment.Experience in programming languages like Java/Python.Demonstrated ability to think strategically about business, product, and technical challengesIntegration of AWS cloud services with on-premises technologies from Microsoft, IBM, Oracle, HP, SAP etc.Should have good experience of SQL and NOSQL database
      • bangalore, karnataka
      • permanent
      Mandatory Skillset: • 15 – 20 years of industry experience in Engineering and IT as technical architect role developing large scale systems based on complex integration solutions and server/ cloud based web application• Prior extensive experience working in a capacity of solution architect, data architect, developing architectures on cloud and hybrid networks• Experience in architecting end to end IoT solutions using Azure, AWS or Google cloud stack • Expe
      Mandatory Skillset: • 15 – 20 years of industry experience in Engineering and IT as technical architect role developing large scale systems based on complex integration solutions and server/ cloud based web application• Prior extensive experience working in a capacity of solution architect, data architect, developing architectures on cloud and hybrid networks• Experience in architecting end to end IoT solutions using Azure, AWS or Google cloud stack • Expe
      • bengaluru / bangalore
      • permanent
      • 0 years 0 months 0 days
      Job DescriptionThe role requires self-sufficiency, experience with working with multiple data sets, build data schemas and affinity for financial data, experience working with development & test teams to validate requirements and outputsCandidate is required to support business in integrating the new requirements of IFRS17 into their reporting solution by changing its current Analysis of Change process to move away from its spreadsheet based ‘Right Hand Si
      Job DescriptionThe role requires self-sufficiency, experience with working with multiple data sets, build data schemas and affinity for financial data, experience working with development & test teams to validate requirements and outputsCandidate is required to support business in integrating the new requirements of IFRS17 into their reporting solution by changing its current Analysis of Change process to move away from its spreadsheet based ‘Right Hand Si
      • gurgaon, haryana
      • permanent
      Experience Level - 12 years of minimum experience.Location - Mumbai/Bangalore/Delhi NCR            Functional / Technical Skills:• In-depth architectural, design and leadership skills• A thorough understanding of technical landscape in the following areas:       Java/J2EE,       Microservices (Spring/Sprint Boot etc)       Public Clouds (Any one of AWS, Azure or GCP),       Dockers, K8, CI/CD and       Big DataWorking knowledge of Web: HTML5, CSS3, ES2015+
      Experience Level - 12 years of minimum experience.Location - Mumbai/Bangalore/Delhi NCR            Functional / Technical Skills:• In-depth architectural, design and leadership skills• A thorough understanding of technical landscape in the following areas:       Java/J2EE,       Microservices (Spring/Sprint Boot etc)       Public Clouds (Any one of AWS, Azure or GCP),       Dockers, K8, CI/CD and       Big DataWorking knowledge of Web: HTML5, CSS3, ES2015+
      • bangalore, karnataka
      • permanent
      About the RoleThe person's main focus is on customer success, which is achieved by driving engineeringresources against project goals and deliverables.What is the Job like?★ To be on the frontline of the mission and driving complex product and technologyinitiatives and ensuring successful and timely execution.★ Being driven by achieved business value, rather than solely on project deliverables.★ Being a hands-on leader and ensuring that technology is appli
      About the RoleThe person's main focus is on customer success, which is achieved by driving engineeringresources against project goals and deliverables.What is the Job like?★ To be on the frontline of the mission and driving complex product and technologyinitiatives and ensuring successful and timely execution.★ Being driven by achieved business value, rather than solely on project deliverables.★ Being a hands-on leader and ensuring that technology is appli
      • chennai, tamil nadu
      • permanent
      Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years Responsibilities for DW ArchitectDesign and implement relational and dimensional models in Snowflake and related tools.Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.Coll
      Company: an Leading Shipping Technology CompanyLocation: ChennaiExpericne:5 -8 Years and 8-15 Years Responsibilities for DW ArchitectDesign and implement relational and dimensional models in Snowflake and related tools.Design, build and support stable, scalable data pipelines or ETL processes that cleanse, structure and integrate big data sets from multiple data sources into DW and provision to transactional systems and Business Intelligence reporting.Coll
      • bengaluru / bangalore
      • permanent
      Technical Skills & ExpertiseMust have:Strong Experience in designing, optimization, capacity planning & architecture of a large multi-tenant messaging clusterMonitoring and tuning messaging middleware platforms and data pipelinesCreate information solutions covering data security, data privacy, multi-tenancy, partitioning etc., for the message brokerWorking knowledge in storing, data modeling and processing streaming data in message brokerExperience and kn
      Technical Skills & ExpertiseMust have:Strong Experience in designing, optimization, capacity planning & architecture of a large multi-tenant messaging clusterMonitoring and tuning messaging middleware platforms and data pipelinesCreate information solutions covering data security, data privacy, multi-tenancy, partitioning etc., for the message brokerWorking knowledge in storing, data modeling and processing streaming data in message brokerExperience and kn
      • hyderabad, telangana
      • permanent
      Overview: Are you excited about Big data and Analytics? Would you love the opportunity to design and build data pipeline technologies that will power the next generation advanced analytics capabilities and AI/ML capabilities? Does solving complex problems motivate you? If yes, we are looking for you! We are looking for a Team Lead/Senior Data Engineer to join our Data Pipeline team and help build the technologies used to drive our advanced Analytics capabi
      Overview: Are you excited about Big data and Analytics? Would you love the opportunity to design and build data pipeline technologies that will power the next generation advanced analytics capabilities and AI/ML capabilities? Does solving complex problems motivate you? If yes, we are looking for you! We are looking for a Team Lead/Senior Data Engineer to join our Data Pipeline team and help build the technologies used to drive our advanced Analytics capabi
      • no data
      • permanent
      Applicant Submission NotificationHello,Avinash has applied for a job posting V21.80 Requirement Testing. To view the applicant details please navigate through this link View Profile .  Internal Submission Template: Requirement  Job Posting:Job Code: JPC - 1622Position tilte: V21.80 Requirement TestingPosted By: Pradeep KumarRespond By: N/ANo.Of.Postions: 5Recruitment Mnager: Arun RIndus: ConstructionState: New YorkCountry: United StatesCity; RochesterRemo
      Applicant Submission NotificationHello,Avinash has applied for a job posting V21.80 Requirement Testing. To view the applicant details please navigate through this link View Profile .  Internal Submission Template: Requirement  Job Posting:Job Code: JPC - 1622Position tilte: V21.80 Requirement TestingPosted By: Pradeep KumarRespond By: N/ANo.Of.Postions: 5Recruitment Mnager: Arun RIndus: ConstructionState: New YorkCountry: United StatesCity; RochesterRemo

    Thank you for subscribing to your personalised job alerts.

    It looks like you want to switch your language. This will reset your filters on your current job search.