You are successfully logged out of your my randstad account

You have successfully deleted your account

    Thank you for subscribing to your personalised job alerts.

    8 jobs found for Big data in Bengaluru / bangalore

    filter2
    clear all
      • bengaluru / bangalore
      • permanent
      Senior in the area of software application development (5+ years of experience)• Track record in successfully developing RCP, Eclipse and Python applications for localenvironment applications• Track record in Front End development projects (Using Angular)• Experience in the area of ‘big-data’, ‘cloud technologies’ and ‘micro services’ is an advantage• Experience in the area of ‘analytics’ and ‘anomaly detection’ is an advantage• Sound knowledge in technologies such as Java, JavaScript, database systems, web protocols,CSS, HTML• Experience in the area of ‘NoSQL’, ‘Document’ and specially MongoDB database is anadvantage• Experience in developing applications in continues delivery style (Git, Jenkins, Maven etc.)• Fluent in English is a must, German or French is an advantage• Previous exposure to an agile software development culture (Scrum)
      Senior in the area of software application development (5+ years of experience)• Track record in successfully developing RCP, Eclipse and Python applications for localenvironment applications• Track record in Front End development projects (Using Angular)• Experience in the area of ‘big-data’, ‘cloud technologies’ and ‘micro services’ is an advantage• Experience in the area of ‘analytics’ and ‘anomaly detection’ is an advantage• Sound knowledge in technologies such as Java, JavaScript, database systems, web protocols,CSS, HTML• Experience in the area of ‘NoSQL’, ‘Document’ and specially MongoDB database is anadvantage• Experience in developing applications in continues delivery style (Git, Jenkins, Maven etc.)• Fluent in English is a must, German or French is an advantage• Previous exposure to an agile software development culture (Scrum)
      • bengaluru / bangalore
      • permanent
      Mandatory Skills :  Development background is a mustShould have hands on experience on at least one Data Engineering project in BIG data spaceKnowledge on Talend + Hive ( or any of the Relational database with good SQL knowledge) is a mustGood problem-solving skills and willing to learn other big data technologies
      Mandatory Skills :  Development background is a mustShould have hands on experience on at least one Data Engineering project in BIG data spaceKnowledge on Talend + Hive ( or any of the Relational database with good SQL knowledge) is a mustGood problem-solving skills and willing to learn other big data technologies
      • bengaluru / bangalore
      • permanent
      • 0 years 0 months 0 days
      Candidate must be having 6 to 8 years of experience in Hive, Hadoop, and Talend ETL Should have hands on experience on at least one Data Engineering project in BIG data space Must be very strong in SQL (or any of the Relational database) Good problem-solving skills and willing to learn other big data technologies Good to have experience in Vertica Database Nice to have Experience in Azure
      Candidate must be having 6 to 8 years of experience in Hive, Hadoop, and Talend ETL Should have hands on experience on at least one Data Engineering project in BIG data space Must be very strong in SQL (or any of the Relational database) Good problem-solving skills and willing to learn other big data technologies Good to have experience in Vertica Database Nice to have Experience in Azure
      • bengaluru / bangalore
      • permanent
      Proven track record in Front End development projects (Using Angular)• Experience in the area of ‘big-data’, ‘cloud technologies’ and ‘micro services’ is an advantage• Experience in the area of ‘analytics’ and ‘anomaly detection’ is an advantage• Sound knowledge in technologies such as Java, JavaScript, database systems, web protocols,CSS, HTML• Experience in the area of ‘NoSQL’, ‘Document’ and specially MongoDB database is anadvantage• Experience with RabbitMQ and Apache Kafka messaging systems is an advantage• Experience with ElasticSearch is an advantage• Experience in developing applications in continues delivery style (Git, Jenkins, Maven etc.)• Fluent English is a must, German is an advantage• Previous exposure to an agile software development culture (Scrum)• Knowledge in the domain of the railway systems is an advantage
      Proven track record in Front End development projects (Using Angular)• Experience in the area of ‘big-data’, ‘cloud technologies’ and ‘micro services’ is an advantage• Experience in the area of ‘analytics’ and ‘anomaly detection’ is an advantage• Sound knowledge in technologies such as Java, JavaScript, database systems, web protocols,CSS, HTML• Experience in the area of ‘NoSQL’, ‘Document’ and specially MongoDB database is anadvantage• Experience with RabbitMQ and Apache Kafka messaging systems is an advantage• Experience with ElasticSearch is an advantage• Experience in developing applications in continues delivery style (Git, Jenkins, Maven etc.)• Fluent English is a must, German is an advantage• Previous exposure to an agile software development culture (Scrum)• Knowledge in the domain of the railway systems is an advantage
      • bengaluru / bangalore
      • permanent
      One of our Client is looking for Data Scientist for Bangalore Location Desired Candidates: Master's Degree degree in Computer Science, Engineering, or another quantitative field. Knowledge of Python. Minimum 5+ years of work experience with scientific modeling, including Machine Learning methods. Knowledge of code architecture, best practices, and key tools, such as Github, AWS . Hands-on experience modeling chemical or electrical processes is a plus. Responbilities: Developed Chargeback Prediction Model using Logistic Regression. Developed Saves Prediction Model – Feature Engineering, Class Imbalance using SMOTE and Logistic Regression. Data Ingestion into Snowflake from Big Data Parquet/ORC files from Amazon S3. Developed Python Modules for data logging, control and data loading to be used in ETL pipelines. Built complex SQL queries in Amazon and Snowflake for data analysis. Data Insights at various levels using user journey data (visits to subscriptions). Data Extraction from different APIs Statistical Significance and Hypothesis Testing on A/B Test Experiments. Took part in hiring drives as a part of Interview Panel. Knowledge Transfer to new team members within and outside the team
      One of our Client is looking for Data Scientist for Bangalore Location Desired Candidates: Master's Degree degree in Computer Science, Engineering, or another quantitative field. Knowledge of Python. Minimum 5+ years of work experience with scientific modeling, including Machine Learning methods. Knowledge of code architecture, best practices, and key tools, such as Github, AWS . Hands-on experience modeling chemical or electrical processes is a plus. Responbilities: Developed Chargeback Prediction Model using Logistic Regression. Developed Saves Prediction Model – Feature Engineering, Class Imbalance using SMOTE and Logistic Regression. Data Ingestion into Snowflake from Big Data Parquet/ORC files from Amazon S3. Developed Python Modules for data logging, control and data loading to be used in ETL pipelines. Built complex SQL queries in Amazon and Snowflake for data analysis. Data Insights at various levels using user journey data (visits to subscriptions). Data Extraction from different APIs Statistical Significance and Hypothesis Testing on A/B Test Experiments. Took part in hiring drives as a part of Interview Panel. Knowledge Transfer to new team members within and outside the team
      • bengaluru / bangalore
      • permanent
      Job Description:Key tasks and job requirements:Requirement elicitation/ Solution design/ Delivery Provide insights, BI solutions/reports, and recommendations for global marketing programs Deliver reports, dashboards and deep analyses to regional Marketing teams, including US Extract, gather and format the right level of data for Marketing programs and projects Integrate and standardize data sets to enable ease in analytics & visualization (both cloud and enterprise) Establish measures of success (metrics & KPIs) – KPI Synthesis Analyze data to understand performance and effectiveness of digital programs (across channels like Search, Digital Display, Social Media, Email and onsite) oProvide guidance on online tagging for measurement and perform User Acceptance Testing (UAT) Develop statistical models to understand customer journeys and influence marketing strategies Assist with A/B and multi-variate content test setup, execution and measurement Liaise with the US Digital Intelligence team for continuous guidance, direction and alignment Interact with internal groups and agency partners that support Marketing.Skillset Required:  7-10 years of practical experience on Adobe Analytics/ Google Analytics – Setup / Configuration / Implementation/ Processing Rules / TroubleshootingData Visualization, Dashboard/Reporting - Tableau / Spotfire / PowerBI / Qlikview o Ability to think creatively to solve real world business problemsStrong analytical and problem-solving abilities withAbility to self-motivate, manage concurrent projects and work with remote team/sExcellent listening, written and verbal communication skillsResearch focused mindset with strong attention to detail with high motivation, good work ethic and maturityExperience in Adobe Audience Manager, Advanced Statistics, A/B Testing, Database, SQL, Predictive Analytics, SAS, R, Tableau/Spotfire, Big Data, Google AdWords, Personalization, Segmentation, Machine Learning is desirable
      Job Description:Key tasks and job requirements:Requirement elicitation/ Solution design/ Delivery Provide insights, BI solutions/reports, and recommendations for global marketing programs Deliver reports, dashboards and deep analyses to regional Marketing teams, including US Extract, gather and format the right level of data for Marketing programs and projects Integrate and standardize data sets to enable ease in analytics & visualization (both cloud and enterprise) Establish measures of success (metrics & KPIs) – KPI Synthesis Analyze data to understand performance and effectiveness of digital programs (across channels like Search, Digital Display, Social Media, Email and onsite) oProvide guidance on online tagging for measurement and perform User Acceptance Testing (UAT) Develop statistical models to understand customer journeys and influence marketing strategies Assist with A/B and multi-variate content test setup, execution and measurement Liaise with the US Digital Intelligence team for continuous guidance, direction and alignment Interact with internal groups and agency partners that support Marketing.Skillset Required:  7-10 years of practical experience on Adobe Analytics/ Google Analytics – Setup / Configuration / Implementation/ Processing Rules / TroubleshootingData Visualization, Dashboard/Reporting - Tableau / Spotfire / PowerBI / Qlikview o Ability to think creatively to solve real world business problemsStrong analytical and problem-solving abilities withAbility to self-motivate, manage concurrent projects and work with remote team/sExcellent listening, written and verbal communication skillsResearch focused mindset with strong attention to detail with high motivation, good work ethic and maturityExperience in Adobe Audience Manager, Advanced Statistics, A/B Testing, Database, SQL, Predictive Analytics, SAS, R, Tableau/Spotfire, Big Data, Google AdWords, Personalization, Segmentation, Machine Learning is desirable
      • bengaluru / bangalore
      • permanent
      Position: Software Development Engineer-II.  Location: Mumbai, Prabhadevi  Reporting to: Head of engineering  Experience: 4-6 Years  Responsibilities:  Participate in all phases of software development life cycle. Design and implement product features in collaboration with Product and Tech stakeholders Write well designed, efficient and testable code. Identifying production and non-production application issues Troubleshoot production support issues post-deployment and come up with solutions as required Develop, test, implement and maintain application software Recommend changes to improve established java application processes Track work progress and quality through efficient review & acceptance frameworks Participate in Low level designing of a project or module. Mentor junior team members. Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review    Desired Skills and Experience:  Proven hands-on Software Development experience •            This is an individual contributor role. Should have experience in minimum four out following eight skills Should have worked in at least one Start-up  Should have worked on at least one E-commerce project. Should have worked on Highly Scalable and distributed system Should have worked on Microservices or SOA based system Should have worked on Big data and analytics products Should have experience in No-SQL dB’s like mongo or Cassandra Should have worked in product firms Should also have significant exposure to cloud systems like AWS Object Oriented analysis and design using common design patterns. Profound insight of Java and JEE internals (Classloading, Memory Management, Transaction management etc) Proven working experience with Java 8 (OOPs, Collections, Generics, Multithreading, Concurrency Framework, Functional Programming) Experience in working on Spring Framework (Spring-MVC and Spring DI) Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) Data Structures and Algorithms •       Excellent problem solving skills. Outstanding analytical & problem solving skills     
      Position: Software Development Engineer-II.  Location: Mumbai, Prabhadevi  Reporting to: Head of engineering  Experience: 4-6 Years  Responsibilities:  Participate in all phases of software development life cycle. Design and implement product features in collaboration with Product and Tech stakeholders Write well designed, efficient and testable code. Identifying production and non-production application issues Troubleshoot production support issues post-deployment and come up with solutions as required Develop, test, implement and maintain application software Recommend changes to improve established java application processes Track work progress and quality through efficient review & acceptance frameworks Participate in Low level designing of a project or module. Mentor junior team members. Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review    Desired Skills and Experience:  Proven hands-on Software Development experience •            This is an individual contributor role. Should have experience in minimum four out following eight skills Should have worked in at least one Start-up  Should have worked on at least one E-commerce project. Should have worked on Highly Scalable and distributed system Should have worked on Microservices or SOA based system Should have worked on Big data and analytics products Should have experience in No-SQL dB’s like mongo or Cassandra Should have worked in product firms Should also have significant exposure to cloud systems like AWS Object Oriented analysis and design using common design patterns. Profound insight of Java and JEE internals (Classloading, Memory Management, Transaction management etc) Proven working experience with Java 8 (OOPs, Collections, Generics, Multithreading, Concurrency Framework, Functional Programming) Experience in working on Spring Framework (Spring-MVC and Spring DI) Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate) Data Structures and Algorithms •       Excellent problem solving skills. Outstanding analytical & problem solving skills     
      • bengaluru / bangalore
      • permanent
      Technical Skills & ExpertiseMust have:Strong Experience in designing, optimization, capacity planning & architecture of a large multi-tenant messaging clusterMonitoring and tuning messaging middleware platforms and data pipelinesCreate information solutions covering data security, data privacy, multi-tenancy, partitioning etc., for the message brokerWorking knowledge in storing, data modeling and processing streaming data in message brokerExperience and knowledge in managing and deploying message broker anywhere (On-prem, Cloud, Hybrid) Strong willingness and interest to quickly learn and deliver in any tech stack for message broker based on business requirementPreferred skillset and expertise minimum in any one technology/platform under each category is required (but technology is not limited to listed alone): Category TechnologyMessage Broker Kafka, Rabbit MQCloud  AWS, Azure, GCPData Platform CDP Added advantage if candidate is able to meet any of the below expectations: Experience in working with container and container orchestration platforms like Kubernetes/OpenShiftExperience with real time streaming computational system like Spark/StormExperience in process orchestration tools like Apache Airflow/Apache NiFiExperience with databases like SQL Server/Cassandra/Mongo/RedisExperience with object-oriented/object function scripting languages: Python, Scala, Java, .NET etc.Knowledge of version control systems (GIT/Perforce), issue tracking tools (Jira), and CI/CD related tools (Jenkins)Knowledge of Scrum and agile planning ResponsibilitiesAssemble large, complex data sets that meet functional / non-functional business requirementsIdentify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Keep our data separated and secure across national boundaries through multiple data centers and regions.Passion for technology, challenges and quality with a dynamic, forward leaning and proactive personality.Excellent interpersonal skills to drive meaningful value to collaboration with a strong belief in team and delivering together.Strong commitment to ownership, responsibilities and cause of business and team by aligning with leadership and managementWork with data and analytics experts to strive for greater functionality in our data systems.
      Technical Skills & ExpertiseMust have:Strong Experience in designing, optimization, capacity planning & architecture of a large multi-tenant messaging clusterMonitoring and tuning messaging middleware platforms and data pipelinesCreate information solutions covering data security, data privacy, multi-tenancy, partitioning etc., for the message brokerWorking knowledge in storing, data modeling and processing streaming data in message brokerExperience and knowledge in managing and deploying message broker anywhere (On-prem, Cloud, Hybrid) Strong willingness and interest to quickly learn and deliver in any tech stack for message broker based on business requirementPreferred skillset and expertise minimum in any one technology/platform under each category is required (but technology is not limited to listed alone): Category TechnologyMessage Broker Kafka, Rabbit MQCloud  AWS, Azure, GCPData Platform CDP Added advantage if candidate is able to meet any of the below expectations: Experience in working with container and container orchestration platforms like Kubernetes/OpenShiftExperience with real time streaming computational system like Spark/StormExperience in process orchestration tools like Apache Airflow/Apache NiFiExperience with databases like SQL Server/Cassandra/Mongo/RedisExperience with object-oriented/object function scripting languages: Python, Scala, Java, .NET etc.Knowledge of version control systems (GIT/Perforce), issue tracking tools (Jira), and CI/CD related tools (Jenkins)Knowledge of Scrum and agile planning ResponsibilitiesAssemble large, complex data sets that meet functional / non-functional business requirementsIdentify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Keep our data separated and secure across national boundaries through multiple data centers and regions.Passion for technology, challenges and quality with a dynamic, forward leaning and proactive personality.Excellent interpersonal skills to drive meaningful value to collaboration with a strong belief in team and delivering together.Strong commitment to ownership, responsibilities and cause of business and team by aligning with leadership and managementWork with data and analytics experts to strive for greater functionality in our data systems.

    Thank you for subscribing to your personalised job alerts.

    Explore over 6 jobs in Bengaluru Bangalore

    It looks like you want to switch your language. This will reset your filters on your current job search.