You will be working in the team which is responsible for the Real time message processing usingKafka as event streaming platform. In this team you will work together with your colleagues in a(scaled) Agile way. You will be working in a team with Developers, Testers, Technical ApplicationManagers, Scrum Masters, Product Owner and Analysts. The team develops, maintains andoptimizes new and existing services in the area of Messaging/integration.Skills we can’
You will be working in the team which is responsible for the Real time message processing usingKafka as event streaming platform. In this team you will work together with your colleagues in a(scaled) Agile way. You will be working in a team with Developers, Testers, Technical ApplicationManagers, Scrum Masters, Product Owner and Analysts. The team develops, maintains andoptimizes new and existing services in the area of Messaging/integration.Skills we can’
• Candidate should have ability work on C++ development in Unix/Linux environment • Candidate should be able to perform and write unit test cases • Candidate should be able to perform Git operations without any issues • Candidate should be able to work in Agile environment and should be able to follow Scrum Agile practices • Candidate should be able to communicate well to respective counter parts and concern team member
• Candidate should have ability work on C++ development in Unix/Linux environment • Candidate should be able to perform and write unit test cases • Candidate should be able to perform Git operations without any issues • Candidate should be able to work in Agile environment and should be able to follow Scrum Agile practices • Candidate should be able to communicate well to respective counter parts and concern team member
EXPERIENCE - 12 Years EDUCATIONAL QUALIFICATION - Any GraduateLOCATION - Pune JOB DESCRIPTION Server hardwareRemarksConduct site check with customer designated personnel. Log the case with vendor for installation Server hardware mounting and installation Soft configuration - RAID groupOkServer firmware/ Bios update. OS Installation Standard software installation Server Hardening and patching (new patches)OkSelf-Test System Handover Day to day activity (Tr
EXPERIENCE - 12 Years EDUCATIONAL QUALIFICATION - Any GraduateLOCATION - Pune JOB DESCRIPTION Server hardwareRemarksConduct site check with customer designated personnel. Log the case with vendor for installation Server hardware mounting and installation Soft configuration - RAID groupOkServer firmware/ Bios update. OS Installation Standard software installation Server Hardening and patching (new patches)OkSelf-Test System Handover Day to day activity (Tr
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience Role and Experience expected: · Experience with implementations of the Datavault modelling, architecture, and methodology· Knowledge of both data lake and data warehouse technologies· Well versed wi
Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience Role and Experience expected: · Experience with implementations of the Datavault modelling, architecture, and methodology· Knowledge of both data lake and data warehouse technologies· Well versed wi
Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Synapse, Polybase, SQL)Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.Experience in create the Technical Specification Design, Application Interface Design.Files Processing – XML, CSV, Excel, ORC, Parquet file FormatsDevelop batch processin
Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Synapse, Polybase, SQL)Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.Experience in create the Technical Specification Design, Application Interface Design.Files Processing – XML, CSV, Excel, ORC, Parquet file FormatsDevelop batch processin
Expected Technical Skills :Stack Java/J2EESpring BootAngular JS /Angular 2+ /NodeJs / Javascript Méthodologie AgileElastic SearchDockerOpenShiftVoice XML / IntelliJGit/GitLab/Jenkins Expected functionnal Skills :SaaS solution applications Database :MariaDB, MySql, Postgressql, MongoDB Knowing & capacity :Code reviewingSonar
Expected Technical Skills :Stack Java/J2EESpring BootAngular JS /Angular 2+ /NodeJs / Javascript Méthodologie AgileElastic SearchDockerOpenShiftVoice XML / IntelliJGit/GitLab/Jenkins Expected functionnal Skills :SaaS solution applications Database :MariaDB, MySql, Postgressql, MongoDB Knowing & capacity :Code reviewingSonar
About Maersk: A.P. Moller - Maersk is an integrated container logistics company striving to connect and simplify its customers’ supply chains. As the global leader in shipping services, the company operates in 130 countries and employs roughly 76,000 people. Our purpose is to improve life for all by integrating the world. We believe in an integrated world, one planet; connected all the way. Maersk is undergoing unprecedented transformation and growth, reth
About Maersk: A.P. Moller - Maersk is an integrated container logistics company striving to connect and simplify its customers’ supply chains. As the global leader in shipping services, the company operates in 130 countries and employs roughly 76,000 people. Our purpose is to improve life for all by integrating the world. We believe in an integrated world, one planet; connected all the way. Maersk is undergoing unprecedented transformation and growth, reth
Job Role• Analyse data of various products/machines (Forklifts, Warehousing equipment, constructionmachinery, agricultural equipment, and aerial work platforms) and their sub-assemblies as per VehicleMake, Model, Generation, and type differentiation for parts analysis.• Research and determine the fast-moving items (serviceable parts) based on technical parameters ofspare parts• Refer to various technical documents, product information and brand-specific do
Job Role• Analyse data of various products/machines (Forklifts, Warehousing equipment, constructionmachinery, agricultural equipment, and aerial work platforms) and their sub-assemblies as per VehicleMake, Model, Generation, and type differentiation for parts analysis.• Research and determine the fast-moving items (serviceable parts) based on technical parameters ofspare parts• Refer to various technical documents, product information and brand-specific do
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
Job description - Candidates should have implementation project experience. Should have work experience in Pricing, ATP, Credit management, Output determinations, Shipping process, Configured and customized sales, Delivery and Billing Document types and others.
Job description - Candidates should have implementation project experience. Should have work experience in Pricing, ATP, Credit management, Output determinations, Shipping process, Configured and customized sales, Delivery and Billing Document types and others.
Your missionsYou will be part of a team of 6 to 8 people. You will participate in the implementation of new functionalities of the offer or in its adaptation to the needs of new customers:Specification/design of new functionalities/expressions of customer needs,Realization of components and associated unit testsAnalysis/correction of anomalies detected during the test phases or in production. The technical environment and languages used in the project are
Your missionsYou will be part of a team of 6 to 8 people. You will participate in the implementation of new functionalities of the offer or in its adaptation to the needs of new customers:Specification/design of new functionalities/expressions of customer needs,Realization of components and associated unit testsAnalysis/correction of anomalies detected during the test phases or in production. The technical environment and languages used in the project are
Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:· Experience with implementing architecture solutions using Azure data Analytics platform· Knowledge of both data lake and data warehouse technologies· Well versed with ELT concepts and designGood with Azure Data F
Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:· Experience with implementing architecture solutions using Azure data Analytics platform· Knowledge of both data lake and data warehouse technologies· Well versed with ELT concepts and designGood with Azure Data F
Job Description● Strong experience working on Postgres database in the production environment.● DBA will be supporting the customer on a 24x7 support.● Should have hand-on experience on Postgres user management and maintenance.● Should have worked on security implementation for Postgres database.● Have worked on maintenance tasks and knowledge of Backup and Recovery.● Must have Database related issue troubleshooting / analytical skills● Should have good co
Job Description● Strong experience working on Postgres database in the production environment.● DBA will be supporting the customer on a 24x7 support.● Should have hand-on experience on Postgres user management and maintenance.● Should have worked on security implementation for Postgres database.● Have worked on maintenance tasks and knowledge of Backup and Recovery.● Must have Database related issue troubleshooting / analytical skills● Should have good co
One to Three work experience of Finance background preferably PTP background.Skill set required: basic accounting knowledge & entry level work of invoice posting.
One to Three work experience of Finance background preferably PTP background.Skill set required: basic accounting knowledge & entry level work of invoice posting.
AP profile background candidates of 1 to 3 years experience who can join by 01st Mar’23.
AP profile background candidates of 1 to 3 years experience who can join by 01st Mar’23.
Mobile Phone Programs Analyst Location: Pune, IndiaExperience: 2 to 4 Years Position Summary:The Mobile Phone Programs Analyst is responsible for tracking the invoicing and inventory of our global mobile phone services, determining user's mobile service needs, entering orders with our mobile vendors, trouble ticket processing & resolution and providing mobile phone operational support.Responsibilities:Responsible for auditing/approving mobile service provi
Mobile Phone Programs Analyst Location: Pune, IndiaExperience: 2 to 4 Years Position Summary:The Mobile Phone Programs Analyst is responsible for tracking the invoicing and inventory of our global mobile phone services, determining user's mobile service needs, entering orders with our mobile vendors, trouble ticket processing & resolution and providing mobile phone operational support.Responsibilities:Responsible for auditing/approving mobile service provi
Position: Operations Analyst Overview:The vacant position is for Operations Analyst to support the Professional Services Group (PSG) organization. Cloud Software Group (Citrix + TIBCO) PSG is the customer facing technical organization providing presales support and post sales services to assist customers with implementing and supporting Cloud Software Group solutions.This open position will be working under the PSG Operations team in India. The
Position: Operations Analyst Overview:The vacant position is for Operations Analyst to support the Professional Services Group (PSG) organization. Cloud Software Group (Citrix + TIBCO) PSG is the customer facing technical organization providing presales support and post sales services to assist customers with implementing and supporting Cloud Software Group solutions.This open position will be working under the PSG Operations team in India. The
12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
We have requirement of asset engineers for pune location for Bits HAM team. Help in getting quality profiles with candidate having minimum 1 year experience in asset management.Engineer should have good communication skill and knowledge in excel/outlook.Engineer should have minimum 1 year experience in asset management.Engineer should work from office as per the shift time (General shift 8:30 Am to 6 pm).Location: Pune Hinjewadi Phase -2
We have requirement of asset engineers for pune location for Bits HAM team. Help in getting quality profiles with candidate having minimum 1 year experience in asset management.Engineer should have good communication skill and knowledge in excel/outlook.Engineer should have minimum 1 year experience in asset management.Engineer should work from office as per the shift time (General shift 8:30 Am to 6 pm).Location: Pune Hinjewadi Phase -2
JD for NATIVE HANA | 5+ years of experience Expertise in Enterprise Native HANA Data Warehouse development projectsExpertise in design and development of SAP HANA models such as tables, Views (Attribute, Analytic, Calculation views and Native views)Ability to develop complex SQL Procedures, Table Functions, Functions, SQL Script and XS jobsExperience in data model content objects creation and knowledge on DU handlingStrong SQL knowledge and experienceExpe
JD for NATIVE HANA | 5+ years of experience Expertise in Enterprise Native HANA Data Warehouse development projectsExpertise in design and development of SAP HANA models such as tables, Views (Attribute, Analytic, Calculation views and Native views)Ability to develop complex SQL Procedures, Table Functions, Functions, SQL Script and XS jobsExperience in data model content objects creation and knowledge on DU handlingStrong SQL knowledge and experienceExpe
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
Job Description● Database administration for MS SQL● Expert on MS SQL Server and any other NoSQL DB will be added advantage● Have Basics understanding of Oracle DB, PostgreSQL and My SQL technologies● AWS SA Associate Certified● Extensive knowledge and be proficient in utilizing all features within SQL Server and MongoDBtogether with a full understanding of feature differences between each version.● Analyze and optimize the operational environment; sustain
Job Description● Database administration for MS SQL● Expert on MS SQL Server and any other NoSQL DB will be added advantage● Have Basics understanding of Oracle DB, PostgreSQL and My SQL technologies● AWS SA Associate Certified● Extensive knowledge and be proficient in utilizing all features within SQL Server and MongoDBtogether with a full understanding of feature differences between each version.● Analyze and optimize the operational environment; sustain
Scouting talent for Test Automation Lead for the leading governance, risk, and compliance (GRC) advisory and technology firm in financial services based out in Pune.Skill Required :Minimum 5 to 10 years of hands-on test automation experience spanning mobile and cross-browser UI, integration, performance testing, etc.Experience with architecting large-scale, reliable, and performant test automation from the ground up including building reusable libraries, C
Scouting talent for Test Automation Lead for the leading governance, risk, and compliance (GRC) advisory and technology firm in financial services based out in Pune.Skill Required :Minimum 5 to 10 years of hands-on test automation experience spanning mobile and cross-browser UI, integration, performance testing, etc.Experience with architecting large-scale, reliable, and performant test automation from the ground up including building reusable libraries, C
Overall 4+ years of experience in Azure and DevOps. ·Must be well versed with automation on cloud (both infrastructure and application deployments). ·Automation using ARM Templates and Terraform·Must have a good understanding and experience of containers and DevOps tools and services ·Must be able to understand the networking and communication on Cloud platform. ·Must have a good understanding and experience of AKS/K8S·Should be willing to work both of
Overall 4+ years of experience in Azure and DevOps. ·Must be well versed with automation on cloud (both infrastructure and application deployments). ·Automation using ARM Templates and Terraform·Must have a good understanding and experience of containers and DevOps tools and services ·Must be able to understand the networking and communication on Cloud platform. ·Must have a good understanding and experience of AKS/K8S·Should be willing to work both of
Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
5+ years' experience in building web applications in .NET with at least 1 year in .Net Core and 3.5 years in.net PlatformStrong knowledge of .NET web framework in either ASP.NET MVC, Web API ,Web Pages or ASP.NET CoreProficient in C# and/or VB.NET, with a good knowledge of their ecosystemsExcellent critical, analytical, and problem-solving abilitiesStrong understanding of object-oriented programming and writing reusable librariesFamiliar with various desig
5+ years' experience in building web applications in .NET with at least 1 year in .Net Core and 3.5 years in.net PlatformStrong knowledge of .NET web framework in either ASP.NET MVC, Web API ,Web Pages or ASP.NET CoreProficient in C# and/or VB.NET, with a good knowledge of their ecosystemsExcellent critical, analytical, and problem-solving abilitiesStrong understanding of object-oriented programming and writing reusable librariesFamiliar with various desig
Job Location:- PuneMandatory Skills:- Oracle Apex & Pl/SQLJob Description:- POSITION SUMMARYThe IT Applications Analyst Materials Management is responsible for requirements definition, functional design, custom development and technical support within the Hexagon SmartPlant Materials application and other data integration applications. Included responsibility of this position is product and lifecycle management for the Hexagon SmartPlant Materials Suite
Job Location:- PuneMandatory Skills:- Oracle Apex & Pl/SQLJob Description:- POSITION SUMMARYThe IT Applications Analyst Materials Management is responsible for requirements definition, functional design, custom development and technical support within the Hexagon SmartPlant Materials application and other data integration applications. Included responsibility of this position is product and lifecycle management for the Hexagon SmartPlant Materials Suite