Responsibilities:·Write test data scripts, based on ETL mapping artifacts·Execute data scripts and detailed analysis on the scripts·Create strategies and test cases for applications that use ETL components·Data mining and detailed data analysis on data warehousing systems·Execute formal test plans to ensure the delivery of data related projects·Provide input and support big data testing initiative·Define and track quality assurance metrics such as defects,
Responsibilities:·Write test data scripts, based on ETL mapping artifacts·Execute data scripts and detailed analysis on the scripts·Create strategies and test cases for applications that use ETL components·Data mining and detailed data analysis on data warehousing systems·Execute formal test plans to ensure the delivery of data related projects·Provide input and support big data testing initiative·Define and track quality assurance metrics such as defects,
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Synapse, Polybase, SQL)Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.Experience in create the Technical Specification Design, Application Interface Design.Files Processing – XML, CSV, Excel, ORC, Parquet file FormatsDevelop batch processin
Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Synapse, Polybase, SQL)Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.Experience in create the Technical Specification Design, Application Interface Design.Files Processing – XML, CSV, Excel, ORC, Parquet file FormatsDevelop batch processin
Job Description● Database administration for MS SQL● Expert on MS SQL Server and any other NoSQL DB will be added advantage● Have Basics understanding of Oracle DB, PostgreSQL and My SQL technologies● AWS SA Associate Certified● Extensive knowledge and be proficient in utilizing all features within SQL Server and MongoDBtogether with a full understanding of feature differences between each version.● Analyze and optimize the operational environment; sustain
Job Description● Database administration for MS SQL● Expert on MS SQL Server and any other NoSQL DB will be added advantage● Have Basics understanding of Oracle DB, PostgreSQL and My SQL technologies● AWS SA Associate Certified● Extensive knowledge and be proficient in utilizing all features within SQL Server and MongoDBtogether with a full understanding of feature differences between each version.● Analyze and optimize the operational environment; sustain
4 to 5 years experience in building web applications in .NET with atleast 1 year in .Net Core PlatformStrong knowledge of .NET web framework in either ASP.NET MVC, Web API ,Web Pages or ASP.NET CoreProficient in C# and/or VB.NET, with a good knowledge of their ecosystemsExcellent critical, analytical, and problem-solving abilitiesStrong understanding of object-oriented programming and writing reusable librariesFamiliar with various design and architectura
4 to 5 years experience in building web applications in .NET with atleast 1 year in .Net Core PlatformStrong knowledge of .NET web framework in either ASP.NET MVC, Web API ,Web Pages or ASP.NET CoreProficient in C# and/or VB.NET, with a good knowledge of their ecosystemsExcellent critical, analytical, and problem-solving abilitiesStrong understanding of object-oriented programming and writing reusable librariesFamiliar with various design and architectura
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
JD for NATIVE HANA | 5+ years of experience Expertise in Enterprise Native HANA Data Warehouse development projectsExpertise in design and development of SAP HANA models such as tables, Views (Attribute, Analytic, Calculation views and Native views)Ability to develop complex SQL Procedures, Table Functions, Functions, SQL Script and XS jobsExperience in data model content objects creation and knowledge on DU handlingStrong SQL knowledge and experienceExpe
JD for NATIVE HANA | 5+ years of experience Expertise in Enterprise Native HANA Data Warehouse development projectsExpertise in design and development of SAP HANA models such as tables, Views (Attribute, Analytic, Calculation views and Native views)Ability to develop complex SQL Procedures, Table Functions, Functions, SQL Script and XS jobsExperience in data model content objects creation and knowledge on DU handlingStrong SQL knowledge and experienceExpe
Exp: 6- 10 yearsRate: Rs700.00 - Rs2,013.00 HourlyLocation: Pune JD:• Overall, 6-10 yrs. of Database related experience• Around 3+ years of experience in cloud data platform administration (snowflake)• Good SQL skills and able to performance tune the query• Good to have knowledge on the BI tool, if not the candidate should be willing to learn BI tool as part of his/her assignment• Self-motivated, fast learner and willing to go outside of comfort zone• Sho
Exp: 6- 10 yearsRate: Rs700.00 - Rs2,013.00 HourlyLocation: Pune JD:• Overall, 6-10 yrs. of Database related experience• Around 3+ years of experience in cloud data platform administration (snowflake)• Good SQL skills and able to performance tune the query• Good to have knowledge on the BI tool, if not the candidate should be willing to learn BI tool as part of his/her assignment• Self-motivated, fast learner and willing to go outside of comfort zone• Sho
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
Exp: 6-10 yearsRate: Rs700.00 - Rs2,013.00 HourlyLocation: Pune JD:• Overall, 6-10 yrs. of Database related experience• Around 3+ years of experience in cloud data platform administration (snowflake)• Good SQL skills and able to performance tune the query• Good to have knowledge on the BI tool, if not the candidate should be willing to learn BI tool as part of his/her assignment• Self-motivated, fast learner and willing to go outside of comfort zone• Shoul
Exp: 6-10 yearsRate: Rs700.00 - Rs2,013.00 HourlyLocation: Pune JD:• Overall, 6-10 yrs. of Database related experience• Around 3+ years of experience in cloud data platform administration (snowflake)• Good SQL skills and able to performance tune the query• Good to have knowledge on the BI tool, if not the candidate should be willing to learn BI tool as part of his/her assignment• Self-motivated, fast learner and willing to go outside of comfort zone• Shoul
Software Engineer3–7 year experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on enterprise related solutions for automation andindustry 4.0 transformation (MES / IIOT Domain). The role will involve solving, developing andensuring continuous improvements through reports and application development. You will take directownership for the development of specific microservices or front-end modules as well
Software Engineer3–7 year experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on enterprise related solutions for automation andindustry 4.0 transformation (MES / IIOT Domain). The role will involve solving, developing andensuring continuous improvements through reports and application development. You will take directownership for the development of specific microservices or front-end modules as well
5+ years' experience in building web applications in .NET with at least 1 year in .Net Core and 3.5 years in.net PlatformStrong knowledge of .NET web framework in either ASP.NET MVC, Web API ,Web Pages or ASP.NET CoreProficient in C# and/or VB.NET, with a good knowledge of their ecosystemsExcellent critical, analytical, and problem-solving abilitiesStrong understanding of object-oriented programming and writing reusable librariesFamiliar with various desig
5+ years' experience in building web applications in .NET with at least 1 year in .Net Core and 3.5 years in.net PlatformStrong knowledge of .NET web framework in either ASP.NET MVC, Web API ,Web Pages or ASP.NET CoreProficient in C# and/or VB.NET, with a good knowledge of their ecosystemsExcellent critical, analytical, and problem-solving abilitiesStrong understanding of object-oriented programming and writing reusable librariesFamiliar with various desig
Exp: 8-10 yearsRate: Rs740.00 - Rs2,500.00 HourlyLocation: Chennai & Pune JD:Skills:• Extensive experience designing, developing, and implementing Power BI Reporting solutions.• Strong understanding of DAX, data modeling, and data visualization.• Experience integrating Power BI with other data sources and platforms (e.g. SQL Server, SharePoint, Excel, etc.).• Advanced knowledge of Power BI widgets• Knowledge of data warehousing concepts and best practices.
Exp: 8-10 yearsRate: Rs740.00 - Rs2,500.00 HourlyLocation: Chennai & Pune JD:Skills:• Extensive experience designing, developing, and implementing Power BI Reporting solutions.• Strong understanding of DAX, data modeling, and data visualization.• Experience integrating Power BI with other data sources and platforms (e.g. SQL Server, SharePoint, Excel, etc.).• Advanced knowledge of Power BI widgets• Knowledge of data warehousing concepts and best practices.
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
JD:• Previous experience working in data governance/data lineage projects• Collect requirements to map the data flows within the organization. Create technical source to target mappings between systems, which includes mapping the names of databases/schemas/tables/physical data elements of each system as well as the transformation logic between them.• Perform gap analysis on technical lineage diagrams/mapping documents• Perform QA testing and UAT on develop
JD:• Previous experience working in data governance/data lineage projects• Collect requirements to map the data flows within the organization. Create technical source to target mappings between systems, which includes mapping the names of databases/schemas/tables/physical data elements of each system as well as the transformation logic between them.• Perform gap analysis on technical lineage diagrams/mapping documents• Perform QA testing and UAT on develop
focused on software development in context of our application’s data base, i. e. having functional and technical requirements as an input for the team, an extensive application as context and the task to provide tested deliverables accordingly. Beyond that, you and the team will also support people from other teams, e. g. working in client projects or operation services. In this the role, you will work on Tuning of SQL-queries (including root cause analysi
focused on software development in context of our application’s data base, i. e. having functional and technical requirements as an input for the team, an extensive application as context and the task to provide tested deliverables accordingly. Beyond that, you and the team will also support people from other teams, e. g. working in client projects or operation services. In this the role, you will work on Tuning of SQL-queries (including root cause analysi
Requirements·Bachelor’s degree in Computer Science/Engineering·Core BA skills are required·Insurance knowledge is preferred·Data warehousing knowledge will be good to have·Experience in writing SQL queries and analysis·6+ years of relevant experience on business analysis·Strong communication skills, in both persuasion and negotiation·Ability to collaborate with a global team, and across multiple time-zones·Passionate, hungry, and determined·Willingness to
Requirements·Bachelor’s degree in Computer Science/Engineering·Core BA skills are required·Insurance knowledge is preferred·Data warehousing knowledge will be good to have·Experience in writing SQL queries and analysis·6+ years of relevant experience on business analysis·Strong communication skills, in both persuasion and negotiation·Ability to collaborate with a global team, and across multiple time-zones·Passionate, hungry, and determined·Willingness to
Skill set required for Java Developer position: Work on tasks assigned based on the needs of projects/support work Coding of applications in a clear and efficient way Unit Testing Participate in Project rituals and reporting on progress and issues Estimate work packages and projects Follow business unit standards for Security and Quality practiceDesired Skills Solid professional experience in software development with Core Java – preferably 1.8 yrs
Skill set required for Java Developer position: Work on tasks assigned based on the needs of projects/support work Coding of applications in a clear and efficient way Unit Testing Participate in Project rituals and reporting on progress and issues Estimate work packages and projects Follow business unit standards for Security and Quality practiceDesired Skills Solid professional experience in software development with Core Java – preferably 1.8 yrs
Full Stack Engineer / Developer3–7 years relevant experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on Digitization and industry 4.0 transformation in aManufacturing Business. The role will involve, assisting in Identifying Opportunities for Digitizationand automation of workflows, developing solutions including but not limited to Extensions of ERP andMES systems, Applications integrated into curren
Full Stack Engineer / Developer3–7 years relevant experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on Digitization and industry 4.0 transformation in aManufacturing Business. The role will involve, assisting in Identifying Opportunities for Digitizationand automation of workflows, developing solutions including but not limited to Extensions of ERP andMES systems, Applications integrated into curren
Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
12+ Years of experience in Software Development with 5-6 years of experience as AWS Data Architect. Having worked in a large team and complex projects. Having prior BI, Analytics and ETL experience. Hands-on experience in modern analytics architecture and tools. Data Modelling and Data Mart / Data ware house design. Key Roles and Responsibilities: Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark
Job Description – Vulnerability Management AnalystWe are seeking an experienced Vulnerability Management Analyst who has experience with vulnerability management across an enterprise. The role focuses on helping the organization look deeper and see further into the security of the environment to help improve and embed controls across the company. The role will be responsible for evaluating evidence by combining advanced data analysis and technology tools t
Job Description – Vulnerability Management AnalystWe are seeking an experienced Vulnerability Management Analyst who has experience with vulnerability management across an enterprise. The role focuses on helping the organization look deeper and see further into the security of the environment to help improve and embed controls across the company. The role will be responsible for evaluating evidence by combining advanced data analysis and technology tools t
Job Location:- PuneMandatory Skills:- Oracle Apex & Pl/SQLJob Description:- POSITION SUMMARYThe IT Applications Analyst Materials Management is responsible for requirements definition, functional design, custom development and technical support within the Hexagon SmartPlant Materials application and other data integration applications. Included responsibility of this position is product and lifecycle management for the Hexagon SmartPlant Materials Suite
Job Location:- PuneMandatory Skills:- Oracle Apex & Pl/SQLJob Description:- POSITION SUMMARYThe IT Applications Analyst Materials Management is responsible for requirements definition, functional design, custom development and technical support within the Hexagon SmartPlant Materials application and other data integration applications. Included responsibility of this position is product and lifecycle management for the Hexagon SmartPlant Materials Suite