Summary: Manager – Civil with min 15 years of experience in the construction field.Experience in handling large-scale building projects. Shall have leadership qualities.Shall have proficiency in written & verbal English communication. Proficiency in MSProject, and MS Office/Excel.THE HOUSE OF ABHINANDAN LODHA (HOABL)We are a dynamic consumer tech brand that is disrupting land ownership byleveraging technology to make land, an age-old asset, young again. We
Summary: Manager – Civil with min 15 years of experience in the construction field.Experience in handling large-scale building projects. Shall have leadership qualities.Shall have proficiency in written & verbal English communication. Proficiency in MSProject, and MS Office/Excel.THE HOUSE OF ABHINANDAN LODHA (HOABL)We are a dynamic consumer tech brand that is disrupting land ownership byleveraging technology to make land, an age-old asset, young again. We
• Candidate should have ability work on C++ development in Unix/Linux environment • Candidate should be able to perform and write unit test cases • Candidate should be able to perform Git operations without any issues • Candidate should be able to work in Agile environment and should be able to follow Scrum Agile practices • Candidate should be able to communicate well to respective counter parts and concern team member
• Candidate should have ability work on C++ development in Unix/Linux environment • Candidate should be able to perform and write unit test cases • Candidate should be able to perform Git operations without any issues • Candidate should be able to work in Agile environment and should be able to follow Scrum Agile practices • Candidate should be able to communicate well to respective counter parts and concern team member
Experience :8+years Required skill sets:• Team lead experience• AWS skill required (3+y)• Node JS required• React required• MongoDB required• Go Lang Optional - It would be a plus if have• GCP should have a basic understanding• Python basic - Optional Please consider a must have background of Node + Java + AWS + React for the candidates, Additionally GCP + Terraform + G
Experience :8+years Required skill sets:• Team lead experience• AWS skill required (3+y)• Node JS required• React required• MongoDB required• Go Lang Optional - It would be a plus if have• GCP should have a basic understanding• Python basic - Optional Please consider a must have background of Node + Java + AWS + React for the candidates, Additionally GCP + Terraform + G
Individual Contributor roleWill be reporting to AVP – Compensation & BenefitsWill handle APAC countries excluding IndiaExposure to countries – Australia, China, New Zealand, Hongkong, Indonesia, Japan, Malaysia, Singapore, Taiwan is necessaryInterview rounds (min. 3)
Individual Contributor roleWill be reporting to AVP – Compensation & BenefitsWill handle APAC countries excluding IndiaExposure to countries – Australia, China, New Zealand, Hongkong, Indonesia, Japan, Malaysia, Singapore, Taiwan is necessaryInterview rounds (min. 3)
JD - FIORI APP DEVELOPER SUMMARY: - RESPONSIBLE FOR DESIGN, DEVELOPMENT AND MAINTENANCE OF ALL THE SAP UI5/FIORI (INCLUDING ODATA AND CDS) AND OTHER WEB APPLICATIONS THAT ARE INTEGRATED WITH SAP. RESPONSIBILITY: - DEVELOP AND MAINTAIN SAP UI5 / FIORI APPLICATIONS (END-TO END INCLUDING BACKEND). DESIGN AND DEVELOP APPLICATIONS CONSISTENT WITH THE UI STRATEGY OF THE COMPANY. WORK WITH THE IT SOLUTION LEADERS TO DEVELOP WIRE-FRAME MODELS AND UI PROTOT
JD - FIORI APP DEVELOPER SUMMARY: - RESPONSIBLE FOR DESIGN, DEVELOPMENT AND MAINTENANCE OF ALL THE SAP UI5/FIORI (INCLUDING ODATA AND CDS) AND OTHER WEB APPLICATIONS THAT ARE INTEGRATED WITH SAP. RESPONSIBILITY: - DEVELOP AND MAINTAIN SAP UI5 / FIORI APPLICATIONS (END-TO END INCLUDING BACKEND). DESIGN AND DEVELOP APPLICATIONS CONSISTENT WITH THE UI STRATEGY OF THE COMPANY. WORK WITH THE IT SOLUTION LEADERS TO DEVELOP WIRE-FRAME MODELS AND UI PROTOT
One of our esteem client is hiring for Big Data Architect. Location : Mumbai Role :- Sr Solution Architect – Big data / Data AnalyticRoles and Responsibilities Build, lead and grow a cross-functional team of various engineering fields to deliver Big data / dataAnalytic / AI / ML powered applications. Be responsible for the solution architect of Big Data / data Analytic project using agile methodology. Understand and break down client problems, bringing
One of our esteem client is hiring for Big Data Architect. Location : Mumbai Role :- Sr Solution Architect – Big data / Data AnalyticRoles and Responsibilities Build, lead and grow a cross-functional team of various engineering fields to deliver Big data / dataAnalytic / AI / ML powered applications. Be responsible for the solution architect of Big Data / data Analytic project using agile methodology. Understand and break down client problems, bringing
Oracle Hyperion Financial ManagementExperience : 8- 10 YearsResponsibilities : Discovery, Design and Development Must Have : OneStream, Hyperion, HFM, ImplementationPrimary Skills :EPM Solution :Strong Hands-on Experience with OneStream System Design, Build, Installation and Implementation. Oracle EPM experience is added advantage:- The primary user interface toolkit for screens, forms, and reports, Create applications and/or templates to meet requirement
Oracle Hyperion Financial ManagementExperience : 8- 10 YearsResponsibilities : Discovery, Design and Development Must Have : OneStream, Hyperion, HFM, ImplementationPrimary Skills :EPM Solution :Strong Hands-on Experience with OneStream System Design, Build, Installation and Implementation. Oracle EPM experience is added advantage:- The primary user interface toolkit for screens, forms, and reports, Create applications and/or templates to meet requirement
Our esteem client is hiring for Java Developer Mumbai LocationExp 2- 10 Years JD- Technical Skills : Java •Sound knowledge and hands-on practice in Core Java and Object-Oriented Programming (OOP) Concept Ø Skill for writing reusable Java librariesØ Solid understanding of SQLØ Experience with multithreaded development, modern approaches to concurrencymemory management, caching, networking (Socket programing), distributed systems Nice to haveØ Financia
Our esteem client is hiring for Java Developer Mumbai LocationExp 2- 10 Years JD- Technical Skills : Java •Sound knowledge and hands-on practice in Core Java and Object-Oriented Programming (OOP) Concept Ø Skill for writing reusable Java librariesØ Solid understanding of SQLØ Experience with multithreaded development, modern approaches to concurrencymemory management, caching, networking (Socket programing), distributed systems Nice to haveØ Financia
JD:-We are looking for AWS Redshift (Developer / Data Architect) with 3-5 years of experience.Experience with parallel query optimization and execution, large scale data analytics, replicate data storage. RequirementsDesired Knowledge and Skills: •Experience with most of the cloud products such as Amazon AWS. •Experience as developer in multiple cloud technologies including AWS EC2, S3, VPC, VPN, Amazon API Gateway, AWS ECS, AWS Glue, EMR, AWS Lambda, AWS
JD:-We are looking for AWS Redshift (Developer / Data Architect) with 3-5 years of experience.Experience with parallel query optimization and execution, large scale data analytics, replicate data storage. RequirementsDesired Knowledge and Skills: •Experience with most of the cloud products such as Amazon AWS. •Experience as developer in multiple cloud technologies including AWS EC2, S3, VPC, VPN, Amazon API Gateway, AWS ECS, AWS Glue, EMR, AWS Lambda, AWS
Company DescriptionOpenBet leads the global gaming market, providing innovative software that powers the world’s most successful operators.We pride ourselves on product innovation and technical excellence and are constantly adding new applications and new functionality to our product suite, used by billions of end users. And, while many other companies have been downsizing, we continue to see growth across our business.Want to be part of our story?Job Des
Company DescriptionOpenBet leads the global gaming market, providing innovative software that powers the world’s most successful operators.We pride ourselves on product innovation and technical excellence and are constantly adding new applications and new functionality to our product suite, used by billions of end users. And, while many other companies have been downsizing, we continue to see growth across our business.Want to be part of our story?Job Des
JD for NATIVE HANA | 5+ years of experience Expertise in Enterprise Native HANA Data Warehouse development projectsExpertise in design and development of SAP HANA models such as tables, Views (Attribute, Analytic, Calculation views and Native views)Ability to develop complex SQL Procedures, Table Functions, Functions, SQL Script and XS jobsExperience in data model content objects creation and knowledge on DU handlingStrong SQL knowledge and experienceExpe
JD for NATIVE HANA | 5+ years of experience Expertise in Enterprise Native HANA Data Warehouse development projectsExpertise in design and development of SAP HANA models such as tables, Views (Attribute, Analytic, Calculation views and Native views)Ability to develop complex SQL Procedures, Table Functions, Functions, SQL Script and XS jobsExperience in data model content objects creation and knowledge on DU handlingStrong SQL knowledge and experienceExpe
Job Description for Sr Full Stack Java Developer Global Payments Experience: 5 to 8 years of experience in IT building Microservice based enterprise applications with Full Stack Java J2EE, Spring Boot Experience.Education: Degree in Science, Engineering, Technology or Commerce PeopleWork Collaboratively with Agile Team Members to deliver Maximum Business Value.Guide and Groom Junior Developers on Java J2EE technology stack Product RoadmapActively work with
Job Description for Sr Full Stack Java Developer Global Payments Experience: 5 to 8 years of experience in IT building Microservice based enterprise applications with Full Stack Java J2EE, Spring Boot Experience.Education: Degree in Science, Engineering, Technology or Commerce PeopleWork Collaboratively with Agile Team Members to deliver Maximum Business Value.Guide and Groom Junior Developers on Java J2EE technology stack Product RoadmapActively work with
Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:· Experience with implementing architecture solutions using Azure data Analytics platform· Knowledge of both data lake and data warehouse technologies· Well versed with ELT concepts and designGood with Azure Data F
Azure Data Architect with Data ModellerMust haveData Architect experienceSSISExperience in Data modelling10+ years of experience as a Data Architect & Data ModellingGood to HaveAzure experienceRole and Experience expected:· Experience with implementing architecture solutions using Azure data Analytics platform· Knowledge of both data lake and data warehouse technologies· Well versed with ELT concepts and designGood with Azure Data F
Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Synapse, Polybase, SQL)Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.Experience in create the Technical Specification Design, Application Interface Design.Files Processing – XML, CSV, Excel, ORC, Parquet file FormatsDevelop batch processin
Experience in Perform Design, Development & Deployment using Azure Services ( Data Factory, Azure Data Lake Storage, Synapse, Polybase, SQL)Develop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.Experience in create the Technical Specification Design, Application Interface Design.Files Processing – XML, CSV, Excel, ORC, Parquet file FormatsDevelop batch processin
Visualization Engineer will be responsible for building Dashboards, Analysis and Visuals using Power Bi and Tableau. Responsibilities: · Build Visualizations and Dashboards using Power Bi and Tableau . Migrate reports from Tableau to Power BI · Design logical models in support of building data models and metadata layer of visualization tools. · Develop and unit test metadata, calculations, hierarchies and visualizations. · Develop and build supporting vi
Visualization Engineer will be responsible for building Dashboards, Analysis and Visuals using Power Bi and Tableau. Responsibilities: · Build Visualizations and Dashboards using Power Bi and Tableau . Migrate reports from Tableau to Power BI · Design logical models in support of building data models and metadata layer of visualization tools. · Develop and unit test metadata, calculations, hierarchies and visualizations. · Develop and build supporting vi
Visualization Engineer will be responsible for building Dashboards, Analysis and Visuals using Power Bi and Tableau. Responsibilities: · Build Visualizations and Dashboards using Power Bi and Tableau . Migrate reports from Tableau to Power BI · Design logical models in support of building data models and metadata layer of visualization tools. · Develop and unit test metadata, calculations, hierarchies and visualizations. · Develop and build supporting vi
Visualization Engineer will be responsible for building Dashboards, Analysis and Visuals using Power Bi and Tableau. Responsibilities: · Build Visualizations and Dashboards using Power Bi and Tableau . Migrate reports from Tableau to Power BI · Design logical models in support of building data models and metadata layer of visualization tools. · Develop and unit test metadata, calculations, hierarchies and visualizations. · Develop and build supporting vi
Overall 3+ years of relevant strong experience in manging Bigdata workloads on Cloud.Hands-on experience with HADOOP, SPARK and HIVE using Java or Python.Very good experience in AWS Technologies like S3, EMR, EC2, CloudFormationGood experience in Snowflake data warehouse tool.Commendable knowledge in Hadoop and its echo systems like HDFS, Hue, Sqoop, Spark, Spark Streaming, hive, oozie, airflow and other components required in building end to end data pipe
Overall 3+ years of relevant strong experience in manging Bigdata workloads on Cloud.Hands-on experience with HADOOP, SPARK and HIVE using Java or Python.Very good experience in AWS Technologies like S3, EMR, EC2, CloudFormationGood experience in Snowflake data warehouse tool.Commendable knowledge in Hadoop and its echo systems like HDFS, Hue, Sqoop, Spark, Spark Streaming, hive, oozie, airflow and other components required in building end to end data pipe
Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience Role and Experience expected: · Experience with implementations of the Datavault modelling, architecture, and methodology· Knowledge of both data lake and data warehouse technologies· Well versed wi
Data Architect with Data ModellerMust haveData Architect experienceExperience in Data modelling using datavault methodology10+ years of experience as a Data Architect & Data ModellingGood to HaveSnowflake experienceAWS/Azure experience Role and Experience expected: · Experience with implementations of the Datavault modelling, architecture, and methodology· Knowledge of both data lake and data warehouse technologies· Well versed wi
Core functional responsibilities• Sales - Prepare offers in the system against inbound requirements, follow-up & convert the offers.• Sales – Attending inbound leads (Via Web & Telephone) and converting them to business• Sales – Desk Research (Web search Prospects) & Cold calling then to identify if they are a prospect.• Post Order - Payment follow-ups.• Post Order - Warranty claim co-ordination with Belgium & Customer• Co-ordination between various teams
Core functional responsibilities• Sales - Prepare offers in the system against inbound requirements, follow-up & convert the offers.• Sales – Attending inbound leads (Via Web & Telephone) and converting them to business• Sales – Desk Research (Web search Prospects) & Cold calling then to identify if they are a prospect.• Post Order - Payment follow-ups.• Post Order - Warranty claim co-ordination with Belgium & Customer• Co-ordination between various teams
Core functional responsibilities• Sales - Prepare offers in the system against inbound requirements, follow-up & convert the offers.• Sales – Attending inbound leads (Via Web & Telephone) and converting them to business• Sales – Desk Research (Web search Prospects) & Cold calling then to identify if they are a prospect.• Post Order - Payment follow-ups.• Post Order - Warranty claim co-ordination with Belgium & Customer• Co-ordination between various teams
Core functional responsibilities• Sales - Prepare offers in the system against inbound requirements, follow-up & convert the offers.• Sales – Attending inbound leads (Via Web & Telephone) and converting them to business• Sales – Desk Research (Web search Prospects) & Cold calling then to identify if they are a prospect.• Post Order - Payment follow-ups.• Post Order - Warranty claim co-ordination with Belgium & Customer• Co-ordination between various teams
Job Role• Analyse data of various products/machines (Forklifts, Warehousing equipment, constructionmachinery, agricultural equipment, and aerial work platforms) and their sub-assemblies as per VehicleMake, Model, Generation, and type differentiation for parts analysis.• Research and determine the fast-moving items (serviceable parts) based on technical parameters ofspare parts• Refer to various technical documents, product information and brand-specific do
Job Role• Analyse data of various products/machines (Forklifts, Warehousing equipment, constructionmachinery, agricultural equipment, and aerial work platforms) and their sub-assemblies as per VehicleMake, Model, Generation, and type differentiation for parts analysis.• Research and determine the fast-moving items (serviceable parts) based on technical parameters ofspare parts• Refer to various technical documents, product information and brand-specific do
Search for candidates who have AWS experience with Project Management.Have managed Data Projects for 4-5 yearsHave knowledge on Agile/ScrumUnderstand Top line and bottom lineHave experience handling projects with AWS stackHave experience handling a team of minimum 10 peopleHave experience managing PNL for their projects This should suffice for this job Id. Job DescriptionJob Description------------------------------------------Customer communication - exp
Search for candidates who have AWS experience with Project Management.Have managed Data Projects for 4-5 yearsHave knowledge on Agile/ScrumUnderstand Top line and bottom lineHave experience handling projects with AWS stackHave experience handling a team of minimum 10 peopleHave experience managing PNL for their projects This should suffice for this job Id. Job DescriptionJob Description------------------------------------------Customer communication - exp
Overall 4+ years of experience in Azure and DevOps. · Must be well versed with automation on cloud (both infrastructure and application deployments). · Automation using ARM Templates and Terraform · Must have a good understanding and experience of containers and DevOps tools and services · Must be able to understand the networking and communication on Cloud platform. · Must have a good understanding and experience of AKS/K
Overall 4+ years of experience in Azure and DevOps. · Must be well versed with automation on cloud (both infrastructure and application deployments). · Automation using ARM Templates and Terraform · Must have a good understanding and experience of containers and DevOps tools and services · Must be able to understand the networking and communication on Cloud platform. · Must have a good understanding and experience of AKS/K
Ability to translate technical requirements into data transformation jobs. · Build data pipelines to bring data from source systems, cleanse and transform data to support data analytics and reporting. · Experience with developing and implementing using – ETL tools like Informatica, Talend and other AWS ETL integration tools, Talend Data Catalog. · Strong knowledge of data warehousing and Data Modelling concepts · Strong experienc
Ability to translate technical requirements into data transformation jobs. · Build data pipelines to bring data from source systems, cleanse and transform data to support data analytics and reporting. · Experience with developing and implementing using – ETL tools like Informatica, Talend and other AWS ETL integration tools, Talend Data Catalog. · Strong knowledge of data warehousing and Data Modelling concepts · Strong experienc
Hands-on experience with Pyspark/Java Spark/Scala SparkProficient understanding of distributed computing principles Proficiency with Data Processing: HDFS, Hive, Spark, Scala/Python Independent thinker, willing to engage, challenge and learn new technologies. Understanding of the benefits of data warehousing, data architecture, data quality processes, data warehousing design, and implementation, table structure, fact and dimension tables, logical and physi
Hands-on experience with Pyspark/Java Spark/Scala SparkProficient understanding of distributed computing principles Proficiency with Data Processing: HDFS, Hive, Spark, Scala/Python Independent thinker, willing to engage, challenge and learn new technologies. Understanding of the benefits of data warehousing, data architecture, data quality processes, data warehousing design, and implementation, table structure, fact and dimension tables, logical and physi
Hands-on experience with Pyspark/Java Spark/Scala SparkProficient understanding of distributed computing principles Proficiency with Data Processing: HDFS, Hive, Spark, Scala/Python Independent thinker, willing to engage, challenge and learn new technologies. Understanding of the benefits of data warehousing, data architecture, data quality processes, data warehousing design, and implementation, table structure, fact and dimension tables, logical and physi
Hands-on experience with Pyspark/Java Spark/Scala SparkProficient understanding of distributed computing principles Proficiency with Data Processing: HDFS, Hive, Spark, Scala/Python Independent thinker, willing to engage, challenge and learn new technologies. Understanding of the benefits of data warehousing, data architecture, data quality processes, data warehousing design, and implementation, table structure, fact and dimension tables, logical and physi
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
You will be working in the team which is responsible for the Real time message processing usingKafka as event streaming platform. In this team you will work together with your colleagues in a(scaled) Agile way. You will be working in a team with Developers, Testers, Technical ApplicationManagers, Scrum Masters, Product Owner and Analysts. The team develops, maintains andoptimizes new and existing services in the area of Messaging/integration.Skills we can’
You will be working in the team which is responsible for the Real time message processing usingKafka as event streaming platform. In this team you will work together with your colleagues in a(scaled) Agile way. You will be working in a team with Developers, Testers, Technical ApplicationManagers, Scrum Masters, Product Owner and Analysts. The team develops, maintains andoptimizes new and existing services in the area of Messaging/integration.Skills we can’
Development and marketing of customized software in the area of electronic payment transactionsFrom the conception to the configuration to the installation of the software components you will be involvedSupporting our customers in all technical mattersAdaptation of the system to customer processes as well as enhancements or new versionsDeveloper instructionsCreation and review of project documentation and technical specificationsTest conception and quality
Development and marketing of customized software in the area of electronic payment transactionsFrom the conception to the configuration to the installation of the software components you will be involvedSupporting our customers in all technical mattersAdaptation of the system to customer processes as well as enhancements or new versionsDeveloper instructionsCreation and review of project documentation and technical specificationsTest conception and quality