Python Developer : (Experience 5 to 7 years)Automation using Python - building a framework. Build package/product with Python Templates using templating languages like Jinja Data engineering background a good to have understanding of Kubernetes/Docker
Python Developer : (Experience 5 to 7 years)Automation using Python - building a framework. Build package/product with Python Templates using templating languages like Jinja Data engineering background a good to have understanding of Kubernetes/Docker
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
6+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python,
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
8+ years of technology experience·Spark Streaming experience is mandatory·Technology Stack:·Spark Streaming·Kafka·Spark, Flink·AWS (Good to have)·Java/Python/Scala·Microservices Architecture·Exposure to API Management·Architectural experience with Spark, AWS and Big Data (Hadoop Cloudera Mapr Hortonworks).·Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.·Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PyS
Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
Job Description:Expert in writing Snowflake SQL queries against Snowflake Developing scripts using java scripts to do Extract, Load, and Transform dataHands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.In-depth understanding of Data Warehouse, ETL concept and modeling structure principlesExpertise in AWS cloud native servicesGood to ha
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·5+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
This role is currently open for a client that deliver business excelence using cloud technologies. Details of the role are as below. Experience in developing and optimizing ETL pipelines, big data pipelines, and data-driven architectures.·Must have strong big-data core knowledge & experience in programming using Spark – Python/Scala·3+ years of relevant strong experience in working with real-time data streaming using Kafka·Experience in solving Streaming
5+yrs relevant experience Spark , ETL, SQLAnyone – AWS Azure GCP preferably AWS Optional –Spark StreamingRole & Responsibilities –Experience in developing and optimizing ETL pipelines, big data pipelines.Must have strong big-data core knowledge & experience in programming using Spark – Python/ScalaFamiliarity with DevOps framework – Git/Bitbucket , Jenkins etc.
5+yrs relevant experience Spark , ETL, SQLAnyone – AWS Azure GCP preferably AWS Optional –Spark StreamingRole & Responsibilities –Experience in developing and optimizing ETL pipelines, big data pipelines.Must have strong big-data core knowledge & experience in programming using Spark – Python/ScalaFamiliarity with DevOps framework – Git/Bitbucket , Jenkins etc.
Responsibilities:• Administration and support of enterprise wide MQ messaging infrastructure on both distributed & Mainframe platforms.• Work with application development teams providing support of middleware components.• Troubleshoot MQ issues, working with internal Client teams and external vendors.• Adhere to the banks middleware standards and governance including MQ and API administration and utilization.•
Responsibilities:• Administration and support of enterprise wide MQ messaging infrastructure on both distributed & Mainframe platforms.• Work with application development teams providing support of middleware components.• Troubleshoot MQ issues, working with internal Client teams and external vendors.• Adhere to the banks middleware standards and governance including MQ and API administration and utilization.•
Responsibilities:Demonstrate expertise related to cloud service provider platforms including Microsoft Azure, AWS, and Google Cloud Platform and their embedded security, as well as multi-cloud security management technologies. Assist in the design and development of cloud security solutions based on customer requirements, evaluating business strategies and cloud architecture best practices. Manage and execute cloud security solutions across design, implem
Responsibilities:Demonstrate expertise related to cloud service provider platforms including Microsoft Azure, AWS, and Google Cloud Platform and their embedded security, as well as multi-cloud security management technologies. Assist in the design and development of cloud security solutions based on customer requirements, evaluating business strategies and cloud architecture best practices. Manage and execute cloud security solutions across design, implem
Required Skills, Knowledge, Relevant Work Experience * 3- 5 year of excellent experience in the areas of Computer Vision & Machine Learning Deep knowledge & experience of: -o Image processing, Computer Vision, Machine Learningo Stereo Camera data processing, 2D and 3D Transformationso Mono & Stereo Camera Calibration & Imagingo Object detection, tracking and recognition. Machine learning techniques, analysis, and algorithms Proficiency in: -o Python a
Required Skills, Knowledge, Relevant Work Experience * 3- 5 year of excellent experience in the areas of Computer Vision & Machine Learning Deep knowledge & experience of: -o Image processing, Computer Vision, Machine Learningo Stereo Camera data processing, 2D and 3D Transformationso Mono & Stereo Camera Calibration & Imagingo Object detection, tracking and recognition. Machine learning techniques, analysis, and algorithms Proficiency in: -o Python a
Software Engineer3–7 year experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on enterprise related solutions for automation andindustry 4.0 transformation (MES / IIOT Domain). The role will involve solving, developing andensuring continuous improvements through reports and application development. You will take directownership for the development of specific microservices or front-end modules as well
Software Engineer3–7 year experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on enterprise related solutions for automation andindustry 4.0 transformation (MES / IIOT Domain). The role will involve solving, developing andensuring continuous improvements through reports and application development. You will take directownership for the development of specific microservices or front-end modules as well
Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
Job Description: 6+ years of relevant experience in developing big data processing task using PySpark/Glue/ADF/Hadoop and other cloud native services. Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame. Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark. Experience in at least one popular programming language – python/Scala/Java Strong analytic skills related to working with structured, semi structured a
Full Stack Engineer / Developer3–7 years relevant experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on Digitization and industry 4.0 transformation in aManufacturing Business. The role will involve, assisting in Identifying Opportunities for Digitizationand automation of workflows, developing solutions including but not limited to Extensions of ERP andMES systems, Applications integrated into curren
Full Stack Engineer / Developer3–7 years relevant experienceRoles & ResponsibilitiesPRINCIPAL ACCOUNTABILITIESWork with vendors and internal parties to focus on Digitization and industry 4.0 transformation in aManufacturing Business. The role will involve, assisting in Identifying Opportunities for Digitizationand automation of workflows, developing solutions including but not limited to Extensions of ERP andMES systems, Applications integrated into curren
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
3+ years of relevant strong experience in developing data processing task using Spark and AWS -cloud native services.Strong knowledge on optimizing workloads developed using Spark SQL/DataFrame.Good to have knowledge on databricks Cloud.Proficiency with Data Processing: Hadoop, Hive, Spark, Scala, Python, PySpark.Strong analytic skills related to working with structured, semi structured and unstructured datasets.Expertise in at least one popular cloud pro
Overall 3+ years of relevant strong experience in manging Bigdata workloads on Cloud.Hands-on experience with HADOOP, SPARK and HIVE using Java or Python.Very good experience in AWS Technologies like S3, EMR, EC2, CloudFormationGood experience in Snowflake data warehouse tool.Commendable knowledge in Hadoop and its echo systems like HDFS, Hue, Sqoop, Spark, Spark Streaming, hive, oozie, airflow and other components required in building end to end data pipe
Overall 3+ years of relevant strong experience in manging Bigdata workloads on Cloud.Hands-on experience with HADOOP, SPARK and HIVE using Java or Python.Very good experience in AWS Technologies like S3, EMR, EC2, CloudFormationGood experience in Snowflake data warehouse tool.Commendable knowledge in Hadoop and its echo systems like HDFS, Hue, Sqoop, Spark, Spark Streaming, hive, oozie, airflow and other components required in building end to end data pipe
Job Description – Vulnerability Management AnalystWe are seeking an experienced Vulnerability Management Analyst who has experience with vulnerability management across an enterprise. The role focuses on helping the organization look deeper and see further into the security of the environment to help improve and embed controls across the company. The role will be responsible for evaluating evidence by combining advanced data analysis and technology tools t
Job Description – Vulnerability Management AnalystWe are seeking an experienced Vulnerability Management Analyst who has experience with vulnerability management across an enterprise. The role focuses on helping the organization look deeper and see further into the security of the environment to help improve and embed controls across the company. The role will be responsible for evaluating evidence by combining advanced data analysis and technology tools t