Design and develop robust data integration pipelines using SnapLogic.Implement ETL (Extract, Transform, Load) processes to integrate data from various sources.Ensure data quality and integrity during the transfer process.Build web services and create integrations between applications using SnapLogic, Splunk, JSON, and HTML leveraging RESTful design principles.Develop business-critical solutions using both client-side and server-side technologies.Work close
Design and develop robust data integration pipelines using SnapLogic.Implement ETL (Extract, Transform, Load) processes to integrate data from various sources.Ensure data quality and integrity during the transfer process.Build web services and create integrations between applications using SnapLogic, Splunk, JSON, and HTML leveraging RESTful design principles.Develop business-critical solutions using both client-side and server-side technologies.Work close
Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, on AWS Experience in developing ETL, OLAP based and Analytical Applications.Experience in ingesting batch and streaming data from various data sources.Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.)Ability to quickly learn and develop expertise in existing highly complex applications and architectures.
Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, on AWS Experience in developing ETL, OLAP based and Analytical Applications.Experience in ingesting batch and streaming data from various data sources.Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.)Ability to quickly learn and develop expertise in existing highly complex applications and architectures.
Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, on AzureExperience in developing ETL, OLAP based and Analytical Applications.Experience in ingesting batch and streaming data from various data sources.Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.)Ability to quickly learn and develop expertise in existing highly complex applications and architectures
Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, on AzureExperience in developing ETL, OLAP based and Analytical Applications.Experience in ingesting batch and streaming data from various data sources.Strong Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.)Ability to quickly learn and develop expertise in existing highly complex applications and architectures
Design and develop robust data integration pipelines using SnapLogic.Implement ETL (Extract, Transform, Load) processes to integrate data from various sources.Ensure data quality and integrity during the transfer process.Build web services and create integrations between applications using SnapLogic, Splunk, JSON, and HTML leveraging RESTful design principles.Develop business-critical solutions using both client-side and server-side technologies.Work close
Design and develop robust data integration pipelines using SnapLogic.Implement ETL (Extract, Transform, Load) processes to integrate data from various sources.Ensure data quality and integrity during the transfer process.Build web services and create integrations between applications using SnapLogic, Splunk, JSON, and HTML leveraging RESTful design principles.Develop business-critical solutions using both client-side and server-side technologies.Work close
let similar jobs come to you
We will keep you updated when we have similar job postings.
Thank you for subscribing to your personalised job alerts.
It looks like you want to switch your language. This will reset your filters on your current job search.