data engineer - gcp in chennai

posted
contact
randstad india
position type
permanent
apply now

posted
location
chennai, tamil nadu
function
Information Technology
position type
permanent
experience
4 To 6
reference number
936722
contact
randstad india
apply now

job description

data engineer - gcp in chennai

?Strong java/Python development experience with Data integration experience. Should have worked on handling huge data.
Strong communication skills, experience in Agile methodologies, ETL/ELT skills, Data movement skills, Data processing skills.
Role Description :
Strong expertise to build Data Services in Google Cloud Platform
Able to build end-applications using a correct choice of Big Data GCP Components like GCS, DataFlow, Dataproc, BigQuery, BigTable, Pub/Sub and Open Source components like MongoDB, Cassandra, Kafka, etc.
Implement and support efficient reliable data pipelines to move data from a wide variety of data sources to data marts / data lake
Working experience in one of RDBMS data stores like Oracle, MySQL, PostgreSQL etc. and one of NoSQL data stores like HBase, Mongo, Cassandra etc.
Implement data aggregation, cleansing and transformation layers
Ability to build Data Ingestion frameworks taking into account access patterns, scalability, response time and availability.
Experience in Big data integration and stream processing technologies using Apache Kafka, Kafka Connect (Confluent), Apache NiFi, Flume, Sqoop, Spark, Hive
Experience working on writing Pub - Sub APIs, developing Kafka Streams, Kafka connect, KSQL
Developing new processors within Apache NiFi and establishing new data flows/troubleshooting existing data flows to the various hardware instances associated with the different data platforms.
Experience with serialization such as JSON and/or BSON.

Benefits
?NA

Client Introduction
?One of the top Oracle Marketing Cloud partners worldwide and Amongst the earliest & most experienced Salesforce partners in South Asia.

skills

?Data Engineer - GCP

qualification

?Strong java/Python development experience with Data integration experience. Should have worked on handling huge data.
Strong communication skills, experience in Agile methodologies, ETL/ELT skills, Data movement skills, Data processing skills.
Role Description :
Strong expertise to build Data Services in Google Cloud Platform
Able to build end-applications using a correct choice of Big Data GCP Components like GCS, DataFlow, Dataproc, BigQuery, BigTable, Pub/Sub and Open Source components like MongoDB, Cassandra, Kafka, etc.
Implement and support efficient reliable data pipelines to move data from a wide variety of data sources to data marts / data lake
Working experience in one of RDBMS data stores like Oracle, MySQL, PostgreSQL etc. and one of NoSQL data stores like HBase, Mongo, Cassandra etc.
Implement data aggregation, cleansing and transformation layers
Ability to build Data Ingestion frameworks taking into account access patterns, scalability, response time and availability.
Experience in Big data integration and stream processing technologies using Apache Kafka, Kafka Connect (Confluent), Apache NiFi, Flume, Sqoop, Spark, Hive
Experience working on writing Pub - Sub APIs, developing Kafka Streams, Kafka connect, KSQL
Developing new processors within Apache NiFi and establishing new data flows/troubleshooting existing data flows to the various hardware instances associated with the different data platforms.
Experience with serialization such as JSON and/or BSON.