Roles & Responsibilities:
• Use Design thinking and a consultative approach to conceive cutting edge technology solutions for
business problems
• Understanding client requirements, develop cloud strategy/architecture and solution and be able to
communicate that effectively to CxOs as part of pre-sales
...
• Be a data leader and apply knowledge and explain the benefits to organizations adopting strategies
relating to NextGen/ New age Data Capabilities
• Be proficient in evaluating new technologies and identifying practical business cases to develop
enhanced business value and increase operating efficiency
• Design, Architect, and Maintain Enterprise solutions on the big data analytics platform (Cloud or Onpremises)
• Implement AWS/ Azure/ GCP services in a variety of distributed computing, enterprise environmentsStrong hands-on experience on AWS or Azure (Eg. Amazon Athena, Amazon EMR, Amazon Redshift,
Amazon Kinesis, Amazon Elasticsearch Service, AWS Glue, Azure Synapse, Azure Data Lake Storage,
Azure Data Factory, Azure Databricks)
• Strong hands-on experience big data platforms (Cloudera or Databricks preferred).
• Strong hands-on experience in integrating traditional and new age Databases like Oracle, SQL Server,
Redshift, Netezza, MongoDB, Teradata, Graph DB, No SQL DB etc.
• Strong hands-on with one or more programming languages (Python, Scala or Java).
• Deep understanding of Big Data stack such as Hadoop, Spark, Hive, Ranger, Atlas, Yarn, Kafka,
Solr/Elastic Search etc.
• Good Understanding of underlying infrastructure for Big Data Solutions (Clusters, Distributed
Computing, Storage)
• Deep understanding of various file formats (Eg. Avro, Parquet, ORC).
• Expertise in Infra capacity sizing, costing of cloud services to drive optimized solution architecture,
• Hands-on experience in building and maintaining cloud platform through IaaC framework.
• Perform POC's on tools and provide recommendations to solve specific business needs
• Lead a team of other data engineers on best practices and approach to meeting the data engineering
needs of the solution
• Implement DevOps practices such as infrastructure as code, CI/CD components, and automated
deployments
• Excellent Knowledge of architectural design patterns, performance tuning, database, and functional
designs
• Excellent understanding of different data domains like data lake, data mesh, data fabric etcUnderstand existing architecture and develop plans to support the growth in the cloud adoption
• Mentor and inspire a data team solving problems through R&D while pushing the state-of-the-art
solution
• Liaise with colleagues and business leaders across Domestic & Global Regions to deliver impactful
analytics projects and drive innovation at scale
Specific Competencies for the role:
• B.E. / B.Tech. + MBA, MCA (Systems / Data / Data Science/ Analytics / Finance) with a good academic
background
• Minimum 15 years + on Job experience in data engineering and related spheres
• Proficiency in at least 2 cloud technologies – AWS, Azure, GCP
• At-least 5 full life cycle implementation experience starting from conceptualization, design,
architecture to delivery of cloud solution
• Minimum of 7 years’ experience as Data Engineer, or similar role
• Minimum of 7 years’ experience with Data/ Data pipelines /Analytics solutions
• At least 7 years’ experience deploying cloud solutions on large scale project is required
• At least 7 years’ experience leading / managing a data engineering team is required
•
show more
Roles & Responsibilities:
• Use Design thinking and a consultative approach to conceive cutting edge technology solutions for
business problems
• Understanding client requirements, develop cloud strategy/architecture and solution and be able to
communicate that effectively to CxOs as part of pre-sales
• Be a data leader and apply knowledge and explain the benefits to organizations adopting strategies
relating to NextGen/ New age Data Capabilities
• Be proficient in evaluating new technologies and identifying practical business cases to develop
enhanced business value and increase operating efficiency
• Design, Architect, and Maintain Enterprise solutions on the big data analytics platform (Cloud or Onpremises)
• Implement AWS/ Azure/ GCP services in a variety of distributed computing, enterprise environmentsStrong hands-on experience on AWS or Azure (Eg. Amazon Athena, Amazon EMR, Amazon Redshift,
Amazon Kinesis, Amazon Elasticsearch Service, AWS Glue, Azure Synapse, Azure Data Lake Storage,
Azure Data Factory, Azure Databricks)
• Strong hands-on experience big data platforms (Cloudera or Databricks preferred).
...
• Strong hands-on experience in integrating traditional and new age Databases like Oracle, SQL Server,
Redshift, Netezza, MongoDB, Teradata, Graph DB, No SQL DB etc.
• Strong hands-on with one or more programming languages (Python, Scala or Java).
• Deep understanding of Big Data stack such as Hadoop, Spark, Hive, Ranger, Atlas, Yarn, Kafka,
Solr/Elastic Search etc.
• Good Understanding of underlying infrastructure for Big Data Solutions (Clusters, Distributed
Computing, Storage)
• Deep understanding of various file formats (Eg. Avro, Parquet, ORC).
• Expertise in Infra capacity sizing, costing of cloud services to drive optimized solution architecture,
• Hands-on experience in building and maintaining cloud platform through IaaC framework.
• Perform POC's on tools and provide recommendations to solve specific business needs
• Lead a team of other data engineers on best practices and approach to meeting the data engineering
needs of the solution
• Implement DevOps practices such as infrastructure as code, CI/CD components, and automated
deployments
• Excellent Knowledge of architectural design patterns, performance tuning, database, and functional
designs
• Excellent understanding of different data domains like data lake, data mesh, data fabric etcUnderstand existing architecture and develop plans to support the growth in the cloud adoption
• Mentor and inspire a data team solving problems through R&D while pushing the state-of-the-art
solution
• Liaise with colleagues and business leaders across Domestic & Global Regions to deliver impactful
analytics projects and drive innovation at scale
Specific Competencies for the role:
• B.E. / B.Tech. + MBA, MCA (Systems / Data / Data Science/ Analytics / Finance) with a good academic
background
• Minimum 15 years + on Job experience in data engineering and related spheres
• Proficiency in at least 2 cloud technologies – AWS, Azure, GCP
• At-least 5 full life cycle implementation experience starting from conceptualization, design,
architecture to delivery of cloud solution
• Minimum of 7 years’ experience as Data Engineer, or similar role
• Minimum of 7 years’ experience with Data/ Data pipelines /Analytics solutions
• At least 7 years’ experience deploying cloud solutions on large scale project is required
• At least 7 years’ experience leading / managing a data engineering team is required
•
show more