Skills & Competencies required:
- Extensive hands-on experience with ETL development and data integration for large-scale, complex systems, including performance tuning and optimization.
- Proven expertise in designing and architecting ETL solutions using platforms such as Synapse Analytics, Azure Data Factory, Fabric, Redshift, or Databricks.
- A solid understanding of data warehousing and ETL processes
- Advanced SQL skills such as query optimization, complex joins, window functions
- Expert-level proficiency in Python (including pySpark) for large-scale data manipulation, transformation, and automation.
- Experience with Azure DevOps and CI/CD process
- Excellent problem-solving and analytical skills
- Experience in creating post-implementation documentation
- Excellent collaboration and communication skills, with the ability to work effectively across multidisciplinary teams and present technical concepts to non-technical stakeholders.
- Attention to detail, commitment to data quality, and a proactive approach to identifying and addressing data issues
Requirements:
Bachelor’s or master’s degree in information technology, Computer Science, or a related field.
5-10 years of relevant experience.
Strong interpersonal skills including analytical thinking, creativity, organizational abilities, high commitment, initiative in task execution, and a fast-learning capability for understanding IT concepts