<p>About Woodside Energy </p><p>We are a global energy company, providing reliable and affordable energy to help people lead better lives. Join our team at Woodside Global Solutions in Bengaluru where talent, digital expertise, and operational excellence converge to solve complex energy challenges, accelerate change, and reimagine business capabilities to support Woodside's global operations and our role in the energy transition. </p><p><br />Founded in 1954, Woodside established the liquefied natural gas (LNG) industry in Australia 40 years ago and supplies customers around the globe. 70 years on, Woodside continues to be driven by a spirit of innovation and determination. <br />At Woodside, we know great results come from our people feeling valued, getting the support they need to reach their full potential and working in a psychologically and physically safe work environment. We believe in nurturing talent and providing opportunities for continuous learning and career advancement. </p><p>Refer to our corporate website for more information about our different locations and projects: What We Do </p><p> </p><p>About Woodside Global Solutions </p><p>Woodside Global Solutions in Bengaluru is being built as a hub of excellence, to drive innovation, digital transformation, and global collaboration. </p><p><br />Working as one Global team, the Woodside Digital team is a trusted partner driving transformation within the organisation. We are bold in our ambitions and resolute in our actions. Through cutting-edge AI, robust cyber security, and advanced data solutions we drive innovation and influence every part of our business. </p><p><br />We are looking for talented professionals who are passionate about technology and eager to make a global impact, helping to shape the future of Woodside together. <br /> <br />About the role </p><p>The purpose of the Data Engineer position is to leverage their technical expertise and domain knowledge to design, build, and maintain efficient and robust data solutions. The Data Engineer shall be responsible for the building, testing, and deployment of Data Pipelines on the EDP. The engineer shall ensure that the pipelines are developed and deployed with a secure-by-design approach, delivering robust, thoroughly tested, and maintainable solutions. <br /> <br /> <br />Duties & Responsibilities: </p><p>Build <br />Pipeline Development <br />Develop pipelines using the standard patterns for data pipelines and workflows utilizing Streamsets, Kestra, dbt, Git <br />Design and Implement data storage and processing solutions employing Snowflake <br />Utilize AWS services for cloud-based platform tooling infrastructure including but not limited to: Lambda,ECS,MSK,RDS,EC2, Secrets Manager, ALB, Cloud Watch, Event Bridge <br />Utilize Terraform for AWS and Azure deployments <br />Leverage and integrate APIs for data access and manipulation <br />Write Python scripts for data common processing and automation tasks <br />Leverage Platform API’s and Web Applications to enforce Platform Security <br />Development experience with Go, SQL, C#, .net, JavaScript, shell scripts & container platforms like Docker <br />The engineer shall have experience integrating with timeseries source systems: Honeywell Plant Historian Database, OSI Pi <br />The engineer shall have experience in Authentication mechanisms including but not limited to (OAuth 2.0, OIDC, Microsoft Entra, Key Pair Authentication, Certificate based authentication, SAML based SSO) <br /> <br />Test <br />Quality Assurance <br />Create and execute comprehensive test plans to ensure the pipelines functionality and performance <br />Develop unit tests, integration tests, and end-to-end tests for data pipelines and workflows <br />Ensure data accuracy and consistency through rigorous testing processes <br />Leverage automated testing processes to enhance efficiency <br /> <br />Governance <br />Compliance & Risk <br />Due to the Crown Jewel nature of the enterprise data platform, Data Engineers may have access to PII, Confidential and Most Confidential data <br />This role requires strict adherence to access process and procedures to maintain Data Privacy and Security <br />Identify and report any potential breaches of the Data Information and Systems Processes <br /> <br />Operate <br />Platform Maintenance <br />Monitor and manage the platform to ensure optimal performance and uptime <br />Conduct regular maintenance tasks such as updates, patches, and backups <br />Resolve any issues or incidents related to the platform in a timely manner <br />Continuously improve platform operations through automation and optimization <br />Strong experience with Windows & Unix like operating systems <br /> <br />Security <br />Secure by Design <br />Implement security best practices throughout the pipeline development and deployment process <br />Conduct regular security reviews and vulnerability assessments <br />Ensure data encryption, access control, and other security measures are enforced <br />Use credential management platforms like Thycotic Secret Server, AWS Secrets Manager <br /> <br />Support <br />Technical Guidance <br />Assist in troubleshooting and resolving intricate technical issues <br />Deliverables <br />Robust and scalable data pipelines with well-documented code and processes <br />Comprehensive test plans and automation scripts ensuring platform reliability <br />Regular security assessments and compliance reports <br />Technical support and guidance documentation for delivery data engineers <br />Deliver secure, robust, and maintainable data pipelines <br />Ensure high-quality and thoroughly tested data solutions <br />Maintain compliance with security standards and best practices <br />Maintain compliance with the Data Lifecycle Management Process <br />Maintain compliance with the Data Privacy standards and best practices <br /> <br />Skills & Experience</p><p>Bachelor’s or Master’s degree in Computer Science, Data Science, Information Technology, or a related field with a focus on data engineering or data analytics <br />Technical Expertise: Strong proficiency in programming languages such as Python, SQL, Java, or Scala for data processing and analysis <br />Data Engineering Skills: Experience with data modeling, ETL processes, data warehousing, data integration, and data pipeline development <br />Database Knowledge: Proficiency in relational databases (e.g., SQL, PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) <br />Cloud Platform Experience: Working knowledge of cloud platforms such as Snowflake, AWS, Azure, or Google Cloud Platform for data storage and processing <br />Data Visualization: Experience with data visualization tools (e.g., Power BI) to create meaningful insights from data <br />Data Quality Assurance: Understanding of data quality principles, data governance, and data validation processes <br />Analytical Skills: Ability to analyze complex data sets, identify trends, patterns, and insights to drive data-driven decision-making <br />Problem-Solving Abilities: Proficiency in troubleshooting data-related issues, identifying root causes, and implementing solutions <br />Project Management Knowledge: Familiarity with project management methodologies to contribute effectively to project planning and execution <br />Communication Skills: Strong verbal and written communication skills to collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders <br />Continuous Learning: Willingness to stay updated with the latest data technologies, tools, and industry trends to enhance data engineering skills <br />Experience: Prior experience in data engineering, data analytics, or related roles with a track record of successful data project delivery <br />Technical Leadership: Ability to make informed, strategic decisions that align technology with business objectives, while balancing short-term and long-term trade-offs <br />Customer Focus: Deep understanding of customer needs and how to translate them into effective technical solutions that drive business value <br />Collaboration: Encourages collaboration across teams and stakeholders, breaking down silos and ensuring alignment <br />Problem-Solving: Strong analytical and problem-solving skills, capable of addressing complex technical challenges and delivering innovative solutions <br />Innovation: Ability to lead innovation in technology while maintaining an eye on product-market fit and user experience <br />Agility: Adaptability in a fast-moving environment, with a mindset focused on delivering high-impact solutions quickly and iteratively <br /> <br /> <br />If you think you can do this job but don’t meet all the criteria, that’s OK! Please apply. At Woodside, we value people with diverse experiences and backgrounds, as they provide unique perspectives that help us innovate. <br /> <br />Recognition & Reward <br /> <br />What you can expect from us: </p><p>Commitment to your ongoing development, including on the job opportunities and formal programs <br />Inclusive parental leave entitlements for both parents <br />Values led culture <br />Flexible work options <br />Generous annual leave, sick leave and casual leave <br />Cultural and religious leave with flexible public holiday opportunities <br />A competitive remuneration package featuring performance based incentives with uncapped Employer Provident Fund </p><p>Woodside is committed to fostering an inclusive and diverse workforce culture, which is supported by our Values. </p><p>Inclusion centres on all employees creating a climate of trust and belonging, where people feel comfortable to bring their whole self to work. We also offer supportive pathways for all employees to grow and develop leadership skills. <br />If you are ready to take your career to the next level and be part of a global organisation that values innovation and excellence, apply now through our careers portal or connect with our recruitment team. </p><p> </p><h3>experience</h3>10
<p>About Woodside Energy </p><p>We are a global energy company, providing reliable and affordable energy to help people lead better lives. Join our team at Woodside Global Solutions in Bengaluru where talent, digital expertise, and operational excellence converge to solve complex energy challenges, accelerate change, and reimagine business capabilities to support Woodside's global operations and our role in the energy transition. </p><p>Founded in 1954, Woodside established the liquefied natural gas (LNG) industry in Australia 40 years ago and supplies customers around the globe. 70 years on, Woodside continues to be driven by a spirit of innovation and determination. </p><p>At Woodside, we know great results come from our people feeling valued, getting the support they need to reach their full potential and working in a psychologically and physically safe work environment. We believe in nurturing talent and providing opportunities for continuous learning and career advancement. </p><p>Refer to our corporate website for more information about our different locations and projects: What We Do </p><p> </p><p>About Woodside Global Solutions </p><p> </p><p>Woodside Global Solutions in Bengaluru is being built as a hub of excellence, to drive innovation, digital transformation, and global collaboration. </p><p>Working as one Global team, the Woodside Digital team is a trusted partner driving transformation within the organisation. We are bold in our ambitions and resolute in our actions. Through cutting-edge AI, robust cyber security, and advanced data solutions we drive innovation and influence every part of our business. </p><p>We are looking for talented professionals who are passionate about technology and eager to make a global impact, helping to shape the future of Woodside together. </p><p> </p><p>About the role </p><p> </p><p>The purpose of the Data Engineer position is to leverage their technical expertise and domain knowledge to design, build, and maintain efficient and robust data solutions. The Data Engineer shall be responsible for the building, testing, and deployment of Data Pipelines on the EDP. The engineer shall ensure that the pipelines are developed and deployed with a secure-by-design approach, delivering robust, thoroughly tested, and maintainable solutions. </p><p> </p><p>Duties & Responsibilities: </p><p>Build </p><p>Pipeline Development </p><ul><li><p>Develop piplines using the standard patterns for data pipelines and workflows utilizing Streamsets, Kestra, dbt, Git </p></li></ul><ul><li><p>Design and Implement data storage and processing solutions employing Snowflake </p></li></ul><ul><li><p>Utilize AWS services for cloud-based platform tooling infrastructure including but not limited to: Lambda,ECS,MSK,RDS,EC2, Secrets Manager, ALB, Cloud Watch, Event Bridge </p></li></ul><ul><li><p>Utilize Terraform for AWS and Azure deployments </p></li></ul><ul><li><p>Leverage and integrate APIs for data access and manipulation </p></li></ul><ul><li><p>Write Python scripts for data common processing and automation tasks </p></li></ul><ul><li><p>Leverage Platform API’s and Web Applications to enforce Platform Security </p></li></ul><ul><li><p>Development experience with Go, SQL, C#, .net, JavaScript, shell scripts & container platforms like Docker </p></li></ul><ul><li><p>The engineer shall have experience integrating with timeseries source systems: Honeywell Plant Historian Database, OSI Pi </p></li></ul><ul><li><p>The engineer shall have experience in Authentication mechanisms including but not limited to (OAuth 2.0, OIDC, Microsoft Entra, Key Pair Authentication, Certificate based authentication, SAML based SSO) </p></li></ul><p> </p><p>Test </p><p>Quality Assurance </p><ul><li><p>Create and execute comprehensive test plans to ensure the pipelines functionality and performance </p></li></ul><ul><li><p>Develop unit tests, integration tests, and end-to-end tests for data pipelines and workflows </p></li></ul><ul><li><p>Ensure data accuracy and consistency through rigorous testing processes </p></li></ul><ul><li><p>Leverage automated testing processes to enhance efficiency </p></li></ul><p> </p><p>Governance </p><p>Compliance & Risk </p><ul><li><p>Due to the Crown Jewel nature of the enterprise data platform, Data Engineers may have access to PII, Confidential and Most Confidential data </p></li></ul><ul><li><p>This role requires strict adherence to access process and procedures to maintain Data Privacy and Security </p></li></ul><ul><li><p>Identify and report any potential breaches of the Data Information and Systems Processes </p></li></ul><p> </p><p>Operate </p><p>Platform Maintenance </p><ul><li><p>Monitor and manage the platform to ensure optimal performance and uptime </p></li></ul><ul><li><p>Conduct regular maintenance tasks such as updates, patches, and backups </p></li></ul><ul><li><p>Resolve any issues or incidents related to the platform in a timely manner </p></li></ul><ul><li><p>Continuously improve platform operations through automation and optimization </p></li></ul><ul><li><p>Strong experience with Windows & Unix like operating systems </p></li></ul><p> </p><p>Security </p><p>Secure by Design </p><ul><li><p>Implement security best practices throughout the pipeline development and deployment process </p></li></ul><ul><li><p>Conduct regular security reviews and vulnerability assessments </p></li></ul><ul><li><p>Ensure data encryption, access control, and other security measures are enforced </p></li></ul><ul><li><p>Use credential management platforms like Thycotic Secret Server, AWS Secrets Manager </p></li></ul><p> </p><p>Support </p><p>Technical Guidance </p><ul><li><p>Assist in troubleshooting and resolving intricate technical issues </p></li></ul><ul><li><p>Deliverables </p></li></ul><ul><li><p>Robust and scalable data pipelines with well-documented code and processes </p></li></ul><ul><li><p>Comprehensive test plans and automation scripts ensuring platform reliability </p></li></ul><ul><li><p>Regular security assessments and compliance reports </p></li></ul><ul><li><p>Technical support and guidance documentation for delivery data engineers </p></li></ul><ul><li><p>Deliver secure, robust, and maintainable data pipelines </p></li></ul><ul><li><p>Ensure high-quality and thoroughly tested data solutions </p></li></ul><ul><li><p>Maintain compliance with security standards and best practices </p></li></ul><ul><li><p>Maintain compliance with the Data Lifecycle Management Process </p></li></ul><ul><li><p>Maintain compliance with the Data Privacy standards and best practices </p></li></ul><p> </p><p> Skills & Experience: </p><p> </p><ul><li><p>10yrs Data Engineering/Data Analyst experience </p></li><li><p>Bachelor’s or Master’s degree in Computer Science, Data Science, Information Technology, or a related field with a focus on data engineering or data analytics </p></li></ul><ul><li><p>Technical Expertise: Strong proficiency in programming languages such as Python, SQL, Java, or Scala for data processing and analysis </p></li></ul><ul><li><p>Data Engineering Skills: Experience with data modeling, ETL processes, data warehousing, data integration, and data pipeline development </p></li></ul><ul><li><p>Database Knowledge: Proficiency in relational databases (e.g., SQL, PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) </p></li></ul><ul><li><p>Cloud Platform Experience: Working knowledge of cloud platforms such as Snowflake, AWS, Azure, or Google Cloud Platform for data storage and processing </p></li></ul><ul><li><p>Data Visualization: Experience with data visualization tools (e.g., Power BI) to create meaningful insights from data </p></li></ul><ul><li><p>Data Quality Assurance: Understanding of data quality principles, data governance, and data validation processes </p></li></ul><ul><li><p>Analytical Skills: Ability to analyze complex data sets, identify trends, patterns, and insights to drive data-driven decision-making </p></li></ul><ul><li><p>Problem-Solving Abilities: Proficiency in troubleshooting data-related issues, identifying root causes, and implementing solutions </p></li></ul><ul><li><p>Project Management Knowledge: Familiarity with project management methodologies to contribute effectively to project planning and execution </p></li></ul><ul><li><p>Communication Skills: Strong verbal and written communication skills to collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders </p></li></ul><ul><li><p>Continuous Learning: Willingness to stay updated with the latest data technologies, tools, and industry trends to enhance data engineering skills </p></li></ul><ul><li><p>Experience: Prior experience in data engineering, data analytics, or related roles with a track record of successful data project delivery </p></li></ul><ul><li><p>Technical Leadership: Ability to make informed, strategic decisions that align technology with business objectives, while balancing short-term and long-term trade-offs </p></li></ul><ul><li><p>Customer Focus: Deep understanding of customer needs and how to translate them into effective technical solutions that drive business value </p></li></ul><ul><li><p>Collaboration: Encourages collaboration across teams and stakeholders, breaking down silos and ensuring alignment </p></li></ul><ul><li><p>Problem-Solving: Strong analytical and problem-solving skills, capable of addressing complex technical challenges and delivering innovative solutions </p></li></ul><ul><li><p>Innovation: Ability to lead innovation in technology while maintaining an eye on product-market fit and user experience </p></li></ul><ul><li><p>Agility: Adaptability in a fast-moving environment, with a mindset focused on delivering high-impact solutions quickly and iteratively </p></li></ul><p> </p><p> </p><p>If you think you can do this job but don’t meet all the criteria, that’s OK! Please apply. At Woodside, we value people with diverse experiences and backgrounds, as they provide unique perspectives that help us innovate. </p><p> </p><p>Recognition & Reward: </p><p> </p><p>What you can expect from us: </p><ul><li><p>Commitment to your ongoing development, including on the job opportunities and formal programs </p></li></ul><ul><li><p>Inclusive parental leave entitlements for both parents </p></li></ul><ul><li><p>Values led culture </p></li></ul><ul><li><p>Flexible work options </p></li></ul><ul><li><p>Generous annual leave, sick leave and casual leave </p></li></ul><ul><li><p>Cultural and religious leave with flexible public holiday opportunities </p></li></ul><ul><li><p>A competitive remuneration package featuring performance based incentives with uncapped Employer Provident Fund </p></li></ul><p> </p><p>Woodside is committed to fostering an inclusive and diverse workforce culture, which is supported by our Values. Inclusion centres on all employees creating a climate of trust and belonging, where people feel comfortable to bring their whole self to work. We also offer supportive pathways for all employees to grow and develop leadership skills. </p><p>If you are ready to take your career to the next level and be part of a global organisation that values innovation and excellence, apply now through our careers portal or connect with our recruitment team.</p><p> </p><h3>experience</h3>15
<p>About Woodside Energy </p><p> </p><p>We are a global energy company, providing reliable and affordable energy to help people lead better lives. Join our team at Woodside Global Solutions in Bengaluru where talent, digital expertise, and operational excellence converge to solve complex energy challenges, accelerate change, and reimagine business capabilities to support Woodside's global operations and our role in the energy transition. </p><p> </p><p>Founded in 1954, Woodside established the liquefied natural gas (LNG) industry in Australia 40 years ago and supplies customers around the globe. 70 years on, Woodside continues to be driven by a spirit of innovation and determination. </p><p> </p><p>At Woodside, we know great results come from our people feeling valued, getting the support they need to reach their full potential and working in a psychologically and physically safe work environment. We believe in nurturing talent and providing opportunities for continuous learning and career advancement. </p><p> </p><p>Refer to our corporate website for more information about our different locations and projects: What We Do - -Link: What we do - Woodside Energy</p><p> </p><p>About Woodside Global Solutions </p><p> </p><p>Woodside Global Solutions in Bengaluru is being built as a hub of excellence, to drive innovation, digital transformation, and global collaboration. </p><p>Working as one Global team, the Woodside Digital team is a trusted partner driving transformation within the organisation. We are bold in our ambitions and resolute in our actions. Through cutting-edge AI, robust cyber security, and advanced data solutions we drive innovation and influence every part of our business. </p><p>We are looking for talented professionals who are passionate about technology and eager to make a global impact, helping to shape the future of Woodside together. </p><p> </p><p>About the role </p><p> </p><p>The Data Scientist at Woodside applies domain expertise, scientific acumen, mathematical and statistical proficiency, as well as computing and programming skills to unlock insights from complex datasets and delivering high-impact analytical solutions that enhance decision-making, optimise operations, and unlock value across the oil and gas value chain. The Data Scientist applies their expertise in scientific methodologies, mathematical modelling, statistical analysis, and computational skills to optimize operations and drive strategic initiatives. </p><p>The role applies scientific reasoning, statistical modelling, machine learning, and software engineering skills to turn complex datasets into actionable insights. </p><p>The Data Scientist will work across diverse business problems—ranging from time-series analytics and anomaly detection to optimisation, simulation, computer vision and applied machine learning—while contributing to the uplift of data science tools, frameworks, and delivery practices. </p><p> </p><p> Duties & Responsibilities: </p><p> </p><ul><li>Collaborate with business users, internal stakeholders and Digital teams to apply data science techniques to deliver value across the oil and gas value chain, understanding and shape client requirements, and translating them into Data Science actions </li></ul><ul><li>Contribute to data-driven data science project from end-to-end (i.e. problem identification to delivery) </li></ul><ul><li>Evaluate and apply data science concepts and techniques (e.g. predictive modelling, statistical inference, algorithms) to problems in oil and gas exploration and production domains </li></ul><ul><li>Communicate key assumptions, uncertainties, findings of statistical or technical analysis through reports, presentations and translate results back into business language </li></ul><ul><li>Utilizing machine learning, artificial intelligence, and data visualization techniques to identify trends, patterns, and anomalies in oil and gas data. </li></ul><ul><li>Validate models using statistical and scientific methodologies, ensuring robustness, reproducibility, and interpretability. </li></ul><ul><li>Develop predictive models, time-series forecasting methods, anomaly detection pipelines, clustering algorithms, and OCR or neural-network models as required. </li></ul><ul><li>Collaborate with SMEs on analysis and modelling processes to solve challenging and high-impact problems in oil and gas exploration and production domains </li></ul><ul><li>Create documentation, develop and improve data science processes and best practices to ensure the sustainability of work. </li></ul><ul><li>Contribute to successful implementation of solutions, ensuring training and knowledge transfer to stakeholders </li></ul><ul><li>Deliver data science solutions from opportunity identification through to handover, including planning, experimentation, implementation, and monitoring. </li></ul><p> </p><p>Skills & Experience: </p><p> </p><ul><li>Min. 3-year proven practical experience in Data Science across the end-to-end model development lifecycle, with machine learning and AI methods </li></ul><ul><li>Strong Python programming skills and experience with ML libraries (Pandas, NumPy, SciPy, scikit-learn, PyTorch/TensorFlow). </li></ul><ul><li>Demonstrated expertise in time-series analysis, anomaly detection, clustering, and general applied machine learning. </li></ul><ul><li>Experience developing web applications (Dash, Streamlit) and APIs (Flask, FastAPI, or Django). </li></ul><ul><li>Experience with software engineering best practices (e.g. CI/CD principles under Agile Framework, version control, reproducible research, etc.) </li></ul><ul><li>Desirable experience being part of a multi-disciplinary team for Digital solution delivery or Experience with Cloud Technologies (AWS, MS Azure) </li></ul><ul><li>Tertiary qualification in Computer Science, Statistics, Mathematics, Data Science, Engineering, or a similar quantitative discipline. </li></ul><ul><li>Strong competency using AWS cloud services, including S3, Lambda, SageMaker, containerisation, and serverless execution. </li></ul><ul><li>Strong documentation, communication, and collaboration capabilities. </li></ul><ul><li>Strong problem-solving and critical thinking skills </li></ul><ul><li>Excellent documentation skills to ensure the sustainability of work. </li></ul><ul><li>Quantitative analytics. </li></ul><ul><li>Excellent written and verbal communication </li></ul><p> </p><h3>experience</h3>15