Woodside Energy adds actual robots into code pipeline stages

By

Before pushing updates to Karratha-based robots.

Woodside Energy has revealed for the first time the mechanics of introducing code changes to robots at its remote sites, with code tested in a lab and carpark before being pushed into production.

Woodside Energy adds actual robots into code pipeline stages
Source: Woodside

Robotics engineer Robert Reid told the AWS Summit that the company is using a continuous integration and continuous deployment (CI/CD) pipeline to test and deploy code changes that start in its Perth-based lab environment and finish onsite at its its Pluto liquefied natural gas (LNG) facility in Karratha in Western Australia.

The company has been trialling robots, include a four-wheeler with remotely-controlled arms and sensors to patrol and the Pluto plant since 2018.

“If we’re doing code development, then we’ll be doing that on a development robot out here in the lab,” he said on Wednesday.

“Once we’re happy with some of the code changes we’re making, we’ll be pushing them up to GitHub, where the CI/CD processes will kick off and build those changes into fresh Debian changes.

“We have a staging robot also out in the carpark, so once those packages have been built up into a new Docker image, we’ll pull it down onto the staging robot, and we’ll spend multiple days testing.

“Once we’re happy the robot is performing as expected, then we’ll actually push that image to the production robot, which is sitting up in Karratha right now.”

Reid said that the widely-used open source robotics software framework, robot operating system (ROS), is the “glue that really brings the various parts of the robot together”.

“We have a range of sensors and their device drivers, and ROS allows us to take the data from each of those sensors, bring them together with a range of algorithms such as localisation, obstacle detection, navigation, and also allows us to encode the images as video, for example, so that we can push that data up to the cloud,” he said.

“We bring all of those various components together through a CI/CD pipeline that is running in AWS services.”

When deployed in the Pluto-Karratha Gas Plant (KGP), the production robot follows a predetermined path to capture data, including images, before re-docking at a ‘bot box’.

The robots use 3D point clouds of the facilities for localisation and planning purposes, which Woodside has built in its test lab.

“For the thousands of hours that we do out in the field, you can put hundreds of thousands of hours in simulation,” head of robotics Mark Micire said.

“Our plant doesn’t change a lot, so unlike… other robots that are in very dynamic environments, we can cheat a little bit and go through and generate a point cloud ahead of time that good to centimetre and millimetre accuracy.”

The lab environment also allows Woodside to used regression tests and simulations as part of the CI/CD process to iron out issues that would otherwise only become clear in the field.

“Frankly, for a lot of this equipment, its lab equipment that we’re adapting to the ‘real-world’, so we’re figuring out how it breaks,” Micire said.

“We’re actually searching for those data points that you’re only going to find after the thousandth hour of testing, and it’s those data points that we want to find out now.

“We want to really find them in a testing environment. That way when we’re working in a real operational environment, we’ve already scrubbed out all of those problems.”

While robots are already being used in the field, Woodside plans to eventually get to a point where they don’t require a "safety operator" - aka a person - to follow them around.

“At the moment, when [the robots are] driving around, we have a safety operator just checking in to make sure that they’re doing everything correctly,” Reid said.

“Our goal long-term is to take that safety operator out of the field and get ourselves to a place where we can actually do remote operations.”

Micire added that further afield, the robots could also be used to “manipulate things in the environment”.

“We do see a future in which a robot is able to go and affect things in the world, as opposed to just moving through the world and monitoring it.”

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

Suncorp builds generative AI engine 'SunGPT'

Suncorp builds generative AI engine 'SunGPT'

Coles Group calculates a TCO for its enterprise applications

Coles Group calculates a TCO for its enterprise applications

NAB retires its Tableau environment

NAB retires its Tableau environment

Bendigo and Adelaide Bank uses GenAI, MongoDB to refactor application

Bendigo and Adelaide Bank uses GenAI, MongoDB to refactor application

Log In

  |  Forgot your password?