Total has been working on robotics for some years now, having launched the Argos challenge in 2013 and now getting ready to put the Taurob designed Stevie robot through its paces in a trial at Shetland Gas Plant. We spoke with Kris Kydd, Head of Robotics, at Total E&P UK.
What have been the main lessons learned to date around enabling these systems to operate in an offshore environment?
Total believes that robots have a huge amount to offer to our industry. We are pioneering their use on oil and gas sites, and the last seven years have seen progress, and lots of lessons learned! Robots offer immediate advantages such as increased safety and efficiency. In the long-term, they offer us new ways of working and are limited only by our imagination.
A major lesson learned is that the performance of any autonomous wireless robot is only as good as the quality of its surrounding digital architecture. We have taken this to heart and developed a digital architecture that is device agnostic and designed to be used with any robot system or mobile device. In the early stages, we had all the autonomous processing performed on board the robots. This led to issues with the battery power running down too quickly; it also limited storage space and processing power on the robot which we’ve upgraded by moving non-critical processing to the cloud. A reliable communications network is also essential. Available bandwidth and latency completely dictate what you can do.
We have also developed robot mission planning capabilities that can be run from the digital twin system. This is where users can indicate the inspection points of interest and this is then translated into robotic commands for the robot. It is this digital architecture that will allow us to extract maximum value from robots. Total now appreciates that using a robot to support operations does not mean keeping the robot on site. What is essential, however, is learning to make the best use of it; recognizing the changes to operations that robotics imply and updating operating philosophies accordingly.
Has anything changed in that period that would have been of benefit to the teams back in 2013?
Machine learning has taken off in a big way since Total launched the ARGOS challenge. If machine learning had been as prevalent then as it is now, then we would have included it from the beginning. If we’d had a digital architecture to interface with the robot, then we also could have tested the end-to-end functionalities.
A digital architecture means the robot gets all the data it requires to complete a mission from a digital twin, which is our single source of truth and contains the latest site data. Then, either during or after its mission, the robot uploads its inspection data to the cloud data store in the machine learning module. This data is then processed through machine learning algorithms. It also means making the interface for the human operator as simple as possible, first through a mission planning app and second through a Dashboard for presenting the results of the mission.
4G/LTE is also much more standard these days. It’s a more recent mobile broadband internet access offering higher capabilities in connectivity as it offers a higher rate of data transfer. During the ARGOS Challenge we experienced black spots with the Wi-Fi network even on small sites because of interference caused by metallic equipment. Our latest robot design has a dual router offering both 4G and WiFi capability.
Kris Kydd, Head of Robotics, at Total E&P UK. Photo from Total.
You mention machine learning quite a lot. Could you explain what you mean by machine learning?
Machine learning, in this respect, allows for automated inspection capabilities. The robot captures images of various types of equipment, which allows us to generate a large data set upon which machine learning algorithms can be developed and tested. As the robot captures more images, the dataset grows, which can be re-fed into the machine learning model allowing the algorithms to be re-tested, which in turn allows its predictions to become increasingly accurate.
Are there any remaining technology gaps or areas where you see more advances can be made?
During the early stages, we focused more on the safety aspects of the hardware (e.g specifying that the design had to be capable of working in a potentially explosive atmosphere – ATEX) rather than the safety aspects of the software. We worked with Saft, Total’s battery specialist affiliate, to meet this challenge. However, there is no point in specifying safety aspects for the hardware if you do not have the software equivalent. Assuring the safe behavior of an autonomous robot in a complex environment is of paramount importance for acceptability. The wider workforce needs to know they can trust these robots to make the correct decisions. In order to gain that trust, those autonomous decisions need to be transparent and explainable. We are currently working on this and recognize how important it is before we can deploy at scale.
When do you expect Stevie to head up to Shetland, and what are the trials going to involve?
Robotics for oil and gas is still in its infancy, so it’s very exciting for Total to be starting site acceptance testing at the Shetland Gas Plant this September. Our major objective is for the robots to successfully operate autonomously in an ATEX environment. We will test robotic fundamentals such as mobility, navigation over a range of surfaces such as gratings, gravel, and stairs. It’s important for us to stress-test the interface between the robot and the digital architecture. We will introduce two robots performing simultaneous autonomous navigation without collision with each other.
What’s the initial goal and what’s the longer-term vision?
The path for Total is to investigate how best to further develop and extend the use of robotics. Initially, we need to make the transition between a one-off week-long deployment to a 12-month continuous deployment. We need to assess the reliability and robustness of the robot. Repeatability needs to be monitored closely; can the robot perform the same missions day in day out? If so, we can build up datasets on how equipment potentially degrades over time and apply machine learning to that. In parallel to reliability and robustness, we want to perform missions that will bring value to the business, for example by targeting inspection tasks that need to be performed frequently.
Longer-term, we wish to remove the robot handler, the person in the immediate vicinity of the robot who is equipped with the remote emergency stop. For that, we need to prove high reliability in obstacle avoidance and collision detection. Adding manipulation functionality is also important to complement the inspection capabilities.
Is that likely to involve different types of robotics for different tasks and scenarios, eg. greenfield sites and brownfield sites?
In the near-term, Total will deploy robotics in human-engineered environments. We will work with operators on installations, building confidence and acceptance. The lessons learned from such deployments will enable “robotization” of future platform design. If we are to eventually achieve a fully unmanned platform, we will need further advances in robotics. That will require different locomotion systems that can perform a wider range of different tasks but all nevertheless communicating through the standard digital twin architecture.
How does robotics incorporate into Total’s broader mission?
Robotics and autonomous systems will allow Total to reduce its Scope 1 and 2 emissions by increasing efficiency and reducing the amount of transportation of personnel that will be required. We are also participating in the recently approved OGTC project for accurate remote methane monitoring using beyond visual line of sight (BVLoS) unmanned aerial vehicles.
AOG Digital E-News is the subsea industry's largest circulation and most authoritative ENews Service, delivered to your Email three times per week