1 August 2013
An Ocean for Everyone
Last night I was standing on the bow of the R/V Thompson, gazing towards the distant horizon and a stunning sunset. I realized what a privilege it is to be on this research cruise and be able to take part in the effort to extend the limits of human knowledge. At the same time, I felt frustrated thinking about the fact that most people never will get the chance to be on a boat, on top of an active volcano, in the Pacific Ocean. An even smaller number of people will be able to actually go down and observe the volcano firsthand from a submarine.
But then I realized, I do not necessarily want or need to go down in a manned submarine and perform oceanographic work, but rather just take a casual look around. I would be perfectly fine just remaining on the ship if only I had the ability to see through the big volume of water all the way down to the seafloor. I started asking myself what this experience would be like.
Would I be able to see the whales slowly moving around?
Would I be able to see gas seeping out of the hydrothermal vents?
Would I even be able to see a major earthquake or a volcano eruption as it happened?
I pictured a Grand Canyon full of life.
If there were a way for scientists to provide that experience to ordinary people, there would no longer be a need for long speeches with the sole purpose of convincing people that the ocean is interesting, important, and very complex. Instead, they could just go and see for themselves!
Unfortunately we humans do not have the ability to actually stand on the seafloor or see through dark sea water, mainly because the high pressure would kill us and because the water rapidly absorbs most light in the visible spectrum. I would argue, however, that there are certain devices that have the ability to withstand high pressure and see through even the murkiest of waters.
SOund NAvigation and Ranging, or sonar, refers to technology developed in the early 1900s where a device sends a sound wave towards an object and listens to the echo. The sooner the echo arrives, the shorter the distance to the object. In recent years, sonars have evolved into range-imaging devices providing a full 3D scan underwater.
By using sonar technology, in combination with laser scanners and high definition cameras, it would be possible to completely capture a dynamically changing underwater environment. With help from the infrastructure provided by the underwater cabled observatory, this information could be streamed live from the bottom of the sea all the way back to shore within milliseconds. Having all the necessary sensor data readily accessible on shore, a competent team of software engineers and educators could start building guided augmented-reality tours of the sea.
In practice, you would show up to a science museum, put on a pair of augmented-reality goggles and a seemingly normal pair of gloves, and then enter a normal-sized room. Suddenly, the room transforms into a ship, it seems like you are actually standing on the R/V Thompson out on the Pacific Ocean. In the blink of an eye, you are teleported down to the seafloor. The tour guide starts talking about the harsh conditions in which life can actually exist. At this point, most of the participants of the group are trying to cope with the slightly unusual experience of standing on the seafloor for the first time in their lives; meanwhile an octopus casually swims by. The guide encourages you to touch one of the hydrothermal vents and to your surprise it feels warm; turns out it was not just a normal glove after all. At the tour guide’s command, you turn into a giant. This gives you a perfect overview of the whole volcano and the fissure between the Pacific tectonic plate and the Juan de Fuca plate. Before returning back to the science museum, the tour guide shows a three-dimensional recording of the great eruption of 2016; the first subsea eruption completely captured by humans.
This might sound like an experience far in future. But in fact, engineers, educators, and scientists already have the fundamental technology to make this dream possible. If the right minds gathered, this amazing ocean experience could turn up at a science museum near you in just a couple of years. For the first time, there would be an ocean for everyone.
31 July 2013
Near-term opportunities in ROVs
For someone like me, a researcher in robotics, this cruise has been very interesting and has offered many opportunities to observe firsthand the extensive use of remotely operated vehicles (ROVs) in oceanography. The company operating the ROV onboard, ROPOS, has been conducting underwater operations since the early 1980s, and they have a long history of successful underwater installation using cutting-edge marine technology. It has been inspiring to see the complexity of the tasks they are willing to take on and solve.
In the most basic ROV setup, the operator (also called the pilot) controls a vehicle equipped with manipulators using a joystick. The pilot then observes the outcome on large TV monitors. In addition, the pilot has a number of monitors displaying navigation data, as well as a variety of sensor readings. The pilots also have access to force feedback when using the manipulators, but most pilots disable that functionality as they feel "it doesn't help.”
With this type of technology, an ROV can be in the water for many hours without having to resurface, sometimes even for days. The pilot, however, typically only works 12-hour shifts. This allows for extensive underwater surveying and long sequences of subsea manipulation such as cable mating, equipment placement, and valve turning. What characterizes scientific underwater manipulation today is the highly customized equipment in combination with complex manipulation tasks. This stands in contrast to the oil and gas industry where operations are mostly standardized.
I believe that underwater operations will be conducted in a very different fashion in the future. Specifically, I think that fewer people will be involved in the operation. Today, one person flies the ROV and two people control one manipulator each. In addition, there is a navigator who plans transit routes and communicates with the captain of the ship. In the near future, the use of assistive control software should make it possible to combine the tasks of flying the ROV and controlling the manipulators even in the case of an operation with more than two manipulators. This is already a proven concept in commercially available surgical robots, where a single surgeon controls the camera and both of the manipulators. The ROV navigator’s role could also most likely be integrated with the ship’s navigators by implementing a system where the ROV’s current and planned path is automatically communicated to the bridge.
The ROV has an ever-increasing number of sensors that provide positioning and visual sensing as well as force readings from the manipulators. Much of that data, however, is not used. For instance, an ROV could on the fly, using sonar, compare the bathymetry of the seafloor with previously collected data for purposes of detecting changes as well as localization of the ROV itself. This data could also be used to virtually revisit a location both during and after the dive. Similar to the aircraft's autopilot, visual sensing could be combined with position sensing and improved control algorithms to automatically bring the ROV to a location specified by the navigator. This could be used to automatically latch on to payload or for positioning the vehicle at a standoff distance from an object of interest.
There are however certain tasks where a completely automated system would fail. These mainly include manipulation tasks where human decision-making is needed. In those cases, it is useful to introduce `some automation' such as tremor reduction, dynamic motion scaling, or a concept in robotics called `Virtual Fixtures'. A virtual fixture can be thought of as a virtual ruler that guides the pilot’s motions, commonly using force feedback. Such force feedback could, for instance, be used to guide an operator towards making a safe and efficient underwater cable mating. It could further be used to prevent the manipulators from unintended collisions in the underwater environment. In the oil and gas industry, a similar system could notify the pilot if an important step of a longer procedure was missed. All of the above would be possible to build today using off-the-shelf technology.
As technology progresses, the assistive control systems would be able to perform increasingly difficult tasks with less and less input from a human operator until the human becomes mainly an observer, taking over only in exceptional cases or emergencies. This means that a single person from shore would be able to control a whole group of vehicles executing tasks that today would require many skilled operators as well as expensive research vessels.
To summarize, I think that future human operations in remote and harsh environments will be made possible using the already well established remotely operated vehicle technology. To lower cost, virtual fixture technology will be adopted to increase both efficiency and safety of remote operations. As virtual fixture technology becomes more sophisticated, the operator will be able to focus less on flying the ROV and more on the actual objective of each operation.
29 July 2013
It's been an exciting first week on board the Thompson where I've learnt how to tie a proper knot, how advanced underwater robotic operations work, and how to do a CTD cast.
Sitting in the ROV control room is quite a different experience compared to watching the live stream from home. This is mainly because you are also hearing all the discussion, as well as the constant ongoing problem solving among the scientists and engineers. It is a complicated operation where weather, engineering, and science all have to be taken into account.
When not eating the delicious Thompson food or helping out with engineering tasks, I have focused on my own little project of visualizing real-time ROV operations in 3D. The figures above and to the right show the bathymetry of Axial Seamount imported into a 3D virtual environment. I also imported the ROV into this visualization so that users can observe the movements of the actual ROV relative to the seafloor, as it is moving in real time.