Longitude Sound Bytes
Ep 83: Collaborating Technically – with Andy Stanford Clark (Listen)
“It’s been the pure passion of fantastic teams, all working together, because of a very common focus.”
Can Erdogan
At the intersection of ideas and action, this is Longitude Sound Bytes, where we bring innovative insights from around the world directly to you.
My name is Can Erdogan, and I’m a Longitude fellow from Rice University. In today’s episode, we will be featuring highlights from a conversation I led with Dr. Andy Stanford Clark, chief technology officer at IBM UK. He’s also the technical leader of IBM’s team working on the Mayflower Autonomous Ship, which we explored for our imagination series to learn about the roles of individuals and the experiences that brought it to fruition.
The ocean covers more than 70% of Earth. Today, more than 80% of it remains unexplored. As an engineering student working at the cross section of statistics and data science, I was interested to hear about the technical challenges of creating an intricate marine vessel that could gather and process data for ocean research, using only an artificial intelligence based computer brain and not a ship captain. We started our conversation with Dr. Stanford Clark’s description of the Mayflower Autonomous Ship.
.
Andy Stanford Clark
The Mayflower Autonomous Ship is a small ship that is about 15 meters (45 feet) long, six meters wide (by 18 feet). And it’s made of aluminum as a trimaran design that’s got two wings coming out the side of the main hull. And the main thing is there are no people on board, so it’s going to sail around the oceans captained by an AI captain, so some software-based IBM products for machine vision, optimization, and rules-based decision processing, which together form the brain of the AI captain.
And the purpose of the ship is to explore the oceans and have experiments on board that will improve our understanding of the impact of climate change on the ocean, and also the effect of pollution of the ocean from humans. And from that we plan to spend a long time at sea, gathering large amounts of data, and use that to change our understanding of this not very well understood, massive resource that’s 70% of the Earth’s surface.
Can
How long have you been a part of it? And what led you to become the technical leader of this project?
Andy
IBM has been involved for about five years when we, accidentally, one of my colleagues attended a conference where the designer of the ship was speaking about the physical design of the ship. And he went up to the speaker and said, How are you going to build the AI captain? And they said, We don’t really know yet. And so my colleague said IBM will help you. That’s how IBM was involved.
I’ve been involved for about the last two and a half years. Mainly when IBM officially announced our partnership with ProMare, which is a company that’s sponsoring the ship, it is funding it. So we came on board as the official technology partner, and that started a whole whirlwind of promotional activity and media interviews, podcasts, conference presentations, and … as it was in the run up to the launch of the ship. A lot more focus on the day-to-day requirements of how the different teams in IBM interacted with MSubs, which is a company that’s building the Mayflower Autonomous Ship. So I took the role as the technical lead from IBM, coordinating all of the activity, but also the media front, the media persona of IBM’s involvement with the Mayflower project, which is really, really exciting.
Can
Do you think IBM is going to continue this project to an extent that goes beyond what we have today? And where do you think- where do you see IBM’s involvement in the MAS 400 project in the next 5 to 10 years?
Andy
We’re involved in MAS 400 for a number of reasons, not only the sustainability in marine research and helping further the science of climate change, but really, we see this being the beginning of a new era of the marine industry.
400 years ago, the original Mayflower set out from Plymouth, UK to sail across the Atlantic to the US. So we really see this as the beginning of the next 400 of the marine industry, and challenge and innovation and exploration at sea.
And so a big part of this is that the AI Captain technology is highly transferable to other ships, from research ships as we find the MAS 400, through Navy ships and for the military, a lot of interest from them on autonomous ships and autonomous submarines, and also for cargo ships. And we’re already talking to quite a lot of large container companies about what the future of shipping will look like. You can imagine that ships will have far fewer people on board, and the actual navigation will be done by computer systems. You might even imagine a future where there’s a human captain on board and there’s a second pair of eyes and an AI Captain looking over their shoulder, saying oh, look out for that iceberg over there to avoid any nasty accidents. And you might- I have talked to some of the marine insurers like Lloyd’s of London, and they say that in the future, they might require a ship to have that kind of technology on board, just as our cars have to have ABS brakes and seat belts and airbags before we can get insurance. Ships might have to have similar equipment on board, the AI Captain, sort of guardian angel looking over their shoulder before it can set to sea. That’ll make the shipping industry a lot safer and a lot less expensive, particularly for research because it costs a huge amount to have a crew of 20 people and researchers out at sea for months on end, then you get a tiny amount of time working on the experiment because it takes a long time to get to and from the research site. And they have to eat food and drink water and get paid and go on vacation and they get sick, all these things, which costs a huge amount and reduces the efficiency of the trip, whereas the autonomous ship can just go out in the ocean, just stay out there for months on end. That’s where we see the value of this.
Can
What kind of outputs have you gathered so far with regards to advancements in marine science and research?
Andy
The original plan was that she’d set sail on the 16th of September 2020, the exact 400th anniversary of the original Mayflower setting sail, to traverse from Plymouth, UK to Plymouth, Massachusetts, following the same route. But because of COVID, the development, the building of the ship got delayed. So we had the naming ceremony on that date, but she was still not fully ready. The AI Captain wasn’t fully functional at that stage. So we did some trials around the UK, and she finally set off to the US on the 15th of June, this summer just gone. She got about four days into the mission, just got past the west, no, three days. It just got past the west tip of Ireland. So she was really facing the open ocean and a component broke in the generator exhaust system. There’s a really trivial mechanical part of pipe fractured so the exhaust gases were going into the wrong place. And it was polluting the generator and we wouldn’t have had enough power. Although MAS 400 is solar powered, she has a a biodiesel generator as a backup. And we couldn’t rely on getting enough energy from the sun to keep her going all the time. So it would have taken a very long time, months and months to get to America. So we decided to turn back and the AI Captain technology and all the IT systems on board were working absolutely flawlessly. It really gave us great confidence in the technology. So we spent the rest of the summer redesigning and replacing the generator system. She’ll be going out in local missions in the UK for the next few months, so we can make sure everything’s working properly in particular science experiments.
Because one thing we learned, your second part of your question, was the experiments work great on the workbench in the laboratory, but they don’t work so well when you put them in a salty environment and shake them out relentlessly inside the ship. So we have a number of adjustments to make to some of the experiments. We gathered some good data from the video cameras on board. And we also gathered some good data from the hydrophone, which is the underwater microphone listening out for whale and dolphin songs. And we found a really nice clip of dolphins swimming along with Mayflower on the second day. And we’ve identified the hydrophone clip from that same time. So we’re currently analyzing that to see if we can actually catch the clicks from the dolphins. We’ve got some other data as well from the hazard detection experiment, which looks at the camera images and uses a very clever system on a chip, so basically, a neural network on a chip to look at those visual images and look for recognized objects. So boats, people. So we are very positive. The current plan is to defer the next US attempt until next April, because we’ll run into the winter storms about November time. So we don’t want to head off into a big storm because there’s some pretty big waves out there. So next spring, we’ll be heading off back to the US with the current plan.
Can
I’m very impressed by the subtle technological complexity behind the project. That is not easy to see when you look at a regular ship. I have several technical questions for you.
Andy
My favorites!
Can
How many people do you have inside the technical team and what kind of programming languages do you use? How do you utilize mathematics into the programming of different neural networks? And what were some of the technological challenges that you had to tackle? And how did you tackle that? So I think this is three questions packaged as one question.
Andy
Okay, great question. So we have more than 100 people inside IBM working on the Mayflower project, in different aspects.
Can
That’s impressive.
Andy
Yeah, so not all of those are full-time, but very few of them are full-time, most of them are doing it because we love the project. So we’re just doing evenings and weekends, grabbing a bit of time here and there just to squeeze it in alongside our day jobs. But it’s probably 20 people, I think, working full time, officially as their day job. Most of those are working on the machine vision part of it. And it’s a lot of technology going into that, which is using deep learning. Trained it on labels, images or captioned images, we’ve got more than 2 million captioned images of things you might see at sea, which are being used to… so we’re training the model in the IBM cloud. And then we’re downloading the trained model into the ship itself. It’s an edge computing solution.
The other part of it is the rules-based decision system, which actually encodes the collision regulations, which is the “colregs,” as we call them, it’s the rules of the road for when you’re at sea, so that’s the rules you have to obey. And that’s been written not in a neural network, but in a rules-based system. So the nice thing about that, we get full explainability. So we can tell from looking at the logs exactly what set of preconditions the AI captain was experiencing from its sensors from the cameras, the radar, the AIS, and weather systems, all the different sensors.
In terms of programming languages, it is quite a mix. We’ve got a lot of Python, Java, a lot of C, some Ruby, and some Go. I think that’s pretty much everything, everything on board, because different people have developed different systems.
What else did you ask? Oh, some of the problems? Well, some of the biggest problems, interestingly, is how you disambiguate images in the cameras. So they can go to a camera, and it’s moving left and right and up and down. And the ship you’re looking at is moving left and right, and up and down. You’ve got two things that are moving left and right and up and down, it’s really easy to convince the computer or convince yourself that the entire picture is full of ships. But to convince yourself to just one ship there requires some really careful, first of all, image segmentation, but also data fusion from things like radar and AIS, saying, Well, it looks like there’s 20 ships in front of us. But actually the radar saying there is one, the AIS is clearly saying there is one. So let’s go with one shall we, and base our decisions around that.
And the other problem is the multi-ship problem. So collision regulations are really easy, very well defined when there’s just one ship. It is you and somebody else, who goes in front of who, who gives way to who, who slows down, who speeds up, how much space you have to give, that sort of thing. If you’ve got multiple ships in the picture, that becomes like a game of chess. You say, well, if I move, if I move out of the way of this one, I am moving into the path of this one, I can’t do that. Okay, so maybe I slow down a bit. Oh, no, cuz that means I cut off this guy behind me. Hmm. Maybe I just stop. Then you become a floating hazard. So all these things become really complicated as soon as there’s multiple players involved. And that’s really where the optimizer component of the AI captain comes in. It basically takes all possible future outcomes and says, okay, given the constraints I have, which is the best outcome? And the best outcome is kind of like a Maslow hierarchy of needs. The first rule is don’t hit anything. Okay? So number one rule. So all other things like, how you get to America, how you run the experiments, how you preserve power, how you make best use of the solar panels, how you head up into and all of those are secondary to not hit anything. So after that, those other things kick in as well, which is basically completing the mission.
Can
Wow, all of the technical complexity behind it sounds extremely challenging, but also extremely interesting as well. What kind of sensors do you use in order to be able to gather data for a combination of so many different experiments that you can try to pursue?
Andy
Yeah, so there’s two classes of sensors. One is the ones for the situational awareness for the AI Captain itself. So those are cameras, radar, the AIS and a little local weather station. And then whether they are dissenting from the Weather Company, which is the weather forecasting company that IBM owns, we process that in all the 15 minute weather forecasts in the Cloud. So we can basically tell Mayflower what weather it needs to know about to factor into its assessment of where it’s going to go next. So that’s the navigational stuff. Then for the experiments, each different experiment has its own set of sensors. So there is the hydrophone. There’s a sonar thing, which kind of points at the waves to see what- kind of assess the angle of the waves. There’s a chemical sensor, which assesses pH and salinity, and conductivity and temperature and stuff like that. Then we’ve got a really cool experiment, which is called Hypertaste, which is an electronic tongue, which came from IBM Research in Zurich. And that uses different ion sensors, trained with a neural network, rather like our own tongue. So there’s one for sweet, one for sour, one for bitter. So there isn’t a sense of … we don’t have a sensor in our mouth, orange juice, we just know that the combination of the activation of those different sensory areas in our tongue, we recognize that as orange juice. So in the Hypertaste, the rate of saturation of the different ion sensors tells us against the trained neural network model that that’s something, one of the chemicals we are looking out for. So we take in a small sample of water, run it through the Hypertaste system, it goes past through different ion sensors. We wait for a few seconds for it to saturate over time. We compare that against the machine learning model. We say, right, we sense this particular chemical that we’re looking for, and then we flush it through, flush it with clean water, get ready for the next one.
We also use the same sample, we take a photo with a very special microscope, and it’s like a 3D microscope that’s from the University of Plymouth. So it almost takes- a bit like a hologram image of the water. And that’s looking for microplastics. We’ve all seen photos of the big rafts of plastic bottles that are floating in the ocean. Microplastics is much scarier. This is where the plastic from the landfill has washed down the rivers into the water. And these are pieces of plastic that are smaller than you can see with the eye but can see with a microscope. Plankton eat those and then they get into our food chain. So when you eat a fish, then you’re eating plastic from microplastics in the water. What we don’t know is how bad that is as you get into the deep ocean because no one has ever measured it in the far, like, in the middle of the ocean. They’ve only measured it near the edges. So one of the experiments is to assess this microplastic density.
Can
I’m really surprised that IBM has developed so many different sensors. They have the tasting sensor, they have like this weather forecasting, I think IBM is producing technical products in a very diverse range of fields, it looks like.
Andy
That is one of the many amazing things about IBM. We operate in so many different industries, so many different countries.
Can
What do you think propels ideation and implementation? What drives the team behind this project that works voluntarily sometimes and gives up from their weekends or from the nights that they can have with their families and friends?
Andy
So I guess there’s two parts to it. One is, there’s a long list of technical objectives that need to be met. So you know, we’ve had technical teams working dimensions for a number of years to develop the core technology that goes on board the ship, everything from machine vision, to edge computing, to Internet of Things, to the decision system that drives it. All that stuff’s got to work in a disconnected world. So you can’t assume that you’ve got a high bandwidth data connection to the cloud, because we simply haven’t. So everything has to work with partial connectivity. So the real motivation for those teams has been that this is a really unusual project for a really unusual customer. It makes a great reference, a great showcase for our technology. And we think, well, if we can make it work on board a ship that is barely connected to the internet, then it’ll work in all sorts of other better connected situations and more reliable environments. So that was part of it.
But all the other people working it, it’s really been the pure passion of a fantastic team, we’ve been really lucky to work with amazing people at Promare and MSubs and MarineAI, also the IBM team all working together, because you’ve got a very common focus. And the idea was to get the Mayflower out into the sea, heading off for America in June. And everything was focused on that date. And all the things that had to come together, all the list of things that to be ticked off before we could go. Getting all the experiments online, working and calibrated, rapidly running out of time, people lining up to receive the data when it came in. And this is a very unusual situation. We don’t often do these kinds of moonshot type events, projects. And when we do, it really galvanizes people around it.
.
Can
Today we had a deep dive into the challenges of creating Mayflower Autonomous Ship and the technical collaboration behind it. Hearing the story of a team who worked so selflessly to protect the oceans by implementing machine vision and edge computing into a marine research vessel was an eye-opening example of using technology to help our planet. But we don’t have to be programmers or statisticians to protect our Hydrosphere: we can start somewhere with simple steps. Ocean Explorer and filmmaker Jacques Cousteau says people protect what they love. So we can start by going out and exploring our oceans and learning how the hydrosphere keeps the world, and us, together.
We hope you enjoyed today’s segment. Please feel free to share your thoughts over social media and visit Longitude.site for the episode transcript. Join us next time for more unique insights on Longitude Sound Bytes.