Planetary Radio: Space Exploration, Astronomy and Science - How Perseverance drives itself around Mars
Episode Date: August 11, 2021NASA’s Perseverance is driving farther and faster than any previous Mars rover, thanks to its advanced AutoNav system. Vandi Verma, the mission’s chief engineer for robotics at NASA’...s Jet Propulsion Laboratory, takes us inside the speedy, six-wheeled robot for a look at its marvelous mechanics and software. Vandi also describes the complex process of sample collection. There’s a high-flying surprise for Bruce Betts in the space trivia contest. Discover more at https://www.planetary.org/planetary-radio/vandi-verma-perseverance-autonav-sample-collectionSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
The Smartest Robot on Mars, this week on Planetary Radio.
Welcome, I'm Matt Kaplan of the Planetary Society,
with more of the human adventure across our solar system and beyond.
Vandy Verma started loving robots as a kid and never stopped.
Now she leads robotic operations for the Mars 2020 Perseverance rover.
She'll tell us how the six-wheeled Explorer is driving across the red planet with the help of
what's called AutoNav. We'll also take a deep dive into the intricate process of sample collection
by the rover. Space shuttle orbiters are big. It's hard to believe that Bruce Betts and I overlooked
them in the What's Up Space Trivia contest, but we did. Find out what I'm talking about just before
Bruce offers a new contest in this week's episode. I'll go to headlines from our weekly newsletter,
The Downlink, in a moment. First, though, I have an invitation for you. You can join me, Bill Nye,
Bruce Betts, and others for the online premiere of a wonderful new documentary about the ongoing
mission of LightSail 2. Our watch party for Sailing the Light is set for 10 a.m. Pacific,
1700 UTC, on Saturday, August 28th. It will be free, of course, at planetary.org slash live.
Leading the headlines this week is a still-unfolding mystery on Mars.
As we publish this episode, scientists and engineers,
no doubt including this week's guest, Vandy Verma,
are working out why Perseverance came up empty-handed,
or more precisely, empty- empty sampling-tubed,
after its first attempt to collect material from a Martian rock.
The team is confident they'll figure this out.
Wait till you hear how it all works from Vandy. Simply amazing.
Lucy is getting ready for her long journey to Jupiter.
The NASA spacecraft reached the Kennedy Space Center last week.
The launch window for this Trojan asteroid explorer opens on October 16th.
The U.S. Government Accounting Office has denied the protests by Blue Origin and Dynetics
that awarded a human lunar lander contract only to SpaceX.
Meanwhile, SpaceX for the first time stacked its Starship on top of
a super-heavy booster, creating a vehicle that was taller than the old Saturn V. This was just
a mating test, and the Starship has now been removed. SpaceX is still waiting for approval
of a first flight by the Mammoth rocket. You'll find more, and this week much, much more about Jupiter at
planetary.org slash downlink. My free monthly newsletter is at planetary.org slash radio news.
Vandy Verma has been driving rovers on Mars for 13 years. As you'll hear, tooling across the red
planet is not the only thing she loves about her job at NASA's Jet
Propulsion Lab. Actually, it's jobs, plural. She's not only the mission's chief engineer for robotic
operations, Vandy also serves as assistant section manager for mobility and robotic systems across
all of JPL. She was featured in a recent article about how Perseverance is using its advanced
self-driving capability to get around nearly 10 times as fast as its older sister, Curiosity.
It didn't take long during our conversation a couple of weeks ago to realize that there has
never been a machine as complex or as capable on another world. Dr. Vandiverma, welcome to Planetary Radio,
your first appearance on Planetary Radio. I also want to congratulate you on beginning
Perseverance's trek across Jezero Crater. We've all been waiting for this.
Thank you for having me. And yes, we are very excited about finally being at Jezero Crater and exploring that location.
I am only sorry that we're not talking to each other across a table or a desk at JPL,
because frankly, I really wanted to try those 3D glasses that you and the other rover drivers get to use.
Have you been doing a lot of your work from home, like so many of us?
Like a lot of people, we've had to make adjustments because of the pandemic.
Close to landing, our entire operations facility was redone, but it meant that a lot fewer people can be on lab.
So a large part of our team is remote, and we do a lot of our strategic work remotely.
But to actually drive the rovers and operate the robotic arm and sampling system, we are on
lab for that.
So I go into JPL regularly to drive the rovers.
And I wanted to add that you'll have to come back when we can have guests and we'll be
sure to get to you that experience with the 3D goggles.
Thank you.
You will hear from me.
I cannot wait because I have read how you sometimes,
when you're wearing the goggles, you just like to stare at Mars.
Putting those goggles on and the cameras we have on this rover are so spectacular that it's a
really immersive experience. You feel like you're right there because you can take such a panorama
and you can pan in different directions. It's sort of like you're turning your. Because you can take such a panorama and you can pan in different directions,
it's sort of like you're turning your head and you can control where you want to look.
It really makes the terrain come alive. Jezero is an incredible location and it's so rich
in the variation and the features and the geology that it really is neat to see in 3D.
We have talked to other rover drivers over the years,
I think at least going back to Spirit and Opportunity.
But I think it's worth repeating, if not for this audience,
well, there must be one or two people out there who are still thinking
that maybe you rover drivers have a joystick or a steering wheel.
Could you give us the little speech that
I'm sure you've given a hundred times that explains how it doesn't exactly work that way?
Yeah. So one of the reasons we can't drive the rovers in real time is just the relative distance
between Earth and Mars. The distance is such that the fastest, just for one-way light time, for a signal to get from Earth to Mars can
take four minutes. But because Earth and Mars are also rotating around the sun and the distance
varies, it could take up to 24 minutes just for us to send a signal one way. That really isn't
sufficient for us to be able to hit the brakes and stop the rover. It also would mean that we're
sending a command and then we're waiting this long to send the next. So the rover. It also would mean that we're sending a command,
and then we're waiting this long to send the next. So we'd make really slow progress. So what we do
is we plan the entire SALT's plan. And a SALT is a Martian day. And we have to think through
all of the possibilities that the rover could encounter as it's doing this drive.
It might slip. The wheels
might actually slip in the sand such that it's making less forward progress and it's digging
deeper. So the odometry may show us that we've covered 10 meters, but we may only have gone nine.
So you have to think, if I'm trying to turn around a rock, what does that mean? So when we drive the
rovers, we send the entire SALTS plan to the rovers, and then it takes the images and sends the data back to us.
We use those to create this 3D terrain environment in which we plan the subsequent drive.
And there are nuances to this, which also mean why we don't drive in real time.
We send the signals from Earth to Mars direct to the rovers, but they go through the deep space network. And there are antennas all around the world, but we also have
a lot of deep space spacecraft, and we have to share the time with them. So there is the coordination
with the deep space network. On top of that, the rovers take so much data that we transmit it to orbiters around Mars and then transmit the data back to
Earth. So there is this coordination that has to happen between when it's daylight on Mars,
because that's how we take the images and drive the rovers at that time, when the orbiters are
flying over the rover, and when we have the deep space network time. And all of this results in us
having a complex schedule in which we can actually drive space network time. And all of this results in us having a complex schedule in
which we can actually drive the rovers. So all of this also explains why it would be a high priority
to develop a rover that is smart enough to do some of this driving on its own. And of course,
that's one of the things we were hoping to talk to you about. AutoNav, you must be thrilled to see Perseverance
beginning to find its own way across Mars
with the guidance that you folks provide
on a soul-by-soul basis.
Yes, it's been very exciting to be able to drive distances
beyond what we can see in the last images
the rover sent to us.
And that's what we get with AutoNav.
Because there's so much variability in the terrain, the rover sent to us. And that's what we get with AutoNav. Because there's so much variability in the terrain,
the cameras beyond a certain distance can see the horizon.
And the orbital images don't have enough accuracy
for us to be able to drive the rovers and avoid the hazards.
So what autonomous navigation allows us to do is
the rover itself will analyze the terrain,
detect the hazards, and find its
path around it. But it's actually a lot of fun for rover drivers, even with AutoNav, because
all autonomous technology, especially when you're putting it on processes that are radiation
hardened for space and are computationally limited, there are certain things they can't
do very well, such as, you know, in the case of perseverance, detect sand. And so robo-drivers still have to plan the overall path
and guide AutoNav so that we tailor the drive to maximize the success for AutoNav.
We'll put up a link on this week's show page at planetary.org slash radio, to a lot of great resources, including a video that actually
documents this first AutoNav roll across the red planet in kind of an S-shaped curve. I mean,
did it function the way, is it functioning the way you all hoped?
Yes. AutoNav has been doing really well. One of the things we really wanted to do was speed it up for Perseverance.
And so we have a dedicated processor, which is allowing us to run AutoNav a lot faster than we could on any previous mission.
The drive that I think you're talking about, which was the first AutoNav checkout drive,
we had identified a set of obstacles, rocks, that we wanted it to go around and actually drove
it around those. And it detected the rocks and did exactly what we had intended it to do.
So that's been great. As you initially try out technology, you really are seeing it in the
environment it was designed for. And we take these steps so that
we can make sure that some of the assumptions we had made are still valid. Earth, we do everything
we can on Earth to test it, but Mars is Mars. And Mars is hard. Didn't Curiosity have an
earlier version of AutoNav that ended up not being used much? So Curiosity also had AutoNav, in fact, as did
Spirit and Opportunity. So there's a couple of differences. The first being that we didn't have
a dedicated processor for us to do the image processing. So it was a lot slower than it is
on Perseverance. We could drive, if we did both visual odometry, which is how we drive the rover
such that we can detect slip, and we do auto nav, we could drive about 20 to 30 meters an hour.
With perseverance, we expect to be able to drive up to, in a given solve, we expect to drive about
200 to 300 meters with autonomic single cell.
We are now just going and doing shorter drives, but we expect to reach that milestone.
And so that's really, that's made a huge difference.
Another thing is it's driven by science.
On Curiosity, the science team wanted to frequently stop and do science investigation. And if you are driving
shorter distances, those are distances you can already see in the imaging you have. And if your
AutoNav is so much slower than your directed driving, then you might choose to just do
directed driving. And then the last part of it is Curiosity had some wheelware. And so on an app,
when you're letting the vehicle decide, but the vehicle didn't have a model of wheelware,
it's just looking for hazards. And what was these small ventifact rocks, which actually weren't
large hazards, could still damage the wheel. So rover drivers would more carefully work their
way around the terrain. That has been mitigated by subsequent technology we put on the wheels. So rover drivers would more carefully work their way around the terrain.
That has been mitigated by subsequent technology we put on the rover, but those are some of the
factors as to why AutoNav on Perseverance is going to be used a lot more than on previous missions.
Quite an improvement in velocity getting across Mars. Still, you compare that to a human walking across the planet, it would still seem pretty slow. But for a robot, not bad at all. And I think we've all seen those pictures of those beefed up wheels that Perseverance has to avoid those pockmarks that Curiosity was getting. There's one other factor that I read about, and there's a cute phrase to describe it, which maybe it has to do with having that dedicated processor for handling AutoNav.
It's thinking while driving. What a concept. People should try that on the freeway.
Yes. There is, in fact, this additional capability. On previous missions, what we would do is we would take an
image, which is the image in which AutoNav is going to analyze the hazards, and we would have
the robot staff think about, as we call it, where the hazards are and then turn the wheels. And the
reason is because it allows us to do this in steps where you know where you're at, you're
analyzing the hazard,
and you can proceed. With thinking while driving, it's doing that autonomous processing while the wheels are turning. So you have to factor that into account in terms of the model you have,
the expected distance it's going to cover and do the drive. But that results in a significant
speed up as well, because you're not stopping to do the terrain hazard
analysis. I also read about something called eNAV. I wonder if you can contrast that with
AutoNAV. Is it just a component of AutoNAV, or what does that mean? Right. It is a component
of AutoNAV, enhanced NAV. And AutoNAV consists of the whole infrastructure of the modules that take the images, process the image, build the train model, all the messages that have to be passed between different flight software modules to actually turn the wheels.
And AutoNav is used for the encompassing term.
And ENAV is a library that is used to do the hazard analysis and the path
evaluation. The other news that came out, actually, since we scheduled this interview,
is that we are getting closer now to Perseverance beginning to collect its first sample, which of
course is what everybody has been excited about. You know, we often on this
show call sample collection the holy grail of Mars exploration, or at least robotic Mars exploration.
And so now I guess we're getting close, but I don't think a site has been selected yet, has it?
So the science team has selected a location that we expect we will collect the first sample. So we are very excited
about this. We are actually today doing a long drive to get close enough to that location,
such that we will need to just do a precision bump to get to precisely the location in which
we want to position the robotic arm in the orientation in which we
can because we are limited because we have a five degree of freedom arm into where we can reach.
So we are driving to close enough such that we can do that precision drive.
So it's about to start and it's a very, very exciting part of the mission. It is the prime reason we're there
is to cache these samples for a subsequent return to bring back to Earth.
The process of sample collection is so much more than just, you know, rolling up,
let's roll over there and drop the drill down and pick up a sample. And I know this is not
specifically speaking your area, although you support the robotics, the mechanics of all this.
Can you talk about the complexities of this and maybe about the sampling and caching system itself, which is an absolute mechanical or maybe I should say robotic marvel?
We always say on this mission, you know, we actually have two robotic systems.
say on this mission, you know, we actually have two robotic systems. One is on the outside,
the one you see, and there's an entirely second system inside the rover for sample and caching.
It really is quite incredible. There are so many steps involved in the sampling process.
It's a finite resource. We have a finite number of tubes. And so we think very carefully about what we're going
to collect in those tubes for subsequent return to Earth. And the science team worries a lot about
diversity of sample and the value of that specific sample to bring back for subsequent return to
Earth. To do that, there are multiple steps which combine the science analysis, but also there are engineering constraints.
Because you do have this robot deploying a 2.1 meter arm with a 45 kilogram turret at the end of it on freeform features on Mars that we are looking through sensors where there are inaccuracies.
that we are looking through sensors where there are inaccuracies.
And yet you have to make contact with the surface.
And both of these constraints come together to do a series of steps.
Now, in addition, for the first sample we collect on a mission,
as was the case with Curiosity, there are additional steps. Because this is the very first time you are exercising the hardware on Mars
in that particular way. And subsequently,
we eliminate some of those steps as we have done the checkout and determined that we can, you know,
skip those or we've got enough information. So for this very first sampling that we're going to do
at this location, and we are currently calling it the Far Pavers, where we get to and the science team will
have a name. And the names on this mission have been incredible.
Very creative.
Very, very, and meaningful. And so that's been great. And that's the case with all of the
missions, right? Naming things is taken very seriously. And there is a number of scientists
who decide why something should be named. And I
think that would be a podcast in itself. You know, I'm going to make that's a good idea.
I'm going to make a note of that. We talked a lot about the names, we've never dedicated a segment
to it. So I like that. So what what we do once we have determined that we've identified a location
with high probability, we'll do a precision drive.
In fact, we do a precision drive for any location in which we want to do careful robotic arm analysis,
which I'm differentiating from the case where we may just have ended up in a drive location
and the scientists look at the image and say, wow, this is neat, and opportunistically deploy the robotic arm. But when we are going
for a dedicated reason, we do a precision drive because we are evaluating in that the robotic
arm moves that we can do to get to that target. Because we have a constrained arm, we have a five
degree of freedom robotic arm, we can't reach arbitrary positions in arbitrary orientations.
We can't reach arbitrary positions in arbitrary orientations.
In addition, if you move the turret in certain ways, it might collide with adjacent terrain features or rocks.
And then we have to think of all the things we're going to want to do while we're there and do them all without repositioning the rover.
And so you're not just thinking about that one first thing you're going to do, you're thinking about the entire sampling campaign that is involved just in the
first step of this, which is the precision drive. And once we've done that drive, we have to evaluate
what are the targets, what is the precise location in which we're going to select a target.
And we do this iteratively, where we might do what we call
remote science. The first is just taking images. So from the images, you can tell so much. Then we
use instruments such as the SuperCam, which has a LIBS laser on it and can study from a distance
what the composition of those potential targets is. So SuperCam will zap it with its laser and you can
get an idea of the composition of that site as well. That's right. And what's so neat about
SuperCam, you're shooting the laser so that you can study the plasma, but on Perseverance,
we even have a SuperCam microphone. So by hearing the sound of that pop, you can tell something about the
hardness and other characteristics of the rock. Based on these analysis, then the next expensive
thing, when we talk about expensive for us, it's in terms of power, energy, and time spent.
The rover is spending doing something. Those are the resources we try and manage
because the rover is capable of so much stuff. We really have to think about what specific thing are we going to have it
do from this huge selection of possibilities. So the next thing we do in that is after those
features have been looked at, the science team might decide, here are the few on which we are
going to do additional analysis by what we
call contact science. And some of this is close proximity, essentially using instruments mounted
on the robotic arm to study very closely the surface. And those are Sherlock and Pixel. And we
position these instruments very close to the surface, within a few centimeters of the surface, and get additional information now really in context.
And once we have additional information from Watson and Sherlock, then we can further refine precisely which target has the characteristics that the science team wants.
The steps for sampling and acquisition here are we actually abrade the surface. We have
the ability on the robotic arm on the drill to do a bit exchange. So we can switch out which bit
it's using for what purpose. So we'll abrade the surface. And that process creates a more even
surface, which allows us to take scientific measurements with the instruments
that are more precise. A lot of the instruments we send to Mars don't like dust, which really
doesn't go with Mars very well, because Mars is such a dusty place. And we have on this mission
a really, really great tool to take care of that, which we call the gas DRT, the gas dust removal tool. In previous missions,
Curiosity and the Spirit and Opportunity rovers, we've had brushes, which we have these brushes
that we can put in contact with the surface and they brush away the dust. Here, we have nitrogen
gas, which we can puff out and clear the surface. The advantage is it really doesn't care about the roughness of the surface
and it can be used in a lot of different contexts. We're then going to do the bit exchange
and then collect a sample core. There are so many steps when we do sampling and coring. We also have
to analyze the hardware to make sure that after a particular activity, the
hardware is as expected, especially for the very first time we do this.
So interleaved with some of this is imaging of the hardware itself.
So we may operate and we take images just to make sure that the engineering hardware
is operating as expected.
And then once we have this very first core,
after a lot of excitement,
because it really is a momentous occasion
because people hear about the rover once it's landed.
But for those of us who've worked on it,
it's many, many years leading up to this day.
And so it'll be a very exciting day when we collect that first
core. And that really isn't even the end of the story, because as I was mentioning, we have this
entire second robotic system on this rover, which we call the Adaptive Caching Assembly.
The external robotic arm with the core docks with the bit carousel, which if you're looking at images of the rover, you see this big circular piece of hardware on the front of the rover.
That's the bit carousel.
It docks with that.
Inside the rover, we have an entire second robotic arm.
that tube and there are various steps in it where it is characterizing that by taking images,
weighing it, measuring so that we know exactly what we have collected and are documenting it for subsequent return to Earth so that we can characterize the sample.
This struck me. It reminded me of the process the Apollo astronauts went through. They didn't just
pick up rocks and stuff them in a bag. They had to take pictures of the location. They had to make
notes. They had to do the same kind of documentation that you're going to be doing, you and the team
will be doing remotely on Mars using this wonderful robotic equipment.
That's actually a really good analogy. Not just the sample by itself has value, but the context in which it was taken. So all of this data
from all of the various instruments is labeled all together in order for us to know what we're
documenting essentially the context of that sample. And also we have to seal the tubes because it's going to be many years
before we actually bring them back to Earth. And we're going to cache them in multiple different
ways, but drop them to the surface where the subsequent FETCH rover is going to collect them.
And we're going to have a launch from Mars where they're going to launch from Mars,
rendezvous in orbit, and then be returned to Earth. So they're going to launch from Mars, rendezvous in orbit, and then be returned
to Earth. So they're going to have quite an incredible journey. And so we have to make sure
that the sample we collect survives that and is pristine. Exactly as I was saying, a far more
complex process than somebody might think just hearing, oh, we're going to be picking up
samples on Mars. Vandy Verma and I will continue our conversation in a minute, including her team's
support of Ingenuity, the Mars helicopter. This is Planetary Radio. Hi again, everyone. It's Bruce.
Many of you know that I'm the program manager for the Planetary Society's LightSail program.
LightSail 2 made history with its launch
and deployment in 2019, and it's still sailing. It will soon be featured in the Smithsonian's new
Futures exhibition. Your support made this happen. LightSail still has much to teach us. Will you
help us sail on into our extended mission? Your gift will sustain daily operations and help us inform future solar
sailing missions like NASA's NEA Scout. When you give today, your contribution will be matched up
to $25,000 by a generous society member. Plus, when you give $100 or more, we will send you the
official LightSail 2 extended mission patch to wear with Pride. Make your contribution to science and history at planetary.org slash S-A-I-L-O-N.
That's planetary.org slash sail on.
Thanks.
I'm also thinking of the software that you and others spend so much time writing,
the algorithms that you have to create to control these processes.
Algorithms that you have to create to control these processes.
Is it all pretty much done from scratch, like a lot of the hardware is custom built for this?
Or are you able to pick and choose bits of other software that can be used for this?
Or is it just starting from zero?
Or maybe I should say zeros and ones.
You're talking about something that is very near and dear to my heart. I have written different parts of the software. And in this specific series of events that will unfold, I worked on the rover collision model, which as we are moving the
robotic arm, allows the rover to analyze trajectory so that it doesn't collide with its own body. And also the super cam laser is not shot at
the rover's body. And we were talking about this broad strokes view of all the sequence of
engineering and science events happening. But just for a saw of robotic arm activity, such as when we
took the selfie, we did hundreds and hundreds of collision checks.
So there is hardware where we modeled the rover up to 500 geometric shapes,
checking collisions with other shapes.
So for just a small part of these moves,
there is all this other stuff and software happening in the background.
There'll be so many people who worked on different aspects,
who all see their work finally doing what it had been designed to do.
And a lot of it may not even be visible in the forefront.
To get back to the question you were asking, we have had rovers on Mars before.
And so some of the infrastructure and lessons we've learned in ways we do, we do carry from a mission to the other.
But everything for Perseverance is custom-built for Perseverance.
These missions are so unique, and our resources in terms of the computation we have is so
limited that we really do try and optimize it.
You know, when I talk to people who are software programmers, I joke that I worked also on
the terrain model for the workspace.
We literally think about what we call like loading in a bit of information, the maximum
information and the way we represent things, just because we are limited in memory and
we are limited in computation.
So there's really creative ways in which we program space rovers that is different from
other high-risk applications that you might have on Earth,
such as self-driving cars and others, because the computers have to be radiation hardened.
There's a lot of radiation on Mars, and there are only computers that have gone through the
radiation hardening that are available to us, but they can be very limited in their computing power.
And so the software we design is sort of custom built
in order to be able to do these complex activities
while still working with those operating systems.
I am not a coder,
but I know that we have coders in the audience
and they are just going crazy right now,
listening to this and imagining what is happening
and that this is all part of your job.
Another piece of your job. Another piece of
your job, turning away from sample collection, we have recently welcomed back Mimi Ong, the
project manager for Ingenuity, the Mars helicopter. I read that you also had a part in contributing to
the wonderful success of that little flying machine, right? That was a passenger for so much of its life on Perseverance.
It's been really exciting to be in some small way associated with the Mars
helicopter ingenuity because that it really is, you know, it captures some of what we try to do
is push the envelope of what we can do with the technology on Mars. We have a prime mission we're trying to accomplish.
The risks we can take in some aspects of the mission,
we evaluate them differently
because there is a higher consequence
if it didn't go according to exactly the intent.
But in other areas,
we really want to take some greater risk
in order to push the envelope
for what we can do
in the future. And the Mars helicopter was that. And for the roboticist in me, you know, that was
just such an exciting moment to see that we could actually fly a robotic vehicle on Mars. It was
very exciting to be part of that. And my part was more in the interface between the helicopter and the rover.
We worked really closely, the robotic operations, which is what I'm part of, consists of driving the
rover, operating the robotic arm, the sampling system, and it includes the interface to the
Ingenuity helicopter, which has, you know, an operations team that works together with the robotic operations team,
which we call the helicopter integration engineers,
to coordinate how we would send the commands
and where the rover is with respect to the helicopter.
The helicopter talks to Earth via the rover.
So it talks to the rover and then we transmit the data.
Some of the parts in the early mission were just so interesting.
The helicopter, as you mentioned, was actually strapped onto the rover sideways.
And there were a whole series of steps that we had to take to have it be upright and drop
it to the surface.
We didn't really know how challenging it would be to fly on Mars.
So there were some constraints of selecting that airfield.
And it was both the helicopter and the rover teams.
In fact, since the day we landed, we met almost daily for a while to analyze the terrain to pick the airfield and the flight zone.
Once we picked that, we had to actually get the rover to that location.
the flight zone, once we picked that, we had to actually get the rover to that location.
But there was a series of other things the rover was doing in that time, checking out whether the wheels could turn and deploying the robotic arm and doing other checkouts. It was interleaved with
doing the flight zone selection. And then to me, one of the really interesting parts really early
on was the helicopter, while it was attached to the rover, was powered by the
rover. But as soon as we would drop it to the surface, it needed its solar panel to be able
to get charged. Otherwise, ingenuity wasn't going to survive. We had 25 hours. And so that drive was
really critical, the drive where we drive off of the helicopter. Because if you just sat there
after dropping it off, there was risk to the helicopter. So that was a lot of fun from the
rover driving perspective. When you're trying to maximize the success of a drive, sometimes you
actually remove some of what we call the fault protection, which are ways in which it tries to
keep itself safe. Because some of those can be cases where it
thought something was an issue and it isn't. We really shaved off a lot of that margin to say,
we really don't want to end the drive unless it's a real issue with a risk to the rover.
And the rover really was the prime mission, so we couldn't risk the rover. But at the same time,
we didn't want to be so conservative. the same time, we didn't want to
have be so conservative. So that drive, we removed a lot of the conservatism to just drive so that we
could maximize the success. And seeing that drive work, and that the helicopter was alive was really
exciting. And then of course, we had to drive off to this location from which we would observe the
first flight. And so it's been really interesting. And I work with the helicopter integration engineering team, which works with the
helicopter team as we're going now into the extended mission. And now to see that helicopter
doing real work, doing its own useful, essentially research or observations of Mars and assisting the rover. It does seem to be
a realization of a dream. I also think of your use of the word fun that you made a few moments ago.
And your other job at JPL as the assistant section manager for mobility and robotics
at the Jet Propulsion Lab, where you work with, you help to manage something like 150
roboticists. I'm sure it's not all fun, but it sure sounds like a very cool professional
community to be part of. It actually is quite a lot of fun.
Good. Because in that role, so we have a JPL.
Most people do multiple things, and it's a very unique place in that way.
And in that role, we're looking to the future.
What are the technologies that will enable the missions of the future?
So you get really brilliant and creative people thinking about technologies that we will need for possible missions.
Now, you also work with the scientists very closely because it's guided by what the science interests are on other planetary bodies and otherwise in the solar system and beyond.
So that's a really interesting part of it.
And there's a connection between these because when you see, when we're driving on the surface, we're already doing this, actually, which is really interesting. As we're already still doing the prime mission and collecting the sample, we're already thinking ahead to extended missions and what we could do in the future. You start to see what is it that we could improve and do better. And that's kind of the role of the robotics technology development.
kind of the role of the robotics technology development.
Speaking of the future, I note that you also worked, contributed to Europa Clipper and to the effort to come up with something to land on that moon of Jupiter.
And, you know, the need for autonomous activity by those spacecraft, I imagine, is obvious
to everybody.
Do you see a bright future for our robots around the solar
system? I mean, are they just going to get smarter and smarter and more capable?
That's so neat that you looked all this information up. I'm very impressed. But yes,
I think that it becomes even more important as we go further out in the solar system,
because the communication delay just gets longer. If we want to have missions that go further out, they need to be
more and more autonomous. And there are other constraints, such as, you know, the example you
gave of the Europa lander mission that we had been looking at. And these are all, you know,
hypothetical missions, other than Clipper. Clipper is a real mission.
Right, right. Thank goodness.
Right.
But I think in some of these cases, when we're looking at possibilities, which we have to
sort of take a concept and study it carefully in order to make it a potential possibility
for scientists to be able to evaluate, there are certain missions where they're constrained
by the lifetime, because you're going to locations that are just so harsh, either thermally or
radiation-wise, that the robot, whether it's a land or a rover or other mobility platforms,
we're looking at snake robots and various other aerial platforms, will have a lifetime on the
order of weeks. And so now you really need
to be autonomous because you can't wait for the signal from Earth to tell you what to do next,
because you're wasting precious time if you're waiting for that. And so those missions are going
to be more and more autonomous. I'm really excited about this because it really enables science,
allows us to investigate these places and study them. I just think the
future is going to be very interesting. Let me go in a slightly different direction as we wrap up
here. You know, we know, I know, kids love robots. Do you see robots, robotics, as a gateway for
young people into STEM careers? I mean, you've been fascinated by robotics and programming since you were pretty small.
Yes, I think robotics is a fantastic way to get kids excited about, you know, just the
process of science in general, because you can do something, and then you have to go and potentially
study a particular area and then apply it. You learn through that process of applying the theory
you learn, and it can start in really small ways. And, you know, JPL, right now we are still in a
pandemic transition, but we have an open house house and we get thousands of people who come through there.
And I always try and volunteer at these because the kids who come through there, it's just so interesting to see how much they already know about these missions.
Sometimes they think of these robots as their friend and they'll come and give you these.
They've thought about the problems you've encountered, and they're really creative and interesting.
And I think it's a great way to get them excited about it, to feel as part of following the
story, they're problem solving along with us.
I try and do as much as I can to try and connect with students.
I always say, you know, robots and dinosaurs, you know, they're a really great way to
get kids excited about science. So boys and girls, keep it up, and you just might end up building
robots that will help us explore the solar system, and maybe someday even beyond. You know, that
statement is actually really, really almost so literally true for Perseverance, because we're collecting these samples. And it's going to be the 2030s. By the time these samples
are brought back to Earth, some of the young people who are still in school may be the ones
who are sort of analyzing and studying these samples. So I think they really are, that
generation really is part of this mission. Vandy, it has been absolutely delightful talking with you.
This is a great place for us to end,
but we won't really end
because I look forward to celebrating with you
and the rest of the team
when that first sample is safely in its little tube
packed away, at least temporarily,
inside the Perseverance rover
and then to be dropped off, to be picked up later, returned to Earth.
And who knows, maybe we will make that discovery of whether we are alone in the solar system and the universe.
Thank you, Vandy, for all of this work that you and the team do and for joining us today.
Thank you. I really appreciate you covering the work we do. It's time for What's Up on
Planetary Radio. Here is Bruce Batts, the chief scientist of the Planetary Society,
also significantly the program manager for the LightSail mission, because we're going to talk
a little bit about that as well. Welcome. Hi, Matt. As you know, because we just discussed it, interesting predicament with the question
that will be answered in a few moments today from the question, the quiz that you posed
a couple of weeks ago that I know you are prepared to deal with.
I am prepared, thanks to you.
So prepare us for the night sky. Oh, nice segue.
Evening sky, saw it last night. Venus, as always, looking really, really bright over in the west
shortly after sunset. And then coming up in the east around sunset are Jupiter and Saturn. Jupiter, the much brighter of the two, Saturn looking
yellowish. We've got the Perseid meteor shower peaking, if you pick this up shortly after it
comes out, peaks the night of the 12th and 13th of August, with increased activity several days
before and after. The best viewing will be after midnight. It typically is as the Earth hits the oncoming
meteors or Earth becomes oncoming. Anyway, after midnight also the moon will have set by then
leaving darker sky. But you can see meteors before then to go out, check it out, relax.
Dark site will of course be better. It's typically the second best meteor shower of the year behind the Geminids.
But for Northern Hemisphere observers, it's usually a lot warmer outside for the Perseids.
We move on to this week in space history.
2005, Mars Reconnaissance Orbiter was launched with its tremendous science instruments to study Mars and still working.
It's quite a performer.
It has certainly done a lot of great work for us.
We move on to random space fact.
For some reason, it makes me feel like I had a head for the drag strip.
Yeah.
Hey, Matt, you and I,
you originally pointed this out to me a while back,
but now it's actually happened.
Two European Space Agency spacecraft
with totally different missions,
totally unrelated to Venus,
flew past Venus during the last week,
only 33 hours apart.
The Solar Orbiter spacecraft, which is, of course, studying the
Sun primarily, and the BepiColombo spacecraft that's headed to Mercury both used Venus for a
gravity assist to help them head farther into the inner solar system. Quite a random space fact.
Congratulations to the teams for both of those missions. Best of success as you head toward your final destinations.
I'll put this out to listeners.
I could not find an example of two flybys of another planet so closely timed.
I can't imagine there would have been, but if anyone knows there is, let us know.
Bet you are right about that.
It seems like something we would have heard about.
Are we ready for the contest? Oh, we're so ready for the contest. I asked, after Mir and Skylab,
what was the most massive artificial object to re-enter the Earth's atmosphere? And I meant to say uncontrolled end of mission, kind of didn't think to say that. So how'd we do, Matt?
What would people tell us? Well, like I said, I've already given Bruce a heads up about this.
We had more people come up with an alternative answer, an alternative to what Bruce had in mind, than probably we ever have before. I'd say half or
more of people who entered this one came up with a variety of space shuttles, all of them, basically,
although some people even differentiated among the different orbiters because some were heavier
than others. But that's not what you were looking for, was it? No, but depending on what
random.org found, I'm happy to take that answer because the space shuttle was indeed incredibly
massive and was more massive than anything else that we're talking about. What else did people say?
I'll get to our winner in a moment, but here is our poet laureate, Dave Fairchild, in Kansas with the
answer I think you were looking for. Cosmos 1686 was docked with Salyut 7. Round the earth they
traveled in a low earth orbit heaven. Salyut 7 had six crews that served in the arena until in 1991
she crashed in Argentina, where according Tourist and Zimmer in Germany,
nobody cried for it.
A little Broadway reference.
Get it? Get it?
I get it. I get it.
Salyut Evita.
Here's our winner.
And he's a first-time winner,
although I think he's been listening for quite a while.
Michael Kaspol in Germany, who, sure enough,
said Salyut 7, coupled with the Cosmos 1686
TKS spacecraft, total mass, he says, about
40,000 kilograms, thus a VNEO,
a very near-Earth object. Congratulations, Michael.
You have gotten yourself a copy
of a great book, Across the Airless Wilds, The Lunar Rover and the Triumph of the Final Moon
Landings by Earl Swift. I've read a little bit more of it since we first said that we were going
to offer this book. It is really fascinating about the development of the rovers and how much they contributed to Apollo, as we discussed recently with Andy Chaikin when we talked about Apollo 15, the first of those folks on the moon to have one of these to tool around in.
It's published by Custom House.
Excellent.
I like this approach from Kent Murley in Washington, who gave the impression that he misheard you.
He thought you wanted the third most massive art official object to reenter Earth's atmosphere.
And so he gave us some candidates of spacecraft that had artists aboard, including Skylab.
Skylab 2, because Alan Bean, who is quite an accomplished
artist, was on there. Kent actually revealed toward the end that he did understand what you
were after. Vlad Bogdanov in British Columbia said that Salyut 7 was up there for 8.8 years,
a record until Mir, six crews, as we heard, including Svetlana Savitskaya, the first woman to perform an EVA, an extravehicular activity, a spacewalk.
And the second woman in space.
John Guyton in Australia.
This is cute. first stage with a weight of around 131,000 kilograms almost made it to Richard Branson's
definition of space with an energy of about 70 kilometers. Yeah, a nice ride, but not what most
people call space, we're sorry to say. From the Netherlands comes this little ditty from
Ipa van der Meer. Be a good comrade and hold your hand to your head
for a flaming streak of yellow and red.
Be sure to give a proud salute for Soviet space station Salyut.
Number seven, to be precise.
He says he loves listening to us from there in the Netherlands.
Aw.
Finally, from Gene Lewin in Washington.
When spacecraft come back from afar and reenter Earth's atmosphere, Aw. ground near Perth. The third largest would be the seventh solute, if we don't count the space
shuttle line. And though this, too, ended up on land, it avoided a littering fine.
Well, there's just all sorts of good information in there.
Yeah. Nice job, everybody. Thank you so much. We're ready for a brand new one of these.
I checked, and I don't think I've ever asked this. What is the tallest
mountain on Venus? Go to planetary.org slash radio contest and give us your answer. I don't know how
this never got asked in the past, but it shouldn't be too hard for you to find. Just make sure that
you find it and enter by Wednesday, August 18 at 8 a.m. Pacific time.
And here's what you might win.
It's another great book, Lighten the Darkness, Black Holes, the Universe and Us by Heno Falke.
He's an award-winning astrophysicist.
He's the guy who actually went up to make the announcement of the first image, actual image of a black hole from the Event Horizon Telescope.
A fascinating book with a little bit of a black hole from the Event Horizon Telescope. A fascinating book
with a little bit of a spiritual angle to it as well, published by Harper One. That'll be yours
if you're chosen by random.org, as Michael Kaspo was this week. Hey, I've got a hint for the trivia
contest that's completely not useful. The answer has been an answer to a previous trivia
contest. And in fact, that was one of my favorite pieces of trivia. Nevermind, totally not useful.
Made me happy. We done, Matt? No, we're not. Because I want to say that we have this wonderful
movie premiere coming up, actually actually the documentary about the mission of
LightSail 2, which you are
a prominent player in.
It's going to premiere on the 28th
of this month, August 28th, at
about 10 o'clock,
really, to be specific, 10-15 Pacific
Time, and people will be able to
watch it on the Planetary Society's
YouTube channel, also
at planetary.org slash
live. That's on the 28th. Very exciting stuff. And then you're going to be in a discussion that
I'll be moderating with the boss, our CEO, Bill Nye, and some other good folks. I'm looking
forward to that. This is going to be quite a celebration. Have you seen the film?
I have probably seen all the pieces, but I have not seen all the pieces put together.
Goody. Well, try and avoid it.
It's really nice.
It's great stuff. Yeah, it's a wonderful story. All right, now we're done.
All right, everybody, go out there, look up at the night sky, and think about metamorphosis.
Thank you, and good night.
That is Bruce Betts, the chief scientist of the Planetary Society, who before our eyes
is metamorphosizing into the program manager for the light sail mission from the Planetary Society.
He joins us every week here for What's Up.
Luckily, not a cockroach.
Planetary Radio is produced by the Planetary Society in Pasadena, California,
and is made possible by its self-driven members. Take the wheel with them at planetary.org slash
join. Mark Hilverda and Jason Davis are our associate producers. Josh Doyle composed our
theme, which is arranged and performed by Peter Schlosser at Astro.