Listen:
Check out all episodes on the My Favorite Mistake main page.
My guest for Episode #257 of the My Favorite Mistake podcast is Chris Lewicki, an Astrofuturist, Engineer, and Entrepreneur who is interested in developing strong, thoughtful foundations for the near-future space economy.
He’s a multi-time co-founder. He first co-founded and was CEO of Planetary Resources Inc. (PRI), which focused on the prospecting, development, and use of resources found on near-Earth asteroids. (Skip) He helped acquire over $60M in investment and revenue, built a team of 80 extremely talented engineers, scientists, and business and policy leaders, and launched 3 experimental spacecraft to advance the adoption of space resources as a crucial part of humanity’s activities in space.
Prior to entering the private sector, Chris was a key member of NASA’s Mars Exploration Rovers and the Phoenix Mars Lander, serving as Flight Director for the Mars rovers Spirit and Opportunity, and as the Surface Mission Manager for Phoenix.
Chris received both bachelor’s and master’s degrees in aerospace engineering from the University of Arizona. He’s the recipient of two NASA Exceptional Achievement Medals and has an asteroid named in his honor: 13609 Lewicki.
Chris imparts lessons learned from his early days in NASA's Mars exploration projects, where a potential disaster during a rover test thrust him into the limelight as an emerging leader in the field. His poignant recount of the incident underscores the nuanced details that contribute to the success or failure of any mission and the critical concept of design for test (DFT).
Drawing parallels to the broader engineering community, this episode's riveting discussion reveals essential strategies used in this high-stakes industry. The implementation of mistake-proofing tactics, robust system performance to ensure resilience, or ‘poka-yoke', and the introduction of redundancy in spacecraft design all contribute to an airtight spacecraft system. Learn from Chris's profound insights as he unravels the many considerations that go into ensuring functionality, designing for testability, and anticipating service requirements and testing needs during the initial design phases.
Questions and Topics:
- Was it a connector being reversed??
- New and innovative work… – was it a design mistake to not be “designed for test”?
- Could that have been mistake proofed in some way? It was not
- Would they have fired you? Did you ask??? Ernie or others??
- Took time to be able to tell the story? How long?
- What response did you get to sharing that story online?
- Bringing these lessons into the private sector as CEO?
- How many people have taken you up on your offer to share their failure stories??
- MY $500M MARS ROVER MISTAKE: A FAILURE STORY
- Netflix documentary on the James Webb telescope
Scroll down to find:
- Video version of the episode
- How to subscribe
- Quotes
- Full transcript
Find Chris on social media:
Video of the Episode:
Quotes:
Click on an image for a larger view
Subscribe, Follow, Support, Rate, and Review!
Please follow, rate, and review via Apple Podcasts, Podchaser, or your favorite app — that helps others find this content, and you'll be sure to get future episodes as they are released weekly. You can also financially support the show through Spotify.
You can now sign up to get new episodes via email, to make sure you don't miss an episode.
This podcast is part of the Lean Communicators network.
Other Ways to Subscribe or Follow — Apps & Email
Automated Transcript (May Contain Mistakes)
Mark Graban:
Well, hi. Welcome to my favorite mistake. I'm Mark Graban. Our guest today is Chris Lewicki. He is an astrophysicist, an engineer and an entrepreneur.
Mark Graban:
He's interested in developing strong and thoughtful foundations for the near future space economy. He's a multi-time co-founder or founder of organizations. At first, he co-founded and was CEO of Planetary Resources, Inc. Or Priya, which focused on the prospecting, development, and use of resources found on near-Earth asteroids. Before entering the private sector, Chris was a key member of NASA's Mars exploration rovers and the Phoenix Mars lander, serving as flight director for the Mars rover Spirit and Opportunity and as the surface mission manager for Phoenix.
Mark Graban:
Chris has both bachelor's and master's degrees in aerospace engineering from the University of Arizona. He's recipient of two NASA exceptional achievement medals. And this is cool. He has an asteroid named in his honor, 13609 Lukey. So Chris, welcome to the podcast.
Mark Graban:
How are you?
Chris Lewicki:
Glad to be here. Mark. One correction in the interim, not quite an astrophysicist, although I admire astrophysics.
Mark Graban:
Oh, I said that wrong.
Chris Lewicki:
Maybe astrofuturist.
Mark Graban:
That is by mistake, astro futurist. I have a friend of mine who's an astrophysicist who's going to come watch the total eclipse with me.
Chris Lewicki:
So excellent. That'll be an exciting day for everyone.
Mark Graban:
Are you going to be anywhere near the path of totality?
Chris Lewicki:
I'm going to travel to experience it with my family, who live right on the line of totality.
Mark Graban:
Oh, great, great. Excited about that. So, astro futurist, my mistake there. It's fair to say you are a rocket scientist. I think you're the first rocket scientist, quite literally rocket scientists that we've had on the show.
Chris Lewicki:
I'll accept that title. They make a lot of mistakes. They just hopefully make them before they show up on tv.
Mark Graban:
And I think we're going to have a chance to talk about that here in the episode of the different things that you're involved in. I'll link to Chris's website and LinkedIn profile and you can get that broader view. But what else would you highlight in terms of the things you're working on these days? Chris?
Chris Lewicki:
Well, I'm interested in the space industry. I also call myself a space industrialist. Anything that creates infrastructure and allows companies, businesses, communities to go to new frontiers. We've been doing it for all of human civilization. I think space is just another ocean to cross.
Chris Lewicki:
So with that, I'm involved with organizations like the Creative Destruction Lab, which is a startup accelerator where I help mentor startups and certainly learn from them and the other mentors, like Commander Chris Hadfeld, who was commander of the International Space Station, also involved with another space explorer, Anusha Ansari, at the Xprize foundation, where we're dreaming up new prizes to create breakthroughs in a lot of different areas. The ones we're interested in space now relate to the problem of space junk. So we hope we can launch a new prize in that area soon. And then just lots of helping out different companies. I think I find myself in the role of helping out founders and companies more often than starting them myself these days, but still always looking for the next moonshot to take.
Mark Graban:
Yeah. And we are taking, I guess we, as the US and other countries are taking more moonshots quite literally these days in the near future. Right?
Chris Lewicki:
Yeah. Very exciting. We've seen things crash on the moon, things not quite crash on the moon, but there's a lot of traffic headed that way. Many exciting things at Mars still have robots at the edge of the solar system. And what's exciting for me, it's just more and more of it is, outside of the activity of governments, there's still certainly a big partner in that.
Chris Lewicki:
A lot of this is just slowly trending towards normal commerce and maybe even a little bit of tourism and all those normal things that make up our daily lives.
Mark Graban:
Yeah. Because recently there was a lander that landed on the moon but then tipped over. Do I have that right?
Chris Lewicki:
There were actually two that could fall under that description. The Japanese had a mission called Slim, which was supposed to do a little bit of a belly floppy, and it did too much of a belly flop and kind of ended up on its head. It has some interesting pictures from that. So that was by the Japanese space agency, a private company just last month called Intuitive Machines. It was the first private company and the first us entity to what they call soft land on the surface of the moon since 1972 was the last time. But Americans did this, and they came in a little too fast.
Chris Lewicki:
So a little bit morbid, but kind of broke their legs at landing and tipped over, and then from that point, whispered out some messages for the next few days to relay data and images. But we call that a qualified success. You made it to the surface of the moon and you were still operating. I don't know if someone probably could have walked away from that landing, maybe limped, but then they would have ran out of oxygen, so it would have been over anyway.
Mark Graban:
Well, and thankfully, as an unmanned mission, we're rusty or there's new technologies. I'm sure a learning opportunity for everybody involved or anybody who's going to go in the future, right?
Chris Lewicki:
Yeah, absolutely. The great thing about these missions and just the change in risk profile of public private partnerships, you know, companies want to innovate. They want to cut what might be unnecessary corners, introduce new technologies that maybe haven't been proven yet, and, you know, do things that have been proven to work. The benefit of that is you can do things faster, you can do them better, and usually can do them for lower cost. The compromise in that is it doesn't always work.
Chris Lewicki:
But, you know, like Elon Musk has shown with SpaceX, who's had many exciting launches in the last year, it's only through actually trying that you actually figure out from Mother Nature and physics what was a good design and what needs improvement. And there's no better way than to guess at it, than to go out and try to do it. So that iterative approach to design and learning and failing farther down the road each time, I think, is actually necessary for really hard types of unprecedented progress.
Mark Graban:
Yeah, that's very well said, and there's a lot more to talk about, but I want to get into what's usually the first question here, the things you've done. I'm curious if this would be a story from time, working on space missions or as an entrepreneur or both. Chris, what would you say is your favorite mistake?
Chris Lewicki:
I think my favorite mistake is the time I nearly killed $500 million Mars rover.
Mark Graban:
Nearly.
Chris Lewicki:
Nearly. That's a key phrase in that sentence. And this is a story I've told a lot of times in my career, and it wasn't one I was always ready to tell. You kind of go from being embarrassed about things or being a bit self conscious about it, but, like, over time, just kind of understanding the lesson from it. So when I was in my twenties and in my first big job as a.
Chris Lewicki:
As a rocket scientist and aerospace engineer, I was working on the Mars rover program at the Jet Propulsion Laboratory in Pasadena. And we were building the Spirit and Opportunity Mars rovers. It was February or so in 2003, and it was just a couple of months before launch. And like two weeks before, we were going to box these things up and ship them to Cape Canaveral, Florida, where they would be finished their testing and going tops of the rocket. And one of these things that's special about these rovers, you can see them behind me, is they have lots and lots of moving parts.
Chris Lewicki:
And all those moving parts have motors. And the motors at the time we were used were brushed motors and those brushes are things that could break. And we put them through a lot of rigorous testing. We wanted to make sure that all the brushes still worked before we sent them to Mars. And there was really only one good way to do that.
Chris Lewicki:
And the way to do that is to get a set of test leads, plug them into the right spot in the rover and give it a bunch of juice. And you'd watch what happened as the motor spinned up. And if the motor was healthy and everything was in working order, you'd see a real nice, clean profile as it just kind of took that energy and converted into rotation. And, you know, if that worked, you know, that you had a clean, healthy motor. And if it was anything other than that, it was something to look into.
Chris Lewicki:
The dangerous part about that, of course, is we didn't exactly design the Rovers to be tested in this way. And someone thought about this in the course of doing this, like, oh, we should do this. So we had to essentially write, I don't know, like 50 custom little instructions of which connector and which pins and how do you take it apart? How do you put it back together? You know, which you pull a connector apart, it's like, okay, is it the left side or the right side?
Chris Lewicki:
Because it's. There's 10,000 options. So you had to look at all that. So my job was to write those instructions and, you know, with one other person, go out and do those tests. So I was probably, you know, 1520 tests into 100 on the second shift of a work, workday, you know, probably getting into, I don't know, 1516, 17 hours that day.
Mark Graban:
Shipped out the next day. The time pressure, it wasn't the next.
Chris Lewicki:
Day, it was the next week or two. But, you know, you've got to fit all these different things in. And the moment of opportunity was they were going to. We were testing spirit in this case, which was the first one to launch. And they were doing a bunch of software tests on the part of Spirit.
Chris Lewicki:
And we were going to do these motor tests on the robotic arm, which we could detach while they were doing the software tests. So, you know, I set everything up and, you know, one of the things that I needed for the test was, was a voltmeter people, you know, multimeter. So one of the people on the floor, John, said, oh, you know, there's one right there. Grab that one. So it was plugged into some stuff.
Chris Lewicki:
So I carefully removed it, checked everything out for the test, looked good, and then it gets to. Everything checks out. Now we're going to actually do the test and push the button. We looked at the plot and it was kind of like, oh, that doesn't look right. And I don't know, time dilation or out of body experiences or whatever it was, but it didn't feel like it took long at all for me to realize we just tested the wrong direction.
Chris Lewicki:
We should have tested right, we tested left. So being well trained in working in the clean room at the time, the first thing I did was to tell Leo, my friend who was doing the software testing, like, hey, we had a little bit of a problem. We did the test on the wrong interface. And Leo's like, oh, yeah, well, we just lost telemetry a little bit ago. And I'm like, yeah, oh my God.
Chris Lewicki:
You know, you never want to hear that. It's never good to lose telemetry with robots.
Mark Graban:
Yeah.
Chris Lewicki:
You know, you know, unless you are intending to do that.
Mark Graban:
Right.
Chris Lewicki:
So at this time, John, you know, the guy who is telling me about, you know, which multimeter to take he had caught, you know, everyone's listening in on the headsets that everyone's wearing and he started unleashing a string of profanities that was a, there's a master course in swearing. And we know we did an emergency shutdown of everything and kind of cleared out and had to figure out, well, what the heck just happened.
Mark Graban:
Right.
Chris Lewicki:
So I am shell shocked at this point, right, because, okay, did a bad thing. Something that didn't look like it was supposed to happen, happened. We're looking everything and, you know, and, you know, as people can continue to triage the situation, you know, someone tells me, well, write down everything that you can remember.
Mark Graban:
Yeah.
Chris Lewicki:
That led to this. You know, this is failure investigation 101.
Mark Graban:
Yeah.
Chris Lewicki:
And this is the point where like I am starting to well up in tears and everyone else is starting to move away from the scene of the crime.
Mark Graban:
Yeah.
Chris Lewicki:
And really I'm feeling pretty bad.
Mark Graban:
Yeah, I can't imagine, you know, we're.
Chris Lewicki:
Kind of, we're kind of looking through this and I can't remember exactly kind of the order of the events, but, you know, the people who were familiar with the details of the design and exactly what happened was like, okay, well, what you did wasn't good, but we can't figure out how you would have lost telemetry from it. You know, it shouldn't have done that. So clearly we don't understand what happened.
Mark Graban:
Yeah.
Chris Lewicki:
And you know, when your computer has a hiccup, often the time is like, oh, well, you know, turn it off, turn it back on. You know, so it works for, you know, half billion dollar rovers, too. It's like, well, let's just turn it back on, you know, and maybe it was a glitch or something, and we.
Mark Graban:
Were probably not going to make it worse by doing that.
Chris Lewicki:
Well, let's figure the best way to figure out what happened is like, maybe it was some glitch. You know, we are in the process of building the thing so it's not quite completely checked out and it's not completely assembled. So we turned it back on and there's a little blinking light that is kind of what we call the heartbeat of the rover. That means that it's got through all the processes and it got to the point where everything's running and running, running appropriately. It takes a little bit to get to that blinking light.
Chris Lewicki:
And when it should have started blinking, it didn't. And waited a little longer. It didn't blink in. It was just no telemetry, no nothing. Wasn't doing the rover things it was supposed to do.
Chris Lewicki:
And it's like, okay, that's definitely not good. Yeah, you know, so this is the point at which, okay, we broke it, and I'm the one that broke it. You know, they're calling in managers at 1130 at night, you know, waking people up. And at this point, you know, this gets into a little bit of the lessons learned, but a gentleman named Ernie bravely walked over where I was and, you know, and consoling me and kind of, you know, puts his arm around me and pats me on the back and he's like, I remember this feeling the next time you have to sign on the dotted line for something.
Mark Graban:
Yeah.
Chris Lewicki:
You know, and this is kind of, you know, now it was, you know, maybe an hour later. I'd never felt that bad. And now I was at a new depth of. Of just absolutely feeling like this was the worst mistake in my life.
Mark Graban:
Yeah.
Chris Lewicki:
So went home, you know, probably one, two am at this point, kind of telling my wife about the experience. And, like, my sense was like, this isn't the thing that you keep your job with hundreds of millions of dollars of government asset. It's broken. You know, I'm going to go into work the next morning and get fired. So I did that.
Chris Lewicki:
Went into the work the next morning after a bad night's sleep.
Mark Graban:
Yeah.
Chris Lewicki:
Also, I mean, it wasn't just that I just did this horrible thing. It was. I'd been working on this project for a couple years. At this point, I was really excited about sending two rovers to Mars, right and now, you know, we're probably going to send one rover to Mars.
Mark Graban:
There's that disappointment regardless of the cause.
Chris Lewicki:
Yeah. This is like, the topic of thing is, like, this will be remembered in history. That is a two rover mission, and. And then because of some idiot's mistake, it becomes a one rover mission.
Mark Graban:
Yeah.
Chris Lewicki:
So the. The next morning came into work, and we're still, you know, trying to figure out how to solve the problem, being NASA and a bunch of engineers.
Mark Graban:
Yeah.
Chris Lewicki:
And we were able to piece together, it's like, okay, well, remember that multimeter I told you about that we were using the test? Like, tell me more about that. Where did you get it? I got it from over there. I'm like, oh, well, you shouldn't have removed that, because that's actually needed for the test configuration.
Chris Lewicki:
And let's try powering it back again. But we'll put the multimeter back in place, because the multimeter actually was measuring something related to the telemetry. So we plugged it back in, closed that circuit, powered it back up, the blinky light starts blinking, flowing, and, like, you know, to that point in my life, it was the greatest relief I've ever experienced.
Mark Graban:
Yeah.
Chris Lewicki:
And, you know, I did not kill the rover that day.
Mark Graban:
Yeah.
Chris Lewicki:
Wow.
Mark Graban:
Oh, go ahead.
Chris Lewicki:
Well, no, there's probably more questions you can ask me at this point, but that was kind of like, that definitely was the greatest mistake of my life, and I have drawn several lessons from it.
Mark Graban:
Yeah. And I definitely want to explore that. I do have questions. I mean, in layman's terms, I'm not an electrical engineer, not a rocket scientist, but, I mean, when you say that it was being tested in the wrong direction, was it a matter of literally kind of connecting a cable in the reverse direction or, like, trying to jumpstart a car and you connected to the wrong polarity terminal, or. There was a little more to it.
Chris Lewicki:
Yeah, well, it's kind of like you've got a connector, and part of that connector goes to the motor. Part of that connector goes to the battery or the computer or whatever it might be. And I picked the part that went into the computer instead of the part that went in the movie. One is designed to take a bunch of energy, and one is designed to give a bunch of energy. Okay.
Mark Graban:
Flow the wrong direction.
Chris Lewicki:
Yeah. And, you know, the thing is, you know, in the electrical engineering terms, I'll tell you a secret. All electrical circuits work on magic blue smoke, and the key is to never release the magic blue smoke or the electronics will stop functioning. So you know, if you've ever burned a piece of electronics and you can smell it, that's why it stopped working.
Mark Graban:
Yeah, yeah. Not a good smell, not a normal smell. I mean, I'd love to kind of talk through some of the other factors involved in this. I mean, you were doing such new, innovative work. I think it's a fascinating detail that it wasn't designed to be tested in this way.
Mark Graban:
I don't know if it's fair to call that a design mistake or just something. Okay, well, that was learned. And maybe in the future you might design it to be tested.
Chris Lewicki:
Well, yeah, I mean, you're actually hitting on what I would say is a fundamental engineering design principle that this is an excellent example for is, and it's called design for test. And if you can't get at the thing that you need to, to make it work, well, then you have to do, you know, ad hoc, dangerous things like I had to do. And when we design this, and we know one, we're running on a really short schedule. So, you know, we had three years from, you know, when the landers had previously crashed to when we had to make the next launch window, you know, or you're talking about hundreds of millions of dollars of cost and delay because you have 26 more months before you could fly again.
Mark Graban:
Waiting for a time when Mars will.
Chris Lewicki:
Be, because waiting for the planet to align again.
Mark Graban:
Yeah.
Chris Lewicki:
And that happens every 26 months with going to Mars.
Mark Graban:
So our simplistic models from elementary school, it's not a bunch of concentric circles around the sun.
Chris Lewicki:
Yeah, but I mean, it's kind of akin to, you know, we put a hood on a car with a latch because we know we're going to need to get to that engine all the time. And we put a dipstick into the oil pan because measuring the level of oil is important. But you can imagine, what if we designed a car without those two things? Cars would break all the time because we can't get after the parts. Yeah.
Mark Graban:
I mean, there's design for manufacturability.
Chris Lewicki:
Yeah.
Mark Graban:
Which is more of the realm I've worked in. Or you could think of like design for service. The fact that the dealer can fold down a little flap and plug in a diagnostic computer to access data that's all designed in.
Chris Lewicki:
And in this particular case, it was one of those things like you're designing one of the most complex pieces of electromechanical machinery that humanity has ever made, you realize that you need to test something, you know, two years into the design process and you don't have time to redesign it.
Mark Graban:
Right.
Chris Lewicki:
So. So that was, you know, that was part of why I got the job, was that I was very familiar with all that stuff.
Mark Graban:
Yeah.
Chris Lewicki:
But, you know, checks and balances and, you know, I would say levels of consciousness factored into it. Another thing, you know, even hard charging, dream chasing 20 somethings have their mental health and physical limits, and I was almost certainly past them that night. And, you know, like many things, there are multiple things that lead to a problem happening. And turned out we found in the testing thing that we forgot to disconnect our test motor when we did the real test. So it actually looked like it was okay when we checked it out because we didn't have things hooked up right in that thing.
Chris Lewicki:
So that was a procedure step that was missed. The people who were working with me didn't have access to double check the details that I had. So, like, all kinds of things that could have been done better, but it only takes one thing to mess it all up.
Mark Graban:
Yeah. And then you were describing, Chris, you wrote the procedure. Did you forget to put that step in the procedure or forget to execute that step?
Chris Lewicki:
Well, I think in this case, it was just, there was so much going on that, you know, the ability to plug things in two different ways was something that was easy to do and that was not in the procedure.
Mark Graban:
Yeah, yeah. So we can think of design and mistake proofing when something's being produced at volume, you know, especially cars, or it's.
Chris Lewicki:
Commonly called idiot-proofing.
Mark Graban:
Well, but see, I mean, you're being self-deprecating or humble. I always cringe when I hear the word idiot. I mean, clearly, you're not an idiot. And the people working in other fields who make mistakes aren't idiots. So that's why I was trying to climb up on a soapbox here, mistake-proofing over idiot-proofing.
Mark Graban:
And, you know, even more in the aviation, not aerospace, you know, with the Boeing bolts that were apparently not reattached to the door plug, I read some commentator who, you know, this is trying to be snarky or whatever. It's like, oh, I don't want to die in a crash because some idiot didn't tighten the bolts. That's kind of a human tendency to want to blame somebody instead of looking at factors like fatigue, time pressure, long day, et cetera. You know, to be fair to you.
Chris Lewicki:
Yeah. I mean, it all gets into, you know, the relationship between, you know, people and expertise and skills and process and procedure.
Mark Graban:
Yeah.
Chris Lewicki:
And, you know, we have process and procedure and checklists to anticipate and to, you know, show the right way things do things. And then the mistake proofing kind of comes in from experience, like in the aviation industry and in spacecraft as well. They have. Oh, I'm trying to remember the formal term for them, but it's safety wire on bolts to where you often see in these really high end things, you got two bolts next to each other, and there's a piece of wire between the two of them. So there's no way that the screw can back out, because if that screw backs out, it'll actually tighten the one that's next to it.
Chris Lewicki:
And anyone walking by can look at it and see that it's a little bit off. Or like if you've ever seen a semi or a bus, and they have those weird plastic tabs with the little arrows pointing, and they're all beautifully lined up, all pointing in the right direction. That's a way to make it really clear that all those bolts are tight because none of them have loosened up, you know, so these are, you know, we have. We build lots of airplanes. We drive lots of semis.
Chris Lewicki:
So we've found production ways to address these problems. We don't yet, in most places build lots of spacecraft, so we make it up as we go along.
Mark Graban:
Yeah, yeah. On the frontiers of space and the frontiers of innovation, that's a different situation than building hundreds of cars a day, you know? You know, the innovation versus production. But, you know, one other question I wanted to ask about the aftermath. You know, we had that stressful night, and you thought they were going to fire you.
Mark Graban:
Did you ever have a conversation with somebody, like, would they have fired you? Was it. Did you ever have kind of a mentoring conversation with Ernie or somebody?
Chris Lewicki:
Or.
Mark Graban:
Okay, well, what would have happened if it hadn't been brought back to life?
Chris Lewicki:
Yeah, well, that wasn't actually discussed.
Mark Graban:
Okay.
Chris Lewicki:
My sense now is it probably would have kept my job, and, I don't know, either had fewer or more opportunities as a result. But the real lesson, the core lesson that I took away from the entire experience was the manager of the project and a lot of other technical people that were involved in the design of these things that I was testing and were worried about the rover still not doing what it needed to do because we needed to test all these things. And there was a key question of, should we continue to do this really dangerous test? The ultimate decision was, yes, we definitely should continue to do this dangerous test because the mission might not work if one of these things is broken. And like the master stroke of management, of the project manager is like, okay, we're going to do these tests and Chris will continue to do them because we have paid for his education and he will never make this mistake again.
Chris Lewicki:
I know, I was just shocked. I've learned since there's stories about IBM that are very similar to this. Maybe this is where my manager learned this from.
Mark Graban:
Yeah, I think it's supposedly about founder Thomas Watson.
Chris Lewicki:
Oh, yeah, yeah, yeah.
Mark Graban:
From IBM. Yeah.
Chris Lewicki:
But, you know, like, I have lived that experience now, and I know the brilliance of that decision is, like, there is indeed there's no better person on that planet to make sure that that mistake never happened again. Because, you know, those checks and balances that we should have had that I didn't realize when I drafted the procedures, like, if I was going to run it 60 more times, you know, I'm definitely going to make sure that it is the most perfect, beautiful, thorough, double checked, absolutely correct procedure that's ever been written. And every time I ran it from that moment on, you know, I'd get sick to my stomach. Just cringing is like, did we get it right this time? You know, and I got it right every time since that.
Chris Lewicki:
And I think the other thing where, you know, the next sigh of relief, I believed was when spirit was on the surface of Mars and it reached out its robotic arm and needed to spin up that motor to grind a rock, it worked. So it was all worth it.
Mark Graban:
Yeah, I imagine between the size and being able to launch it and everything, there probably was an opportunity to build in a lot of redundancy. I mean, jet airliners used to have four engines. Now they generally only have two. Cause the reliability is better, but they're not relying on one, and they can fly a long distance. If one were to go out, I imagine that type of redundancy.
Mark Graban:
Redundancy just wasn't possible in this kind of application.
Chris Lewicki:
Well, it's hard to design redundant actuators for joints and things. It's like, redundancy is like if one of your arms is not able to be used, you have another arm, you know, with eyeballs and legs and things, but you only have one heart. You know, we didn't figure out how to make the heart failure tolerant, so it's just very, very reliable. Yeah, you know, we protect it well, so. But, you know, that's also the kind of the philosophy of these things, like to make things redundant, make them more reliable, but it makes them less capable.
Chris Lewicki:
If you're working with a constrained set of resources, you know, it's kind of the difference between maybe a gymnast and a football player. The gymnast might be able to do a lot of more different things, but, you know, the football player can do one thing extremely well. Yeah, but, you know, they have different muscle mass, of course. So spacecraft and robots and things alike, you have to figure out the best way to manage that risk. And in the case of these rovers, the way they managed it was by building two.
Chris Lewicki:
So one didn't work, you got another one.
Mark Graban:
Yeah. You have written this story, and this is a rare instance for the listener. I knew the story because I had read it, and that's why I reached out to Chris. I often don't know the story that a guest is gonna tell, but I've heard it in a different way here that prompts different questions than I might have planned for. You said earlier it took you a while to be able to tell the story.
Mark Graban:
Do you remember about how long. I mean, was it. Was it years? A decade?
Chris Lewicki:
Probably only a couple of years. You know, I think it's the type of thing where, you know, it was a personal failure, but it wasn't a failure that, you know, I guess the thing that we were doing didn't fail. So there was a degree of comfort after these two Rovers had been wildly successful. And, you know, it was one of those. One of those war stories you could talk about.
Chris Lewicki:
But I think it was for me, everyone is on their path of experience. One of the other things I learned is every person who has a responsibility like that really feels, you know, they're before the grace of God go I. So while I had expected a lot of people to lash out at me, you know, accusatorially or, you know, saying shoulda, woulda, coulda.
Mark Graban:
Yeah.
Chris Lewicki:
No one really did what I really experienced, except for a brief moment of profanity, which. Which was probably deserved in the moment. In the moment, it was deserved, right.
Mark Graban:
It's a natural human response, even if it wasn't deserved.
Chris Lewicki:
But what I experienced was support. And I think the real thing to recognize is, and this is something the chief engineer of JPL now, my friend Rob Manning has talked about, there are thousands of mistakes that. That are made in the process of building these things, but it's the way that we have learned how to build and design them that allows us to work through those mistakes happening anyway and realizing that the best thing you can do. Another friend, Lindy Elkins Tanton, who is the principal investigator of the psyche mission on its way to a metal asteroid right now, you know, she had once said in a NASA review that I was in here with the best news is bad news delivered early enough to fix it. That's true.
Chris Lewicki:
Like, bad news doesn't get better with age. So the type of thing. I was holding the smoking gun in the clean room when I had just probably wrecked the rover, but, you know, I immediately had to tell someone. Wouldn't happen. I can't hide it.
Chris Lewicki:
Not. No one's gonna benefit by hiding it. The bad feeling is not gonna get any better. So, you know, the best thing you can do when you know that there's a problem is. Is to tell someone else about it.
Mark Graban:
Yeah.
Chris Lewicki:
Because everyone is gonna want to help you solve that problem as soon as you can.
Mark Graban:
Yeah. Yeah. And I. It's great to hear that you did speak up about it. There are times when, especially when there's fear of losing your job or worse, that people might choose not to or if they're to deflect or, you know, come up with some other feasible cause that was, you know, not related to something I did.
Mark Graban:
Like that. Maybe it was your sense of mission. I know, pun intended with mission, but the sense of commitment to what you were trying to accomplish, that it was necessary to speak up, even if it wasn't easy or.
Chris Lewicki:
Well, I think it probably landed in its current form of feeling about it. When I was a CEO of Planetary Resources and these types of problems would happen in the team, people would come into my office nervous, sometimes crying. But my primary concern was, like, was anyone hurt?
Mark Graban:
Yeah.
Chris Lewicki:
Like, okay, no one's hurt. Oh, yeah. This is a solvable problem then.
Mark Graban:
Yeah.
Chris Lewicki:
So, you know, it's the Jeff Bezos idea of reversible and irreversible doors and decisions or problems. You know, it's just a machine. It can be fixed, and, you know, everybody is going to make mistakes. Right. It is the very.
Chris Lewicki:
It is the most human thing about us. We strive for perfection and rarely achieve it.
Mark Graban:
Right. Right. And that lesson is going to continue serving you and other leadership roles or as you're coaching and mentoring others. And I hope anybody listening who's doing any sort of technical work. I mean, there's so many stories from Silicon Valley of the person who accidentally deleted the files for one of the Toy Story sequels, I think.
Mark Graban:
And there was a backup to be found somewhere, thankfully. Or somebody deleting the production database or stories like this.
Chris Lewicki:
Yeah. Yeah. There were lots of. When I published the story, there were lots of it professionals who had a lot of disaster stories to share of their own that, you know, I could imagine felt every bit as bad as my experience.
Mark Graban:
Yeah, yeah, yeah. So, you know, in the piece, and I'll link to it in the show notes, I invite people to go and read it. And, you know, I thought it was interesting. Chris, you invited people to share their stories, but I think maybe this was from our conversation earlier, not from the article, I don't think. But you started asking people about failure stories during job interviews.
Chris Lewicki:
Yes, absolutely.
Mark Graban:
Can you tell us more about that?
Chris Lewicki:
So this was a process where, you know, you only can know someone so much about someone's resume or, you know, referrals are great, of course, to kind of let you know the understanding or the character of people. But we would ask in our application at planetary resources, you know, we'd have a brief thing like, you know, everyone who is doing anything challenging fails at something. What's something that you failed at and what you learned as a result of that. And for us, it was really a way to match on culture that we wanted people who had tried hard things, but we also wanted people who were humble enough, you know, to realize that they don't always go like you would expect and they're willing to learn from it. And, like, what we often found is, like, there was, where there were, there was a type of person who, you know, had some failure and, you know, their primary, the primary description of it was it.
Chris Lewicki:
It wasn't my fault. You know, it was everything around me and this and that and the other thing. Right. And I had no part in it, which I would say sometimes was true. Maybe not the type of failure story that we're looking for, but, you know, the, the other parts where people would just kind of see the wisdom of, like, I should have known or, you know, you know, it shouldn't have done it that way or should have asked for help.
Chris Lewicki:
You know, the people were. There's some introspection and just kind of on that continuous lifelong learning journey where, you know, it's risk and reward and, you know, people often say it's, you know, it's not a failure if you've learned from it. Right. Just a lesson. Yeah, there's.
Mark Graban:
Yeah. Lots of great variations on that quote or, you know, to your point of, you know, we're all human. I've heard a variation of the quote along the lines of making mistakes is not a choice. Learning from them is the only real failure is failing to learn from the mistake.
Chris Lewicki:
Yeah. The interesting thing about it, and this maybe is the rite of passage of everyone standing. You can't learn anywhere near the same emotional experience of reading about these things than having these horrifying experiences yourself. The most surprising kind of corner of feedback that I got after I published my article was from a team of doctors at a hospital. And of course, you want them to be perfect.
Chris Lewicki:
You want them to never make a mistake. But humans are humans wherever they do things, and in that profession, you absolutely want them learning from their mistakes, approaching things with humility. So the idea of it with half billion dollar rovers or open heart surgery, same elements are at play.
Mark Graban:
Yeah, I mean, there is, I think back to the question of building the first of something, innovation versus production. I think there are different circumstances. And I've seen, like Amy Edmondson from Harvard, her great book Right Kind of Wrong, that explores a lot of this. I'll credit her for this idea. The first time someone ever did a heart transplant, surgery like that's on the frontier of learning and innovation.
Mark Graban:
And there could be something they didn't anticipate along the lines of what you hadn't anticipated with that rover, as opposed to, you know, I'm going to use knee replacement surgery that's got to be pretty darn routine, or laser eye surgery. That's almost more of a production process of like, you know, you shouldn't cut into the wrong me, you know, so there's certain mistakes that we'd say, well, it's innovation, we need to learn from it. And then there's certain mistakes where you'd say, gosh, we shouldn't, you know, we should be preventing that. But if it happens, let's learn from it instead of being punitive. That's me getting on my soapbox again.
Chris Lewicki:
A little bit, I think, you know, my life is now surrounded in commercial space and there's certainly a lot of different and exciting things going on than any other time in history. And there is a different type of mistake that actually relates to perfectionism. And, you know, the reason why things take very long and they get really expensive and they ended up with antiquated technology in them because were striving for the perfect, you know, in that case, the consequence of all that is it took too long, it cost too much money, you know, and it didn't deliver results nearly as soon as it could. Because you were unwilling to take any risk at all.
Mark Graban:
Yeah. Or you might miss the market opportunity altogether. Someone else launches before you.
Chris Lewicki:
Yeah.
Mark Graban:
Software maybe not launch before.
Chris Lewicki:
There's a spectrum of risk taking where, you know, how do you take the thing you're trying to do and protect your downside in making mistakes and maybe the risk of being stuck in conference rooms and having meetings for too long is like, okay, well, what's the earliest we can try something to see if it works or doesn't? And we'll learn more from trying to make this robot work that day than we ever would on a phone call guessing about it. So that's the type of thing where a culture of calculated bets and calculated risks. And again, I'll credit Elon Musk and SpaceX. They're the best example in history on this particular topic of having rockets exploding in mid flight and being super excited about it because it was such a fast and inexpensive way to learn things.
Mark Graban:
Yeah.
Chris Lewicki:
And it's a little bit harder when you're landing on Mars just because the iteration cycles are longer. You know, optimizing that iteration cycle with the risk that you're taking is key.
Mark Graban:
Yeah. Because there's. There's testing. I watched a documentary recently about the Hubble space telescope. It was on Netflix, I think, and they were talking about the different things that could have gone wrong, from the unfolding of the mirrors to the James Webb space telescope.
Mark Graban:
About the Webb telescope. Sorry. Yeah, my mistake. That's not a $500 million mistake. That's not even.
Chris Lewicki:
That's a $10 billion mistake.
Mark Graban:
I should have just said the documentary about space telescopes. But, you know, there were things they were concerned about around, I think, you know, the solar, I'm going to call it, like the tent type folding or the vibration of launch or landing on the moon. And it's a hard landing. I imagine there are certain things that you just couldn't test because it might be destructive. So you're weighing the odds.
Mark Graban:
Back to your other question, even with.
Chris Lewicki:
Your test, this is another thing that, you know, I mentioned. Well, no, actually, I didn't mention this. It was a different call today. There was a lander that failed to land on Mars in 1998 called Mars Polar Lander. And what happened was they deployed the legs right before they're getting ready to touch down, and there's a touchdown sensor in the leg, so it knows when to turn off the engines.
Chris Lewicki:
And they deployed that touchdown sensor, and the touchdown sensor bounced, like in midair, because you're deploying this thing and it kicks in. But the problem was it was 600ft up in the air when it bounced and got the signal to turn off the thrusters. And then it fell 600ft and crumpled onto the surface of Mars.
Mark Graban:
Wow.
Chris Lewicki:
And what NASA learned from that particular failure was the idea of testing in the same way that you plan on flying the mission and conversely, flying the mission consistent with how you tested it, and that, as an ideal, is anticipating everything that's going to happen and doing your best to emulate how it's actually going to happen. You know, sometimes, as you mentioned, you can't do that. You can't deploy a tennis, Tennis court sized shield, you know, thermal shield in microgravity without actually doing it in space. But with this lander lag, the problem that they did is they never deployed the lander lag at the same time that they were running the secrets in the software to pretend it was landing because they didn't want to use up the leg, so to speak, by releasing it too many times. But now, since that failure, NASA very much is like, deploy everything, move all the things, do every critical thing that you can do because you want your software and your hardware to experience as close to the final reality as possible.
Mark Graban:
And designed to be landed more than once.
Chris Lewicki:
As a criteria, this can go to deploying websites and live events and all those type of things. If you're going to have 50 million people clicking on your web page, you should try to simulate 50 million people clicking on your webpage. And what happens when people are putting credit card charges through at that rate, anticipating different things that can happen in stress testing your system. And what happens when the soccer game happens at a different time of day and you're doing the server backup process when the soccer game is happening and the servers are overloaded. So just kind of forecasting all those things that might actually happen in the real world that don't happen in your development environment.
Mark Graban:
Well, again, we've been joined today, Chris Lewicki. His website, chrislewicki.com. I'll link to it in the show notes and I'll link to that article. A couple quick questions here. I'll get this correct.
Mark Graban:
I won't make a mistake on this. Another term you use to describe yourself, near futurist. Yes. What does that term mean?
Chris Lewicki:
It's the future that you can create by doing something this afternoon. I don't like to think 50 years into the future. I like to think, what am I working on right now that's going to happen in five years? So that's the type of future I like to think about.
Mark Graban:
Yeah. And then that mindset seems to be related to what you listed as a motto. I'm going to just read the first part and ask you to finish the sentence, stop reading about it and, and start doing it. Seems like good advice for all of us and a lot of great lessons here, even for me to keep in mind the work I'm doing is far lower stakes. There's people listening, I'm sure, doing high stakes work and lots of great lessons.
Mark Graban:
And thank you for being willing to share your story so candidly and with all of those reflections, Chris.
Chris Lewicki:
Yeah, it's been an absolute privilege to do so, and I'm glad it all worked out in the end.
Mark Graban:
So am. So. Yeah. Thanks. Thanks again for being here, Chris.
Chris Lewicki:
All right, bye.
Episode Summary and More
Unprecedented Successes and Learnings in the Modern Space Industry
Chris Lewicki: A Profile in Space Entrepreneurship
Chris Lewicki's journey in the aerospace field combines expertise in engineering with pioneering entrepreneurial ventures. His early contributions as a key member in NASA's Mars exploration rovers positioned him as a formidable figure within the space community. Lewicki's involvement with the Mars rover Spirit and Opportunity, as well as the Phoenix Mars lander, earned him recognition and respect, reflected in two NASA exceptional achievement medals and an asteroid named in his honor, 13609 Lewicki.
Lewicki's academic pursuits culminated in earning bachelor's and master's degrees in aerospace engineering from the University of Arizona, laying the foundation for his subsequent ventures. His transition from the public sector to entrepreneurship saw him co-found Planetary Resources, Inc. (PRI), an organization dedicated to tapping into the wealth of resources found on near-Earth asteroids. PRI represented a significant step towards shaping a robust space economy, a sector Lewicki is unequivocally passionate about.
Astounding Advances: Private Sector Contributions to Space Exploration
The paradigm of space exploration is undergoing a seismic shift; private companies are emerging as key players, bridging the gap between governmental programs and commercial endeavors. Chris Lewicki's influence extends beyond the operations of single entities. As a committed industry advocate, he has aligned himself with accelerators like the Creative Destruction Lab, where he mentors space-related startups and learns along with other industry stalwarts, including ISS Commander Chris Hadfield.
His involvement with the XPRIZE Foundation alongside Anousha Ansari highlights the convergence of innovative thought and philanthropy aimed at addressing critical challenges such as space debris. By facilitating the inception of new prizes, Lewicki contributes to inspiring solutions that could have a lasting impact on space advancement and sustainability. Constantly aiding founders and contributing strategic insight, he embodies the archetype of the space industrialist, driving the creation of infrastructure essential for spacefaring communities.
The New Horizons: Lunar Landings and the Importance of Risk-Taking
The resumption of lunar exploration missions by the US and other countries indicates significant interest and increased activity toward the Moon. This lunar renaissance, including both triumphs and minimal setbacks, is a testament to the bold direction being taken by the space industry. With recent missions conducted by entities such as the Japanese space agency and Intuitive Machines, the challenges experienced underscore the necessity of iterative design, testing, and embracing the reality that, in space exploration, not every outcome is predictable.
The willingness to innovate and accept calculated risks mirrors the ethos of entrepreneurial ventures on Earth. As Lewicki indicates, this approach facilitates advancements at a rapid pace, lower costs, and with novel technology implementations. The narrative of space exploration is defined not just by successful missions but also by those moments that offer opportunities for learning and improvement. The embracement of failures as stepping stones towards mastery is an indispensable philosophy in the quest to expand humanity's reach within the cosmos.
Embracing Failure: Lessons from a Near-Miss on Mars
In an industry where precision is non-negotiable, the margin for error is virtually non-existent. Yet, it is through the near-misses that valuable lessons are learned. Recounting a personal anecdote, Chris Lewicki reflects on a moment early in his career that almost compromised a $500 million Mars rover. The incident, concerning an incorrect testing procedure, resulted in a momentary loss of telemetry—an aerospace engineer's nightmare. Nonetheless, it provided critical insights that would continue to shape his approach to problem-solving and system design in the intricate dance of interplanetary robotics.
The process of identifying, understanding, and rectifying such mistakes serves as the embodiment of advancement through experimentation. As Lewicki acknowledges, accepting and articulating one's vulnerabilities and errors is part of the maturation process within the sector. It is a reminder that behind our most advanced machines are humans, fallible yet capable of astonishing feats when they harness the learnings from their missteps.
Navigating the Complexities of Spacecraft Engineering
Spacecraft engineering stands as one of humanity's most intricate and complex endeavors, a symphony of multidisciplinary collaboration where even the minutest detail can significantly impact a mission. Chris Lewicki's brush with potential disaster during a rover test is a poignant example of the multifaceted nature of spacecraft design. It mirrors the intricate considerations that go into ensuring functionality while also reinforcing an often overlooked notion: the necessity of designing for testability. The concept of design for test(DFT) underscores how essential it is to anticipate service requirements and testing needs during the initial design phases.
Creating a spacecraft that combines robustness with diagnostic access is a delicate balance, and it calls for an ingenious approach towards the build and maintenance life cycle. In the nascent days of space exploration, spacecraft were often made as single-use implements with little or no thought to in-space servicing. However, with increasing mission complexities and longer space explorations, the idea of servicing and upgrading spacecraft in orbit or on extraterrestrial surfaces becomes crucial, thus shaping the modern design philosophy.
Mistake-Proofing: A Critical Engineering Principle
The space industry, where each component could be vital for mission success, is continuously learning from aviation and other high-stakes engineering fields in implementing mistake-proofing strategies. Such tactics are vital to designing spacecraft systems that can endure the unabashed scrutiny of space. Techniques like safety wire or indicator tabs offer visual cues that can prevent catastrophic oversights—a nod to the industry's credence in proactive safety measures.
Mistake-proofing, or ‘poka-yoke' in Japanese, is an influential concept derived from the manufacturing domain, and its implementation in spacecraft engineering demonstrates an evolutionary step. Whether it’s ensuring bolts remain tightened on a satellite or designing connectors that prohibit erroneous assembly, space engineers incorporate these and other forms of built-in safeguards to preclude human errors.
The Human Element in Space Exploration
Chris Lewicki's narrative reveals a deeper truth about the space industry: it is as much about human courage, resilience, and learning as it is about technological prowess. His experience points to the dualities of working on the bleeding edge of innovation—where the thrill of creation coexists with the specter of errors. His candid admission of pushing beyond mental and physical limits on the night of the near-miss incident reminds us that human factors such as fatigue and stress play significant roles in operational excellence.
Recognizing human fallibility within the context of space missions compels organizations to construct a culture of support, mentorship, and understanding. The role of leadership in leveraging mistakes as learning opportunities, as exemplified by Lewicki's manager, can be an organization's strongest asset. Creating a work environment that acknowledges the inevitability of errors while fostering a relentless pursuit of excellence is instrumental in driving innovation forward.
Building on Experience: The Future of Spacecraft Testing
The lessons learned from the Mars Rover incident have ramifications beyond the immediate context of the mishap. They serve as a blueprint for future designs, procedures, and operational protocols. The implementation of comprehensive design for testability, intricate mistake-proofing, and fostering human resilience are key takeaways to refine future missions. Spacecraft engineers now can operate with a greater level of awareness and precision, drawing from the repository of hard-earned knowledge.
As spacecraft become more autonomous and self-reliant, the possibility of in situ repairs and adjustments emphasizes the need for modular and adaptable components. These changes herald a new age in spacecraft engineering, where maintenance and diagnostics in the vacuum of space could become as routine as auto-servicing on Earth. With these evolutionary steps, human explorers and robotic proxies alike will further the limits of our celestial reach, all the while securing the reliability of the craft that takes us beyond our blue planet.
Embracing Redundancy in Spacecraft Design
The mention of building two rovers as a contingency plan highlights the significance of redundancy in space missions. This method of incorporating duplicate systems, also known as redundancy engineering, ensures that if one system fails, a back-up can take over, allowing the mission to continue uninterrupted. The philosophy of redundancy is deeply embedded in the design of spacecraft due to the harsh and unforgiving environment of space, where repairs are often not feasible. Reliability is paramount, and redundancy is a key strategy to achieve it, despite the trade-offs in capability and resource allocation.
Learning From Failures: A Cultural Imperative
The humility to admit a mistake and the immediate impulse to share the bad news signal a culture where learning is more valued than maintaining appearances. This is especially relevant in fields like spacecraft engineering, where the stakes are high and the complexity of systems is enormous. The response to the near-failure discussed in the transcribed conversation suggests an organizational culture that supports staff in the event of mistakes, understanding that this approach ultimately leads to better outcomes. By sharing failure stories, particularly during job interviews, organizations can build teams that are resilient, adaptive, and committed to continuous learning and improvement.
The Art of Balancing Perfectionism and Pragmatism
Perfectionism in spacecraft engineering must be counterbalanced by pragmatism to avoid spiraling costs and outdated technology. There is a fine line between striving for near-perfection to avoid catastrophic failures and the need for timely innovation and responsiveness to the mission's objectives. Critical to navigating this balance is the acceptance and strategic use of calculated risks. SpaceX's approach of embracing explosive failures as learning opportunities illustrates an extreme yet effective approach to innovation, where rapid iteration and learning from actual test outcomes outweigh speculative planning and exhaustive risk aversion.
Lessons for the Broader Engineering Community
The experiences within the space engineering sector offer invaluable lessons for broader engineering disciplines and industries. Whether it's medical professionals learning from errors, IT catastrophes averted, or space telescopes like James Webb testing the limits of engineering in space, there is a common thread: the iterative process of testing, learning, adjusting, and improving. As engineers and organizations devise systems and protocols that can weather mistakes and adapt to new information, humanity continues to advance its technological capabilities and push the boundaries of the known universe.
Simulating Real-World Scenarios to Ensure Robust System Performance
Understanding the variety of real-world conditions that a system might encounter is crucial in both space exploration and other high-stakes fields like web development and live event broadcasting. For instance, the practice of simulating an extreme number of users simultaneously accessing a website can uncover potential points of failure that would not be evident under normal testing conditions. This process of ‘stress testing' involves pushing a system to its limits to verify its resilience and capacity.
- Preemptive Load Testing: By simulating heavy traffic, engineers can identify bottlenecks and optimize system performance.
- Transaction Testing: Intensive assessment of payment processing under high-volume conditions ensures reliability and security during critical operations.
- Scheduled Maintenance vs. Peak Times: Testing how scheduled processes, like server backups, interact with high-traffic events can prevent system overloads.
By planning for the unexpected, engineers can better anticipate challenges, leading to more effective contingency strategies and robust systems that maintain functionality in the face of unforeseen events.
The Near Futurist Perspective in Innovation
The term “near futurist” suggests a focus on tangible, short-term outcomes that stem from current actions. It encapsulates the belief that the future is shaped by what we do today, rather than by passively predicting outcomes that lie decades away. This forward-thinking vision drives the development of technologies and systems with a horizon that is within reach, perhaps five or so years into the future. By focusing on this achievable future, innovators can create concrete plans and work towards them with clarity and purpose.
- Action-oriented Planning: Prioritizing immediate steps that can quickly lead to innovative breakthroughs.
- Responsive Strategizing: Adapting to changes and new information to make relevant progress toward near-term goals.
Embodying the near futurist approach ensures that advancements are not only conceptual but also practical, leading to actionable projects that bring about real change in a foreseeable timeframe.
The “Do It Now” Mentality for Productivity and Progress
The advice to “stop reading about it and start doing it” is not just a call to action; it's a philosophy for breaking the cycle of inaction and moving towards tangible results. This proactive mentality emphasizes the transition from theoretical understanding to practical application. Whether working on low or high-stakes projects, this mindset can accelerate learning and lead to successful outcomes.
- Emphasizing Execution: Translating knowledge into practice to gain experience and refine skills.
- Overcoming Procrastination: Encouraging immediate action to combat the paralysis of analysis.
- Iterative Learning: Adopting a hands-on approach that embraces trial and error as part of the improvement process.
By adopting a “do it now” attitude, individuals and organizations foster a culture of dynamism and progress, where ideas are swiftly transformed into reality, propelling both personal growth and industry evolution.