Oct. 16, 2025

#528 Error Isn’t the Enemy: Eckhard Jann on What Aviation Can Teach Startups About Failure and Growth

#528 Error Isn’t the Enemy: Eckhard Jann on What Aviation Can Teach Startups About Failure and Growth

In this episode of The CTO Show with Mehmet, host Mehmet Gonullu sits down with Eckhard Jann — former commercial pilot, safety manager, author of Error One, and host of the Error One podcast — to explore how lessons from aviation safety can transform leadership, culture, and decision-making in startups.

 

Drawing from 30 years in the cockpit and years of investigating human error, Eckhard unpacks why mistakes are inevitable but manageable, how psychological safety shapes resilient teams, and why “error culture” may be the missing ingredient in modern business leadership.

 

 

👤 About the Guest

Eckhard Jann is a business consultant, author, and former commercial pilot with three decades of aviation experience. His bestselling book Error One and his podcast of the same name bring the science and psychology of human error to a global audience. Eckhard’s mission: to help leaders, founders, and teams build systems that learn before they fail.

 

 

💡 Key Takeaways

• ✈️ Aviation’s secret: Every procedure, rule, and checklist was “written in blood” — mistakes are teachers, not threats.

• 🧠 Error chains: Big failures are never caused by one mistake — they result from small, ignored signals that compound over time.

• 🗣️ Culture over blame: Teams must be empowered to talk about errors without fear; silence is the real danger.

• 🤝 Psychological safety: Great leaders invite criticism, feedback, and correction from everyone — even the youngest team member.

• 🚀 Startup relevance: Just like in aviation, startups thrive when they treat missteps as learning loops, not career-ending moments.

 

 

🎓 What You’ll Learn

• The concept of “Error One” and how to identify early warning signs in teams and organizations

• How aviation built a resilient safety culture and what startups can borrow from it

• Why blame culture kills innovation

• How psychology and systems thinking can prevent failure before it happens

• The role of human creativity in an AI-driven world where automation can’t anticipate the unexpected

 

 

🕒 Episode Highlights (Timestamps)

00:00 – Introduction and Eckhard’s journey from pilot to author

03:00 – Why aviation learned safety “written in blood”

07:00 – The inevitability of human error and how we grow from it

11:00 – The iceberg analogy: visible accidents vs. hidden small mistakes

14:00 – Spotting early signals: empowering teams to speak up

18:00 – The psychology of fear and building error-safe cultures

23:00 – What startups can learn from cockpit teamwork

30:00 – Leadership humility and feedback loops

33:00 – SpaceX vs. Boeing: two mindsets on risk and failure

35:00 – Why AI can’t replace human creativity in crisis situations

41:00 – Inside the Error One podcast and its most powerful stories

46:00 – Final reflections: separating the error from the person

 

 

🔗 Resources Mentioned

• 📘 Error One by Eckhard Jannhttps://www.amazon.com/dp/B0DM6Z16GL?dplnkId=c43308fc-bc40-4317-b5af-eebb9c49a3ea&nodl=1

• 🎧 Error One Podcasthttps://open.spotify.com/show/09uOW1cp2kQ3Qx91kjQslG?si=4OIu_CbUS-qtdk_FkePsaA

• 🌐 http://www.errorone.net/ 

Traininghttps://aviationinvestigation.com/en/willkommen-beim-aviation-investigation-training-english/

• 🔗 Connect with Eckhard on LinkedIn: https://www.linkedin.com/in/eckhardjann

 

[00:00:00] 

Mehmet: Hello and welcome back to a new episode of the CTO Show with Mehmet today. I'm very pleased joining me from Germany, Eckhard Jann who is an expert in error. I will keep this Eckhard for you to explain. What is that? He's an [00:01:00] author, he's a podcaster also like me. But of course we're gonna talk a lot about, uh, topics, which are, in my opinion, are very, uh.

You know, interesting and should be discussed, and I think we tried with some guests to focus on, but again, I love to hear different, uh, opinions from different experts like Eckhard. And without further ado, Eckhard, my first traditional question to all my guests. Tell us a little more about you, your background, your journey, and what you are currently up to, and then we can start the discussion from there.

So the floor is yours. 

Eckhard: Thank you very much. I'm, uh, really appreciate. I'm, I'm excited to be on your show right now. Uh, thank you. So, you rightly said, I, um, uh, currently work in as a business consultant. An author, uh, published a book or wrote a book during the pandemic. But in my previous professional life, I was for 30 years a pilot, a commercial pilot, and worked as a safety manager.

So, from the very early stages of my career, I, I [00:02:00] was working with people. With colleagues and investigating occurrences and trying to understand why people may make mistakes. So this is my intrinsic motivation and, um, I'm, after I published the book, which was, uh, hugely and greatly, um, receptive here in Germany.

Which won a ba basically. Also, a, a, uh, prize is the best nonfiction book. It's now available in, uh, in the US as well. But I, what I want to say is that I am also a, um, uh, the, the podcast that I'm producing, era one and Germany's also extremely successful. And this is now available in, in, in the US as well, or in English speaking podcast as well.

But, um, what really motivates me is trying to convey. The point that we are at the end of the error chain. And this is what Mo motivates me and, um, this is why I call this concept the error one. And this [00:03:00] is what I would like to talk to you and why the error one is so important for any startup, for any business leadership, for, for basically anything that is important when trying to be su successful.

Mehmet: Great. And thank you again Kar, for being here with me today. I really appreciate, you know, the time, uh, that you, you know, speci, you know, gave to me today to, to have this great discussion. Now, something traditional and because you mentioned something about coming from, from the, uh, aviation background, both as a commercial pilot and, uh, safety manager.

Um, when, when we think aviation, you know, we think about. Safety and zero error, so. Mm-hmm. Um, like what really happens in, I know maybe it's a loaded question, but, you know, how, [00:04:00] how does you know, like this error management cultures look like, uh, in, in, in aviation sector and, you know, what lessons you think other industries can learn, uh, about, you know, safety and failure?

Eckhard: Oh, that's, that's, uh, uh, that's a great question. I could answer it quickly, but I can answer it, uh, at length because there's a long history about, uh, aviation learning from mistakes and learning from errors, and I always, um, keep saying that all. Procedures that we have all, all regulations, all rules that we have, have been written with blood because there were so many accidents on the beginning of, of aviation.

Uh, when I'm looking at 120 years ago when Charles Lindbergh was flying with his bird of Saint Louis, uh, of the Atlantic, there was no in-flight entertainment. There was no catering, there was nothing on it, on, on board. And, uh, when, uh, he barely made it, we had a very, very bad. Uh, [00:05:00] uh, accident rate at that time, but aviation learned from all of these accidents and it, it evolved over the decades into a, a highly.

Safe environment right now that we're, we're, uh, we don't need to be afraid to step into any airplane as you, we just, uh, before we started the show, you just said you'd step off the airplane now. Right? I knew. Yeah. You'll be arriving safe. I don't need to worry that, uh, you know, the only question was do we make it on time?

Yes or no? But that's basically it. It's also, we are so. We are right now talking about an ultrasafe, um, in, uh, industry that we are in because we all learn from all the errors and the all mistakes that we did. And what motivates me is that don't look at the end of the error chain when the, the, the plane crashed.

And there are reasons why. Even though people make mistakes, I do mistakes every day, but we have what we call resilient systems [00:06:00] where it's permissible. To make mistakes, but you have a team and you have a system, and you have procedures in place, which prevents the error chain from developing into an accident.

And this is, I will come to back to this, uh, which, which has been called in, in, in academia, the, um. Psychology of safety or the um, uh, safe environment in any company that you need, that you're require to have a team which is open to active and passive criticism, and this is the right way to deal with errors.

Mehmet: Great. Now let me ask you a little bit about the book. So because of all this experience you accumulated over the years, you said you decided to write the book Error one, and of course I was preparing and I did a little bit of my homework, right? And you say like. Everything begins for a reason [00:07:00] and it, it has something to do with psychology and you know, the impact of the mistake.

So why do you believe errors and failures are something that we cannot, you know, avoid all the time, yet they are necessary for foundations of success. 

Eckhard: Absolutely. Um, when you're looking back, let's go 2000 years, go back Sure. When the Romans said to air as human is human, but what has changed in the last 2000 years?

Nothing. And I promise you, even for the next 2000 years, humankind will not change. In this respect. We always will be making mistakes. We are even. We can even predict errors depending on the circumstances and the situation that we are are prone. And yet when [00:08:00] learning from mistakes, this is where we are progressing as humankind.

Within the industry, as I just said, in the, in the aviation, but also personally. And when you're, when I would ask you, well, do you have, for example, you have one of your listeners might, might face this situation. You have a complicated surgery, just, uh, you know, a couple of weeks ahead and you need to decide which hospital you want to go and which surgeon you want to be operated by.

Who would you approach? Would you go to someone who said, oh yeah, I, um, I've just graduated from, uh, medical school and I, I saw the surgery on YouTube so I know exactly what I'm supposed to do. Or do you want to have someone who's been expert has been done this operation, the surgery maybe a hundred, several hundred, maybe a thousand times already.

He knows exactly where it's been critical because. Maybe on some surgery. [00:09:00] Previously it was, you know, maybe not a hundred percent what he expected, but he learned from this. And you, we want to have experienced experts who know what they're doing. How, why are they experienced? Because they made mistakes.

They made. They learn from all the errors. Maybe they learn from other people, but this, the experience is the foundation to make the sound a right decision and safe decisions. 

Mehmet: You know, I, I don't ask these questions to challenge people yet. I ask these questions to get some. Better. I would say understanding.

So because someone might be listening and saying, Hey, I understand. I get what you're saying. We always do mistakes, whether in personal lives or professional lives, but sometimes these mistakes can be. [00:10:00] Unforgettable. Like maybe even they will, they will be destructive, right. So, mm-hmm. In your experience, what's the difference between a mistake that you know, might affect the trust or might destroy reputation versus a mistake that, okay, we can say.

Yes, this is, this is a mistake. Um, this is a failure. I don't know what you want to call it. And it's good for learning for the next time. So what is the line that separates, you know, these two kinds of mistakes or errors, whatever you want to call them. 

Eckhard: Mm-hmm. Um, excellent. Just consider that you are, um, uh, look at a, an iceberg.

When you're looking at the iceberg. There's always the saying. That's just the tip of the iceberg. That's the only thing that is just, just above the water surface and everything underneath is what is invisible. And this is very good, [00:11:00] comparable to accidents. And when I'm talking about errors, these could be minor errors, and the error does not necessarily have to evolve into a huge disaster, a catastrophe, an accident, an airplane disaster, for example.

But this is always the result. Of an error chain. And there are several errors in between that when they are piling up like dominoes, a row of dominoes that they will eventually, um. Develop into an accident, and the accident is, is the tip of the iceberg. But the mistakes, those errors, they happen more way more frequently than just a single airplane crash.

And this is what we have and what we face each and every day in all of our lives. But this is beneath the water level. This is just the, the lower part. Of the iceberg, and this is called the, um, accident [00:12:00] pyramid. Uh, it's, uh, been researched that, that before an accident happens, there are prob around 15 to 30, uh, significant occurrences, uh, then 300 minor occurrences and roughly around 10 to 15,000 observed work errors.

These all contain elements of the error chain that at the, at the. Worst timing. Most of the time they develop into an accident. Um, and this is, this is, but this is a single case. This is just the tip of the iceberg. But what I'm trying to, when I'm talking to, to my, my, my clients is look for those observed work error at the lowest level of the excellent environment, the lowest part of this iceberg.

When you're looking for those, when you're identifying those error elements, then you can. Um, you are able to prevent the next accident. 

Mehmet: Perfect. I get it. Now how I do this [00:13:00] now I understand it's a chain. It's not like that big mistake that caused, let's say, a crash or maybe a, let's say maybe in a, in a business perspective, a huge, uh, PR catastrophe or whatever it was.

A results of multiple errors that or mistake that happen now, me as a leader, me as a founder of, of a new startup, how I would be able to spot these early so I don't end up with the iceberg that you are just mentioning. So are there any things I can do to spot these errors early on? 

Eckhard: Yes, that's, uh, uh, very much so, and this is, this is why it's so important to really have an ERA culture within your company that people are willing.

And are, are, are empowered to talk about those small errors. Um, when I'm working with hospital, for [00:14:00] example, we're on, um, developing those, um, um, error reporting systems when saying the single, the single simple error could be very important for us to understand, uh, how often do they occur when making a risk assessment, what is the, the po potential outcome?

And when you're identifying those. Those errors early on, then you can prevent it. But it's important that they empower your people, your employees, your colleagues, to openly talk about errors. Small occurrences, the smallest occurrence, the smallest error could be very, very important. Very, very crucial. And there are, there are plenty of accidents that happen on aviation that I'm aware of where.

Those errors, those significant errors, the era one had happened, um, multiple times before the, the day that the airplane crashed. One example is the air in terror, uh, accident in [00:15:00] France and Strassburg Air Airplane 8 3 20 flew down into Strassburg at night and the pilots mistook. A, uh, a, the, the autopilot mode, instead of ver vertical speed, they inserted flight path angle or vice versa.

They intended to have flight path angle and, uh, inadvertently put in 3,300 feet of, uh, sync rate and, uh, didn't realize, didn't recognize, and they eventually crashed on the mountains. And when the investigators asked the pilots at Aaron Tear, how could this be? Most of the pilots admitted. Yeah, we all did the mistake previously, but the error chain before did not develop into an accident because the other circumstances and surroundings, they were not lined up to, to become an error chain and mm-hmm.

Therefore, it's so crucial to tell your colleagues, to tell the pilots, for example, the interior before the accident happens. Tell us about these minute, these small [00:16:00] errors so we get a better understanding. Where do we have our problems? 

Mehmet: Now Eckhard, there is something you mentioned about culture and I, I got it right, but people in general, me included, and I would not hide it.

Uh, if I feel like all eyes and all fingers would be pointing at me psychologically, I will be in a defensive mode, especially if I'm talking about like a business perspective or maybe God forbid, like, uh, crashing an airplane or a car or whatever, or I train because, you know. I, and I think this is a human nature.

Mm-hmm. Now, you mentioned about the, you mentioned about the culture. How, how is that done? Like, I mean, should we give the people some trainings first, you know, [00:17:00] should we. Actually, you know, like in, in maybe people who are in familiar with this term in, in testing and all this. So sometime we say we create a, a sandbox, which is like an environment which is similar to the real one and then you test with it and even if everything is, is broken, you don't care.

'cause it was just a test environment, right? Even developers know this. So where to where to start? Because you know, this is something also related. In my opinion, correct me if I'm wrong to psychology, it's related. It's related also to communication because you know there's a big difference that I come to a card and say, Hey, card, like I think I made a mistake, or maybe, you know, and you might not understand what I want to say.

So, so walk me through like when you, when Sure. When you are advising a, a business about this, what are the steps? I'm very, really interested to know Absolutely. How we, we get this one by one. 

Eckhard: Absolutely. So there, there, there are two or threefold. Uh, [00:18:00] this is a, uh, um, uh, excel and question, but there's a two to threefold answer, first of all.

Sure. First of all, the first and probably the most crucial part of great a culture is what we are. What I often see at companies is that we're, look, we're not looking for the eras. But we are just observing the outcomes, the consequences. So when the airplane crashed, when the car, you know, was slitting off the runway of, of, of the, of the, uh, of the street.

Or there's a crash happened or something happened at the, or there was a, uh, malpractice at a hospital or anything like that. We're just looking at the consequences. And, and then your superiors are on the leaders and the managers are like, so what happened? And what error? Um, what was the error? But what they are right now [00:19:00] doing is they are just on simply just judging the outcome.

Not the error. And this is probably the, the most crucial part is what you're looking at is the error, uh, the, the outcome, not the error. And I'm saying, look for the error one. Where did it all start? Was it, was it fatigue? Was it, uh, stress? Was it confusion? Was it distraction? Was it complacency? Um, or was it, you know, was it maybe negligence or anything like that?

Don't only look at the, at the outcome and, uh, you know, there are so many, um, so many instances where companies are just, they fire the pilots when, or the, the people, the employees when something had happened and they, uh, because they said, oh, we must punish them. But it really is the opposite of great agriculture when you just fire them because you believe [00:20:00] that once you fire those employees.

Problem was solved, but you really didn't understand. Where did it all start from the beginning? Where was the error One, maybe you have a very, very different issue. I'll give you an example, uh, with the hospital, when, when we have a, um, a, uh, the wrong medication, the patient, uh, received the wrong medication on.

When I start, uh, working with my clients, let's, and I ask them, so what did you do? Uh, the most, most often the answer that I get, oh, yeah, we talked to the nurse and, uh, told her, uh, for the next time, you should look better, more carefully. Okay. And it, it's, it's, it's, yeah. Well, you'll, you're laughing and, um, I'm saying it's, it's so, so sad actually, because it's not that anybody is just saying, you know, they're, they're not blindfolding, they're not saying, oh, I didn't, I don't care.

Or just closing their, so what medication do we have enough on? Plenty. And what needs to, uh, we, we need to get rid of. So [00:21:00] it doesn't work the way just telling a single person look better the next time. So what I told my clients is try to figure out where did the error error chain really start? And it, in this case, it's, it did not start with the nurse because when we talked to her, she said, yeah, I, I took the right medication, at least the right packaging, but mm-hmm the insert.

Was wrong. There was the wrong, you know, the, we call it blisters. I'm not sure how it's called in English. Uh, where the, these, I got it. Yeah. Small tablets are inside our packaged vacuum, uh, packaged. It was so, she had the right packaging, but the insert, the tablets were wrong. And she, she didn't see and she didn't realize, so obviously.

The Aron chain started somewhere before that, before she took the right packaging. So, but where did it start? Was it maybe another nurse who put it on, uh, uh, [00:22:00] put it back in the wrong way? Or even worse, maybe it started with the production company. Or with a pharmacist, we don't know. And if you don't look that far, you, you still have what I call safety myopia.

You are blind to the true problems that you might have within your company. So therefore, it's so important to really go to the beginning of the error change. So, um, putting it all together is number one. We need to. Make a diff differentiations. Errors are not consequences, so don't, don't judge the consequence, just judge the error.

If everyone is doing that, you just simply can't file your employees or those poor colleagues who were at the end of the error chain because it will happen again. And then we are again at the, at the same exit pyramid, but look for the arrow one. Where did it all start? Number two is, uh, we have, uh, human factor [00:23:00] training, what's called crew resource management, where, uh, you are trained as crew members that you are, you need to be open to talk about pilots, and you must empower your team, your team by also to tell them, um, you must be.

Give me criticism. I need feedback from you. Be because I am not perfect. I, as a captain, I will make mistakes and I even need the youngest flight attendant on board who's maybe just checked out a few weeks before, um, to tell me if she absorbs something, which is unusual. And this is what's called, uh, psychological safety, where you have a team, which they are.

Willing and open to give themselves critique. So it's called active and passive criticism. Active means meme. I'm telling you, I'm observed. Something. There was something where I believe this is, uh, not the optimal way, but may, maybe there was an arrow [00:24:00] and passive criticism is that I'm not offended when my team member tells me something saying, Ecker, I observed, um, you took the wrong medication.

Or maybe you are. During the approach, you are too fast or we're not gonna make it. You are, you have to do a grower role or anything like that. I'm not certainly taking it personally. And to sum it up, there's a research been done by Amy Edmondson, a Harvard professor, and she quite clearly stated, uh, when she observed two different teams, team one and Team two, team one, they were not, they were toxic.

They were not giving, you know, feedback and criticism and the other, uh, one team B, they team two, they were. Really having this psychological safety, when you have it, when you really live by this, the outcome for the patient, the outcome for the flight will always, always be better. Always. No exception. 

Mehmet: Uh, 

Eckhard: great question, but it's a long answer.

I'm sorry about that. No, 

Mehmet: no, no, no. I [00:25:00] love long, uh, answers with detailed explanation because you know, when we were talking now, even I started to relate to. You know, the training process, which sometime me as the end user or the consumer or whatever you want to call me, maybe I don't understand what's happening in the background.

I mean, I'm talking here about, you know, the aviation. I know like a lot of people, sometime they start to feel uncomfortable. Why the plane is stopping for a long time and you know why we're not moving. Blah, blah, blah, blah. And but the reason is because of course I, you, you know, like maybe not all people are fan of this.

You've been a pilot to se, but I've been fan to understand, you know, these documentaries about a plane crashes and all this. And what I have seen, like every time they were adding, you know, to the. What you call it to the manual that, that you know, the procedures like something new based on [00:26:00] previous things that happened before.

And then you need to go over it. And this brings me to the training part. So, and the openness, and this is I think why in the aviation, you have the pilot and you have the co-pilot so the copilot can interfere and correct if maybe the main pilot forgets something. Right? True. And they talk up and they talk openly.

And then sometime I wish. Why this doesn't happen in out the outside world. I mean, in, in comp in, oh, you are still 

Eckhard: right. Absolutely. I totally agree. 

Mehmet: In companies, right? Yeah. Like why at companies, we don't have the same culture sometime. Mm-hmm. We, we see it in startups. When they are very small team, maybe up to 20 people, 25, and then all of a sudden we see this boom, you know, it's not there anymore.

Right. So, so, so we have, and this, I think, again, it's, it's part of the human nature. Uh, it's just a, a, an observation from [00:27:00] my side. I don't know if you have something to add on that 

Eckhard: Yes. Ab about, but absolutely. One, I'm 100% sure. I totally agree. Uh, even with startups, the best startups, the most successful, most frequent, frequently successful startups, they are collaborating.

They're talking maybe, maybe they are even fighting about, you know, which way should we go. But they are talking, they are collaborating and uh, and normally. Errors very often, very quickly become very obvious to everyone. So you can't earn startups and small companies. You simply can't hide them. Now the, the issue is not the error.

The issue is how. Do we deal with them? How do I, as a startup, you know, as investor or or as the founder of a startup, how do I deal with them if I believe I am the only person in the room in the company who has, is the wise this person on the only one. [00:28:00] Making the right decision and everyone just needs to follow me because I know exactly, and you just simply, um, shut up.

Those companies most often fail. The best companies are the ones who saying meme. Let's, okay, let's find what, what do you think about this? What do we, well, what, uh, maybe we should do it differently. Maybe we should, we should try this program, this programming, this software, or this approach, this marketing, this, you know, this pricing and yeah, my, I don't know.

And oh, that was, and then a week later they sit together and say, okay, no, that, that was a bad decision. Okay, let's try differently. So they are very. Agile, agile companies. This is, so, this is also an a, a, an issue. Agile companies are, are very, very quick and very evolving because they expect that errors will happen, but they learn from [00:29:00] each era.

Mehmet: You know, because you mentioned, uh, you mentioned investors. Uh, I think the best. Industry that have this culture. Very good. Are the v the venture capitalists? Mm-hmm. Um, there is a book, and just for the sake, I don't misspell anything wrong, it's called the Venture Mindset, how to Make Smarter Bets and Achieve Extraordinary Growth by Dr.

Ilia Eev. So he, he's at Stanford, so, you know, I read the book. And one of the things that attracted me is that usually venture capitalists, they don't blame themself. When they do a wrong investment, they start to dig more why they didn't pick up the right one, right? Correct, 

Eckhard: correct. So 

Mehmet: that means, so that means, you know, we act.

So they accept that mistakes would happen. Of course, they would question why wouldn't. Y we didn't pick up for example, X company or Y company or whatever. Uh, and, and I think this is back [00:30:00] to what you were mentioning about the leadership and how it's important, you know, as leader from day one to understand that, hey, we are not perfect.

Like there's, you know, and I always remind people, like, guys, like take any known company brand today and think, you know, how they started it. And there is something I showed to someone the other day. They are very young, so they didn't see it. I showed someone the first computer shipped by Apple and they were shocked.

They said, really? 

Eckhard: Yes. 

Mehmet: A, a piece of wood? And they said, yeah, guess what? It was a failure also. Like it wasn't a success as they expected and they had to go back to the drawing, uh, board and change things. And they said, oh wow, like we didn't know this. And I think this is why Eckhard is very important to spread this more.

Uh, I'm big fan. I tell people, you know, I removed the word fail from my dictionary. I don't have something called fail. [00:31:00] I call it learning, right? Mm-hmm. I call it, yeah. But. Behind, you know, be, be, you know, other than, other than, you know, uh, reframing it and seeing it in this way. How do you see companies really benefiting from adopting this, this mentality and this philosophy of accepting like other, other than of course, uh, you know, the thing that we talk, what are like some other.

Positive outcomes you've seen from, from companies or organizations that adopted this mindset? 

Eckhard: Yes. Um, it, it didn't, you know, it's not that it's been just invented, it's been there all the time. It's been maybe right named differently. It's, uh, Toyota invented the Kaizen system where it said, okay, we're learning from each mistake on trying to make an improving.

Even today when you're looking at, um, um, at, not at, uh, at aerospace and, [00:32:00] uh, spacecraft, for example, we're looking at comparing, uh, the Boeing company when they are, what was it called? I'm not sure. Even the spacecraft that they were sending out, the test flight where the old astronauts were stuck on ISS for Oh yeah, half a year.

It was, it's not Dragon called about, but but, but it was called space something. And um, and they are, they have a very, a totally different approach than SpaceX. For example, when Elon Musk, uh, you might think about him or whatever you want, but when he started a space, when he started and invested into SpaceX, he said, I know exactly that we will have, uh, rockets that will fail.

I expect that they will fail. Uh, yeah. Even with his biggest rocket right now, he says, I know they will fail. And this is, you know, it's, it's, it's. Expecting them. Even [00:33:00] calculating them. Anticipating them, and then saying, okay, once they failed, we do everything that is required to understand why. And whereas Boeing, on the totally op opposite side, they are trying, you know, to think through everything before this first, uh, rocket launch.

And, you know, they postpone, uh, their launch and time again and again, again, again, something happened. Oh, we don't know. We need to, we need to look at it before. And, uh, so a very, very different approach when you are, when you are, uh, talking about spacecraft and how they were developed. And this is for me a perfect example anticipating.

Errors that they will happen, but then dig into them really deep, try to understand where were the true issues, true errors, the true problems. Where is if you're able try to find the error one. 

Mehmet: Great. Now. [00:34:00] A question to you occurred with this age of AI and advanced technology and automation. Do you think we gonna get rid of these errors that happens, or the AI will amplify them more?

It gonna be a matter of us. 

Eckhard: Um, that's one of the most fascinating, uh, questions that I was talking to a lot of experts, um, uh, since, uh, AI became really famous, um, that I, I'm, I'm really concerned, um, if, if that is really, if AI is, is able. To cover issues, occurrences where there had never been a single or a similar occurrence before.

You know, it's, uh, from my perspective, from, from what I learned from all the experts that I was talking to, is that AI is [00:35:00] not, not the, not intelligent in a creative way, but it's, it's very smart because it could, it can. Use all available knowledge that is publicly available. From all databases, uh, you know, with Wikipedia or anything like that.

And even though that, um, you still have the issue that, uh, AI is hallucinating sometimes with its answers. But when you're looking at, um, the, um, at Apollo 11, and this is, uh, a great, great example, uh, when, when, uh, John F. Kennedy said, we'll be gonna send a man to the moon. And bring him, this is the second part is, is, is even more important and bring him back safely to earth.

Hmm. So when they were calculating and there was no limit. No limit [00:36:00] whatsoever about, uh, uh, resources. There was no limit about money. There was no limit about engineers and, and, uh, and, uh, scientists that they were using, uh, for NASA and working with NASA and all the associated companies to. To develop the rockets and the entire system in order to be able to send a man to the moon and bring him back safely.

When they did this, they of course had the issue that said, they said, how do we go about with occurrences that we can't anticipate? We don't know. 

Mehmet: Right. 

Eckhard: And even though the money wasn't a limit, it was no limit at all. No limit at all. So they were trying to calculate what's the risk and uh, how, how, what is the possibility, what something had ha could happen.

And, um. But all calculations were devastating, all calculations, uh, resulted in a, in a [00:37:00] calculation where it said it can't be successful. So they said, okay. We put it in our, our drawer and go back to not a quantitative, uh, analysis, but a qualitative, qualitative analysis and said, oh, we're trying to make sure that they will be able.

So you have a hundred thousands of people working for 10 years and trying to figure out anything that that could go wrong, have checklists for anything, have a, the, the most extensive train that you could get, get astronauts. And then what happened on the day that Apollo 11 was the, the eagle was approaching the moon surface.

What had happened? Suddenly you have an alarm error. Oh, an error. Either, I think it was, uh, 10 11 or something. Uh, 10, 12, 21, I think. And, uh, suddenly an error [00:38:00] occurred within the, um, lunar landing module and, and Houston. Hundreds of people were watching and they didn't know. They didn't know what to do. They were totally confused.

Oh, what happened? And, and finally, at the end of the day, who saved NASA's reputation? It was the, well, were the astronauts Neil Armstrong on board with both re They said, we're, we're, we continue. They, they had the permission to continue, but they saved. And NASA's reputation and saved Apollo 11. The day where was so crucial, no matter how much you were looking at it before, there will be always situations where you need humans with their creative minds to.

To deal with a situations, a situation [00:39:00] which nobody ever anticipated. And this is the exactly question. They all, I, I asked all the, all the experts, um, the I AI experts that I was talking to. Do you think that AI will be able to anticipate anything like that? For example, a Mars mission, if we're, uh, intending to send people to Mars, there will be situations and occurrences where errors will develop and happen.

Uh. Which we cannot predict, and this is why humans, we as humans are so crucial. No matter how much you invest into ai, uh, my personal opinion, um, that you need the creative mind to be able to come up with a solution for a situation which never, never had occurred before. 

Mehmet: I can't agree more because, uh. You know, a AI [00:40:00] also relies on what data we feed it, right?

So, uh, so whatever we teach the ai, the AI gonna do the same thing. So if we feed the ai, to your point about hallucination, and it's not creative in the. Way that some people, they think, so it it gonna mimic us like this is again, let's say personal, uh, view. Maybe in the future it'll, with, with a lot of more correct and accurate data, it might be able to drift a little bit and say, Hey, I think what you're trying to do here is nonsense because it doesn't follow any.

Physical or chemical or mathematical equation, and it doesn't make sense, especially if it's like something related to applied science. Of course, I'm not scientist, I'm not mathematician by any way, but you know, like this is the, the logic that that I use. But when it comes to maybe creating new or giving new ideas, yeah, of course it might give us something right.

Something wrong. It [00:41:00] depends, but yeah, I, I think, you know, the, the, the point that you just mentioned is, is, is, is important about the need for human. Action at the end. So we are the ones in control, so a hundred percent. Now, tell me more about your podcast. Like what do you discuss there? Is it solo? Do you have, uh, guests with you?

Like what are the topics you discuss there? 

Eckhard: Yes. Um, f the, um, first of all, it's uh, um, I do interview sometimes, but, uh, very often I describe occurrences and accidents that happened. I'm trying to. Uh, to, uh, have a, a story that I'm telling. It starts actually with the challenger catastrophe, 1986. Uh, that was the, that's the, uh, um, the after the first, the first episode is introduction.

Second episode is, uh, is challenger. Then you got the 10 reef accident where the two jumbo jets of PanAm and Ka uh, crashed. Yes. And it was a single, [00:42:00] very, very minute era. And, uh, and then I have, um, many others. Uh, one of the most fascinating episodes that I did on how I think, oh, uh, I interviewed the investigator in charge, the person who was in, who was responsible to investigate an air crash, which was Sioux City, the DC 10 crash that happened, and, uh, to learn from him personally, how he and, um, understood and how he, um, uh, experienced.

The entire development of the situation. Most fascinating is I never normally the investigators, the NTSB, the National Transportation Safety World, which is the investigation branch in the United States when they, when they, uh, start investigating, there're being alarmed and called after the crash. But the DC 10 Sioux City accident was very different.

They were alarmed while the airplane was still. In flight, they were fighting for [00:43:00] controls because they lost all the hydraulics and were fighting for control. And the NTSB was absolutely con convinced that they will crash and probably every, everyone will die. So they were alarmed while they were still in the air.

So it all happened real time. And talking to this person, uh, Bob Macintosh, uh, a retired, um, uh, NTSB investigator was just one of the most. Fascinating, interesting stories and, and, and, uh, interviews that I did. And, uh, this is, um, uh, available on Spotify and almost any kind of podcast, uh, uh, platform that you can look.

And then I have many others. There are, uh, Exxon Valdez and, uh, uh, deep Water Horizon. And uh, so it's not only aviation related, but ERO happen everywhere where humans are. 

Mehmet: Looks very. Uh, exciting and, you know, to listen because, you [00:44:00] know, knowing, the, knowing at that moment of, of the incident or let's say the crisis, right?

Because when you have an, an error mistake, whatever again you want to call it, so you have these moments where you have also to take decisions very fast. I mean. The other happened, we, we can't change it anymore. So now the decision process and everything that happens in maybe milliseconds, microseconds, I'm very interested to, to see the people who, who, who, who tells these stories also.

So look something which is, uh, amazing there. What other projects you're working currently on, uh, and anything that you, you might want to share with the audience. 

Eckhard: Yes. You, you realize that I'm really passionate about this issue. This, um, errors, yes. Dealing with errors and, um, one, one journalist asked me, well, Eckhard, you seem to be happy about errors.

No, no, no, I'm not, I'm not passionate about errors, but [00:45:00] I'm passionate about learning from errors and trying to help others. To prevent the next accident, to prevent the next catastrophe and disaster. Um, so I wrote the book, I translated it now in English. It's available on, on, uh, Amazon. Uh, you can listen to my podcast, uh, uh, free of charge.

And currently I'm, uh, I'm just on the last, uh, steps of having an average version of my, um, written book as an audiobook so you can listen to, to the most fascinating stories. Of my book error one as an audio book, and, uh, this will be available just shortly. So I'm excited for this and, uh, very happy about this.

And I hope to instill, you know, the fire that burns in me to, you know, to, to instill this fire and this passion about the subject with my audience. And, uh, whenever I have a presentation as, as, um, as you know, I'm a speaker as well, I'm speaking [00:46:00] for small audiences in front of small audiences or even hundreds or thousands of people, uh, yeah.

With, uh, you know, uh, employees, managers, CEOs, and, and any kind of, uh, of, uh, of people. And when I'm talking about, about us as humans, that we, humans are failable, we make mistakes, but the most, the most. One of the nicest stories that I I am telling them is the story about the tribe of Bababa. The Bababa, it's a tribe in Zambia and Africa, and how they deal with errors and when some person, some, a member of the tribe fail in a bad way, they are not chastising him or criticizing him.

They put, they bring him together. In the center village and they gather around him and then they start. [00:47:00] Cheering him up, they're saying, MeMed, you are a great guy. We love you. You're fantastic. You are right the way you are. Uh, they then this ceremony goes on as long as it takes until his, his frustration about his mistake dissipates and why I love this story because it's, it's not only because it's so extra ordinary, but what they really.

Have made A-A-A-A-A ritual is that we're dividing between the ERA and us as a person. An ERA should never define who we are as a person. And if this is the last, you know, takeaway from this podcast into interview for your listeners, then I'd be very, very happy Tribe of Pemba. They are trying to, to, you know, they are.

Make a distinction, [00:48:00] differentiation between, that's just the error, but that should not define who we are as human beings. 

Mehmet: This is, this is a priceless, uh, advice. And, you know, I'm happy you mentioned this because, uh, I think even in our education system, we need to change this because. Part where I blame society on this is mainly in schools, right?

Where they show kids, our kids and me, including when I was a small child, that, oh, you don't have to make mistakes like you should get. Uh, 10 over 10 grade all the time. And you know, this psychologically slowly, slowly start to, you know, it sits within us and we forget, to your point, our basic nature as humans, that we make mistakes.

And I'm really happy, uh, because you. The [00:49:00] way you explained the whole thing today with examples, with takeaways, really it's made me even think, you know, more deeply about this. Uh, and, and by the way, just so people, if they want to get in touch, do they find you on LinkedIn? Are there any social? Okay, great.

Eckhard: Yes. LinkedIn and everything else you can find on LinkedIn. I do provide aviation investigation training for, um, aviation investigators, safety managers, for example. But you can also look at error. one.net, uh, that's the, uh, on domain. But, uh, everything else you find me. And please anybody, uh, contact me on LinkedIn.

I'm very, very happy about any contact, any feedback, and, uh, any, any kind of, uh, interested parties who is interested in talking and, and giving interviews. You know, this is why, why I'm so passionate about, and, and really appreciate the time and, uh, being on your show. My pleasure. 

Mehmet: My pleasure. It's very obvious, ARD, uh, the, your passion is, is like, I can sense it [00:50:00] from the distance and of course all the links for the, the book, the podcast, your, uh, the website, uh, the about the training and your link it in all will be in the show notes.

So people, they don't need to go and look for that. I really appreciate the time. It was really an very, uh, engaging discussion. I learned a lot from you. Thank you. And do you remind, you, remind us. Simply, I would say it in this way that we are humans, that we do mistakes. We should not blame ourselves much.

Of course, we should take care, but it's okay. We can do mistakes and we can do errors. Yeah. So thank you very much for, for bringing this back and this is how I add my episode. This is for the audience. If you just discovered us by luck. Thank you for passing by. If, uh, you know. You liked what you listened to and you want to keep knowing about the new episodes, subscribe and share it with your friends and colleagues and if you are one of the people who keeps coming again and again, thank you very much for the support, for the encouragement, for all the things that you are doing for the podcast this year.

[00:51:00] Keeping it us up into the two, two, sorry, top 200 charts on Apple Podcasts. I can't thank you enough for this, and thank you for the support of the book launch also as well from nowhere to next. And stay tuned for some news also very soon. And as I say, always thank you very much for tuning in. We'll meet again very soon.

Thank you. Bye-bye.