• r00ty@kbin.life
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    The Boeing MCAS story and the fact they were not held accountable at all terrifies me. Not the idea of the augmentation, I kinda understand they needed to fit bigger engines onto their existing frame until they can make and certify a new one. It’s not a good solution, but I can understand the business thinking behind it.

    Here’s where it goes wrong for me.

    • Not documenting the MCAS system, in order to cheat the system to not require recertification for the plane. Adding a system that can make trim changes without informing the pilots and that there isn’t a documented way to override was an accident waiting to happen.
    • Worse to me, is the fact that while the aircraft has two AoA sensors, the MCAS system only takes input from one of them. This is terrifying. There’s no way the software can know the inputs could be wrong. So the software would effectively try to kill people all the while thinking it’s actually doing you a favour.

    It was a debacle that should have been investigated further. Now, it’s not fair (although it probably is) to compare Boeing putting their toes into more flight automation against airbus. But the modern airbus jets use multiple sensor sources, and when there is a disagreement, they will reduce flight protections and inform the pilots about it, pilots that will be trained on the various flight modes that can come out of this. Just using one sensor was just a crazy decision, and I bet it was based on cost.

    What’s going on now though is more a general QC/QA situation. Where I think it overlaps with the MCAS situation is that both the lack of redundancy in MCAS sensor input and the lack of QC in general just reeks of ruthless cost-cutting.

      • r00ty@kbin.life
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Yeah, it’s a race to the bottom. But we have strict aviation rules across the west for this very reason.

        The crash in Japan is actually an example of a failure that fits the Swiss Cheese model. I think ultimately most of the blame will fall on the surviving coastguard captain, but everyone involved had a chance to stop that crash. The coastguard messed up and joined the runway when he shouldn’t have. Mistake 1. ATC didn’t notice the warning on the monitor that would have drawn attention to this. Mistake 2. The pilots didn’t see the coastguard plane on the runway. Now, this one, is a tough one. With all the bigger planes with beacon/nav/interior lights, the runway lights, the airport lighting. It may well have been hard to see the small plane on the runway, but it had beacon lights on, and they had the opportunity to see it and abort the landing.

        So essentially there were three chances to stop that accident and all three were missed.

        I completely agree, designing a feature on a plane that doesn’t respect this way of thinking is not the behaviour of a responsible aviation company.