Has Boost ever been used in a regulated project (FDA, FAA)?

前端 未结 2 1206
借酒劲吻你
借酒劲吻你 2021-02-18 23:52

While posting a comment recently, I found myself remarking that, in my experience, Boost is not widely used in regulated industries (FDA, FAA). In fact, I don\'t know of any pr

相关标签:
2条回答
  • 2021-02-19 00:13

    I think the best answer we can have here is "yes and no." I will try to explain why.

    Boost is a huge umbrella for many constituent libraries. Some of them depend on each other in various ways (e.g. when a higher-level library needs features provided by a lower-level part of Boost like Type Traits). This raises questions about the usefulness of a simple answer to the question, because if three parts of Boost have been used in a regulated project, but they are different parts than you want to use, it is of no little value to know this. And we will never know the full answer regarding all parts, because you cannot prove a negative (and there are too many parts to ever expect a "100% yes" answer).

    Boost is (and always has been) rapidly evolving. Entirely new libraries are added all the time. ASIO is a big one that didn't exist at all until somewhat recently. This makes it even more difficult to answer the question, because over time there are parts of Boost which are young and not as well tested as others. Additionally, existing libraries sometimes go through backward-incompatible revisions (e.g. "Boost Filesystem 3" not too long ago).

    Many parts of Boost end up in projects not by a traditional dependency but rather by copy-pasting code from Boost, and perhaps modifying it to taste (e.g. adding or removing support for specific compilers). Likewise, many parts of Boost end up in projects via the fact that Boost is sort of a proving ground for many new C++ standard library features, such as shared_ptr (C++11) and unordered_map (TR1). Some features which are part of the language today were originally part of Boost, so many people have used "Boost code" without even knowing it.

    Note that code does not somehow become safer when it transitions to official status within the language--GCC has had bugs which did not exist in the Boost equivalent implementations of the same concepts. This matters when considering practical questions like "Should we allow the use of Boost in our project or should we restrict ourselves to what the compiler vendor gives us?" If you are thinking of using a feature which has been implemented very recently by your compiler vendor (say, within the past year), you may well be better off using a third-party (e.g. Boost) implementation which is more mature.

    Finally, since it seems that the impetus for this question is to gain some reassurance that using Boost is not a bad idea for a production project: I would certainly say that in general using Boost is fine and good, with the huge caveat that you need a local expert in Boost who knows which parts of Boost should not be used in your domain. For example, Boost Spirit, Phoenix, and Wave are examples of libraries which have been in Boost for a while but which very few people truly, deeply understand. It's one thing to use library code you don't fully understand (we all do), but quite another to use code which almost no person on earth understands.

    In summary, I don't think anyone will be able to give you the reassurance that you seek that Boost is OK for safety-critical systems. You need to evaluate it on your own, the same as you need to evaluate your own compiler vendor's software, your other third-party dependencies, and the code you write yourself. I have used all four categories of software quite a lot, and in my experience Boost had fewer critical bugs than any of the others, and fewer regressions than either GCC or my own code.

    0 讨论(0)
  • 2021-02-19 00:29

    I would say that a lot if it comes down to how the system designer(s) are handling system safety at an architectural level.

    Triplication

    For instance if the approach taken is triplicate redundancy with a trusted voter then system trials/testing is going to be the major step in approving the implementation. Suppose that one of the triplicates' development team chose to use Boost. If the system as a whole passed all its test vectors then one could argue that one didn't need to trawl through Boost itself looking for implementation errors. Obviously if all three triplicates had chosen to use Boost then that would be cause for concern, because then the scope of the test vectors becomes unmanageable.

    Triplication is a standard approach to handling the problem of using software resources like compilers, libraries and programmers all of which are at risk of error. Boost is just another one of these. One could argue that Boost shared pointers are clearly a way of reducing the risk of programmer error. That in the round is most likely to be beneficial to the system as a whole.

    Important, but Not Safety Critical

    Where it gets interesting is when triplication is not being used, and one is now in the realms of really having to trust stuff. Interestingly the way a lot of systems seem to get round the problem is to say that ultimately there is a human in control who is supervising and able to intervene in the event of system error.

    For example the car industry has a set of programming rules called MISRA. The software for an ABS system is supposed to be written to this rule set, and the development tools are supposed to be set to enforce those rules on the source code. The idea is that this will reduce the risk of undetected bugs to an acceptable level. And because ultimately there is a driver driving the car they can always do their own cadence braking. And thus the car industry has avoided having to have a triplicate implementation of ABS.

    They are extending the same philosophy to more complex car systems like adaptive cruise control and self driving cars. Personally I think that such an extension is unreasonable for self driving cars. The relevant legislation makes it clear that it is the 'drivers' fault if such a vehicle has a crash (ie you are still 'driving' it), but the glossy advertising won't dwell on that important aspect.

    It's the same in the world of medical devices; there's supposed to be a nurse or someone monitoring the patient anyway, so the occasional blip is covered by that supervision. The whole thing is very poor anyway; whilst the software for a medical device may have been written tested and approved, quite often these things run on embedded Windows XP. They all get networked up and end up being infested with computer viruses, etc. The FDA won't let you have an auto update system inplace, only the device vendor can update it, and of course they can't ever hope to keep up. So you end up with a nicely written well tested and good piece of medical software running on top of an OS installation which has had all the world's hackers running around inside it doing god knows what. I think that the use of Boost in these circumstances is not going to add much to the overall system risk.

    So if a MISRA compliant toolchain offered Boost as part of that toolchain then I don't see why that would be any different to a toolchain offering a standard C library. If the toolchain vendor is certifying it then it's no different to the situation with anything else.

    There are weaknesses with that approach. In my experience I have come across a MISRA compliant tool chain in widespread use whose compiler turned out junk object code when all optimisations were turned on. I was actually able to verify this in the disassembly. I then took a look at the source code for toolchains's standard C library, and it clearly wasn't in itself written to the MISRA rule set, and furthermore it contained glaring, horrible bugs.

    And yet there is no regulatory block to building, testing and selling a car ABS system using this tool chain so long as you tick the MISRA checkbox in the project settings. Adding Boost to that toolchain would hardly make matters worse.

    Safety Critical Without Triplication

    The final approach is no triplication and no human supervision. This is really hard because you then need formal proof of the correctness of every component of the tool chain, OS, drivers, chips, etc. AFAIK it's never been done for a truly safety critical system like nuclear reactors, flight control avionics, or other systems that really will definitely kill people if they go wrong.

    The only thing that comes close so far as I can tell is Greenhill's compiler suite and their INTEGRITY operating system. They can give you (for a large fee) formal testing and verification evidence for every single line of the OS, all their libraries and their compiler, everything. If one were ever to attempt a truly safety critical system without triplication that would be a starting point.

    I don't think they've done a C++11 yet, though I have added Boost to their toolchain and it worked just fine (it wasn't in a safety critical system I hasten to add).

    Conclusion

    Certainly if outfits like Greenhills with a well deserved and good reputation for reliable and thoroughly tested toolchains offer Boost then I think one would be in good position to use it in an regulated system. However I doubt that the whole of Boost would be offered that way; they are more likely to follow the compiler standards.

    I also know that GCC has in the past been put through formal compiler validation testing so that it could be used in Stuff That Matters. I expect that that will get repeated sooner of later for the more recent incarnations that have taken on aspects of Boost.

    0 讨论(0)
提交回复
热议问题