You could ask the same question for lots of languages, including Eiffel, OCaml and soon Rust. They may all look better suited to write an OS from scratch now. Part of the answer is IMO that you see the problem with the wrong angle. Unix authors did not look at a range of languages and pick the best one to write their OS. They basically invented C for the purpose to write Unix. At the time C was a very high level language compared to what people wrote OSs in, which is various kinds of assembly. There were other high level languages back then (Fortran, ALGOL, early Pascal...). They were probably all "safer" than C but that didn't matter compared to other advantages of C: ease of portability and ability to mix with assembly. Nowadays, the reasons why OSs are still written in C / C++ also include various "legacy" effects: - large existing code bases in C; - experienced OS programmers know and think in C; - more man-hours have been invested in C / C++ compilers than in anything else. The last point is particularly significant. The main reason "C is fast" is probably not that it is close to the machine, but that compilers are very good at optimizing it. So that is for OSs. For mission-critical apps, I would be more nuanced: as you said we are seeing some of them written in other languages now (ADA, OCaml, Erlang for distributed systems, even Coq sometimes...) and this is a good thing. That being said, the C ecosystem is still worth considering. It is much easier to write near bug-free C code now thanks to tools like Valgrind and static analyzers. If you are serious about comparing C to other languages in the context of mission-critical applications you must use a team that uses all these tools as your reference. Considering that it is much easier to find experienced C programmers than ADA programmers, I wouldn't always bet on ADA to produce the best result within similar budget and time constraints.