The silo'd codebases I was referring to are verification tools they produce. They're used to prevent attacks. Each tool has one or more capabilities others lack. If combined, they'd catch many problems.
Examples: KLEE test generator; combinatorial or path-bases testing; CPAChecker; race detectors for concurrency; SIF information flow control; symbolic execution; Why3 verifier which commercial tools already build on.
I find that AI models are very bad at doing performance work because they keep guessing how their changes affect things or not really understanding how the profiler results work, leading to them going in circles and taking forever. I have noticed this effect in surprisingly few lines of code (hundreds).
One thing I've found that I've found super helpful for this is converting profiling results to Markdown and feeding it back into the agent in a loop. I've done it with a bit of manual orchestration, but it could probably be automated pretty well. Specifically, pprof-rs[0] and pprof-to-md[1] have worked pretty well for me, YMMV.
Yes, I was being a bit terse with my language, which is why I clarified a bit in my last comment. Here's how I might have written it better:
> FGM reconstruction actually seems to have negative outcomes post-surgery. I'm surprised by this.
Surgery is essentially mutilation, just in the physical sense (you are cutting through healthy tissue), not a moral sense (the whole point is to make the body more healthy). The information gathered from mapping nerve endings in a clitoris will hopefully help surgeons perform reconstruction surgery with less damage to the body.
Rust is just a bit less than 11 years old, C++ was 13 years old when screwed up std::string ABI, so, I think Rust has a few years yet to do less badly.
Obviously it's easier to provide a stable ABI for say &'static [T] (a reference which lives forever to an immutable slice of T) or Option<NonZeroU32> (either a positive 32-bit unsigned integer, or nothing) than for String (amortized growable UTF-8 text) or File (an open file somewhere on the filesystem, whatever that means) and it will never be practical to provide some sort of "stable ABI" for arbitrary things like IntoIterator -- but that's exactly why the C++ choice was a bad idea. In practice of course the internal guts of things in C++ are not frozen, that would be a nightmare for maintenance teams - but in theory there should be no observable effect from such changes and so that discrepancy leads to endless bugs where a user found some obscure way to depend on what you'd hidden inside some implementation detail, the letter of the ISO document says your change is fine but the practice of C++ development says it is a breaking change - and the resulting engineering overhead at C++ vendors is made even worse by all the UB in real C++ software.
This is the real reason libc++ still shipped Quicksort as its unstable sort when Biden was President, many years after this was in theory prohibited by the ISO standard† Fixing the sort breaks people's code and they'd rather it was technically faulty and practically slower than have their crap code stop working.
† Tony's Quicksort algorithm on its own is worse than O(n log n) for some inputs, you should use an introspective comparison sort aka introsort here, those existed almost 30 years ago but C++ only began to require them in 2011.
A lot of people here talking about the northeastern routes, but there's another good one that is kinda worth it: San Jose to Santa Barbara on the Coast Starlight, on account of SBA being very expensive to fly through. It's about 8 hours (driving is 5-6) and comparable in price to a bus (and it is probably beating gas right now, to be honest). And the tracks go by some of the prettiest coastline in the United States, usually around sunset too.
reply