Maybe I'm dense or naive, but I don't think there's any precedent for that. A gag order is one thing (and there are certainly places for it), but forcing someone to lie would hopefully violate the First Amendment.
It's strange that chainsaw10 is being downvoted for their comment. From the second link above, "Have courts upheld compelled false speech? No, and the cases on compelled speech have tended to rely on truth as a minimum requirement." That sounds more like there is not precedent to force people to tell explicit lies.
Also true. But we know from the Snowden revelations and other sources that Apple has been backing up its promises. So we have at least some level of assurance that Apple is a good actor.
https://en.wikipedia.org/wiki/40-bit_encryption was the most secure thing it was legal to export. The Netscape browser, in particular, had a lot of hoops you had to go through in order to get the 56 bit version meant for US audiences. Therefore, even most Americans with internet access at the time had the crippled international version.
Whether or not they hold the keys at present, Apple is in a position of power with regard to the iOS environment. In a technical sense it would be fairly straightforward for them to acquire the keys.
Trusting the company has nothing to do with it - they could be legally compelled to do so in a secret court, and gagged with a NSL to keep them from revealing such an order. Sadly that's the reality we now live in.
No, they can't... not without designing changes into their hardware to allow retrieving the keys from the secure enclave.
In theory Apple could modify iMessage to MITM the key distribution server and enable eavesdropping. The only way to protect against that is to provide in-person validation mechanisms so users can directly compare keys. I hope they add such a thing, not that 99.99999% of their users would ever use it.
As far as the US legal system goes you'd need positive law to enforce wiretapping requirements. Courts (as a general rule) can't issue orders to force Apple to write new code or modify their silicon design to support something the government wishes it could have. Given the way SCOTUS has been approaching cell phone privacy I'm not sure such a law would pass muster.
Laws could, however, force the company to secretly push out an update that sends your keys to be held in escrow on government servers should the need arise to decrypt your stuff.
Similarly, laws cannot force a company to divulge encrypted data if the company does not hold the encryption keys.
So as long as you trust the company, the country it's in is not relevant, at least for the situations outlined above.