> Manufacturers have never had to care about security because no Gov agency would ever mandate secure firmware.
The problem is that "secure firmware" is a relativistic statement. You ship something with no known bugs and then someone finds one.
What you need is not a government mandate for infallibility, it's updates. But then vendors want to stop issuing them after 3 years, meanwhile many consumers will keep using the device for 15. And "require longer support" doesn't fix it because many of the vendors will go out of business.
What you need is the ability for consumers to replace the firmware.
That solves the problem in three ways. First, when the company goes out of business you can still put a supported third party firmware on the device. Second, you can do that immediately, because the open source firmwares have a better security record than the OEMs to begin with. And third, then the device is running a widely used open source firmware instead of a custom device-specific proprietary black box, which makes it easier for the government or anyone else who is so inclined to find vulnerabilities and patch them.
> What you need is not a government mandate for infallibility, it's updates
So, we don't need an electrical code to enforce correct wiring. We just need a kind soul driving by our house to notice the company who built our house wired it up wrong. Then that kind person can inform the company of the bad wiring.
And if the company agrees it's their wiring at fault, we can wait 3 months for a fix. Then the next month another kind soul finds more bad wiring. And we just have to hope there is an army of kind strangers out there checking every building built by every company. And hope in the meantime that the building doesn't burn down.
Meanwhile, people have to live with bad wiring for years, that could have been completely prevented to begin with, by an electrician following the electrical code we all already agree on.
> So, we don't need an electrical code to enforce correct wiring.
For an analogy to work, its underlying elements should have a relation to the target. Your analogy is not in the same universe. For electrical work, there is a baseline of materials and practices which is known to produce acceptable results if adhered to. For software, there isn't. (Don't tell me about the Space Shuttle. Consumer software doesn't cost tens of millions and isn't written with dedicated teams over the decades.)
The analogy does work. The house is any software provided by any vendor. The kind strangers are white hat security researchers. The people living in the house are the users.
Software absolutely has baseline materials, have you never written software before? Never used a library? Programming language? API? Protocol? Data format or specification? CPU instruction? Sorting algorithm? A standard material is just a material tested to meet a standard. A 10d nail is a 10d nail if it meets the testing specs for 10d nails (ASTM F1667). Software can be tested against a spec. It's not rocket surgery.
No known practices with acceptable results?? Ever heard of OWASP? SBOMs? Artifact management? OIDC? RBAC? Automated security scanning? Version control? Code signing? Provenance? Profiling? Static code analysis? Strict types? Formal proofs? Automated testing? Fuzzing? Strict programming guidelines (ex. NASA/DOD/MISRA/AUTOSAR)? These are things professionals know about and use when they want standard acceptable results.
What are you talking about re: space shuttle and tens of millions? Have you actually read the coding standards for Air Force or NASA? They're simple, common-sense guidelines that any seasoned programmer would agree are good to follow if you want reliability.
I think the problem here is there's too many armchair experts saying "Can't be done" when they don't know what they're talking about, or jaded old fogeys who were on some horrible government project and decided anything done with rigor will be terrible. That's not the way it is in the trades, in medicine, in law, and those folks actually have more to think about than software engineers, and more restrictions. I think SWEs are just trying to get out of doing work and claiming it's too difficult, and the industry doesn't want to stop the free ride of lack of accountability it's had for decades.
AI is going to introduce 100x more security holes than before, so something will have to be done to improve security and reliability. We need to stop screwing around and create the software building code, before the government does it for us.
> What are you talking about re: space shuttle and tens of millions?
GP was almost certainly referring to "They Write the Right Stuff," an old article that is pretty well known in spaces like this. It discusses a process that (a) works extremely well (the engine control software was ~420 kLoC with a total of 17 bugs found in a window of 11 versions) and (b) is extremely expensive (the on-board shuttle software group had a budget of ~35 million per year in mid-90s dollars).
> The analogy does work. The house is any software provided by any vendor.
Even before we start, you immediately have a problem. When a house is built, the thing to be inspected is built in the jurisdiction requiring the inspection.
If you have some code being written in China or India and some US jurisdiction wants to require the sort of programming practices you're suggesting, is the US going to send inspectors to other countries? How do they even validate that the processes are being followed either way? And what are you proposing to do with all the existing code that was written in the past? Requiring the company to have a checklist included in their book of procedures that nobody is actually following doesn't solve anything.
The way this nominally works for building inspections is that the inspector waits until after the work is done and then inspects the work, but that's a validation of the result rather than the procedures. The equivalent for code is an audit, which is dramatically more labor intensive for the government than sending someone to have a quick look to see if the wires appear to be hooked up right, if you expect it to actually do anything.
> I think the problem here is there's too many armchair experts saying "Can't be done" when they don't know what they're talking about
There are too many armchair experts saying "if they can land a man on the moon then surely they can land a man on the sun."
> That's not the way it is in the trades, in medicine, in law, and those folks actually have more to think about than software engineers, and more restrictions
First notice that you're listing all the professions where costs are out of control and the incumbents have captured the regulators to limit supply.
On top of that, those regulations are not even effective in solving the analogous problem. For example, the ethical requirements for lawyers nominally require them to do the thing public defenders aren't provided with the ability to do, i.e. spend enough time on the case to give the client adequate representation. Public defenders are given more clients by the state than they have the resources to actually represent. Quite unsurprisingly, this utterly fails to solve the problem of indigent defendants not having adequate representation.
But that's the thing most analogous to what you're proposing. If you nominally require companies to do something they otherwise have no real incentive to do, which you have no efficient way of verifying that they've done, and provide them no additional resources to do it, you can't expect "they will now do it well" to be the result.
> I think SWEs are just trying to get out of doing work and claiming it's too difficult, and the industry doesn't want to stop the free ride of lack of accountability it's had for decades.
What makes you think the software developers are the ones objecting to it? They, and the incumbent companies trying to raise costs on smaller upstarts, are the ones trying to establish a new racket and exclude newcomers from the industry. The ones objecting are the customers, and anyone who values efficiency and efficacy.
> We need to stop screwing around and create the software building code, before the government does it for us.
"We need to stop screwing around and create the Torment Nexus, before the government does it for us."
I mean this is still a semi-bs response on your case, even if you don't realize it.
Many of these devices have security flaws that are horrific and out of best practices by over a decade.
Just having something like "Have a bonded 3rd party security team review the source code and running router software" would solve around 95% of the stupid things they do.
> Just having something like "Have a bonded 3rd party security team review the source code and running router software" would solve around 95% of the stupid things they do.
It would certainly help, but no economically feasible amount of auditing and best practices could lead to having a warranty on that software. My thesis is that our current understanding of software is fundamentally weaker than that of practical applications of electricity, so it makes no sense to present analogies between the two.
> So, we don't need an electrical code to enforce correct wiring.
Are you familiar with how the actual electrical code works? It's a racket. The code is quite long and most of the inspectors don't know most of it so only a small subset is ever actually checked, and that only in the places where the person doing the work is actually pulling permits and the local inspector isn't corrupt or lax in areas the local tradespeople have learned that they're lax. Then we purposely limit the supply of licensed electricians so that they're expensive enough that ordinary people can't afford one, so that handyman from Craigslist or whatever, who isn't even allowed to pull permits, is the one who ends up doing the work.
It only basically works because no one has the incentive to purposely burn down your house and then it only happens in the cases where the numerous violations turned out to actually matter, which is rare enough for people to not get too upset about it.
But the thing that makes it a racket is the making the official process expensive on purpose to milk wealthy homeowners and corporations who actually use the official process, which is the same thing that drives common people to someone who charges a price they can afford even knowing that then there no inspection.
> Then that kind person can inform the company of the bad wiring.
The point is rather that when the homeowner discovers that their microwave outlet is heating up, they can fix it themselves or hire an independent professional to do it instead of the company that built the house (which may or may not even still exist) being the only one who can feasibly cause it to not stay like that until the house is on fire.
I mean, if you could download an update that would fix the wiring in your house, it would be much less critical that the initial installer got it right. (Still much more important than your router, though; it doesn't stop being an electrocution hazard during the un-updated period.)
Trying to make analogies from software to hardware will always fall down on that point. If you want to argue that there should be stricter security & correctness requirements for routers, maybe look more toward "here is how people actually treat them in practice" with regard to ignoring updates...?
> I mean, if you could download an update that would fix the wiring in your house, it would be much less critical that the initial installer got it right
As in my example, some random stranger needs to first find out your "house" (the vendor's software) is wired wrong. And this needs to happen for every "house" (every piece of software). While waiting for this to be discovered, your house burns down (hackers penetrate millions of devices, or perhaps just Microsoft Sharepoint that the govt is uses).
> What you need is the ability for consumers to replace the firmware.
I don't think that's enough. Most people aren't going to replace the firmware on their device with an open source replacement made by someone else. Now if the firmware was required to be open source, and automatic updates could be seamlessly switched over to a non-profit or government agency in the event of the company going out of business, you might have something. But there would be a lot of details to work out.
I have a PC hooked up to my TV in my living room that has been running the latest version of Kubuntu for over 18 years now. It has had many upgrades in that time but it's still the same basic hardware: A CPU, some memory, USB ports, a video card, and an ethernet port on the back.
That "genericness" is what's missing in the router space. Literally every consumer router that comes out has some super proprietary design that's meant to be replaced in its entirety in 3-4 years. Many can run Linux, sure, but how many have a replaceable/upgradable board? How many are like a PC where you can install whatever OS you want?
Sure, you can forcibly flash a new OS (e.g. OpenWRT) but that is a hack. The company lets you do that because they figure they'll get a bit more market share out of their products if they don't lock the firmware so much. They key point remains, however: They're not just hardware—even though they should be!
The world of consumer routers needs a PC-like architecture change. You can buy routers from companies like Banana Pi and Microtik like this but they're not marketed towards every-day consumers. Mostly because they're considered "too premium" and require too much expertise to setup.
I think there's a huge hole in the market for consumer-minded routers that run hardware like the Banana Pi R4 (which I have). When you buy it, you get the board and nothing else. It's up to you to get a case and install an OS on it (with OpenWRT, Debian, and Ubuntu being the normal options).
We need something like the Framework laptop for routers. Not from a, "it has interchangeable parts" perspective but from a marketing perspective. Normal people are buying Framework laptops because geeky friends and colleagues recommend them and they're not that much more expensive/troublesome than say, a cheap Acer/Asus laptop.
> They key point remains, however: They're not just hardware—even though they should be!
This is the most thoughtful comment I've seen on this topic. I hadn't even considered this approach, but you're right. The hardware needs to be commoditized in a way that makes the software a layer that can be replaced. Someone else said this but in a way that described flashing a third-party package as HN nerds would. That's too much effort and it won't work.
It should be as generic as PC hardware. Every router manufacturer should build devices that can run the OSes of all their competitors' devices and vice versa. Maybe some features won't work with the other company's OS cause it isn't designed for that, but overall it ought to be replaceable. "Normal people" still wouldn't flash a new OS, but making it an option is a step towards making devices more secure.
If every router could get a new OS as easily as your techy friend could install Firefox or an ad-blocker or whatever else, we'd start the long march to a real longterm solution.
You completely missed the point of what I said. I have a Linksys as a cheap backup in case my real router (Netgate / pfsense) dies. The Linksys is running OpenWRT and hopefully I'll never need to plug it in ever again.
I had to verify that OpenWRT was compatible when I bought it _to be a backup_. Re-read what I said about everything being commodity hardware that can run any other device firmware / OS.
It's not so simple. Routers, like most tech emitting and modulating an RF signal by design, are certified products. The radio frequency bands, output power, allowed channels are all tightly controlled. Allowing end-users control without restrictions over such equipment would be unsafe.
It's quite different. The transceiver in your device is mainly a low-power receiver, transmit power is limited to ~100mW at best. Meanwhile a typical AP can go up to 1W per antenna for transmit. Also, the firmware that operates the wifi stack on your network card is not open source or user-modifiable beyond firmware updates issued by the manufacturer. I suggest reading up on wifi and RF before going further.
> I suggest reading up on wifi and RF before going further.
I'd suggest neither matter in the face of how the problem is solved in the consumer cards the OP was talking about. They solve it by locking down the firmware that controls the radios.
The reality is most routers do that too. You can replace the firmware in most of them with OpenWRT or something similar. You still can't exceed regulatory limits because of the signed blobs of firmware in the radios.
Nonetheless, here we are getting comments like yours, which imply all firmware in the device must be behind a proprietary wall because a relatively small blob of firmware in them must be protected. It has its own protections. It doesn't need to be protected by the OS or the application that runs on top of it.
Yet it's in those applications where most of the vulnerabilities show up. Making them consumer replaceable would help in solving the problem. Protecting the firmware is not a good reason to not do it.
I was responding to the original post about open standards. My point is that anything with an RF transceiver will never be as open as a standard PC with replaceable components. The radio portion will always be blocked off. That relatively small blob will always limit how much control you can exert over the device.
We don't have to look far. The embedded space with Arduinos, ESP32s and even RPis is a hacker's paradise. Yet the radio stack is restricted in all of them. For instance, it's not possible to take an ESP32 board and turn it's single antenna into a MIMO configuration, even if you make a custom PCB with trace antennas.
My point is that anything with an RF transceiver will never be as open as a standard PC with replaceable components. The radio portion will always be blocked off.
sure, but again, why would the RF transceiver on my desktop PC or in my laptop be any different than the one in my router?
If you make something internet commected you must provide lifetime warranty for security. no import or sales sor even leases) until you have in escrow the money to pay for them.
i will allow sunsetting and removing ipv4 after 2020 (that is more that 5 years ago)
The concept of community firmware seems like a huge cop-out that allows companies to externalize costs. And it probably won't help security because 99% of devices will never get the third-party firmware installed anyway.
If they were trying to save costs they would ship the community firmware on the device to begin with because then they wouldn't have to write and maintain their own. The community welcomes them to externalize those costs onto the people with better incentives to improve the software.
What they're actually trying to do is obsolete the devices faster because then they won't add new protocols or other software-only features to older devices so you have to buy a new one, or only expose features in more expensive models that the less expensive hardware would also be capable of doing. Which is all the more reason for us to not have that.
And if they were required to allow anyone to replace the firmware then you would get companies reflashing and selling them that way from the store because the free firmware has more advertisable features. There's a reason you can go to major PC OEMs and pick between Windows, Linux and "don't even install one" and the reason is that if you give customers a choice, they generally don't want their software to be made by the OEM.
> What they're actually trying to do is obsolete the devices faster
This is exactly why. Obsoleting older devices keeps (in their eyes) the purchase treadmill running. Making a device that could be updated forever means never making another sale to that user (unless some physical failure happens, or the user wants a second one).
It could be part of dissolution of the company to mandate community firmware. But it depends on their licenses…
Anyhow, this is a common enough practice. Many companies that provide infrastructure type software and sell to Fortune 500 companies often have a clause whereby they deliver their software to their customers if the shut down.
We don't care about their licenses; that's their problem. If they need firmware with a license that allows them to redistribute it there are plenty of free ones to choose from.
And you can't wait until after they're dead to have them do something. By then they're gone or judgment proof because they're already bankrupt. Especially when you're talking about companies that aren't in the jurisdiction because you can't even make them do anything when they're already not shipping products to you anymore. It has to be from Day 1.
There was a promising design from Azure Sphere for 10 years of IoT device Linux security updates from Microsoft, even if the IoT vendor went out of business. This required a hardware design to isolate vendor userspace code from device security code, so they could be updated independently. Could be resurrected as open standard with FRAND licensing.
The main thing you need is for the lowest-level code to be open and replaceable/patchable because it's the only part which is actually specific to the device. Windows running on Core Boot is a better place to be than custom Linux running on opaque blob, because in the first case you can pretty easily get to newer Windows, vanilla Linux or anything else you want running on Core Boot after the original version of Windows goes out of support, and you can update Core Boot, whereas the latter often can't even get you to a newer version of Linux.
Modern coreboot depends on opaque blobs on CPU (FSP/ACM on Intel) and auxiliary processors (ME/PSP), but AMD is moving in the right direction with OpenSIL host firmware. Arm devices have their own share of firmware blobs.
A decade of security updates for routers would require stable isolation between low-level device security and IoT vendor userspace. In Sphere, the business model for 10 years of paid updates was backed by hardware isolation. Anyone know why it didn't get market traction? There was a dev board, but no products shipped.
Oh gee. Maybe because no one sane looks at an industrial product adversarially built to confine and prevent the end user from doing anything to it and wants anything to do with it? It isn't rocket science. If I can't buy it and get a damn manual and programming tools to twiddle all the bits, I'm not adopting. Not even at gunpoint, or if you're the last supplier on Earth. I won't be held voluntarily hostage because a bunch of corporate types, and bureaucrats decided to work together to normalize adversarial silicon. Multiply by everyone I know, and anyone with enough braincells to rub together to pattern match "regulatory capture" and "capitalist rent seeking". You can call me a bore if you want. The incentives are completely unaligned, as this place is so fond of saying. End user adoption is built on faith in product. End user capacity to have faith in the product is based on the capability of the technically savvy purchaser to keep the thing running, repair, understand, and explain it to the non-technically savvy. I look at adversarial silicon isolating me from the hardware; I have to sound off-my-rocker to my non tech-savvy friends family to actually explain that yes, there are industrial cabals out to keep you from doing things with the thing you bought.
It doesn't make any business sense, or practical sense whatsoever. Don't bother quoting regulations that demand the isolation (baseband processors and radio emission regulations) at me. Yeah. I know. I've read those too.
Get over business models that require normalized game theory, and we can talk. Until then, enjoy never having nice things catch on. Hint: your definition of "nice" (where I can't control how it works after purchase) is mutually exclusive with things I'm willing to syndicate as "nice". Nice people don't manipulate others.
> If I can't buy it and get a damn manual and programming tools to twiddle all the bits, I'm not adopting.
Hence the isolated device security hardware should be an open standard with FRAND licensing. If devices ship with a prepaid commercial license for 10 years of device security updates from BIG_CO, the default commercial baseline would be raised independent of IoT vendors. Tech-savvy users could then have the option to replace the device security layer with the OSS _or_ competing commercial stack of their choice.
It might also be illegal. Don't know about the US but forcing a bankruptcy to avoid regulations is usually frowned upon by the court system here. So putting a product in a child-dummycorp to go poof when you want and let the parent stay afloat usually puts the parent in the line of fire directly and you are screwed either way.
It is possible to require escrow accounts for cover costs of fixing future security issues) - these survive bankruptcy. They need to be big enough to cover the costs though - insurance can calculate this but it isn't cheap.
The obvious problem with that is the detriment to smaller companies, but it makes a good alternative to releasing the code.
Then if you're a five person shop making routers and you publish the firmware source under a license that allows anyone to make and distribute modifications you're all set. And if you're Apple or Microsoft and you want to make a router without publishing the source code, you post the enormous bond which you have no trouble doing because you're an enormous company and you're all set.
The typical IoT company not surviving the typical lifecycle of their products shows that IoT is a seriously dorked up idea. Anybody deploying them who values security should choose products that can be updated even after the vendor is gone.
> How are you going to use a warranty from a company that no longer exists to get a security update for a product a million consumers still have?
> The typical IoT company not surviving the typical lifecycle of their products shows that IoT is a seriously dorked up idea. Anybody deploying them who values security should choose products that can be updated even after the vendor is gone.
But then many consumers value cost or other things over security, which is why you need all the devices to be able to be updated even after the vendor is gone.
> I was not talking about using warranty for this.
Then why are you talking about a warranty to begin with?
> But then many consumers value cost or other things over security, which is why you need all the devices to be able to be updated even after the vendor is gone.
This is only possible if the firmware is replaceable. Along with a practical update mechanism it also requires the possibility to create an update package. That can be achieved by using open source components, but there might be other mechanisms. For example making provisions in case of bankruptcy.
> Then why are you talking about a warranty to begin with?
I was making a comparison with warranty law, which exists to ensure a certain minimum bar for quality and longevity of products. Which is usually desired, therefore legal provisions for updateability of hardware should also be required. Note that a firmware update might well become required within the warranty period.
This is by no means a new concern. IP cams, home routers, robot vacuums, and internet-enabled fridges exist for a long time already. The warranty period was never intended to cover "smart" devices. Maybe forcing an extension of the warranty period for such devices is enough to take care of the problem.
Why not just put the onus on ISPs? 99% of users lease their router from their ISP. If updates stop after three years, looks like you're getting a complimentary service appointment to get a new router.
> What you need is the ability for consumers to replace the firmware.
> That solves the problem in three ways.
That alleviates the problem, but definitely doesn't solve it. Updates are still required, and most people will never update devices they don't directly interact with.
Which introduces new security risks, but more importantly, the consumer has to configure the device to use open source firmware, and set up auto updates, unless the device is being auto updated by the device manufacturer and forces all of their customers to switch to the new firmware, which seems very unlikely.
How? The device phones home to the manufacturer's servers to get new updates. Manufacturer goes out of business, servers get shut down. How does it know where to get updates now?
> Manufacturer goes out of business, servers get shut down.
Continue your chain of reasoning: DNS name becomes unmaintained, gets grabbed by open source / foundation / gov agency, pushes open source firmware update.
The government obviously cares less about citizens running firmware China can hack than it does about citizens potentially running firmware the government can't hack.
Somebody has to pay for the support. There is no free meal.
Enterprise must be able to pay for support for as long as they use devices. Solved.
I can only think of requiring the devices to be serviceable, as you say. The absolute only way I can think of charging the consumers, ie the owners, is to charge a tax on internet connections. Then the government would pay somehow vulnerability hunters working along patchers, who can oversee each other.
Consumers are tricky: if you include support in the sale price, the company will grab the money and run in 3 or 5 years; and some companies will sell cheaper because they know they won't provide support.
> Somebody has to pay for the support. There is no free meal.
The problem is not that people need a free meal. The problem is that people need the ability to eat some other food when the OEM's restaurant is closed or unsatisfactory.
> Who creates and regularly keeps the firmware for the dozens and dozens of router models secure and up-to-date?
99% of the firmware is not actually device specific, and more to the point no one has to create it because it already exists and is already maintained. You don't have to write the Linux kernel from scratch for every different device.
The problem looks like this: The vendor creates an opaque blob that runs on part of the device. This is only 1% of the code that runs on the device but it's the device-specific part. Moreover, that code interacts with the kernel, but was written to assume a specific older version of the kernel which is now out of date.
Updating it to use a newer kernel requires very little work if you have the source code -- in that case much of it is just automated refactoring -- but without the code it becomes a much more arduous reverse engineering effort. Likewise, if the device-specific code has a bug and you have the source code, the cause of the problem is easier to identify and the fix is to change two lines and recompile it. But without the code just identifying the problem becomes an intensive reverse engineering task again.
So you have a community which is willing and able to do a finite amount of work. Some subset of the device owners are programmers and if they can spend two hours fixing a problem that they themselves have, it gets fixed for everyone. But if fixing the same problem takes them two months, they don't. Therefore, the solution is to do the thing that allows it to take the shorter amount of time so that it actually happens.
It would be ideal if we could come up with a way to get people paid to maintain a community firmware. However, that's a considerably harder problem than "you absolutely must allow community firmware to be flashed".
I agree. It's a harder problem and it's the more critical problem.
Businesses aren't incentivized to maintain it and hoping that the community can support it by opening it is perhaps necessary, but it's far from sufficient.
Either the business or maintainers need to be sufficiently incentivized--whether it's through funding, reputation, or something else (graduate-student torture).
I mean, OEM would make the device upgradeable, government will pay independent bounty hunters and patchers and will push the updates. Then consumers pay for all that.
> But then vendors want to stop issuing them after 3 years
Tough shit. You provide updates for the mandated amount of time, or you lose access to the market. No warnings, you're just done.
> And "require longer support" doesn't fix it because many of the vendors will go out of business.
Source code escrow plus a bond. The bond is set at a level where a third party can pay engineers to maintain the software and distribute updates for the remainder of the mandated support period. And as time passes with documented active support, the bond requirements for that device go down until the end of the support period.
Requiring that the customer be allowed to replace the firmware is essential, I agree, but not for this reason. That requirement, by itself, just externalizes the support costs onto open source communities. Companies that sell this sort of hardware need to put up the resources, up front, irrevocably, to ensure the cost of software maintenance is covered for the entire period.
Personally I don't buy consumer router hardware that I can't immediately flash OpenWRT on, but that option is not suitable for the general public.
How does this help? 99% of the population aren't technically minded enough. Most people just buy a wifi router, plug it in (maybe having read the instructions) and that's it. They have neither the skills nor the inclination to update firmware.
The real problem is: assuming that firmware can be updated, how do you run a nationwide update programme overcoming a population that doesn't really care or have the skills to do it.
Vehicle safety standards (mandated annual safety checks like the UK MoT test) is the closest analogy I can think of - in the UK you can't insure your car without a valid MoT. If you were serious, then maybe tying ISP access to updated router firmware would be the way to go.
Customers notice higher prices at time of purchase a lot more than they notice a lack of future security updates, so good luck selling them for that price when someone else just puts an existing open source firmware on the existing hardware and sells it for the existing price.
"You ship something with no known bugs and then someone finds one."
You managed to say that with a straight face!
Let's keep this ... non partisan. You might recall that many vendors have decided to embed static creds in firmware and only bother patch them out when caught out.
How on earth is embedded creds in any way: "no known bugs"?
I think we are on the same side (absolutely) but please don't allow the buggers any credibility!
> How on earth is embedded creds in any way: "no known bugs"?
You misunderstand how organizational knowledge works. You see, it doesn't.
Some embeds the credentials, someone else ships the product. The first person doesn't even necessarily still work there at that point.
Remember that time NASA sent a Mars orbiter to Mars and then immediately crashed it because some of them were using pounds and the others newtons? Literally rocket scientists.
The best we know how to do here is to keep the incentives aligned so the people who suffer the consequences of something can do something about it. And in this case the people who suffer the consequences are the consumers, not the company that may have already ceased to exist, so we need to give the consumers a good way to fix it.
"Spectrum quickly learned that far more had gone wrong than just a units conversion error. A critical flaw was a program management grown too confident and too careless, even to the point of missing opportunities to avoid the disaster.
"As reconstructed by Spectrum, ground controllers ignored a string of indications that something was seriously wrong with the craft's trajectory, over a period of weeks if not months. But managers demanded that worriers and doubters "prove something was wrong," even though classic and fundamental principles of mission safety should have demanded that they themselves, in the presence of significant doubts, properly "prove all is right" with the flight.
Plus, navigators, had concerns about the trajectory,which were dismissed because they "did not follow the rules about filling out [the incident surprise and analysis procedure] form to document their concerns" - from a trajectory team which was understaffed and overworked.
> When you are building software, you build a security process, not security individuals or stuff like this happens.
You can't solve an incentive problem with process because then they lack the incentive to follow the process.
To enforce a law you need to be able to identify a violation at a point in time when you can still impose a penalty for it. When a device is first released, you don't yet know if anyone will find a vulnerability in it or if the company will stay around to update it if they do. By the time you find out if it will happen, you can't punish them for the same reason they can't provide updates: they've ceased operations and no longer exist. So that doesn't work.
> With software writers the losses occur to the end user.
Which is why the end user needs to be empowered to efficiently prevent the losses, since they're the one with the strongest incentive to do it.
>And "require longer support" doesn't fix it because many of the vendors will go out of business.
Do you mean 'out of business so they cannot provide updates'?
Because, if you mean cheap companies won't be able to provide updates and stay in business, surely that's the point. Companies would have to shim to a standardised firmware that was robust, or something, to keep costs down.
Isn't this all to protect USA business interests and ensure the Trump regime can install their own backdoor though?
> Secure software won't protect you from insecure hardware
Then what's KPTI etc.?
> which also needs to be formally verified for a secure system.
Now we just need a correct and complete theory of quantum mechanics and to do something about that Heisenberg thing.
In general formal proofs tell you if something is true given a stipulated set of assumptions. They don't tell you if one of the stipulated assumptions is wrong or can be caused to be wrong on purpose by doing something nobody had previously known to be possible.
You're being sarcastic, right? The entire concept of "guaranteed to be secure" is a fantasy.
Even EAL7 can't guarantee anything. It can only say that the tools used for verification didn't find anything wrong. I'm not saying the tools are garbage, but the tools were made by humans, and humans are fallible.
The problem is that "secure firmware" is a relativistic statement. You ship something with no known bugs and then someone finds one.
What you need is not a government mandate for infallibility, it's updates. But then vendors want to stop issuing them after 3 years, meanwhile many consumers will keep using the device for 15. And "require longer support" doesn't fix it because many of the vendors will go out of business.
What you need is the ability for consumers to replace the firmware.
That solves the problem in three ways. First, when the company goes out of business you can still put a supported third party firmware on the device. Second, you can do that immediately, because the open source firmwares have a better security record than the OEMs to begin with. And third, then the device is running a widely used open source firmware instead of a custom device-specific proprietary black box, which makes it easier for the government or anyone else who is so inclined to find vulnerabilities and patch them.