Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The opinionated approach feels restrictive to me. My best recommendation to avoid slowness, privacy violations, and other nasty things is to not include certain features as opposed to eliminating JavaScript.

For example if we know large SPA frameworks and/or slow websites require use of convention A then simply not include support for convention A. This is a fail by design approach instead of a blacklist approach.

Here are some things to block to radically increase performance:

* string parsing: innerHTML, querySelectors, console.log

* allow a quota of load time requests and then stop taking http requests until a person interacts with the page (or just break the page). If you set the quota at 10 then any pages with greater numbers of requests will just stop loading. That alone will eliminate 99% of spyware and dramatically shift user behavior.

* drop requests for JavaScript from different origins than the page will improve both performance and privacy

The biggest thing to help with privacy is to not support CORS. That will do more than eliminating JavaScript.

These things are still highly restrictive but much less so than a blacklist approach.



> string parsing: innerHTML, querySelectors, console.log

I don't have numbers, but I doubt an in-browser JS implementation without these APIs would be useful on many websites. Even HN uses innerHTML.


Feels much harder to both implement and debug, though.


That completely misses the point. Privacy and performance advocates don’t care how hard life is for JavaScript developers. JavaScript developer convenience as the top priority results in the very slow privacy violating sites things like this browser exist to ignore.


No, I mean for the developer of the web browser.

I have worked on a web browser. It's fricking hard. Remove JavaScript and DOM entirely, and it gets much easier.

Now if you start to implement select parts of the spec, it becomes even harder when something fails to know whether it's a feature (because you don't like that part of the spec) or a bug (because you misunderstood or misimplemented some part of the spec that you wanted to implement correctly).


I agree with that about 90%. The standards do not come as a single universal package. Browsers cherry pick what they want to support or what support to drop as they can release new features. The challenge there is picking features to support but ensuring they work independently and without regression. It takes testing the incumbents have spent years developing.


It's a whitelist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: