Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We're still using hypertext transfer protocol, right? You need to send hypertext down the wire.

We shouldn't let one company, google, dictate how the web works, simply because of their proprietary technological innovation.



Roads are meant for horses, right? We shouldn't let one company, Ford, dictate how the roads work.

My man, if we only used infrastructure and technology in the way it was originally intended and narrowly imagined, the world would be a dim place.


Ok, I see your point.

But, have Yahoo or Bing or DuckDuckGo made the transition to be able to crawl the web with a full JS & DOM rendering engine? I doubt it. By eschewing that compatibility we're setting a very high bar for what any competitor to google would have to achieve.

I like google. I just don't think it's good to have one company own a market so completely.


I agree that diversity is important, but if Google is owning the market via innovation, they deserve to own the market.

In this case, they're closing an ever widening rift between human consumable content and that which is targeted for search engines. The data that comes down the pipe can still be semantic, it's just glued together differently.

Web technology is presently outpacing the ability of indexing services. The onus is on Google's competitors now; they need to catch up. Luckily js rendering on the server isn't an academic problem, it's one of resource allocation.


But, have Yahoo or Bing or DuckDuckGo made the transition to be able to crawl the web with a full JS & DOM rendering engine?

They can just use PhantomJS (http://phantomjs.org/), which is free and open source.


They could, but I wonder what it'd take to scale it to crawling that number of pages.

I think only Bing would have the cash and resources to build that.


I don't really like google and I agree.


Google is not dictating anything so there is no need to vent your dislike, of, Google, in that way. If anything, this releases developers from a constraint on how sites are built.

That said, I'd rather see more server-side HTML instead of client-side JS when it comes to the web. If you're developing a game, client-side JS is fine. If you're serving textual content with the odd image, please serve it in the way the web was won: HTML. Use CSS if you feel the need for some 'style', but remember that perfection is reached when there is nothing left to take away, rather than nothing more to add.


If my product is well-formatted data, why should I build a server that needs to know how to vend that data in a machine-readable visual-formatting-agnostic format (such as JSON or XML) and also a targeted-for-human-consumption format such as HTML? It's a reasonable architectural decision to build one server that knows only how to vend JSON and an associated viewer (that happens to use HTML and JavaScript for presentation purposes) that knows how to consume and render that JSON.


Because it pushes work onto the client, and the client has an indeterminate number of applications competing for the same indeterminate speed and size of CPU and RAM. You know exactly what your situation is on the server, and you can decouple data and presentation without putting them on different sides of the wire.


The counter-point is that (a) the number of applications competing for resources is semi-determinate, constrained by the user's patience and willingness to upgrade and (b) the number of users the server must support is similarly semi-determinant, constrained by the popularity of the service, size of data a user could want to manipulate, and the desired speed of those manipulations. There's an argument to be made for the economy of scale of pushing some of the presentation work to the side of the wire where there is a user who wants to consume the data; I wouldn't want to have to use, say, Google Spreadsheets as a server-side-only product.

Note that we can reduce this line of thinking to absurdity if we substitute JavaScript with, say, video; some clients are incapable of rendering video, some clients are constrained in the size of video they can render. Do we therefore never stream video down the wire, and require the client to grab the whole video element wholesale as a single request-response? Of course not; users don't want to wait that long. So developers make an educated guess on what their clients' user-agents can do and set a sane bar for size and scope of video content. A similar sane bar can be set for richness of client-side executable code interactions.


I'm not saying that you shouldn't architect that way, I'm just giving reasons why you might not.

>semi-determinate, constrained by the user's patience and willingness to upgrade

The user can vaguely determine it, but you can't. You can't guarantee a user experience without relying on the user not to be viewing your page on a two year old cellphone that they're simultaneously listening to music on and running three other js-heavy pages.

>the number of users the server must support is similarly semi-determinant, constrained by the popularity of the service, size of data a user could want to manipulate, and the desired speed of those manipulations.

You can assign whatever arbitrary amount of resources you want to your site on the server side, and allow in whatever limited (or unlimited) number of clients/visitors that you want to, so it's completely deterministic. The amount or resources utilized (whether on the client or on the server) will likely not be very deterministic, but you can have algorithmic problems whether on a thick client, on a backend data server, or on a backend presentation server.


Well said. Too many websites these days are JS heavy.


> We shouldn't let one company, google, dictate how the web works, simply because of their proprietary technological innovation.

What? This article is about how a constraint due to a single company (and to a lesser extent, other search engines) is being _relaxed_. Before this, single-page apps were a riskier proposition because of many sites' reliance on Google traffic. Now that we're moving closer to that technical limitation being overcome, this is one _less_ constraint "dictated" by Google that site owners have to deal with.


Not to sound like a complete troll but, uh, how should i send this gif to the client?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: