Echo JS 0.11.0

<~>
lyschoening 3702 days ago. link parent 1 point
I was referring to Google planning to execute the JavaScript on websites it crawls in the near future. If that becomes standard, as it soon might have to, it would solve the SEO issue. The search engines are under pressure to sort it out on their side lest they become blind to a big part of the web — and yes, they will sort it out.

The first rendering would be almost always — though not necessarily — slower on first load. (An empty page + a simple JSON call + DOM injection has the potential of using less bandwidth than static HTML, but of course JavaScript applications are becoming quite large these days). After the first rendering though, it would generally be much faster. JSON uses much less bandwidth than HTML and you can do much more sophisticated rendering on the client than you could do on a server at no extra cost.

Replies

bevacqua 3702 days ago. link 1 point
Nobody is disputing that client-side rendering is great after first load. But that doesn't mean you can get away not doing server-side rendering on first load.
lyschoening 3701 days ago. link 1 point
It does depend on the use case. If you have a website that requires authentication, you can load the whole code while the user logs in. In that case, SEO is not an issue either already now.

It also does depend on your market. Certainly there are some markets where you might have to eek out every last bit of reduced latency. But if your market is a country where nobody has a connection of less than 10MB/s on their desktop OR phone then you can have a 1MB codebase plus stylesheets compressed to ~500KB and served by SPDY/HTTP2 downloaded, decompressed and running in less than 250ms. Why would you then ever go through the additional work of rendering on the server?