Echo JS 0.11.0

<~>

victorquinn comments

victorquinn 3723 days ago. link 1 point
Very interesting, I wrote something very similar we have had in production for about a year now we called BatchRequest.

The docs: http://batch-request.socialradar.com
On Github: https://github.com/socialradar/batch-request
The Koa version: https://github.com/socialradar/koa-batch



A few differences I note off the bat:

* multifetch appears to take the list of requests to make in the url, BatchRequest takes them as a POST body JSON

* It seems multifetch only supports GET requests, BatchRequest can perform GET, POST, PUT, DELETE, PATCH

* BatchRequest has a dependency model so you can say perform request A then B and after B is done then C to ensure they run in order. It doesn't appear multifetch has this capability

* BatchRequest includes validation middleware to ensure the batch request itself is valid prior to firing it off

* multifetch seems to only support JSON, BatchRequest can handle requests of any format including no return at all or even XML *shudder*

* multifetch seems to make requests only to this server, BatchRequest could be used to make requests to external servers if desired (though using a whitelist model so as not to open a massive security hole of course)

Interestingly, their response bodies are almost identical. Also, entirely coincidental but they both are at version 0.1.1


Not meant as a one-upping kinda thing, just pointing out some differences. I'm glad someone else also noticed a need for something like this!
victorquinn 3723 days ago. link 2 points
I mention it in the footnote (and disclaimer, I'm a Node expert, not a networking expert, so I may be totally wrong), but one of the main benefits of SPDY is the data pipelining, so it will open a connection and leave it open without forcing a TCP handshake with every request. Helps mostly with multiple requests in rapid succession. That is one of the main benefits for deploying SPDY.

If I were to terminate SPDY at the load balancer then communicate via plain jane HTTP from the load balancer to our Node servers, the pipeline would be kept open between the client device and the load balancer which is good, but there would be a TCP handshake between the load balancer and the Node server on every incoming request since it'd be communicating over normal HTTP, thereby eliminating most of that benefit of SPDY.

It'd be a bit like starting with a slow internet connection (10Mbps) and a slow router (10Mbps). Your throughput is 10Mbps.

You upgrade to a blazing fast internet connection (1000Mbps) but keep your slow router (10Mbps). Your throughput is still only 10Mbps in spite of your fast connection.

But, if you can upgrade your router (1000Mbps) and pair it with your fast internet (1000Mbps), then you really get fast throughput (1000Mbps).

That's my understanding anyway, again I'm no expert. I found it a bit tricky to do things like performance metrics for SPDY because most performance and load testing tools (like ab) only support HTTP(S) and not SPDY.