Their explanation for why Go performs badly didn't make any sense to me. I'm not sure if they don't understand how goroutines work, if I don't understand how goroutines work or if I just don't understand their explanation.
Also, in the end, they didn't use the JSON payload. It would have been interesting if they had just written a static string. I'm curious how much of this is really measuring JSON [de]serialization performance.
Finally, it's worth pointing out that WebSocket is a standard. It's possible that some of these implementations follow the standard better than others. For example, WebSocket requires that a text message be valid UTF8. Personally, I think that's a dumb requirement (and in my own websocket server implementation for Zig, I don't enforce this - if the application wants to, it can). But it's completely possible that some implementations enforce this and others don't, and that (along with every other check) could make a difference.
Interesting that https://github.com/uNetworking/uWebSockets.js (which is C++ with node bindings) outperforms the raw C++ uWebSockets implementation.
It's also interesting that https://github.com/websockets/ws does not appear in this study, given that in the node ecosystem it is ~3x more likely to be used (not a perfect measurement but ws has 28k github stars vs uWebSockets 8k stars)
Thanks for the free access links. I did read through a bit.
The title is misleading because exactly one implementation was chosen for each of the tested languages. They conclude “do not us e Python” because the Python websockets library performs pretty poorly.
Each language is scored based on the library chosen. I have to believe there are more options for some of these languages.
As someone who is implementing an Elixir LiveView app right now, I was particularly curious to see how Elixir performed given LiveViews reliance on websockets, but as Elixir didn’t make the cut.
Was this published as-is to some sort of prominent CS journal? I honestly can't tell from the link. If that's the case, I'm very disappointed and would have a few choice words about the state of "academia".
Too bad that uWebsockets was used for Node because a lot of higher level libraries are built on top of https://www.npmjs.com/package/ws
I have a home grown websocket library I wrote in TypeScript for node.js. When I measured it a couple of years ago here were my findings:
* I was able to send a little under 11x faster than I could process the messages on the receiving end. I suspected this was due to accounting for processing of frame headers with consideration of the various forms of message fragmentation. I also ran both send and receive operations on the same machine which could have biased the numbers
* I was able to send messages on my hardware at 280,000 messages per second. Bun claimed, at that time, a send rate of about 780,000 messages per second. My hardware is old with DDR3 memory. I suspect faster memory would increase those numbers more than anything else, but I never validated that
* In real world practical use switching from HTTP for data messaging to WebSockets made my big application about 8x faster overall in test automation.
Things I suspect, my other assumptions:
* A WebSocket library can achieve superior performance if written in a strongly typed language that is statically compiled and without garbage collection. Bun achieved far superior numbers and is written in Zig.
* I suspect that faster memory would lower the performance gap between sending and receiving when perf testing on a single machine
(2021) Was surprised it used a depreciated Rust crate until I noticed how out of date it is
I'd like to know why Elixir and Erlang were excluded.
The SSRN link doesn’t have a login-wall: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3778525
A meta comment: This paper gives an example of a "teaser abstract". It says what was done, but does not say anything about the actual results. This style is relatively common, but I find it very annoying. There was certainly enough room in the abstract to provide a concise summary of the actual results, which would both inform the reader and perhaps encourage more people to read the entire paper.
I'm surprised at how well php is doing here. I'm guessing they are using fibers?
Not including Swift in such a research seems to be a big oversight to me.
Is this a peer reviewed paper? It does not seem to be. At a first glance, the researchgate URI and the way the title was formulated made me think it would be the case.
It would be interesting to this repeated with Starlette and Granian on Python 13 (with GIL and JIT).
The DX for websockets in Go(gorilla) is horrible. But i do not believe these numbers one bit.
[flagged]
If the author is reading this, I think a single repository would be more appropriate than multiple repos [1]. It would be nice to set things up so we can simply git pull, docker run, and execute the benchmarks for each language sequentially.
Something that stood out to me is the author’s conclusion that "Node.js wins." However, both the Node.js and C++ versions use the same library, uWebSockets! I suspect the actual takeaway is this:
"uWebSockets wins, and the uWebSockets authors know their library well enough that even their JavaScript wrapper outperforms my own implementation in plain C++ using the same library!" :-p
Makes me wonder if there’s something different that could be done in Go to achieve better performance. Alternatively, this may highlight which language/library makes it easier to do the right thing out of the box (for example, it seems easier to use uWebsockets in nodejs than in C++). TechEmpower controversies also come to mind, where "winning" implementations often don’t reflect how developers typically write code in a given language, framework, or library.
--
1: https://github.com/matttomasetti?tab=repositories&q=websocke...