What is the best way to send a very big String to a nodeJs express server?
I have a Webpage with codemirror which is able to load files from a express server and load them into Codemirror.But what method is the best to send "the file"(actually it's a string,can be realy big)
back to the express api ?
The most common way is to stream the data instead of sending everything at the same time. I don't know how to do that on the client side, I actually don't think that to stream a request is even possible.
I'm not an expert and I don't know much about large strings, but why is that a problem? From a networking point of vue, isn't the TCP protocol supposed to packet everything?
Related
The situation:
I have a Node.JS server. The NGINX is set to limit the size of request to be 5MB.
I need to upload large files (~15MB) from the client running in a browser to the server.
There are 3 instances of the server running WITHOUT shared memory/file systems.
What I have done:
I used some libraries to break down the files into chunks (< 5MB), sending them to the server. After successfully sending the last chunk to the server, the client called to a server endpoint to signal the completion and then the merging of chunks happened. This worked when I had one instance of the server running. Due to the load balancing, etc., each request of sending a chunk might be handled by a different instance of the server. Therefore, the chunks may not be merged correctly.
Solutions I have thought of:
The ultimate solution (in my opinion) would be how to stream the chunks to the server in just one request, which is handled by just one server instance.
The Stream API is still experimental. I would like to try but have not found a good example to follow. I heard that the Stream API on client side was different than the stream on Node.JS and some more things needed to be done.
I did some research on Transfer-Encoding: chunked of the HTTP header. Someone said it was good to send large files but I haven't found a good working example how to achieve this.
I also thought of WebSocket (or even Socket.io) to establish a connection with a server instance and send the chunks over. However, some reading told me that WebSocket was not a good way to send large files.
The question:
How can I achieve the goal of sending large files to one server instance in the most efficient way?
I have a C# server side web service. but I don't want user can to see my requests like request tab from client's browsers.
now, I haven't been find any solution on SO.
what is the best solution to do this?
I think I can use a node.js server-side and render my reactjs inside it and my node.js send my requests to C# server side. like this:
React.js<--(render)--Node.js--(Send/Receive api's)-->C#
I don't know if I use a node.js server, my requests will be hidden from clients?
I don't want to use reactjs.net.
If you're making a HTTP request to node server, and making the stealth request from NodeJS to another server, that request will not be visible to the client.
Alternatively, you can make an encrypted request. Although URL and some part of encryption algorithm will still be exposed at client's end.
I have a node server that will serve more than 10,000 users at the same time with big data, I already tried lz-string https://www.npmjs.org/package/lz-string but it's not good module because it's blocking the node thread.
Please answer these questions:
is it better to compress the data in server and then decompress in client instead of send plain/json data?
what is the best and fastest way to compress/decompress the data?
If you are sending large chunks of text data over the internet using HTTP protocol, then there are already some technologies in place to help you.
One is called HTTP Compression. HTTP protocol specifications allow few compression algorithms to perform on data being sent, but that requires the server and client to be properly configured for compression. Standard Node.js server will not compress the data without modifying code.
For bare Node.js without any frameworks and 3rd party modules there is zlib module made specially for HTTP compression, both server and client.
Using Express? Then there is a compression middleware.
It might also be worth looking using nginx as a proxy server to your node.js applications. Then you can easily flip the switch ON for compression in nginx, without needing to do anything in your Node.js application at all:
server {
gzip on;
...
}
It really depends on the stack you are using, but the idea is same: compress the HTTP stream itself, as it is supported by the protocol.
I have a question which path should I follow.
I want to develop real time online game via webbrowsers.
I would want to write game server using C++ with TCP sockets listening. The client side game will be written in javascript. The only problem I dont know how to communicate javascript with c++ server using TCP sockets. I have considered using Socket.IO but as far as I know this library does not have option to just connect to real TCP server, push bytes through and read incoming ones. Instead I would need to use some wrapper like Node.JS server which I want to avoid.
Anyone could guide me which path I should take?
You could make your game server itself an HTTP server. For the most part it could just serve up your static files, but when it received a WebSocket upgrade request, it could handle that however it desired.
You should take a look to websockify:
websockyfy is a WebSocket to TCP proxy/bridge. This allows a browser
to connect to any application/server/service. Implementations in
Python, C, Node.js and Ruby.
I'm working on a small personal project (to learn node) and I'm wondering what the best way to send chunks of a video data from a client to the server is? Obviously, I don't want to use Node's http module, but I've never used anything other than http. I know there is a net module -- is this better?