i made a node server dedicated to streaming video files.
the logic is that when a client requests to node server, server starts transforming one avi file into files for hls (m3u8, ts) using spawn from child_process, streams from fs and ffmpeg.exe.
this logic finishes off in about 3 seconds after starting.
my problem is that i don't know how to run or scale node server efficiently using worker thread, cluster or something like that making a server deal with many tasks or request fast and stably.
i'm a frontend developer so i don't know backend stuff well.
please give me an advice how can i improve this server and what module should i use and study.
thanks!
Related
I have a node.js application on a Windows Server and a series of cron jobs setup via cPanel on an external CENTOS server. The cron jobs all target the URL of my node.js application using wget. I confirmed that the jobs are running according to the cron logs in WHM, but they are not causing the script on the other server to run as they should.
The format of the cron jobs are like this:
wget -q -O - "http://example.com/app?¶m=1¶m=2" >/dev/null 2>&1
The result of the cron job should be the execution of the script on the other server. That script uses Puppeteer to scrape web pages, take screenshots, parse the results into RSS feeds, and create rss formatted XML files on the file system. Despite the cron jobs firing and targeting the right URLs, I have yet to see a single cron job result in a new image or xml file being created.
This leads me to wonder if there is something I don't know about when it comes to cPanel, cron jobs and javascript. I'm thinking maybe they won't work with URLs leading to javascript files, but if that were true then wouldn't I have the same problem when executing the same commands via SSH? I just tried one using the WHM terminal and it triggered the desired result.
I think the issues was due to the firewall on the IIS server. I added the remote IP address to the whitelist and now the cron jobs work.
I am trying to create a network of nodes on my computer that can act like a client and a server. Each node should be running its own instance of server code and client code and should be able to give requests to its own instance of the server. How can I start a server of my own in Node and how can I have them all run on different ports?
Thanks
Use docker with docker compose, you can run all environments you want very easily
https://docs.docker.com/compose/overview/
I know that node.js server caches modules. So, after starting the server all files get "compiled" and all your changes to code can take effect after restart the server.
But if there are always hundreds of users online on website - how do you make those changes (restart server) in a way your hundreds of client won't notice any trouble, downtime?
Please, give me some guide and (your own) examples about (I guess) scalability, balancing load on servers etc, so I can make awesome large and dynamic website with node.js too.
The best way to accomplish continuous uptime with Node.JS is to have 2 Node servers running and proxy to them using nginx and upstream providers. That way you can restart one whilst the other takes the traffic load then do the same to the other node server.
Your nginx configuration would use something similar to the below:
upstream backend {
server backend1.example.com weight=5;
server backend2.example.com:8080;
server unix:/tmp/backend3;
server backup1.example.com:8080 backup;
server backup2.example.com:8080 backup;
}
server {
location / {
proxy_pass http://backend;
}
}
For more information about nginx proxying and upstream providers see the nginx documentation: http://nginx.org/en/docs/http/ngx_http_upstream_module.html
Here is my problem... I have a node server that multiple node terminals(raspberry pi) connect to. These node terminals run a series of jobs and some of them generate files. The files are not saved in the terminal but in a mysql blob. Now these terminals are managed through an interface in the server (a CRM webpage). I manage them using socket.io and there is also redis available. Through socket.io I can tell the terminal what file I want, but the problem I'm facing is getting the file to the requesting browser client. I can identify the browser via the socket id but I am not sure as to how I am going to serve that file. Any help or suggestion would be great. Note: Im not using any JS or nodejs frameworks.
I know Windows 8 'apps' can be developed using web technologies but I haven't been able to find out if terminal commands can be run in the background using web technologies as an interface. I basically have mongoDB on my computer and it takes two terminal windows open to run it. I thought it might be a neat project to see if I could write a little app that is nothing more than a button that launches both commands behind the scenes saving me the hassle of going to the directories and running the commands manually for both terminal windows.
If you plan to launch apps via server-side JavaScript (e.g. node.js), use the child_process module..
The workflow would be that in the windows 8 gui side, it will just issue a request to your own local server in node.js, then it would execute those commands.
Example:
var exec = require('child_process').exec;
var child = exec("insert command here", function(err, stdout, stderr) { });
See examples exec and spawn for more examples.
======
Another thing you can do is create a batch (.bat) file that contains those two commands needed for your mongodb instance and put that as a shortcut in the Windows 8 Start Screen.
It depends on what kinds of commands you need to execute, and when and where. If you plan to execute commands remotely, I'd assume server-side JS would be appropriate, but if you plan to execute commands locally, I think all you need is just batch scripting.