The company I'm working at uses Perl for all "backend related" stuff. However, we would like to use some real-time communication between server-processes and connected clients via browser.
We are also using Apache as Webserver with mod.perl. So that is my first question, I don't see any practical way to combine a WebSocket-Server in that constelation. Maybe there is one I didn't found yet?
The only thing which really works serious about that topic, is Mojolicious. However I'm not that experienced with that yet, so I'd be happy if someone could state if I can use that in my current mod-perl environment. I think I also would have to let this run as standalone webserver process, No?
Which brings me to my second question. What is the best practice, if you have multiple perl files, which do certain things running on Apache/modperl, but you want to keep all your connected users informed about things. What I mean is, all these scripts are accessed via XHR, but some actions require other users to get informed. Currently, we do a classic ajax polling.
The problem I'm struggling around with is, that if there is a dedicated websocket server, which runs independently, all those scripts would need to somehow communicate with this process as well right ? How would one do that? Pipes? Sockets? Shared memory ?
Theoretically, if I would choose to go with such an independent ws server solution, I could write it in any language right? Could even be Ruby or Node. I'm just wondering if that is the best way or if there is a good solution which is more integrated in existing perl/modperl constructs.
TL;DR
Is it best practice to have a standalone, independent web-socket server which communicates with the rest of your Apache/modperl scripts aswell as with its connected clients ?
You could look on AnyEvent CPAN module:
http://metacpan.org/pod/AnyEvent
With it you can write your own standalone event-driven WebSocket-server, also you could find a lot of examples in google or in AnyEvent's perldoc.
Related
sorry if this question sounds kinda weird, but I haven't managed to get a satisfying answer yet.
The thing is, I am making( or willing to make ) a web-browser based game where people have their own company, trade, build etc.. And the game will also include bots. Because there will be quite a lot happening, it would be good for the bots to run at all times and react to various situations.
But.. I don't know how to make it happen server-side. Is there a way to make a script run on a server (like VPS)? Or is there a way to make some kind of application that would run on the server and communicate with the database and send answers?
Thank you :)
Of course there is a way to do this server-side - you "just" have to create a server, using any language/framework/platform that enables you to create a proper server application (this could be C++, Java, you could even stay in the JavaScript realm by using Node.js).
You would have to provide some kind of communication channel between web clients and your server. For that, you could use either Websockets (TCP) or WebRTC (UDP), simple AJAX could be an option too, but I wouldn't recommend it for persistent communication channel.
I've got a simple Javascript application with a JSON API. Currently it runs in the client, but I'd like to move it from the client to the server. I am accustomed to learning new platforms, but in this case, my time is very limited - so I need to find the absolute simplest way possible.
This should be an easy task, but all I'm finding are solutions that are way overcomplicated:
The application is currently hosted on an extremely basic server. Node.js is not available, and I do not have install privileges. I'll eventually move it to a different server, but I really don't know what will be available there.
Many solutions require installing and running a standalone server. Really? Just to evaluate Javascript server-side and spit out some data?
I can run Python and PHP, and I see that it's possible to call Javascript from inside a Python or PHP script. However, the specific Python solution that I've found also require installing some Python support via pip or easy-install, so probably not an option. Also, this just feels overcomplicated, and I'm concerned about setting myself up for errors such as data conversion or permissions, etc.
Any help?
#Quentin is correct. There is no way to run javascript on a server without a javascript interpreter on the server.
Node.js is not only the most robust and widely used one, it's also the simplest. It is certainly possible to write your own javascript interpreter in PHP or Python, but that would be much more complicated than using Node.js.
Try really hard to find a server solution that allows you to use Node. In the end, it's going to save you (and any other stakeholders interested in the project) a lot of time and money.
This is a question about the best way to structure an app that has both server-side and client-side needs. Forgive the length -- I am trying to be as clear as possible with my vague question.
For a standalone non-web-connected art project, I'm creating a simple browser-based app. It could best be compared to a showy semi-complicated calculator.
I want the app to take advantage of the browser presentation abilities and run in a single non-reloading page. While I have lots of experience writing server-side apps in perl, PHP, and Python, I am newer to client-side programming, and neophyte at JavaScript.
The app will be doing a fair bit of math, a fair bit of I/O control on the Raspberry Pi, and lots of display control.
My original thought (and comfort zone) was to write it in Python with some JS hooks, but I might need to rethink that. I'd prefer to separate the logic layer from the presentation layer, but given the whole thing happens on a single non reloading html page, it seems like JavaScript is my most reasonable choice.
I'll be running this on a Raspberry Pi and I need to access the GPIO ports for both input and output. I understand that JavaScript will not be able to do I/O directly, and so I need to turn to something that will be doing AJAX-ish type calls to receive and sent IO, something like nodejs or socket.io.
My principle question is this -- Is there a clear best practice in choosing between these two approaches:
Writing the main logic of the app in client-side JavaScript and using server-side scripting to do I/O, or
Writing the logic of the app in a server-side language such as Python with calls to client-side Javascript to manage the presentation layer?
Both approaches require an intermediary between the client-side and server-side scripting. What would be the simplest platform or library to do this that will serve without being either total overkill or totally overwhelming for a learner?
I have never developed for the Raspberry Pi or had to access GPIO ports. But I have developed stand-alone web apps that acted like showy semi-complicated calculators.
One rather direct approach for your consideration:
Create the app as a single page HTML5 stand-alone web app that uses AJAX to access the GPIO ports via Node.JS or Python. Some thoughts on this approach based on my experience:
jQuery is a wonderful tool for keeping DOM access and manipulation readable and manageable. It streamlines JavaScript for working with the HTML page elements.
Keep your state in the browser local storage - using JavaScript objects and JSON makes this process amazingly simple and powerful. (One line of code can write your whole global state object to the local storage as a JSON string.) Always transfer any persistent application state changes from local variables to local storage - and have a page init routine that pulls the local storage into local variables upon any browser refresh or system reboot. Test by constantly refreshing your app as part of your testing as you develop to make sure state is managed the way you desire. This trick will keep things stable as you progress.
Using AJAX via jQuery for any I/O is very readable and reliable. It's asynchronous approach also keeps the app responsive as you perform any I/O. Error trapping and time-out handling is also easily accomplished.
For a back end, if the platform supports it, do consider Node.JS. It looks like there is at least one module for your specific I/O needs: https://github.com/EnotionZ/GpiO
I have found node to be very well supported and very easy to get started with. Also, it will keep you using JavaScript on both the front and back ends. Where this becomes most powerful is when you rely on JavaScript object literals and JSON - the two become almost interchangeable and allow you to pass complicated data structures to/from the back end via a few (or even one!) single object variable.
You can also keep your options open now on where you want to execute your math functions - since you can execute the exact same JavaScript functions in the browser or in the node back end.
If you do go the route of JavaScript and an HTML5 approach - do invest time in using the browser "developer tools" that offer very powerful debugging tools and dashboards to see exactly what is going on. You can even browse all the local storage key/value pairs with ease. It's quite a nice development platform.
After some consideration, I see the following options for your situation:
Disable browser security and directly communicate with GPIO. No standard libaries?
Use a JavaScript server environment with GPIO access and AJAX. Complexity introduced by AJAX
Use the familiar Python and use an embedded web browser If libraries are around, easy
Don't add too much complexity if you're not familiar with the tooling and language
Oh what a nice question! I'm thinking of it right now. My approach is a little difference:
With old MVC fashion, you consider the V(iew) layer is the rendered HTML page with Javascript CSS and many other things, and M and C will run on the server. And one day, I met Mr.AngularJS, I realized: Wow, some basic things may change:
AngularJS considers the View ( or the thing I believed is view ) is not actually view. AngularJS gave me Controllers, Data resources and even View templates in that "View", in another word: Client side itself can be a real Application. So now my approach is:
Server do the "server job" like: read and write data , sends data to the client, receive data from client ect....
And client do the "client job": interact with user, do the logic processes of data BEFORE IT WAS SENT such as validation, or format the information collected from user ect...
Maybe you can re-think of your approach: Ask your self what logic should run at client, what should at server. Client with javascript do its I/O, Server with server-side script do its I/O. The server will provide the needed resource for client and javascript use that resources as M(odel) of it's MVC. Hope you understand, my bad English :D
Well... it sounds like you've mostly settled on:
Python Server. (Python must manage the GPIO.)
HTML/JavaScript client, to create a beautiful UI. (HTML must present the UI.)
That seems great!
You're just wondering how much work to do on each side of the client/server divide... should be functionally equivalent.
In short: Do most of the work in whichever language you are more productive in.
Other notes come to mind:
Writing the entire server as standalone python is pretty
straightforwad.
You don't have to , but it's nice and
self-contained if you serve the page content itself from it.
If you
keep most of the state on the server/python side, you can make the
whole app a little more robust against page reloads (even though I
know you mentioned, that should never happen).
I am working on app where I need to pass messages between a C++ application and a Javascript web app.
Certainly I could write sockets code myself in either language and I have done this in the past when necessary.
What I would really like is a higher-level message posting or message queueing API that does a lot of the work for me. Does anyone know of such an API?
I have looked at ICE, and it doesn't appear to have Javascript bindings. I have also looked at Boost message queue, but it only caters for the C++ side of things. If necessary I might roll my own Javascript bindings for either of these technologies.
UPDATE: Sorry should have mentioned this before, I want to run this in a browser.
To give a more complete story what I want is a simple browser-based app that is used to configure and display logging for a C++ application.
I know there are other ways of doing this, but I am specifically interested in a high-level library in both C++ and browser-based Javascript that builds a message queue ontop of the sockets API (if there isn't one then I might consider implementing it myself and writing up a code project article).
ALSO: I'm not bothered about portability in terms of the web browser. Eg if there is a high-level IPC Javascript library that only works in Chrome, I'll be happy with that.
With JavaScript I assume that you are running it in a browser? In this case your C++ application needs to provide a webserver and some kind of JSON based webservice that you can call. On the JavaScript side you just use AJAX to communicate with that webservice.
An alternative would be websockets which might be a little harder to implement on the C++ side though.
To simply answer your question: No, there is no IPC implemented in ECMAscript out of the box.
But you actually answered you question already. If you try to communicate with Javascript that runs in a browser, you indeed should use (web-)sockets connections to pipe date in either direction. Of course you could write a simple HTTP server in C++, but I guess that is overkill and does not have the capabilitys of bi-directional sockets.
It's still some work to implement a web-socket connection in C++ from the scratch (the specs were in flux for a long time), but I guess there are some librarys out already.
If you're trying to communicate with node.js, this is an almost trivial task using real sockets/pipes.
I have found a solution that meets my needs. It isn't exactly perfect but I think it works well enough.
Some people suggested using HTTP and ajax. That turned out to be a useful idea and after some prototyping I think it solves my rather basic needs.
To be more specific I am using the Mongoose HTTP server embedded in my C++ application and I am using the jQuery ajax function to pull data from the server. The jQuery client polls the server continously for new data, not particularly efficient but I think it will do the job good enough for me.
Once my implementation is complete I'll write an article explaining how to do this in detail and then I'll update this answer.
You could try DBus, it has very simple mechanism to define, query and use interfaces, and there are some components for XPCOM and webkit based browsers (for example http://sandbox.movial.com/wiki/index.php/Browser_DBus_Bridge and http://code.google.com/p/v8-dbus/). Also DBus is opensource and cross platform.
For a server side or non-browser implementation how about named pipes?
Yes it's vintage technology and the usage depends which OS you use, but as long as your server side js environment has ability to read and write files it may work, and it fits the description 'high-level' inter-process communication.
In my nearest future I will have to make a system with C++ backend and web frontend (requirements). At the moment, I don't know much more about it. I think that Frontend will be triggering data delivery, not backend - so no need for Comet-like things.
Because of possibly little experience in this field, I'd really appreciate your comments about design decisions I made.
First of all, I don't like the option of generating HTML from C++.
So, C++ backend will have to communicate with Javascript frontend. Simplest option I see here is Ajax. I think it should be ok, so far.
Commucating through Ajax with C++ backend means that backend should be capable of handling HTTP. It'd be nice to separate backend which provides actual data from HTTP handling functionality.
Here I see the place for Node.js. I got an overview of it and this's the place where all my doubts lie.
To have a HTTP handling server on Node.js, which will have the 'data backend' as a Node.js module? I think, it should be ok - but I'm not sure that I really need all this asynchronization, so there may be some simpler options I'm not aware of? How would you make such a system?
Thanks in advance.
"All this asynchronization" is not something that Node.js works very hard to provide as an extra. It is a different view of Web serving that is easy as breathing once you understand how Node.js works.
For example, my colleagues needed a way to wrap a C++ program as a web service, but the program had a significant start-up cost, so they wanted to run just one instance of the program, running in a loop, serving all the web requests. The whole thing in Node.js took less than two screenfuls.
Wrapping a single program that is called for each request can be done in less than ten lines of Node.js. Don't think of asynchronicity as a chore - if you embrace it, Node.js is awesome.
That said, you could go the CGI route, and do it in a bit more standard way, and the end result would be pretty much the same. This may or may not come in handy.
Did you consider CGI/FCGI module option with nginx, Apache, etc. ?
If not then I think it makes sense to start from it. Your module will handle data/json request and the rest will be handled by HTTP server.