The organization that I'm working with now uses Munin as monitoring tool. We've written a service that streams realtime data about the service that can be displayed by a Javascript component. Preferably the operations team would like to show these metrics in Munin to avoid having another system for realtime monitoring.
Is it possible and feasible to use Munin for displaying realtime data using Javascript? Preferably I'd like to create this as a plugin but we're fine with modifying some Munin HTML page or similar as well and just add the Javascript component to the page.
Specifying alerts/alamars when certain properties of the streams go above a certain threshold would be nice as well. Given that (1) is feasible then one idea to integrate this would be to write an external app that reads the realtime stream and identifies when an alert should be triggered. When a error is detected the external app could write this to a file on disk. The idea is then to write a Munin plugin that reads from this file and trigger alters/alarms from within Munin if applicable.
Munin "polls" machines for data every five minutes. In order to provice your streaming data points to the central munin server, you need to configure a munin node on the server which streams data, and write a shell script (probably involving curl and awk) to fetch the current data.
Creating a munin plugin on a node is really simple, it's just a shell script which outputs it's data in readable form to standard out.
Setting alarms is easy, for the values you return you need to set warn and critical values in the munin plugin config output. Please keep in mind that these warnings are also on a 5 minute schedule so it's not "immediate".
Read up on how munin works at http://guide.munin-monitoring.org/en/latest/
Example of a simple munin plugin (stripped version of the system load plugin):
#!/bin/sh
. $MUNIN_LIBDIR/plugins/plugin.sh
if [ "$1" = "autoconf" ]; then
echo yes
exit 0
fi
if [ "$1" = "config" ]; then
echo 'graph_title Load average'
echo 'graph_args --base 1000 -l 0'
echo 'graph_vlabel load'
echo 'graph_scale no'
echo 'graph_category system'
echo 'load.label load'
print_warning load
print_critical load
echo 'graph_info The load average of the machine describes how many processes are in the run-queue (scheduled to run "immediately").'
echo 'load.info 5 minute load average'
exit 0
fi
echo -n "load.value "
cut -f2 -d' ' < /proc/loadavg
Save your data which you want to make a chart for in the database. Write another code to make a chart from that and simply update your chart with ajax request.
the php code which make chart use gd library or you can do it by svg xml output which i suggest it more.
And as long as time goes get the result of script by just requesting it in ajax.
Thats i just know
Related
In my webapp the user has the option to download a file containing some data, which they do by clicking on a button. For small amounts of data the file starts downloading pretty much immediately and that shows in the browser's download area. Which is good.
For large amounts of data it can take the server a substantial amount of time to calculate the data, even before we start downloading. This is not good. I want to indicate that the calculation is in progress. However I don't want to put a "busy" indicator on my UI, because the action does not block the UI - the user should be able to do other things while the file is being prepared.
A good solution from my point of view would be to start the download process before I have finished the calculation. We always know (or can quickly calculate) the first few hundred bytes of the file. Is there a mechanism where I can have the server respond to a download request with those few bytes, thus starting the download and making the file show up in the download area, and provide the rest of the file when I have finished calculating it? I'm aware that it will look like the download is stalled, and that's not a problem.
I can make a pretty good estimate of the file size very quickly. I would prefer not to have to use a third-party package to achieve this, unless it's a very simple one. We are using Angular but happy to code raw JS if needed.
To indicate that the link points to a download on the client, the easiest way is the download attribute on the link. The presence of the attribute tells the browser not to unload the current tab or create a new one; the value of the attribute is the suggested filename.
For the back-end part, after setting the correct response headers, just write the data to the output stream as it becomes available.
You asked for a general solution
1) First, at your HTML/JS you can prevent the UI from being blocked by setting you download target to any other WebPage, the preferred way for doing this is to set the target to an IFRAME:
<!-- your link must target the iframe "downloader-iframe" -->
<a src="../your-file-generator-api/some-args?a=more-args" target="downloader-iframe">Download</a>
<!-- you don't need the file to be shown -->
<iframe id="downloader-iframe" style="display: none"></iframe>
2) Second, at your back-end you'll have to use both Content-Disposition and Content-Length(optional) headers, be careful using the "length" one, if you miss calculate the fileSize it will not be downloaded. If you don't use Content-Length you'll not see the "downloading progress".
3) Third, at you'r back-end you have to make sure that you are writing your bytes directly at your response! that way your Browser and your Web-Server will know that the download is "in progress",
Example for Java:
Using ServletOutputStream to write very large files in a Java servlet without memory issues
Example for C#:
Writing MemoryStream to Response Object
HOW this 3 steps are built will be up to you, frameworks and libraries you are using, for example Dojo & JQuery have great IFRAME manipulation utilities, all thought you can do the coding by yourself, this is a JQuery sample:
Using jQuery and iFrame to Download a File
Also:
Adding a "busy" animation is ok! you just have to make sure that it's not blocking you'r UI, something like this:
I'm making a collaborative editor for websites, much like Google Docs, but it's built for coding and development. I want to allow multiple users to edit a file at the same time, and push their changes both to the server and the other person viewing the file. How could I do this?
I can't figure out how to synchronize the data between users. The code I have right now is as follows:
AJAX in JS:
function update(f, txt){
$.ajax({
type: 'POST',
data: {text: txt, file: f},
url: "save.php",
});
}
save.php:
$file = $_POST['file'];
$contents = $_POST['text'];
file_put_contents(dirname(__FILE__) . "/preview" . "/" . $file,$contents);
You can use EventSource to stream data from server to browser or WebSocket to stream data both to server and from server. That is, each client receives the stream of the file continuously, see for example How to read and echo file size of uploaded file being written at server in real time without blocking at both server and client?.
plnkr provides a real time sharing option and is open source. Consider locating and reading the code for that application and adjusting for what you are trying to achieve.
I am wondering if I can institute a chronTab that will run a php script from within an Apache web server (on Unix based systems). My interest is in incorporating it with the javascript server side program Node.
This would be to be able to 'ping' a browser to see if it is still "in the ball park" so far as when to end sessions,
etc. The chronTab would be used to monitor some kind of script written file that would contain a record of ajax sent
messages telling the server that is still interested in some content. The chrontab would determine if there was a
sufficient amount of time since the last ajax request to close a session.
Possibly, if not probably the chrontag script would be written in php a command line interface script (or other shell or
script type?)
These are details that I have been thinking about but don't know enough about to use.
Thank you for time and attention
Can you use the regular system cron, and create a job that does a wget at the URL where your script resides? So you would add a job like this in your crontab:
*/5 * * * * /usr/bin/wget -O - -t 1 http://example.com/cgi-bin/my_cron_script.php
It will execute every 5 minutes and call the my_cron_script.php on your webserver.
Assuming you mean crontab?
If so, and if you have shell access to the box:
crontab -e
and add entry for the php script you want to run:
0 */4 * * * /path/to/php /full/path/to/script_to_run.php
NOTE: Use full paths to both php and your script and your php might be installed in a different directory. Use 'which php' to get the full path.
I am in a situation, when I have to implement downloading of large files(up to 4GB) from a Web server: Apache 2.4.4 via HTTP protocol. I have tried several approaches, but the best solution looks to be the usage of X-SendFile module.
As I offer progress bar for file uploads, I would need to have the same feature for file downloads. So here are my questions:
Is there any way, including workaround, to achieve file downloads progress monitoring?
Is there any way, including workaround, to calculate file download transfer speed?
Is there better way to provide efficient file downloads from a web server than usage of X-Sendfile module?
Is there better file download option in general, that would allow me to monitor file download progress? It can be a client (JavaScript) or server solution(PHP). Is there any particular web server that allows this?
Currently I use:
Apache 2.4.4
Ubuntu
Many times thanks.
2 ideas (not verified):
First:
Instead of placing regular links to files (that you want to download) on your page place links like .../dowanload.php which may look sth like this:
<?php
// download.php file
session_start(); // if needed
$filename = $_GET['filename']);
header( 'Content-type: text/plain' ); // use any MIME you want here
header( 'Content-Disposition: attachment; filename="' . htmlspecialchars($filename) . '"' );
header( 'Pragma: no-cache' );
// of course add some error handling
$filename = 'c:/php/php.ini';
$handle = fopen($filename, 'rb');
// do not use file_get_contents as you've said files are up to 4GB - so read in chunks
while($chunk = fread($handle, 1000)) // chunk size may depend on your filesize
{
echo $chunk;
flush();
// write progress info to the DB, or session variable in order to update progress bar
}
fclose($handle);
?>
This way you may keep eye on your download process. In the meantime you may write progress info to the DB/session var and update progress bar reading status from DB/session var using AJAX of course polling a script that reads progress info.
That is very simplified but I think it might work as you want.
Second:
Apache 2.4 has Lua language built in:
mod_lua
Creating hooks and scripts with mod_lua
I bet you can try to write LUA Apache handler that will monitor your download - send progress to the DB and update progress bar using PHP/AJAX taking progress info from the DB.
Similarly - there are modules for perl and even python (but not for win)
I see main problem in that:
In a php+apache solution output buffering may be placed in several places:
Browser <= 1 => Apache <= 2 => PHP handler <= 3 => PHP Interpreter
process
You need to control first buffer. But directly from PHP it is impossible.
Possible solutions:
1) You can write own mini daemon which primary function will be only send files and run it on another than 80 port 8880 for example. And process downloading files and monitor output buffer from there.
Your output buffer will be only one and you can control it:
Browser <= 1 => PHP Interpreter process
2) Also you can take mod_lua and control output buffers directly from apache.
3) Also you can take nginx and control nginx output buffers using built-in perl (it is stable)
4) Try to use PHP Built-in web server and control php output buffer directly. I can't say anything about how it is stable, sorry. But you can try. ;)
I think that nginx+php+built-in perl is more stable and powerful solution.
But you can choose and maybe use other solution non in that list. I will follow this topic and waiting your final solution with interest.
Read and write to the database at short intervals is killing performance.
I would suggest to use sessions (incrementing the value of sent data in the loop) with which you can safely off by quite another php file, you can return data as JSON which can be used by the javascript function/plugin.
In my Tomcat container I have created a JSP page in which I'm rendering a chart using pure JavaScript.
Now I want to export this JavaScript rendered chart using server side exporting technology like fusionchart. I have all required jar files in my Tomcat container for this exporting feature in Java EE (FCExporter.jar, etc).
I got the Answer.
I have created the Servlet and passing request to that servlet in tomcat container from JS..
<"chart exportEnabled="1" exportAction="Save" exportAtClient="0"
html5ExportHandler="http://localhost:8085/FusionCharts_J2EE/JSP/ExportExample/IMGExporter"
caption="Brand Winner" yAxisName="Brand Value ($ m)"
xAxisName="Brand" bgColor="F1F1F1" showValues="0" canvasBorderThickness="1"
canvasBorderColor="999999" plotFillAngle="330" plotBorderColor="999999"
showAlternateVGridColor="1" divLineAlpha="0">
and in servlet ..I got the svg parameter.with
request.getparameter("svg");
and with following command i have generated the jpeg of fusion chart..
java -jar batik-rasterizer.jar -d D:\ -m image/jpeg samples/out.svg
hurray....
It is possible with some tweaking.
Please read: http://docs.fusioncharts.com/charts/contents/?exporting-image/ECPureJS.html#ownserver
[section: Setup your own server to process and export JavaScript charts ]
This tells you how to set up your own server for server side export of JS charts.
Well, you need to have php running in your server and download the batik jar as stated in the steps listed in the above page.
Once setup, edit the index.php (of the export) file to save the generated image to your server location.
In case you do not have php, you need to write Java EE code to do the same work done by index.php
OR
[added after a while]
If you wish not to use any client side rendering, rather do a silent server-side creation of chart images, you can follow http://fcimg.org/