I am invoking java classes on a Javascript engine, so I am limited on what I can do.
The code works fine, but when I invoke the Javascript on a loop, it times out after the first invocation.
The code just downloads a ZIP from a url, and stores it to a path. Here is the full code:
var result = (function() {
var URL = Packages.java.net.URL;
var Files = Packages.java.nio.file.Files;
var Paths = Packages.java.nio.file.Paths;
var StandardCopyOption = Packages.java.nio.file.StandardCopyOption;
var urlAddr = new URL(
target_url
);
var connection = urlAddr.openConnection();
connection.setRequestMethod("GET");
connection.setRequestProperty("Authorization", bearer);
connection.setReadTimeout(20*1000);
connection.connect();
Files.copy(connection.getInputStream(), Paths.get(file_path+new_filename+".zip"), StandardCopyOption.REPLACE_EXISTING);
return "";
}());
The error I am getting from the server:
Java Exception: java.lang.RuntimeException: java.net.SocketTimeoutException: Read timed out: during call of javax.script.ScriptEngine.eval.
Before someone suggests, I cant get away from the java being called inside the javascript.
I thought it could be a connection being kept open.
UPDATE: this javascript for some reason is always taking 5 minutes to run....
UPDATE2: Files.copy seems to be the issue, since its the one taking the 5 minutes from the code.... the file is very small, and 5 min wouldnt justify
Related
I'm using Electron for a project of mine. I need to pass an URL between windows, which I'm doing by using the URL the following way:
function openWindow(url) {
url=encodeURIComponent(url);
const remote = require('electron').remote;
const BrowserWindow = remote.BrowserWindow;
var win = new BrowserWindow({ width: 800, height: 600 });
win.loadURL('file://' + __dirname + '/otherwindow.html?url=' + url);
}
On the receiving end (in otherwindow.html) I get the parameter this way:
var urlParam = function(name, w){
w = w || window;
var rx = new RegExp('[\?]'+name+'=([^\#]+)'),
val = w.location.search.match(rx);
return !val ? '':val[1];
}
I call this function as:
var decoded=decodeURIComponent(urlParam('url'));
And this all works fine. It's kind of ugly right now, but it works. Or so it seems. If I print out the decoded URL to the console it displays correctly. It seems to be fine, if I open up an Electron window with it, it displays the destination with no problem what so ever.
Here's the catch. I'm using wcjs-player for this project. In the destination page (otherwindow.html) is a wcjs-player instance, which would take the decoded URL and play the media located there with .addPlaylist(); or .vlc.play();.
It works fine if I put the destination URL in a variable in the same page then l put it as a parameter to these 2 functions, even works if I use encodeURIComponent(); and then decodeURIComponent(); on said variable, so encoding the URL is not the problem. I even tried with a base64 encoding to pass between the pages, no success.
So judging from all this, I reckon that the problem is not the encoding itself, but the passing between pages. My (probably wrong) theory is that the URL might get somehow very slightly altered (losing/gaining some information, special characters maybe?), which wcjs-player is not prepared to handle, but Chromium is (since there's no problem with the Electron window using the result URL).
I have no idea to fix this, I've tried all my ideas. Did a fair bit of searching, but didn't really find anything useful. I can solve it another way, but that would involve opening and processing the same file twice - in both windows - which I'm trying to avoid.
I've pretty new to Javascript overall, so please excuse me if I'm missing something trivial. Any help is appreciated!
I am experimenting with iMacros to automate as task that Firefox will do. I simply want to save the current page with the MAFF extension. The JavaScript that the iMacros forum has lead me to, is this:
// I stuck these variable in just to try something.
var doc = "http://www.traderjoes.com";
var file = "C:\\Export\\Test.maff";
var format = "MAFF";
// I stuck these variable in just to try something.
var MafObjects = {};
Components.utils.import("resource://maf/modules/mafObjects.jsm",
MafObjects);
var jobListener = {
onJobComplete: function(aJob, aResult) {
if (!Components.isSuccessCode(aResult)) {
// An error occurred
} else {
// The save operation completed successfully
}
},
onJobProgressChange: function(aJob, aWebProgress, aRequest,
aCurSelfProgress,
aMaxSelfProgress,
aCurTotalProgress,
aMaxTotalProgress) { },
onStatusChange: function(aWebProgress, aRequest, aStatus,
aMessage) { }
};
var saveJob = new MafObjects.SaveJob(jobListener);
saveJob.addJobFromDocument(doc, file, format);
saveJob.start();
I was only getting an error on line 26 because this was sample code. With the little JavaScript I know I tried to add some variables on the lines before the code starts. The thing is that when I try to search for syntax example for the method .addJobFromDocument I don’t find much, just like two results. Is this a method of JavaScript? Usually with things from the DOM you will get a great deal of information on them.
Does anybody know a way of automating the save of MAFF of the current open tab in Firefox and then closing the browser? iMacros was something I came to and glad to see it features but really I just want to automate from a command line the saving of a URL as a MAFF archive The doc (that I got from iMacros forum) also had these code snippets but I don’t have much idea how to use them. Thanks
var fileUri = Components.
classes["#mozilla.org/network/io-service;1"].
getService(Components.interfaces.nsIIOService).
newFileURI(file);
var persistObject = new MafObjects.MafArchivePersist(null, format);
persistObject.saveDocument(doc, fileUri);
Also:
var doc = gBrowser.contentDocument;
var file = Components.classes["#mozilla.org/file/local;1"].
createInstance(Components.interfaces.nsILocalFile);
file.initWithPath("C:\\My Documents\\Test.maff");
var format = "TypeMAFF";
I have PHP script which acts as a DNode client. Then I have Node.js Dnode server which evaluates code which receives from PHP client and it returns DOM as HTML. However, Node.js acts strangely to me (beeing a Node.js newbie). It doesn't return anything, even though the returning string is not empty. My code is below:
PHP client code using DNode-PHP library:
<?php
require(__DIR__.'/../../vendor/autoload.php');
$loop = new React\EventLoop\StreamSelectLoop();
$dnode = new DNode\DNode($loop);
$dnode->connect(7070, function($remote, $connection) {
$js = 'var a = document.createElement("A");';
$js.= 'document.getElementsByTagName("body")[0].appendChild(a);'
$remote->zing($js, function($n) use ($connection) {
print_r($n);
$connection->end();
});
});
$loop->run();
?>
Node.js server code:
var dnode = require('dnode');
var jsdom = require("jsdom");
var server = dnode({
zing: function (n, cb) {
var document = jsdom.jsdom('<!DOCTYPE html>');
var window = jsdom.parentWindow;
eval(n);
var html = jsdom.serializeDocument(document);
// console.log(html);
cb(html);
}
});
server.listen(7070);
Console.log() clearly outputs <!DOCTYPE html><html><head></head><body><a></a></body></html> what is expected result. But it never gets to PHP client. But what is strange, if I change line cb(html); to cb('test');, PHP outputs "test". So the problem must be somewhere on the Node.js side. But I have no idea where to look for.
Thanks in advance for any hints.
How are you viewing the response? Through a web browser? If so, then you're depending on whatever you're evaluating in eval(n) to change the DOM of the document... If nothing changes, then you won't end up seeing anything because you'll have an empty DOM other than the html/head/body tags. It would be worth your time confirming that you're getting an empty response back and it's not just an empty DOM.
That being said, The eval function has any context of you wanting to execute it on the document/window you declare above. As it is, it is just executing in the context of node itself, not on the page you are attempting to create. To fix this, try using:
window.eval(n)
If you take a look at the example Creating a browser-like window object
on the Github page for jsdom, this will give you a better idea of how exactly to use this package.
https://github.com/tmpvar/jsdom
What you have above should look something like this:
var document = jsdom.jsdom("<!DOCUMENT html>");
var window = document.parentWindow;
window.eval(n);
var html = jsdom.serializeDocument(document);
cb(html);
Now you'll be executing the Javascript on the DOM you were previously creating :-)
Your problem is not in Node. When I use the server code you show in your question and try with this client code, I get the expected result:
var dnode = require("dnode");
var d = dnode();
d.on('remote', function (remote) {
var js = 'var a = document.createElement("A");' +
'document.getElementsByTagName("body")[0].appendChild(a);';
remote.zing(js, function (s) {
console.log(s);
});
});
d.connect('localhost', '7070');
I don't do PHP so I don't know what the problem might be on that side.
I'm trying to write KDE4 plasmoid in JavaScript, but have not success.
So, I need to get some data via HTTP and display it in Label. That's working well, but I need regular refresh (once in 10 seconds), it's not working.
My code:
inLabel = new Label();
var timer= new QTimer();
var job=0;
var fileContent="";
function onData(job, data){
if(data.length > 0){
var content = new String(data.valueOf());
fileContent += content;
}
}
function onFinished(job) {
inLabel.text=fileContent;
}
plasmoid.sizeChanged=function()
{
plasmoid.update();
}
timer.timeout.connect(getData);
timer.singleShot=false;
getData();
timer.start(10000);
function getData()
{
fileContent="";
job = plasmoid.getUrl("http://192.168.0.10/script.cgi");
job.data.connect(onData);
job.finished.connect(onFinished);
plasmoid.update();
}
It gets script once and does not refresh it after 10 seconds. Where is my mistake?
It is working just fine in here at least (running a recent build from git master), getData() is being called as expected. Can you see any errors in the console?
EDIT: The problem was that getUrl() explicitly sets NoReload for KIO::get() which causes it load data from cache instead of forcing a reload from the server. Solution was to add a query parameter to the URL in order to make it force reload it.
I'm trying to use BlobBuilder and FileWriter API in HTML to write data to the file.
My problem is that if I use write function twice, I get an error. The following
code executes OK:
var bb = new window.WebKitBlobBuilder();
bb.append('LOREL');
outWriter.write(bb.getBlob('text/plain'));
But if I change it as follows (try to write twice)
var bb = new window.WebKitBlobBuilder();
bb.append('LOREL');
outWriter.write(bb.getBlob('text/plain'));
bb.append('LOREL');
outWriter.write(bb.getBlob('text/plain'));
I get an error. The error code is: INVALID_STATE_ERR
Any help would be appreciated.
The issue is that FileWriter.write() is asynchronous and you're trying to write more data to the file before the first write has complete. According to the spec, a FileException should be thrown if readyState==WRITING. Likely what's happening in your case. You need something like:
var bb = new window.WebKitBlobBuilder();
bb.append('LOREL');
outWriter.onwrite = function(e) {
bb.append('LOREL');
outWriter.write(bb.getBlob('text/plain'));
};
outWriter.write(bb.getBlob('text/plain'));
Also, I hope your code snippet is just an example and you're not actually appending, writing, appending, writing. Otherwise, use one write():
var bb = new window.WebKitBlobBuilder();
bb.append('LOREL');
bb.append('LOREL');
outWriter.write(bb.getBlob('text/plain'));