Create a platform independent path string - javascript

I'm using the Mozilla addon sdk for development and need to create a file on the local system.
Currently I use the statement below but feel it may not cover all platforms.
Running the statement on Windows 7 and Windows XP returns:
console.log(system.platform);
winnt
Running it on Linux returns:
console.log(system.platform);
linux
Is there a more reliable way to create the fullPath string, without having to check contents of system.platform?
pathToFile = Cc["#mozilla.org/file/directory_service;1"]
.getService(Ci.nsIProperties).get("Home", Ci.nsIFile).path;
if (system.platform.indexOf("win") == 0) {
fileSeparator = "\";
}else{
fileSeparator = "/";
}
fullPath=pathToFile + fileSeparator + 'myFile.txt'

Just a little modfication to your code should do the trick
var file = Cc["#mozilla.org/file/directory_service;1"]
.getService(Ci.nsIProperties).get("Home", Ci.nsIFile);
file.append("myFile.txt");
var fullPath = file.path;

I'd like to point out an alternative to #Kashif's answer.
Use FileUtils.getFile(), which is just a convenience function, essentially doing multiple .append()s, one per item in the parts array.
Cu.import("resource://gre/modules/FileUtils.jsm");
var file = FileUtils.getFile("Home", ["myFile.txt"]);
var path = file.path;

The SDK has a 'fs/path' module that has parity with Node's path API

Related

nodejs fs createWriteStream not working with file path prefix

I am trying to call an API, loop through an array of images, assign unique names to each image in the array and then write them to a local directory. If I simplify the code I can write them to the root folder .I already created the sub folder manually, so it existed prior to running the function.
Here is my basic function:
const imageFolder = './img';
function downloadImage(url, filepath) {
client.get(url, res => {
res.pipe(fs.createWriteStream(`${imageFolder}/${filepath}`));
});
}
...make api call
const imagesArray = generations.data.map(item => item.generation.image_path);
imagesArray.forEach(item => {
// const fileName = uuid.v4() + '.webp'; // trying to assign unique filename with uuid
const fileName = new Date().getTime().toString() + '.webp'; // trying to assign unique filename with date object
downloadImage(item, fileName);
});
If I change
res.pipe(fs.createWriteStream(`${imageFolder}/${filepath}`));
to
res.pipe(fs.createWriteStream(filepath));
then it will work but just dumps the images in the root. I was thinking perhaps I was trying to concatenate a variable name with a string (for fileName + '.webp', but it is working in the root as mentioned. See attached image.
I also tried adding the path into the actual function call inside the forEach loop like so
downloadImage(item, `${imageFolder}/${fileName}`);
I did wonder about needing the __dirname variable, or whether it could be a permissions issue, but I don't see any errors.
I am assuming this is pretty straightforward.
OK, was fairly simple and I guess I sort of knew it once I got it working, changing to
downloadImage(item, path.join('src', 'img', fileName));
path.join concatenates folder names and fixes issues when working across platforms (OSX, Windows etc) which applies in this case as I am testing from both Windows and Mac.

How to read WebAssembly file it JavaScript from the local file system with native tools

I am looking for a way to structure my JavaScript test and read a simple binary WebAssembly file i.e. wasm from the local file system i.e. not in the browser application without using third party tools like node. So far i have found that this can be done with node fs object. but I do not want to load such a huge tool only to read one file.
i.e.
I am looking of the way to replace a node call like this
var file = fs.readFileSync('myTestFile.wasm');
var buffer = new Uint8Array(file).buffer;
how will that look like in a JavaScript without node and without browser?
If you base64-encode the .wasm file then you can include it directly in the JavaScript like this:
Module.wasmBinaryFile = "data:application/wasm;base64,AGFzbQEAAAAByQ/AAWACf38Bf2A...";
All JavaScript engines have a non-browser build of their source code which runs on the command line. JSC has jsc, V8 has d8, SpiderMonkey js, and ChakraCore ch.
Those are used by each browser vendor for testing, and inevitably we sometimes need to read ASCII or binary files. There's unfortunately not really a standard for such functionality, but I've found that this works for my purpose:
const readAsBinary = filename => {
if (typeof process === 'object' && typeof require === 'function') {
const binary = require('fs').readFileSync(file);
return !binary.buffer ? new Uint8Array(binary) : binary;
} else
return typeof readbuffer === 'function'
? new Uint8Array(readbuffer(file))
: read(file, 'binary');
};
const instance = new WebAssembly.Instance(new WebAssembly.Module(readAsBinary(filename)), {});
This will only work in node.js or an engine's shell, and not in a browser.

How do you open photoshop with javascript?

So I have a personal website, and I have a button that I want to use to open photoshop and run a script for me. How do I do this?
It is not possible. It would pose a huge security risk to allow javascript to open programs on client side.
This has been up for awhile, but if the solution to this in node is to use a child_process.
you can npm install child_process
and the code to run executables would be to do
const exec = require("child-process").execFile;
var process = exec("Photoshop.exe", [*add options here*], {cwd:"C:/*path to photoshop*"});
you can do a lot of cool things afterwards like event handlers
process.on("close", code => {
console.log("process closed with code: "+ code)
})
process.on("exit", code => {
console.log("process exited with code: "+ code)
})
process.stdout.on("data", data => {
console.log(data)
})
You can read the Docs here: https://nodejs.org/api/child_process.html
maybe this
Code will open image in photoshop with the help of javascript. You just need to place your image file into photoshop->sample folder nothing more than that and you done with it.
var fileRef = new File(app.path.toString() + “/Samples/test.jpg”); // ‘samples’ is a folder resides in Program Files\Adobe\Adobe Photoshop CS5\samples
//open (fileRef);
var doc = open(fileRef);
// get document name (and remove file extension)
var name = tempName[0];
// convert to RGB; convert to 8-bpc; merge visible
doc.changeMode(ChangeMode.RGB);
doc.bitsPerChannel = BitsPerChannelType.EIGHT;
doc.artLayers.add();
doc.mergeVisibleLayers();
// rename layer; duplicate to new document
var layer = doc.activeLayer;
layer.name = tempName[0];
layer.duplicate(newDoc, ElementPlacement.PLACEATBEGINNING);
// close imported document
doc.close(SaveOptions.DONOTSAVECHANGES);

createBlockBlob and commitBlobBlocks create empty files in BlobStorage

I'm developing a web app that can upload large file into the Azure Blob Storage.
As a backend, I am using Windows Azure Mobile Services (the web app will generate contents for mobile devices) in nodeJS.
My client can successfully send chunks of data to the backend, everything looks fine but, at the end, the uploaded file is empty. The data upload has been prepared by following this tutorial: it works perfectly when the file is small enough to be uploaded in a single requests. The process fails when the file needs to be broken in chunks. It uses the ReadableStreamBuffer from the tutorial.
Can somebody help me?
Here the code:
Back-end : createBlobBlockFromStream
[...]
//Get references
var azure = require('azure');
var qs = require('querystring');
var appSettings = require('mobileservice-config').appSettings;
var accountName = appSettings.STORAGE_NAME;
var accountKey = appSettings.STORAGE_KEY;
var host = accountName + '.blob.core.windows.net';
var container = "zips";
//console.log(request.body);
var blobName = request.body.file;
var blobExt = request.body.ext;
var blockId = request.body.blockId;
var data = new Buffer(request.body.data, "base64");
var stream = new ReadableStreamBuffer(data);
var streamLen = stream.size();
var blobFull = blobName+"."+blobExt;
console.log("BlobFull: "+blobFull+"; id: "+blockId+"; len: "+streamLen+"; "+stream);
var blobService = azure.createBlobService(accountName, accountKey, host);
//console.log("blockId: "+blockId+"; container: "+container+";\nblobFull: "+blobFull+"streamLen: "+streamLen);
blobService.createBlobBlockFromStream(blockId, container, blobFull, stream, streamLen,
function(error, response){
if(error){
request.respond(statusCodes.INTERNAL_SERVER_ERROR, error);
} else {
request.respond(statusCodes.OK, {message : "block created"});
}
});
[...]
Back-end: commitBlobBlock
[...]
var azure = require('azure');
var qs = require('querystring');
var appSettings = require('mobileservice-config').appSettings;
var accountName = appSettings.STORAGE_NAME;
var accountKey = appSettings.STORAGE_KEY;
var host = accountName + '.blob.core.windows.net';
var container = "zips";
var blobName = request.body.file;
var blobExt = request.body.ext;
var blobFull = blobName+"."+blobExt;
var blockIdList = request.body.blockList;
console.log("blobFull: "+blobFull+"; blockIdList: "+JSON.stringify(blockIdList));
var blobService = azure.createBlobService(accountName, accountKey, host);
blobService.commitBlobBlocks(container, blobFull, blockIdList, function(error, result){
if(error){
request.respond(statusCodes.INTERNAL_SERVER_ERROR, error);
} else {
request.respond(statusCodes.OK, result);
blobService.listBlobBlocks(container, blobFull)
}
});
[...]
The second method returns the correct list of blockId, so I think that the second part of the process works fine. I think that it is the first method that fails to write the data inside the block, as if it creates some empty blocks.
In the client-side, I read the file as an ArrayBuffer, by using the FileReader JS API.
Then I convert it in a Base4 encoded string by using the following code. This approach works perfectly if I create the blob in a single call, good for small files.
[...]
//data contains the ArrayBuffer read by the FileReader API
var requestData = new Uint8Array(data);
var binary = "";
for (var i = 0; i < requestData.length; i++) {
binary += String.fromCharCode( requestData[ i ] );
}
[...]
Any idea?
Thank you,
Ric
Which version of the Azure Storage Node.js SDK are you using? It looks like you might be using an older version; if so I would recommend upgrading to the latest (0.3.0 as of this writing). We’ve improved many areas with the new library, including blob upload; you might be hitting a bug that has already been fixed. Note that there may be breaking changes between versions.
Download the latest Node.js Module (code is also on Github)
https://www.npmjs.org/package/azure-storage
Read our blog post: Microsoft Azure Storage Client Module for Node.js v. 0.2.0 http://blogs.msdn.com/b/windowsazurestorage/archive/2014/06/26/microsoft-azure-storage-client-module-for-node-js-v-0-2-0.aspx
If that’s not the issue, can you check a Fiddler trace (or equivalent) to see if the raw data blocks are being sent to the service?
Not too sure if your still suffering from this problem but i was experiencing the exact same thing and came across this looking for a solution. Well i found one and though id share.
My problem was not with how i push the block but in how i committed it. My little proxy server had no knowledge of prior commits, it just pushes the data its sent and commits it. Trouble is i wasn't providing the commit message with the previously committed blocks so it was overwriting them with the current commit each time.
So my solution:
var opts = {
UncommittedBlocks: [IdOfJustCommitedBlock],
CommittedBlocks: [IdsOfPreviouslyCommittedBlocks]
}
blobService.commitBlobBlocks('containerName', 'blobName', opts, function(e, r){});
For me the bit that broke everything was the format of the opts object. I wasn't providing an array of previously committed block names. Its also worth noting that i had to base64 decode the existing block names as:
blobService.listBlobBlocks('containerName', 'fileName', 'type IE committed', fn)
Returns an object for each block with the name being base64 encoded.
Just for completeness here's how i push my blocks, req is from the express route:
var blobId = blobService.getBlockId('blobName', 'lengthOfPreviouslyCommittedArray + 1 as Int');
var length = req.headers['content-length'];
blobService.createBlobBlockFromStream(blobId, 'containerName', 'blobName', req, length, fn);
Also with the upload i had a strange issue where the content-length header caused it to break and so had to delete it from the req.headers object.
Hope this helps and is detailed enough.

Thunderbird extension - How to use relative paths with Javascript?

I'm developing my own Thunderbird extension.
The extension adds an .xml-file as an attachment to a Thunderbird mail (it works very well).
My only problem is that I don’t know how to use a relative path.
It looks something like that:
var file= 'C:\\...[… \\…]...\\chrome\\VHitG2.xml';
var attachments = [];
attachments.push(FileToAttachment(file));
AddAttachments(attachments);
If the extension is installed in a different path, the extension can’t work.
Doe’s anyone know how to use relative paths ?
The FileToAttachment() function doesn't do magic, it is actually very simple. I assume that you are talking about a static file that is part of your extension - it should be accessible under a URL like chrome://myextension/content/VHitG2.xml. Then you can simply create an nsIMsgAttachment instance yourself using that URL:
var attachment = Components.classes["#mozilla.org/messengercompose/attachment;1"]
.createInstance(Components.interfaces.nsIMsgAttachment);
attachment.url = "chrome://myextension/content/VHitG2.xml";
AddAttachments([attachment]);
Note that your extension doesn't need to be installed unpacked for that, you don't need an actual file on disk.
I used a very circuitous way to get the URL of the extension’s files:
Components.utils.import("resource://gre/modules/FileUtils.jsm");
var test1 = FileUtils.getFile("CurProcD", ["VHitG2.xml"]);
var test2 = FileUtils.getFile("CurProcD", ["VHitG.xml"]);
var file1 = test1.path.replace(/VHitG2.xml/i, "extensions\\custom-toolbar-button#example.com\\chrome\\VHitG2.xml");
var file2 = test2.path.replace(/VHitG.xml/i, "extensions\\custom-toolbar-button#example.com\\chrome\\VHitG.xml");
var attachment1 = file1.replace(/\\/g, "\\\\");
var attachment2 = file2.replace(/\\/g, "\\\\");

Categories