Okay, not completely sure how to phrase this, but I'm gonna try my best.
I have been trying to use node's new vm module, and I wanted to enable Buffer support for the code running within the vm.
Here is the initial code I was using:
let vm = require('vm');
let someCode = '... some unsafe code ...';
let result = vm.runInNewContext(
someCode,
{ Buffer: Buffer }
);
However, as I quickly discovered, this will return the node process's Buffer class, so if someone were to modify Buffer.prototype within the vm, this will also change the Buffer outside the vm.
Therefore, after looking at the docs I tried changing the sandbox object to:
{ Buffer: require('buffer').Buffer }
After some checking, I discovered to my dismay that this doesn't work as expected either, since require somehow returns the same object:
let b = require('buffer').Buffer;
console.log(b === Buffer); // This becomes true
Is it possible to create a new Buffer class using the vm, or will I have to further safeguard by using child_process in some fancy way?
Related
I am working on a script that runs during our build process in Jenkins right before npm install. My issue is that I need to download a JavaScript file from an external resource and read a variable from it.
unzipper.on('extract', () => {
const content = fs.readFileSync(`${outputDir}/js/en.js`, 'utf8');
eval(content); // Less smellier alternative?
if (obj) {
const str = JSON.stringify(obj);
fs.writeFileSync(`${outputDir}/public/data.json`, str);
} else {
throw 'Variable `obj` not found';
}
});
I know that "eval is evil", but any suggested alternatives I've found online don't seem to work.
I have tried different variations of new Function(obj)(), but Node seems to exit the script after (the if-case never runs).
Ideas?
Since node.js provides the API to talk to the V8 runner directly, it might be a good idea to use it. Basically, it's the API used by node's require under the hood.
Assuming the js file in question contains the variable obj we're interested in, we do the following:
read the code from the file
append ; obj to the code to make sure it's the last expression it evaluates
pass the code to V8 for evaluation
grab the return value (which is our obj):
const fs = require('fs'),
vm = require('vm');
const code = fs.readFileSync('path-to-js-file', 'utf8');
const obj = vm.runInNewContext(code + ';obj');
This answer is heavily based on #georg's comments, but since it helped me I'll provide it as an alternative answer.
Explanation in the comments.
let content = fs.readFileSync(`${outputDir}/js/en.js`, 'utf8');
content += '; module.exports=obj'; // Export "obj" variable
fs.writeFileSync(`${outputDir}/temp`, content); // Create a temporary file
const obj = require(`${outputDir}/temp`); // Import the variable from the temporary file
fs.unlinkSync(`${outputDir}/temp`); // Remove the temporary file
I have a pretty straightforward piece of Typescript code that parses a specific data format, the input is a UInt8Array. I've optimized it as far as I can, but I think this rather simple parser should be able to run faster than I can make it run as JS. I wanted to try out writing it in web assembly using AssemblyScript to make sure I'm not running into any quirks of the Javascript engines.
As I figured out now, I can't just pass a TypedArray to Wasm and have it work automatically. As far as I understand, I can pass a pointer to the array and should be able to access this directly from Wasm without copying the array. But I can't get this to work with AssemblyScript.
The following is a minimal example that shows how I'm failing to pass an ArrayBuffer to Wasm.
The code to set up the Wasm export is mostly from the automatically generated boilerplate:
const fs = require("fs");
const compiled = new WebAssembly.Module(
fs.readFileSync(__dirname + "/build/optimized.wasm")
);
const imports = {
env: {
abort(msgPtr, filePtr, line, column) {
throw new Error(`index.ts: abort at [${line}:${column}]`);
}
}
};
Object.defineProperty(module, "exports", {
get: () => new WebAssembly.Instance(compiled, imports).exports
});
The following code invokes the WASM, index.js is the glue code above.
const m = require("./index.js");
const data = new Uint8Array([1, 2, 3, 4, 5, 6, 7, 8]);
const result = m.parse(data.buffer);
And the AssemblyScript that is compiled to WASM is the following:
import "allocator/arena";
export function parse(offset: usize): number {
return load<u8>(offset);
}
I get a "RuntimeError: memory access out of bounds" when I execute that code.
The major problem is that the errors I get back from Wasm are simply not helpful to figure this out on my own. I'm obviously missing some major aspects of how this actually works behind the scenes.
How do I actually pass a TypedArray or an ArrayBuffer from JS to Wasm using AssemblyScript?
In AssemblyScript, there are many ways to read data from the memory. The quickest and fastest way to get this data is to use a linked function in your module's function imports to return a pointer to the data itself.
let myData = new Float64Array(100); // have some data in AssemblyScript
// We should specify the location of our linked function
#external("env", "sendFloat64Array")
declare function sendFloat64Array(pointer: usize, length: i32): void;
/**
* The underlying array buffer has a special property called `data` which
* points to the start of the memory.
*/
sendFloat64Data(myData.buffer.data, myData.length);
Then in JavaScript, we can use the Float64Array constructor inside our linked function to return the values directly.
/**
* This is the fastest way to receive the data. Add a linked function like this.
*/
imports.env.sendFloat64Array = function sendFloat64Array(pointer, length) {
var data = new Float64Array(wasmmodule.memory.buffer, pointer, length);
};
However, there is a much clearer way to obtain the data, and it involves returning a reference from AssemblyScript, and then using the AssemblyScript loader.
let myData = new Float64Array(100); // have some data in AssemblyScript
export function getData(): Float64Array {
return myData;
}
Then in JavaScript, we can use the ASUtil loader provided by AssemblyScript.
import { instantiateStreaming } from "assemblyscript/lib/loader";
let wasm: ASUtil = await instantiateStreaming(fetch("myAssemblyScriptModule.wasm"), imports);
let dataReference: number = wasm.getData();
let data: Float64Array = wasm.getArray(Float64Array, dataReference);
I highly recommend using the second example for code clarity reasons, unless performance is absolutely critical.
Good luck with your AssemblyScript project!
I have Jasmine tests that are run with Karma. These tests cover an object with static properties that are used to control behavior. Altering these static properties has detrimental effects for tests that do not expect them to deviate from the default. More concretely, this test:
it('honors the base64CharactersPerLine option', () => {
const testData = new Uint8Array([ 0x01, 0x02, 0x03, 0x04 ]);
const pem = new PEMObject();
pem.header = "CERTIFICATE";
pem.data = testData;
PEMObject.base64CharactersPerLine = 1; // Causes race condition
const encodedData = pem.encode();
console.info(encodedData);
console.info(encodedData.match(/^\w/g));
expect(encodedData.match(/^\w$/g).length).toBeGreaterThan(testData.length);
});
is adversely affecting this test:
it('decoding then encoding returns the original data', () => {
const pem = new PEMObject();
pem.header = "CERTIFICATE";
pem.decode(testPEM);
expect(pem.encode()).toEqual(testPEM);
});
by making the output of pem.encode() in the latter test not identical to the original input testPEM (a string).
The more general question I would like to know is: how do you prevent race conditions in Karma/Jasmine tests where static members are used to modify behavior?
Are you sure it's a race condition and not just an issue that you're modifying the object which many tests use?
You can perhaps save the old value of the base64CharactersPerLine field before the test is run and restore it afterwards, so that other tests see the right value.
Assigning a new value to a static value in a test will affect all other tests that run later.
There are two ways to fix this. First, in a beforeEach block, cache the original value of the static property, and then reset it in an afterEach block. Like this:
let cachedValue;
beforeEach(() => cachedValue = PEMObject.base64CharactersPerLine);
afterEach(() => PEMObject.base64CharactersPerLine = cachedValue);
This will work, but will be complicated to maintain and reason about if you have lots of static properties you are manipulating during your tests.
A better approach is to change from using static properties and use static methods instead (though this requires a change to your base code). Once using static methods, you can then use jasmine spies, which are cleaned up after each test run. So, you would do something like this:
// in your base code
PEMObject.getBase64CharactersPerLine = () => SOME_VALUE;
// in your 'honors the base64CharactersPerLine option' test
spyOn(PEMObject, 'getBase64CharactersPerLine').and.returnValue(1);
And you no longer need beforeEach/afterEach blocks.
I noticed that if I execute a JavaScript script using the mongo command, the script can treat a cursor object as if it was an array.
var conn = new Mongo('localhost:27017');
var db = conn.getDB('learn');
db.test.remove({});
db.test.insert({foo: 'bar'});
var cur = db.test.find();
print(cur[0].foo); //prints: bar
print(cur[1]); // prints: undefined
This seems like it should be beyond the capabilities of the JavaScript language, since there is no way to "overload the subscript operator". So how does this actually work?
As documentation says, it is special ability of driver. It automagicly converts cursor[0] to cursor.toArray()[0]. You can prove it by overriding toArray() with print function or new Error().stack to get callstack back. Here it is:
at DBQuery.a.toArray ((shell):1:32)
at DBQuery.arrayAccess (src/mongo/shell/query.js:290:17)
at (shell):1:2
As you can see, indexing calls arrayAccess. How? Here we have a dbQueryIndexAccess function, which calls arrayAccess.
v8::Handle<v8::Value> arrayAccess = info.This()->GetPrototype()->ToObject()->Get(
v8::String::New("arrayAccess"));
...
v8::Handle<v8::Function> f = arrayAccess.As<v8::Function>();
...
return f->Call(info.This(), 1, argv);
And here we have a code, which sets indexed property handler to this function. WOW, v8 API gives us ability to add this handler!
DBQueryFT()->InstanceTemplate()->SetIndexedPropertyHandler(dbQueryIndexAccess);
... and injects it into JS cursor class, which is defined originaly in JS.
injectV8Function("DBQuery", DBQueryFT(), _global);
Tl;dr: It is hacked in C++ source code of mongo shell.
So I got a dreaded Illegal Invocation error in Chrome. I'm using Web Audio API (HTML5 is awesome btw), to build a framework of audio effects and signal routing. Basically it automatically routes the nodes based on where in the array they are. But I included my own class/objects for complex effects, or modules. These have multiple nodes within them and can delogate what goes where as if it was a single input/output node.
Anyways, when the object is created it throws the error right at the point of node creation (the private one within the module). I've read about this happening with console a lot, or loss of inherent "this". But I don't know why its happening here since the variable to grab the node constructor is a global variable. Any ideas?
var context, nodes;
var Delay = function(_context, _time, _feedback, _wet) {
this.type = "Delay";
this.delay = new context.createDelayNode(); //Console points error here.
this.feedback = new context.createGainNode();
this.crossfade = new context.createGain();
}
function GotStream(stream) {
context = new AudioContext();
nodes = [ new Delay(1,1,1) ]; //Start point of error
...
}
Obviously its stripped a bit, but that is the exact nesting of functions/variables. I should note I've tried adding .call(window), and .bind(window) to the node constructor with no luck