Loading polyfills conditionally & bundles - javascript

I'm composing a small library (for the first time) that utilizes the Resize Observer API. However, I want to include the polyfill for browsers that don't support it, and only load it into the final build if it's required, though I'm not sure where to place the API check/module load, or if this method is the right way to do it.
Say I have my index file which is the main file exported with the library, should I add this statement outside the function definition like so:
// index.js
(async () => {
if (!ResizeObserver in window) {
const module = await import('resize-observer-polyfill');
window.ResizeObserver = module.ResizeObserver
}
}();
// main function exported
export default () => {
function loadResizeObserver() {
const ro = new ResizeObserver((entries) => {
// ...
})
}
}
Is there a better way to compose something like this? How will this affect the build and what's included if a consumer installs the package (with and without native RO support?)
First time publishing and realized I know nothing about it.

Related

Can I allow dynamic loading of JS modules from the client side?

In my web app I want to give users the ability to author their own modules for use in the app. These are ideally written in JS on the client side.
However, I'm having trouble allowing for classes created on the client side to be accessed by the app. Originally I wanted some kind of module import, but dynamic imports still require a path, which isn't accessible by the browser for security reasons. Just doing a straight import of the JS into a script tag pollutes the global namespace, which isn't ideal either.
Is there some sensible way to do this? I would ideally want to import the class, e.g.
// MyClass.js, on the client side
export default class MyClass {
myPrint() {
console.log('ya blew it');
}
}
And then in App.js:
import(somehow_get_this_path).then((MyClass) => { etc});
Is getting that path possible? The current namespace-polluting method uses a select file dialog but doesn't allow me to tell import what path it has. All you get is a blob. I'm pretty new to this stuff so apologies if this question is dumb.
edit: I've tried getting an object URL using CreateObjectURL, which gives the error:
Error: Cannot find module 'blob:null/d651a896-d568-437f-86d0-72ebcee7bc56'
if you are using webpack as a bundler then use can use magic comments to lazy load the components.
or else you can use dynamic imports.
import('path_to_my_class').then(MyClass => {
// Do something with MyClass
});
Edited 1:-
You can use this code to get a working local URL for the uploaded js file. try using this
const path = (window.URL || window.webkitURL).createObjectURL(file);
console.log('path', path);
Edited 2:-
Alternatively, you can create your own object at a global level and wrap the file using an anonymous function (creating closure) such that it won't pollute the global namespace.
// you can add other methods to it according to your use case.
window.MyPlugins = {
plugins: [],
registerPlugin: function (plugin){
this.plugins.push(plugin);
}
}
// MyFirstPlugin.js
// you can inject this code via creating a script tag.
(function(w){ //closure
function MyFirstPlugin() {
this.myPrint = function (){
console.log('ya blew it');
}
}
w.MyPlugins.registerPlugin(new MyFirstPlugin());
})(window)
// get the reference to the plugin in some other file
window.MyPlugins.plugins
Read file and use eval
<script>
function importJS(event) {
var input = event.target;
var reader = new FileReader();
reader.onload = function(){
eval(reader.result);
input.value='';
};
reader.readAsText(input.files[0]);
};
</script>
Select a *.js file and execute
<br>
<br>
<input type="file" onchange="importJS(event);">

Run rollup plugin when everything on build is finished

I use the experimentalCodeSplitting: true feature of rollup 0.61.2 to get nice code splitting. Because my project consists also of assets I created a plugin which copies and minifies the asset files accordingly. The problem is that the hooks I used are called for every chunk which is created. Therefore the assets are copied and minified multiple time. The only workaround I found is, to create some flag which is set to true after everything is done correctly. Is there a functionality to call a rollup hook after everything (or before everything) is finished and not on every chunk? Now my plugin looks something like the following code (I removed some parts and simplified for readability):
export default function copy(userOptions = {}) {
const name = 'copyAndMinify';
const files = userOptions.files || [];
let isCopyDone = false;
return {
name: name,
// also tried onwrite, ongenerate, buildEnd and generateBundle
buildStart() {
if (isCopyDone) {
return;
}
for (let key in files) {
const src = key;
const dest = files[key];
try {
minifyFile(src, dest);
} catch (err) {
fatal(name, src, dest, err);
}
}
isCopyDone = true;
}
};
};
Maybe there is a better way of doing this kind of stuff because with this implementation I always have to completely restart rollup to execute my plugin
The rollup site lists all the available plugin hooks.
generateBundle seems like what you'd want.
generateBundle (formerly onwrite and ongenerate) - a ( outputOptions, bundle, isWrite ) => void function hook called when bundle.generate() or bundle.write() is being executed; you can also return a Promise. bundle provides the full list of files being written or generated along with their details.

Evalute JS file with template strings from another file

I would like to make use of a function called executeJavaScript() from the Electron webContents API. Since it is very close to eval() I will use this in the example.
The problem:
I have a decent sized script but it is contained in a template string.
Expanding this app, the script could grow a lot as a string.
I am not sure what the best practices are for this.
I also understand that eval() is dangerous, but I am interested in the principal of my question.
Basic eval example for my question:
// Modules
const fs = require('fs');
// CONSTANTS
const EXAMPLE_1 = 'EXAMPLE_1';
const EXAMPLE_2 = 'EXAMPLE_2';
const EXAMPLE_3 = 'EXAMPLE_3';
const exampleScriptFunction = require('./exampleScriptFunction');
const exampleScriptFile = fs.readFileSync('./exampleScriptFile.js');
// using direct template string
eval(`console.log(${EXAMPLE_1})`);
// using a method from but this doesnt solve the neatness issue.
eval(exampleScriptFunction(EXAMPLE_2));
// What I want is to just use a JS file because it is neater.
eval(`${exampleScriptFile}`);
exampleScriptFunction.js
module.exports = function(fetchType) {
return `console.log(${fetchType});`;
}
This will allow me to separate the script to a new file
what if I have many more then 1 variable???
exampleScriptFile.js:
console.log(${EXAMPLE_3});
This clearly does not work, but I am just trying to show my thinking.
back ticks are not present, fs loads as string, main file has back ticks.
This does not work. I do not know how else to show what I mean.
Because I am loading this will readFileSync, I figured the es6 template string would work.
This allows me to write a plain js file with proper syntax highlighting
The issue is the variables are on the page running the eval().
Perhaps I am completely wrong here and looking at this the wrong way. I am open to suggestions. Please do not mark me minus 1 because of my infancy in programming. I really do not know how else to ask this question. Thank you.
Assuming your source is stored in exampleScriptFile:
// polyfill
const fs = { readFileSync() { return 'console.log(`${EXAMPLE_3}`);'; } };
// CONSTANTS
const EXAMPLE_1 = 'EXAMPLE_1';
const EXAMPLE_2 = 'EXAMPLE_2';
const EXAMPLE_3 = 'EXAMPLE_3';
const exampleScriptFile = fs.readFileSync('./exampleScriptFile.js');
// What I want is to just use a JS file because it is neater.
eval(exampleScriptFile);
Update
Perhaps I wasn't clear. The ./exampleScriptFile.js should be:
console.log(`${EXAMPLE_3}`);
While what you're describing can be done with eval as #PatrickRoberts demonstrates, that doesn't extend to executeJavaScript.
The former runs in the caller's context, while the latter triggers an IPC call to another process with the contents of the code. Presumably this process doesn't have any information on the caller's context, and therefore, the template strings can't be populated with variables defined in this context.
Relevant snippets from electron/lib/browsers/api/web-contents.js:
WebContents.prototype.send = function (channel, ...args) {
// ...
return this._send(false, channel, args)
}
// ...
WebContents.prototype.executeJavaScript = function (code, hasUserGesture, callback) {
// ...
return asyncWebFrameMethods.call(this, requestId, 'executeJavaScript',
// ...
}
// ...
const asyncWebFrameMethods = function (requestId, method, callback, ...args) {
return new Promise((resolve, reject) => {
this.send('ELECTRON_INTERNAL_RENDERER_ASYNC_WEB_FRAME_METHOD', requestId, method, args)
// ...
})
}
Relevant snippets from electron/atom/browser/api/atom_api_web_contents.cc
//...
void WebContents::BuildPrototype(v8::Isolate* isolate,
v8::Local<v8::FunctionTemplate> prototype) {
prototype->SetClassName(mate::StringToV8(isolate, "WebContents"));
mate::ObjectTemplateBuilder(isolate, prototype->PrototypeTemplate())
// ...
.SetMethod("_send", &WebContents::SendIPCMessage)
// ...
}

RequireJS - using module returns undefined

I'm trying to use RequireJS in my app. I'm including the requirejs script from cdnjs like this:
<script src="//cdnjs.cloudflare.com/ajax/libs/require.js/2.1.10/require.min.js"></script>
on my page I have a button and I register an event for it:
$('#btnSpeedTest').on('click', function (e) {
require([baseUrl + 'Content/js/tools/speedtest.js'], function (speedTestModule) {
alert(speedTestModule);
});
});
If I watch with Fidler - I see that upon clicking the button speedtest.js is loaded.
speedtest.js contains the following:
define('speedTestModule', function () {
function SpeedTest(settings, startNow) {
// basic initialization
}
var fn = SpeedTest.prototype;
fn.startRequest = function (download, twoRequests) {
// logic
}
return SpeedTest;
});
The alert(speedTestModule); command returns "undefined". I saw a tutorial on RequireJS and in that tutorial everything was in the same directory as well as files with names of modules (which is not my case since I'm loading it from CDN).
I even tried to return a simple string, but it did not work. What am I missing?
Thanks
Don't use a named define. Instead of this:
define('speedTestModule', function () {
do this:
define(function () {
and let RequireJS name your module. You typically want to let r.js add names to your modules when you optimize them. There are a few cases where using names yourself in a define call is warranted but these are really special cases.

Load "Vanilla" Javascript Libraries into Node.js

There are some third party Javascript libraries that have some functionality I would like to use in a Node.js server. (Specifically I want to use a QuadTree javascript library that I found.) But these libraries are just straightforward .js files and not "Node.js libraries".
As such, these libraries don't follow the exports.var_name syntax that Node.js expects for its modules. As far as I understand that means when you do module = require('module_name'); or module = require('./path/to/file.js'); you'll end up with a module with no publicly accessible functions, etc.
My question then is "How do I load an arbitrary javascript file into Node.js such that I can utilize its functionality without having to rewrite it so that it does do exports?"
I'm very new to Node.js so please let me know if there is some glaring hole in my understanding of how it works.
EDIT: Researching into things more and I now see that the module loading pattern that Node.js uses is actually part of a recently developed standard for loading Javascript libraries called CommonJS. It says this right on the module doc page for Node.js, but I missed that until now.
It may end up being that the answer to my question is "wait until your library's authors get around to writing a CommonJS interface or do it your damn self."
Here's what I think is the 'rightest' answer for this situation.
Say you have a script file called quadtree.js.
You should build a custom node_module that has this sort of directory structure...
./node_modules/quadtree/quadtree-lib/
./node_modules/quadtree/quadtree-lib/quadtree.js
./node_modules/quadtree/quadtree-lib/README
./node_modules/quadtree/quadtree-lib/some-other-crap.js
./node_modules/quadtree/index.js
Everything in your ./node_modules/quadtree/quadtree-lib/ directory are files from your 3rd party library.
Then your ./node_modules/quadtree/index.js file will just load that library from the filesystem and do the work of exporting things properly.
var fs = require('fs');
// Read and eval library
filedata = fs.readFileSync('./node_modules/quadtree/quadtree-lib/quadtree.js','utf8');
eval(filedata);
/* The quadtree.js file defines a class 'QuadTree' which is all we want to export */
exports.QuadTree = QuadTree
Now you can use your quadtree module like any other node module...
var qt = require('quadtree');
qt.QuadTree();
I like this method because there's no need to go changing any of the source code of your 3rd party library--so it's easier to maintain. All you need to do on upgrade is look at their source code and ensure that you are still exporting the proper objects.
There is a much better method than using eval: the vm module.
For example, here is my execfile module, which evaluates the script at path in either context or the global context:
var vm = require("vm");
var fs = require("fs");
module.exports = function(path, context) {
context = context || {};
var data = fs.readFileSync(path);
vm.runInNewContext(data, context, path);
return context;
}
And it can be used like this:
> var execfile = require("execfile");
> // `someGlobal` will be a global variable while the script runs
> var context = execfile("example.js", { someGlobal: 42 });
> // And `getSomeGlobal` defined in the script is available on `context`:
> context.getSomeGlobal()
42
> context.someGlobal = 16
> context.getSomeGlobal()
16
Where example.js contains:
function getSomeGlobal() {
return someGlobal;
}
The big advantage of this method is that you've got complete control over the global variables in the executed script: you can pass in custom globals (via context), and all the globals created by the script will be added to context. Debugging is also easier because syntax errors and the like will be reported with the correct file name.
The simplest way is: eval(require('fs').readFileSync('./path/to/file.js', 'utf8'));
This works great for testing in the interactive shell.
AFAIK, that is indeed how modules must be loaded.
However, instead of tacking all exported functions onto the exports object, you can also tack them onto this (what would otherwise be the global object).
So, if you want to keep the other libraries compatible, you can do this:
this.quadTree = function () {
// the function's code
};
or, when the external library already has its own namespace, e.g. jQuery (not that you can use that in a server-side environment):
this.jQuery = jQuery;
In a non-Node environment, this would resolve to the global object, thus making it a global variable... which it already was. So it shouldn't break anything.
Edit:
James Herdman has a nice writeup about node.js for beginners, which also mentions this.
I'm not sure if I'll actually end up using this because it's a rather hacky solution, but one way around this is to build a little mini-module importer like this...
In the file ./node_modules/vanilla.js:
var fs = require('fs');
exports.require = function(path,names_to_export) {
filedata = fs.readFileSync(path,'utf8');
eval(filedata);
exported_obj = {};
for (i in names_to_export) {
to_eval = 'exported_obj[names_to_export[i]] = '
+ names_to_export[i] + ';'
eval(to_eval);
}
return exported_obj;
}
Then when you want to use your library's functionality you'll need to manually choose which names to export.
So for a library like the file ./lib/mylibrary.js...
function Foo() { //Do something... }
biz = "Blah blah";
var bar = {'baz':'filler'};
When you want to use its functionality in your Node.js code...
var vanilla = require('vanilla');
var mylibrary = vanilla.require('./lib/mylibrary.js',['biz','Foo'])
mylibrary.Foo // <-- this is Foo()
mylibrary.biz // <-- this is "Blah blah"
mylibrary.bar // <-- this is undefined (because we didn't export it)
Don't know how well this would all work in practice though.
I was able to make it work by updating their script, very easily, simply adding module.exports = where appropriate...
For example, I took their file and I copied to './libs/apprise.js'. Then where it starts with
function apprise(string, args, callback){
I assigned the function to module.exports = thus:
module.exports = function(string, args, callback){
Thus I'm able to import the library into my code like this:
window.apprise = require('./libs/apprise.js');
And I was good to go. YMMV, this was with webpack.
A simple include(filename) function with better error messaging (stack, filename etc.) for eval, in case of errors:
var fs = require('fs');
// circumvent nodejs/v8 "bug":
// https://github.com/PythonJS/PythonJS/issues/111
// http://perfectionkills.com/global-eval-what-are-the-options/
// e.g. a "function test() {}" will be undefined, but "test = function() {}" will exist
var globalEval = (function() {
var isIndirectEvalGlobal = (function(original, Object) {
try {
// Does `Object` resolve to a local variable, or to a global, built-in `Object`,
// reference to which we passed as a first argument?
return (1, eval)('Object') === original;
} catch (err) {
// if indirect eval errors out (as allowed per ES3), then just bail out with `false`
return false;
}
})(Object, 123);
if (isIndirectEvalGlobal) {
// if indirect eval executes code globally, use it
return function(expression) {
return (1, eval)(expression);
};
} else if (typeof window.execScript !== 'undefined') {
// if `window.execScript exists`, use it
return function(expression) {
return window.execScript(expression);
};
}
// otherwise, globalEval is `undefined` since nothing is returned
})();
function include(filename) {
file_contents = fs.readFileSync(filename, "utf8");
try {
//console.log(file_contents);
globalEval(file_contents);
} catch (e) {
e.fileName = filename;
keys = ["columnNumber", "fileName", "lineNumber", "message", "name", "stack"]
for (key in keys) {
k = keys[key];
console.log(k, " = ", e[k])
}
fo = e;
//throw new Error("include failed");
}
}
But it even gets dirtier with nodejs: you need to specify this:
export NODE_MODULE_CONTEXTS=1
nodejs tmp.js
Otherwise you cannot use global variables in files included with include(...).

Categories