I have a SPA (in Aurelia / TypeScript but that should not matter) which uses SystemJS. Let's say it runs at http://spa:5000/app.
It sometimes loads JavaScript modules like waterservice/external.js on demand from an external URL like http://otherhost:5002/fetchmodule?moduleId=waterservice.external.js. I use SystemJS.import(url) to do this and it works fine.
But when this external module wants to import another module with a simple import { OtherClass } from './other-class'; this (comprehensiblely) does not work. When loaded by the SPA it looks at http://spa:5000/app/other-class.js. In this case I have to intercept the path/location to redirect it to http://otherhost:5002/fetchmodule?moduleId=other-class.js.
Note: The Typescript compilation for waterservice/external.ts works find because the typescript compiler can find ./other-class.ts easily. Obviously I cannot use an absolute URL for the import.
How can I intercept the module loading inside a module I am importing with SystemJS?
One approach I already tested is to add a mapping in the SystemJS configuration. If I import it like import { OtherClass } from 'other-class'; and add a mapping like "other-class": "http://otherhost:5002/fetchmodule?moduleId=other-class" it works. But if this approach is good, how can I add mapping dynamically at runtime?
Other approaches like a generic load url interception are welcome too.
Update
I tried to intercept SystemJS as suggest by artem like this
var systemLoader = SystemJS;
var defaultNormalize = systemLoader.normalize;
systemLoader.normalize = function(name, parentName) {
console.error("Intercepting", name, parentName);
return defaultNormalize(name, parentName);
}
This would normally not change anything but produce some console output to see what is going on. Unfortunately this seems to do change something as I get an error Uncaught (in promise) TypeError: this.has is not a function inside system.js.
Then I tried to add mappings with SystemJS.config({map: ...});. Surprisingly this function works incremental, so when I call it, it does not loose the already provided mappings. So I can do:
System.config({map: {
"other-class": `http://otherhost:5002/fetchModule?moduleId=other-class.js`
}});
This does not work with relative paths (those which start with . or ..) but if I put the shared ones in the root this works out.
I would still prefer to intercept the loading to be able to handle more scenarios but at the moment I have no idea which has function is missing in the above approach.
how can I add mapping dynamically at runtime?
AFAIK SystemJS can be configured at any time just by calling
SystemJS.config({ map: { additional-mappings-here ... }});
If it does not work for you, you can override loader.normalize and add your own mapping from module ids to URLs there. Something along these lines:
// assuming you have one global SystemJS instance
var loader = SystemJS;
var defaultNormalize = loader.normalize;
loader.normalize = function(name, parentName) {
if (parentName == 'your-external-module' && name == 'your-external-submodule') {
return Promise.resolve('your-submodule-url');
} else {
return defaultNormalize.call(loader, name, parentName);
}
}
I have no idea if this will work with typescript or not. Also, you will have to figure out what names exactly are passed to loader.normalize in your case.
Also, if you use systemjs builder to bundle your code, you will need to add that override to the loader used by builder (and that's whole another story).
Related
I have a scenario in which I want to dynamically tell a child component which image to show, based on an required image passed down by the parent, as such:
Parent:
function Social() {
return (
<SocialBar>
{
SocialMedias.map((media) =>{
return <SocialApp
key={uuidv4()}
link={media.link}
social={require(media.icon)}
/>
})
}
</SocialBar>
);
}
Child
function SocialApp(props) {
return (
<Social href={props.link} target="_blank">
<CustomSizedImage src={props.social.default} />
</Social>
);
}
Which seems pretty straight forward, right? However, this causes the error below:
I've seen some posts say that this is caused because the path is not absolute. However, I tested with absolute paths and it still doesn't work. The funny thing is, when I changed the parent code to the below for testing, it worked.
New parent code:
function Social() {
const img = require("../assets/icons/social-media/instagram.svg");
return (
<SocialBar>
{
SocialMedias.map((media) =>{
return <SocialApp
key={uuidv4()}
link={media.link}
social={img}
/>
})
}
</SocialBar>
);
}
export default Social;
The result:
OBS: I tested with EVERY image that this code will possibly load. Is not an image/image path related issue.
What is expected of all this is to dynamically load icons for social media with the correct URLs, etc. What I do not understand is: why does it work when I require the image outside the return?. I know I could make this work by iterating media object outside of return and use the resulting array with the map index, but that doesn't seem clean. Why does it have this behavior? Is there a way to make it work inside the map?
Your last code snippet works because the path is known at compile time.
I assume you use a bundler like Webpack or similar. When they compile your project, they check all imports (including require calls), use that to calculate the dependency tree, and finally bundle all required files together. For images, that includes "compiling it into a module" which just returns the path to the image.
Usually that also involves replacing paths with module identifiers. E.g. if you ever looked at the built Webpack code, you'll notice that it doesn't use e.g. require('./module') but __webpack_require.c(5) or something similar. It gave the module './module' the unique ID 5 in the bundled code.
With your earlier code snippets, when using a dynamic require with a non-constant string, Webpack just doesn't know that you'll request that image. Nor would it know the new unique ID to use for that.
The easiest solution is to import your images somewhere and use those references dynamically:
// images.ts
export IconA from './media/icon-a.svg';
export IconB from './media/icon-b.svg';
// somewhere else
import { IconA, IconB } from './images';
const iconPath = someCondition ? IconA : IconB;
Of course, you don't have to use the import/require mechanic. You can simply have your icon path be an actual path, relative to your public/ folder.
I'm kind of going nuts here.
I need to start writing a stand alone non web application in JavaScript.
My problem is that I can run (using nodejs) the script just fine, but I can't split the code that I need to write among multiple files because then I don't know how to include them.
Rigth now all I want is the simplest working example of including a JS file inside another so I can use the functions defined there (or class if I so choose).
My test.js looks like this.
const { outputText } = require("./text_module.node.js");
outputText();
While my text_module.node.js looks like this:
function outputText(){
console.log("hello world");
}
My package.json looks like this:
{
"type": "text_module.node.js"
}
I have tried
Adding export before the function. It tells me export is unexpected
Using this notation: import { something } from "a_path". Anything that uses this notation will get me an "Unexpected token {" error.
The current configuration simply tells me that outputText() is not a function. So I'm out of ideas.
I don't know how else to search. I've been searching for hours on this topic and it seem that no matter where I look there is some HTML code that is needed to tie everything togeter or using another third party tool. Using jQuery loads stuff "asynchronically" which is not what I need. I want it to sequential.
Is JS just NOT suppossed to be used like other scripting languages? Otherwise I can't figure out why this is so complicated. I feel like I'm missing something big here.
If you want to use const { outputText } = require("./text_module.node.js");,
export it as module.exports = { outputText: outputText }.
In your question,
Adding export before the function. It tells me export is unexpected
Because it's an es6 thing and nodejs doesn't support all es6 features. You can refer to this answer for more details on that.
I have the following problem, I have created a library. which its methodos
they are in separate files writeLog.js in a folder called ./lib.
and I have an index.js file that explores that folder and extracts the function
and puts the same name as the file as the function name.
example:
writeLog.js
and then I can use
let jsPackTools require('./index');
let utils = new jsPackTools();
utils.writeLog('test');
The way I use to add the methods to the classes is through the prototype.
the folder is scanned with readdirSync () and then I take the file name to
place it inside a require ().
the code is the following:
fs.readdirSync(__dirname+'/lib').forEach(modules => {
let module = modules.split('.').shift();
jsPackTools.prototype[module] =require(\`${__dirname}/lib/${module}\`);
});
Everything works perfectly fine, the problem is that when I want to access my methods through the autocomplete of any code editor. Methods they seem to be not accessible. however everything works perfectly, I can do use of functions.
The problem is that I have to know each method that I am going to use and it cannot be visualized in the completed auto of any editor.
I have tried with:
const writeLog = require('./lib/writeLog');
class jsPackTools {
get writeLog() { return writeLog.bind(this) }
}
This allows indexing perfectly. however, I don't logo do it dynamically.
The idea is that the class does not know what its methods are until the ./lib folder is scanned and all methods are extracted. This way you can add functions inderteminately and when you use them you can see.
I do not want to use transpilators, I thought of using a d.ts but it is necessary to use typeScript but the intention is to create this library without dependencies or the least possible.
I am currently developing a single page application using Cordova 3.4.0, Requirejs and Backbone. While porting my application from iPhone to iPad, I need to change some functions in some views and keep other parts intact.
To keep the change minimal, my solution is to create new object for each view I need to change, inherit all properties from original view and override only necessary functions.
To do so, I need to configure Requirejs so that in iPad, if I require, for instance, 'user/view/edit-profile.js', it will check whether there was a 'user/ipad/view/edit-profile.js' file, if there is one, requires it, otherwise require 'user/view/edit-profile.js'.
I have tried i18n, but it is not right for this situation. I am coming up with an idea of creating a new plugin for requirejs to do the task.
Does anyone have any suggestion for my problem?
Btw, Since the required file changes dynamically according to the platform. I call it polymorphism.
You could use path fallbacks:
paths: {
"user/view/edit-profile": ["user/ipad/view/edit-profile", "user/view/edit-profile"]
}
The above will make RequireJS try to load the ipad variant first. If as you develop your application you end up with logic to complex for fallbacks, you can use errbacks:
function onload(module) {
// Whatever you want to do...
};
require([module_a], onload, function (err) {
require([module_b], onload);
});
The code above will try to load a module from module_a and then from module_b. I use this kind of code to load modules with names that are computed at run time.
Since every module is a Backbone View, I come up with a solution, which will override extend function of Backbone View to return modified object depending on the existence of ipad, iphone or android dependencies.
The solution will require that if a base view have iPad version, it have to declare the iPad versions at dependencies, the extend function will extend existing view by the iPad view so that the iPad view can inherit all properties of base view and override only necessary functions.
I name the view PolyplatformView. Each view need to declare its ID, for example: user/view/edit-profile
Backbone.PolyplatformView = Backbone.View.extend({
});
Backbone.PolyplatformView.extend = function() {
if(arguments[0].moduleId) {
var extendPropsPath = arguments[0].moduleId.replace('/', '/' + constants.platform_name + '/');
// turn user/view/edit-profile -> user/ipad/view/edit-profile
if(require.defined(extendPropsPath)) {
_.extend(arguments[0], require(extendPropsPath));
}
} else {
console.warn('No module Id in polyplatform view -> cannot extend its parent');
}
var extended = Backbone.View.extend.apply(this, arguments);
return extended;
}
I've created a Dojo module which depends on dojox/data/JsonRestStore like this:
define("my/MyRestStore",
["dojo/_base/declare", "dojox/data/JsonRestStore"],
function(declare, JsonRestStore) {
var x = new JsonRestStore({
target: '/items',
identifier: 'id'
});
...
which is fine. But now I want to have the the uncompressed version of the JsonRestStore code loaded so that I can debug it. I can't find any documentation on how to do this, but since there is a file called 'JsonRestStore.js.uncompressed.js' I changed my code to:
define("my/MyRestStore",
["dojo/_base/declare", "dojox/data/JsonRestStore.js.uncompressed"],
function(declare, JsonRestStore) {
...
thinking that might work.
I can see the JsonRestStore.js.uncompressed.js file being loaded in FireBug, but I get an error when trying to do new JsonRestStore:
JsonRestStore is not a constructor
Should this work?
Is there a way of configuring Dojo to use uncompressed versions of all modules? That's what I really want, but will settle for doing it on a per dependency basis if that's the only way.
Update
I've found a way to achieve what I want to do: rename the JsonRestStore.js.uncompressed.js file to JsonRestStore.js.
However, this seems a bit like a hacky workaround so I'd still be keen to know if there is a better way (e.g. via configuration).
You have two options
1) Create a custom build. The custom build will output a single uncompressed file that you can use for debugging. Think the dojo.js.uncompressed.js but it includes all the extra modules that you use.
OR
2) For a development environment, use the dojo source code. This means downloading the Dojo Toolkit SDK and referencing dojo.js from that in the development environment.
For the projects I work on, I do both. I set up the Dojo configuration so that it can be dynamic and I can change which configuration that I want using a query string parameter.
When I am debugging a problem, I will use the first option just to let me step through code and see what is going on. I use the second option when I am writing some significant js and don't want the overhead of the custom build to see my changes.
I describe this a bit more at
http://swingingcode.blogspot.com/2012/03/dojo-configurations.html
I think the reason for this is due to the fact that the loader declares its class-loads (modules), by the file conventions used. The 1.7 loader is not too robust just yet, ive had similar problems until realizing how to separate the '.' and '/' chars.
Its only a qualified guess; but i believe it has to do with the interpretation of '.' character in the class-name which signifies as a sub-namespace and not module name.
The 'define(/ * BLANK * / [ / * DEPENDENCIES * / ], ...)' - where no first string parameter is given - gets loaded by the filename (basename). The returned declare also has a saying though. So, for your example with jsonrest, its split/parsed as such:
toplevel = dojox
mid = data
modulename = JsonRestStore.js.uncompressed
(Fail.. Module renders as dojox.data.JsonRestStore.js.uncompressed, not dojox.data.JsonRestStore as should).
So, three options;
Load uncomressed classes through <script src="{{dataUrl}}/dojox/data/JsonRestStore.js.uncompressed.js"></script> and work them on dojo.ready
I think modifying the define([], function(){}) in uncompressed.js to define("JsonRestStore", [], function() {}) would do the trick (uncomfirmed)
Use the dojo/text loader, see below
Text filler needed :)
define("my/MyRestStore",
["dojo/_base/declare", "dojo/text!dojox/data/JsonRestStore.js.uncompressed.js"],
function(declare, JsonRestStore) {
...
JsonRestStore = eval(JsonRestStore);
// not 100% sure 'define' returns reference to actual class,
// if above renders invalid, try access through global reference, such as
// dojox.dat...