Randomly failing polymer web component test - javascript

Yet another attempt to marry Polymer with ES6 classes.
And it almost works except a wct test failing randomly on a polymer component with an imported es6 class (by systemjs). As far as I understand, this happens because the script, containing class definition, gets loaded after mocha executed the test. The polymer component consists of two parts - html and javascript (to compile latter to es5),
html:
<dom-module id="my-test">
<template>hello</template>
<script>
System.import('elements/my-test.js');
</script>
</dom-module>
javascript:
import {User} from 'elements/model/user.js';
Polymer({
is: "my-test",
method: function(){
console.log("method, user="+this.val);
},
ready: function(){
this.user= new User(); //this.user is randomly undefined
}
});
This seems to work quite stable in the browser, at least when loaded from localhost. But the only thing which ‘fixes’ the test is delaying Polymer’s ready call:
Polymer.whenReady = function (f) {
console.log("polymer ready");
setTimeout(f, 100);// "fix"
//f();
}
which means at some point all this will fail in browser too (maybe when serving not from the localhost).
I’m thinking about getting somehow to the system-register’s promises and making something similar to HTMLImports.whenDocumentReady, but I still don’t have clear understanding of how all this works.
So any ideas and suggestions are highly appreciated!
The sample app is on github:
git clone https://github.com/bushuyev/test_test.git
cd test_test
git checkout tags/random_fail
npm install
bower install
gulp wct
To make it succeed more often than fail - change verbose to true in wct.conf.js.
kind of update: How to import system js SFX library

It's possible to use Polymer, SystemJS and TypeScript (like ES6 but with added Polymer-friendly syntax) together in a very nice way, also handling HTML imports using SystemJS. That does involve a timing issue, and I've published a small shim that first waits from webcomponents.js to load and catches its ready event (before other code gets a chance to see it), then it loads Polymer and finally all other components and TypeScript code. Then it dispatches the event again so Polymer finishes initializing itself.
Here's an article about combining the technologies with the mentioned solution, downloadable code and a demo.

Related

Check to see if dynamic import is a module in JavaScript/TypeScript

I am working on a ScriptManager class for a project that was created many years ago. The original code read scripts from a database, and these scripts are different depending on the customer and installation (the application is a desktop app that uses Chrome Embedded Framework to display web pages). The code would read custom JavaScript code and eval() it, which of course is highly undesirable.
I am replacing this code with a ScriptManager class that can support dynamically inserted code, and the ScriptManager is capable of loading code as a module using JavaScript's dynamic import() command, or loading code as pure script by creating a script tag dynamically in the document.
My problem is that there are many different possible custom code blocks in the database, and not all are modules; some will be pure script until those can be converted to modules at a later time. My code can handle this as described above, but I need a way to detect if the script code from the database is a module, so I can either use the import() command or insert a script tag if it is not.
I am solving this temporarily by making sure any module script code has "export const isModule = true", and checking this after calling import(). This works, but any code that is pure script still results in a module variable, but with no exports in it. If possible I don't want the other developers to have to remember to add isModule = true to any modules they develop in the future.
Is there a way to check that code is a module without having to do complex analysis of the code to check if there are exports in it? Since import() still returns an object and throws no errors if there are no exports, I don't know how to detect this.
UPDATE: Here are some examples of how this is intended to work:
// Not real code, pretend that function gets the string of the script.
let code = getSomeCodeFromTheDatabase();
// Save the code for later loading.
let filename = 'some-filename.js';
saveCodeToFile(code, filename);
// Attempt to dynamically import the script as a module.
let module = await import(filename);
// If it is NOT a module, load it instead as a script tag.
// This is where I need to be able to detect if the code is
// a module or pure script.
if (!module.isModule) {
let scriptTag = document.createElement('script');
scriptTag.src = filename;
document.head.appendChild(script);
}
So if you look here How can I tell if a particular module is a CommonJS module or an ES6 module? you will see I answered a similar question.
So the thing is Sarah, modules are defined by the way that they resolve. Module-types resolving differently is what, not only makes them incompatible with one another, but it is also why we name them differently. Originally Transpillers like Babel & TypeScript were invented because of differences in ECMA-262 Specifications, and the desire to support people who didn't have the updated specifications, as well as supporting the newer features for those who did.
Today transpilers are still being used, conceptually, much the same way. They still help us maintain a single code base while supporting older specifications, and features in newer specifications, at the same time, but the also have the added benefit of being able to generate multiple different builds from a single code base. They use this to support different module types. In node.js, the primary module-type is CJS, but the future lies in ESM modules, so package maintainers have opted to dual build the projects. The use the TypeScript Compiler (aka Transpiler) to emit a CJS build, and an ESM build.
Now this is where things get complicated, because you cannot tell just by looking at a module if it CJS or ESM in this situation, **you absolutely have to inspect its code, and check if it has more than one tsconfig.json file (because it would need at-least 2 to maintain a bi-modular build (which are becoming increasingly common with the turn of each day)
My Suggestion to You:
Use well documented packages. Good packages should be documented well, and those packages should state in their README.md document, what type of package/module they are, and if the package supports more than one module type. If in doubt you can either come and ask here, or better yet, you can ask the maintainer by creating an issue, asking them to add that information to their README.md document.
You can check that there are no export after import. In chrome import() added empty default for non module.
function isNotModule(module) {
return (!Object.keys(module).length) || (!!module.default && typeof module.default === 'object' && !Object.keys(module.default).length)
}
import('./test.js')
.then((module) => {
console.log('./test.js',isNotModule(module))
})
May be it's better to check source code via regex to check if it contains export
something like this
const reg = new RegExp('([^\w]|^)export((\s+)\w|(\s*{))')
reg.test(source)

What happens if polyfill script fails to download?

I have a SPA (built on webpack, with babel, etc) that includes a polyfill in the index.html:
<script src="https://cdn.polyfill.io/v2/polyfill.min.js?features=Promise,Array.prototype.includes,Element.prototype.remove"></script>
One use-case for the polyfill is in order to use the Promise API on IE 10-11.
My error monitoring reporting an error on an IE 11 client of the following:
ReferenceError: 'Promise' is undefined
So I assume that particular session failed to download the polyfill for some reason.
My question is: How should I deal with this case? Is it a scenario I should expect to happen sometimes? Is the user expected to noticed the application not working properly, and reload the page?
There is an error event you can attach to allow for more control if you are really worried. You don't usually need to handle this explicitly though.
In this particular case you could migrate towards using babel to build a bundle with polyfills included in your scripts. This adds an additional build step to your process though.
Since you mentioned you're using webpack, it would just be best to include the necessary polyfills directly in the project via an import statement (something like core.js) rather than using a cdn - polyfill.io.
However, you could alternatively add an ID to the script element and listen to the onload and onerror events to determine whether the script (un)successfully loaded like so:
<script id="polyfillScript" src="https://cdn.polyfill.io/v2/polyfill.min.js?features=Promise,Array.prototype.includes,Element.prototype.remove"></script>
In your project index.js:
document.getElementById('polyfillScript').addEventListener('error', () => {
alert('Failed to load polyfill!');
});

javascript, import array from js file

I am trying to import array from js file to another js file. link For example:
file1.js:
var array = ['one', 'two', 'three'];
file2.js:
import { array } from 'file1'
My idle says that "import declaration are not supported by current Javascript version." How can I import this array?
Thank You for any help
As mentioned by article you linked, import is available with ES6.
I think you're writing ES5 JavaScript, then you'll need a ES6 transpiler such as Babel.
https://codeburst.io/es5-vs-es6-with-example-code-9901fa0136fc
importing is only allowed in ES6, the newest version of javascript so to speak. Browsers are just now getting around to being able to "understand" ES6, and they won't be able to understand it completely for awhile. Currently, browsers understand the previous version of javascript, ES5, really well. So how can you write code that uses import since it's ES6?
Unfortunately, you need some special tools that change code that you write in ES6 to ES5 so that browsers can understand your code. This is called transpiling. A tool called Babel is by far the most popular transpiler used today. But that's not all. You'll need another tool to bundle the modules you write in ES6 syntax. Rollupjs and Webpack are the most popular tools for this task.
It can take a few weeks of reading and trying things out to learn these tools, so unfortunately I can't explain them well enough in one answer here for you to get a complete understanding. Basically what bundling does, using your example: file1.js and file2.js will be combined into one single final file bundle.js and this is the file that you will include <script src="bundle.js"> in your html. file2.js is called a module. How does this new help write better code? Well we can write most of our code in modules and then just import whatever modules I need on particuler pages. If I need a countdown timer on page1.html, I'll create a module which has the code needed for a countdown timer, and import the countdown timer module. If I need a countdown timer on a new page 6 months from now, I will just import the countdown timer module on this new page. Code reusability.
Webpack is more popular than Rollup, but I would recommend using Rollup to get a good overview of what a bundler actually does. We use Rollup in our enterprise application in fact, although Webpack is more widely used I think, as it does many many things beyond bundling.
Note: ES6 is not really the newest version of javascript; we already have ES7 and ES8 coming along. In fact, there are new features constantly being added to the official javascript language. The problem is that it takes time for browsers to implement (be able to "understand") these new features.
The best way I can think to replicate the behavior in ES5 is to create an iframe with a script tag to the file you want. Onload you can grab the desired variable and bring it into the parent window. Then destroy the iframe.
Usage would be something like this:
import('file-url', 'variable-name', function(imported){
});
Implementation would be something like this:
function import(filePath, varName, callback){
//create iframe
//create script tag
//add filepath and onload handler
script.onload = function(){
//retrieve variable and
//trigger callback in parent
};
}

Angular methods (constants, config, run, factory, service), Does it matter what order you call them?

I have an application built on generator-angular-fullstack and it does a great job of allowing all my angular components to live in their own separate files.
I was just running all my code through JSLint and it asked to remove 'use strict'; from my index.module.js file as it worked out that this was the global or starting file for my entire application.
I was reviewing the JSLint warning here use-the-function-form-of-use-strict
This got me thinking, how did JSLint know that index.module.js was the starting code block.
Which then got me thinking, does it matter what order angular startup methods are called.
Can these methods be run in any order you like, or is there an expected sequential order for these calls?
angular.module('appName')
angular.module('appName').run(function() { });
angular.module('appName').config(function() { });
angular.module('appName').service(function() { });
angular.module('appName').constant('blah', 'blah');
TL;DR - no it's not.
The way angular is doing it - is when the page renders and scripts are loaded it register all the components(services\config\constants..) but do not execute them.
When the registration completes angular is starting to execute the application, providers --> config --> run and so on..
If your interested in some more detailed explanation on the way angular works under the hood you can check out this awesome article.
*forgot to mention that you must define your module first.

TypeScript use typescript-require shared files

I use TypeScript to code my javascript file with Object Oriented Programing.
I want to use the node module https://npmjs.org/package/typescript-require to require my .ts files from other files.
I want to share my files in both server and client side. (Browser) And that's very important. Note that the folder /shared/ doesn't mean shared between client and server but between Game server and Web server. I use pomelo.js as framework, that's why.
For the moment I'm not using (successfully) the typescript-require library.
I do like that:
shared/lib/message.js
var Message = require('./../classes/Message');
module.exports = {
getNewInstance: function(message, data, status){
console.log(requireTs);// Global typescript-require instance
console.log(Message);
return new Message(message, data, status);
}
};
This file need the Message.js to create new instances.
shared/classes/Message.ts
class Message{
// Big stuff
}
try{
module.exports = Message;
}catch(e){}
At the end of the fil I add this try/catch to add the class to the module.exports if it exists. (It works, but it's not really a good way to do it, I would like to do better)
If I load the file from the browser, the module.export won't exists.
So, what I did above is working. Now if I try to use the typescript-require module, I'll change some things:
shared/lib/message.js
var Message = requireTs('./../classes/Message.ts');
I use requireTs instead of require, it's a global var. I precise I'm using .ts file.
shared/classes/Message.ts
export class Message{
// Big stuff
}
// remove the compatibility script at the end
Now, if I try like this and if I take a look to the console server, I get requireTs is object and Message is undefined in shared/lib/message.js.
I get the same if I don't use the export keyword in Message.ts. Even if I use my little script at the end I get always an error.
But there is more, I have another class name ValidatorMessage.ts which extends Message.ts, it's not working if I use the export keyword...
Did I did something wrong? I tried several other things but nothing is working, looks like the typescript-require is not able to require .ts files.
Thank you for your help.
Looking at the typescript-require library, I see it hasn't been updated for 9 months. As it includes the lib.d.ts typing central to TypeScript (and the node.d.ts typing), and as these have progressed greatly in the past 9 months (along with needed changes due to language updates), it's probably not compatible with the latest TypeScript releases (just my assumption, I may be wrong).
Sharing modules between Node and the browser is not easy with TypeScript, as they both use very different module systems (CommonJS in Node, and typically something like RequireJS in the browser). TypeScript emits code for one or the other, depending on the --module switch given. (Note: There is a Universal Module Definition (UMD) pattern some folks use, but TypeScript doesn't support this directly).
What goals exactly are you trying to achieve, and I may be able to offer some guidance.
I am doing the same and keep having issues whichever way I try to do things... The main problems for me are:
I write my typescript as namespaces and components, so there is no export module with multiple file compilation you have to do a hack to add some _exporter.ts at the end to add the export for your library-output.js to be importable as a module, this would require something like:
module.exports.MyRootNamespace = MyRootNamespace
If you do the above it works, however then you get the issue of when you need to reference classes from other modules (such as MyRootNamespace1.SomeClass being referenced by MyRootNamespace2.SomeOtherClass) you can reference it but then it will compile it into your library-output2.js file so you end up having duplicates of classes if you are trying to re-use typescript across multiple compiled targets (like how you would have 1 solution in VS and multiple projects which have their own dll outputs)
Assuming you are not happy with hacking the exports and/or duplicating your references then you can just import them into the global scope, which is a hack but works... however then when you decide you want to test your code (using whatever nodejs testing framework) you will need to mock out certain things, and as the dependencies for your components may not be included via a require() call (and your module may depend upon node_modules which are not really usable with global scope hacking) and this then makes it difficult to satisfy dependencies and mock certain ones, its like an all or nothing sort of approach.
Finally you can try to mitigate all these problems by using a typescript framework such as appex which allows you to run your typescript directly rather than the compile into js first, and while it seems very good up front it is VERY hard to debug compilation errors, this is currently my preferred way but I have an issue where my typescript compiles fine via tsc, but just blows up with a max stack size exception on appex, and I am at the mercy of the project maintainer to fix this (I was not able to find the underlying issue). There are also not many of these sort of projects out there however they make the issue of compiling at module level/file level etc a moot point.
Ultimately I have had nothing but problems trying to wrestle with Typescript to get it to work in a way which is maintainable and testable. I also am trying to re-use some of the typescript components on the clientside however if you go down the npm hack route to get your modules included you then have to make sure your client side uses a require compatible resource/package loader. As much as I would love to just use typescript on my client and my server projects, it just does not seem to want to work in a nice way.
Solution here:
Inheritance TypeScript with exported class and modules
Finally I don't use require-typescript but typescript.api instead, it works well. (You have to load lib.d.ts if you use it, else you'll get some errors on the console.
I don't have a solution to have the script on the browser yet. (Because of export keyword I have some errors client side) I think add a exports global var to avoid errors like this.
Thank you for your help Bill.

Categories