Using JavaScript ES6 modules requires specifying a mime type in html like:
<script src="./js/graphics.js" crossorigin type="module"></script>
This will only load if a CORS header is added as a message header, which can only be added by a server, depending on server settings.
Am I correct to assume that this means that it is impossible to have a webpage which will still work offline while using ES6 modules ?
So designing an offline first app is impossible using ES6 modules ?
If you want to make a really offline-first app, is a good idea to make one bundle with all your ES6 modules. For example, you can choose Webpack as your module bundler.
After the compilation, you will receive one .js file which you can include in your page and do not think about CORS.
It's is also helpful if you want to reduce page loading time, because every time you use tag, the browser makes a request to the server, so using only one bundle will reduce requests.
Related
I hope this is clear, I need to import a JS file in HTML file. So I'm using src attribute like this :
<script src="my/js/file/1.js">
<!-- Some JS script here -->
</script>
But there is a thing... In my JS file, line 1, there is a require("another/file.js")... So I got an error in my browser console : require is not defined. How to solve it ?
EDIT
I'll try to be more clear :
I got 3 files, 1 HTML & 2 JS
The script tag above is in my html file.
In the src file, I need to import a 2nd JS file with require("my/js/file/2.js"
And it's working if i'm not using src attribute
But I got a error msg in console when I add src attribute
require is a built-in function provided by JS environments that support a couple of different kinds of modules, so how you load your JS file into a browser depends on what type of module system it is written to use.
The most likely cases are:
It is is a AMD module (very unlikely in 2021) in which case you can probably load it with RequireJS
It is a CommonJS module that depends on Node.js-specific APIs (in which case it can't run in a browser and to interact with it in a browser you would need to build it into a web service and make HTTP requests to it (e.g. via Ajax)). Some things that depend on Node.js-specific APIs include:
Making HTTP requests to sites which don't grant permission for browser JS to access them using CORS
Non-HTTP network requests (like direct access to a MySQL database)
Reading (or doing anything with) files from a file path expressed as a string (as opposed to reading files from a <input type="file">)
Spawning other processes
It is a CommonJS module that doesn't depend on Node.js-specific APIs and you can convert it to run in a browser using a bundler tool such as Webpack or Parcel.
Find out which of those options it is before you start trying to implement one of these solutions (all of which will take some time and effort that you don't want to waste).
Reading the documentation for a module will usually tell you. If you get it from NPM and it doesn't mention being browser compatible then it is probably Node.js only.
This might be because require() is not part of the standard JavaScript API. Your code might be using Nodejs, which is where require() might be used. Also, for you to choose an src file, you might also want to include the type which is <script type="text/javascript">.
I'm using Meteor 1.5, FlowRouter and Blaze.
I've been working on an app which requires offline support, unfortunately it also has several (large) areas of the app that are only available to a small subset of users, to avoid bloating the initial JS download with content most users won't require, I'm using dynamic imports at the FlowRouter.route({action}) level.
For offline mode (in addition to handling the data, etc) I'm using a service worker to cache the JS, CSS and HTML. Unfortunately, because dynamic imports work over the websocket, it isn't possible to cache these as they are loaded.
Fortunately, the user has to notify the server of their intent to work offline (so relevant data and files, videos, etc can be downloaded), this gives the opportunity to load these dynamic imports prior to the client going offline.
What are my options for caching dynamic imports? What I've considered so far:
writing a simple package with all the dynamic imports loaded statically, and with {lazy: true} defined in the package.js.
requires quite a lot of restructuring
the lazy: true means the package isn't actually available via a URL, it seems it is only available as a dynamic import itself.
writing a server side "fetcher" that will take a package name as an argument and serve the content from the file system
I don't know how the client side packages are stored on the server, and if the server has access to those files at all.
using browserify (or something similar) to manually generate a static js bundle in /public which can be downloaded when the client states their intent to go offline
Its very manual, and a change to the dynamic imports could easily be missed.
Has anyone attempted this before, I know meteor doesn't officially support service workers, but as far as I can tell, with the exception of dynamic imports, it plays nice with them
TL,DR: How to load css and javascript files independent of Meteor assumptions about alphabetical order (which is far from how it works in practice.)
Stackoverflow tells me this question might be subjective but I hope not.
Meteor loads files based on alphabetical order (and other rules.)
So to force it to load the CSS and JS files in the order I wanted, I had to start the fiels with numbers that indicate the load order. If I have jquery.js and bootstrap.js, Meteor will load bootstrap.js before jquery.js. But bootstrap depends on jquery so jquery must be loaded first.
In order to solve this, the options are:
1. Put the files in the public directory and manually load them. But this didn't work as Meteor appears to be sending the files with text/html MIME type.
2. Create a Meteor package and specify the load order from there. I find this like hitting a fly with a hammer just for loading CSS and Javascript.
3. Put a number before every file. In the previous example, to load jquery before bootstrap, rename the fiels to 1.jquery.js and 2.bootstrap.js This works and is tedious but at least I get to load the files the way I want them to.
I am new to Meteor so I am wondering if there are recommended best practices concerning this. I was thinking of using AMD for javascript but that's limited to javascript.
Its an interesting question and this is probably one of the pitfalls of making a Meteor app.
You've mentioned all of the usable solutions such as creating an explicit package or renaming the files.
The best way I would think is to use the atmosphere packages. For example if you add bootstrap, jquery is a dependency of it so it will always load first. Most js libraries that involve load order are typically on atmosphere.
The other best way if there's no atmosphere package, though i'm not sure I would say is tedious is to put a number in front of the js file to indicate load order.
One thing is when you use the /public folder the files map to /, so you can load the js file yourself manually in the order you would want (in the root html file using /public. Meteor returns the text/html MIME type as its version of a 404 file not found error. This method is a bit troublesome though because the files are seperated in production and can cause trouble if one or the other dont load.
I noticed that some programmers use two ways of calling .js file.
1- this way where you must have the js file:
<script src="lib/jquery.js" type="text/javascript"></script>
2- and this way where you don't need the js file :
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.7.2/jquery.js" type="text/javascript"></script>
and I want to know which way is better to use.
The first option is using local files, The second option is using a CDN.
A CDN is a group of fast servers with several common use files. Is really useful to save bandwidth and speed up the download of your site.
However, as was mentioned, you would have problems if the end user don't have access to internet.
Basically, if you expect your application to be executed always online, a CDN is a great option. If you are developing an app that could be executed offline (like a CRM for a company) then it would be better to be served using local files.
If the CDN is down, then your website will be broke. But is more likely that your website is down than the CDN.
Depends.
Method #1 means you have a local copy of the file -- you don't need to rely on an existing path to the internet (from an intranet behind a firewall, spotty internet service, etc). You take care of any caching, and making sure the file exists.
Method #2 may give you a fast planet-wide content-delivery-network (CDN).
I have, and will continue to use both methods... but #2 is easier.
I am working on a strategy for storing and deploying JavaScript files on Azure (ASP.NET web role)
My requirements are:
To use minified versions in production
Use original versions (i.e. not minified) local versions in development environment (to simplify debugging)
Simple build/deployment process (VS2010)
Simple update process (my files will change from time-to-time)
There is a great discussion here Visual Studio 2010: Publish minified javascript files instead of the original ones however this does not take into account the benefits Azure can offer or working with multiple instances.
I am considering deploying my minified JavaScript files to blob storage and use these in the production version. These will be stored with a large max-age Cache Control for client side caching and filenames will store the version (so I can easily update). I welcome feedback on this strategy.
Thus, in development the rendered HTML would refer to a local script file, i.e.:
<script src="Scripts/myjavascript-0.0.1.js" type="text/javascript"></script>
But in Production the result should use the following to refer to a minified version.
<script src="http://myblob.blob.core.windows.net/Scripts/myjavascript-0.0.1.js" type="text/javascript"></script>
My main question though is how to best to achieve automatic switching of the paths in development and production. Or would a custom handler be the normal route (and if so how would that work – I don’t want each instance to reload from the blob on each request).
Regarding #1 & 2:
I discuss a strategy for this here. The basic idea is to user a helper function to emit the script tag. The function can construct a link to the debug files when in debug mode, and the minified files otherwise (which also makes it easy to test locally with the minified files). The same function can handle adding a version to the path for cache invalidation, etc.
Regarding #3:
Add the minification as an after-build step. I added this to my csproj (which is just an msbuild file), which use yui-compressor:
<Target Name="AfterBuild" Condition="'$(Configuration)' != 'Debug'">
<!-- remove previous minified files -->
<Exec Command="del $(ProjectDir)Styles\*-min.css" />
<Exec Command="del $(ProjectDir)Scripts\*-min.js" />
<!-- Minify javascript and css, unless we're in Debug -->
<Exec Command="java -jar $(ProjectDir)..\yuicompressor\build\yuicompressor-2.4.6.jar -o .css$:-min.css --charset utf-8 $(ProjectDir)Styles\*.css" />
<Exec Command="java -jar $(ProjectDir)..\yuicompressor\build\yuicompressor-2.4.6.jar -o .js$:-min.js --charset utf-8 $(ProjectDir)Scripts\*.js" />
</Target>
This will create minified *-min.js and *-min.css files in ~\Scripts and ~\Styles, respectively.
Warning B/c of a bug in version 2.4.6 of the yui compressor, the above won't work if there is only one .css or .js file in the directory.
Your basic plan sounds good. It will even enable you to make use of the CDN with very little effort (you just need to replace the path to your storage account with the path to the CDN).
I don't think I'd try to over think this too much. As suggested elsewhere a control is a good way to go. Simply have this control look up a web.config setting to get the root directory for your scripts and prepend it to the path of the script (your local version this setting would be empty). In order to make sure that you don't have to mess around changing the config for every deploy, I'd use config transformations so it just happens automatically.
For switching the URL of the script links dynamically when running from Azure, you should put all the script blocks inside a usercontrol and use that usercontrol in all the pages. You should not put the script links directly on the aspx/master pages, instead put then on ascx and use the ascx. This helps keeping common script links in a single file and when you need to make a sitewide change, you just change the ascx.
Another approach is to use my httphandler that changes the URL of the scripts from relative to absolute in order to facilitate download of scripts from different domain than the site is running from. You can of course use it to prepend the absolute URL of your Azure site.
http://omaralzabir.com/loading_static_content_in_asp_net_pages_from_different_domain_for_faster_parallel_download/
You may want to check out the Windows Azure CDN helpers project. It should do pretty much everything you are asking for. You can set in the config if you want your minified files to automatically be deployed to blob storage or stay on the web roles.
http://cdnhelpers.codeplex.com/
http://ntotten.com/2011/06/windows-azure-cdn-helpers/
http://nuget.org/List/Packages/CdnHelpers.Razor
http://nuget.org/List/Packages/CdnHelpers.ASPX