I'm using uglify-js to minify the source code. I want to remove
const moment = require('moment');
const PouchDB = require('pouchdb');
module.exports = Chart;
statements of the original source code. Is it possible? Or is there any other compressor tool supports this?
I use the code as below in Node.js.
'use strict'
const moment = require('moment');
const PouchDB = require('pouchdb');
const defaultcachetime = 12; // hours
const VERIFIED = 3;
const UNIQUCOUNTER = 1;
var caches = {};
var cachechange = {};
function Chart(path, credentials, user){
}
module.exports = Chart;
The output contains
"use strict";const moment=require("moment"),PouchDB=require("pouchdb") return a},module.exports=Chart;
Thank you for helping
Managed to overcome the challenge by doing
Browserify to convert Require and Import keywords to FE parsable code.
Minify JS code with Uglify
Hope it helps with anyone who might be facing the same challenge.
Related
I have an existing Nodejs application that still uses CommonJS, it's been fine up till now but I've ran into a module that I'm not sure how to import. I was hoping there was a fast way of doing it rather than restructuring my whole app to the module standard.
Here is the module's import documentation:
import MetaApi, {CopyFactory} from 'module.cloud-sdk';
const token = '...';
const api = new MetaApi(token);
const copyFactory = new CopyFactory(token);
I got the CopyFactory part to work by destructuring like so:
const { CopyFactory } = require('metaapi.cloud-sdk')
const copyFactory = new CopyFactory(token)
But I can't find a way to import the api aspect with the token, is there anyway of doing it?
Thanks a lot
You can do it this way,
const MetaApi = require('metaapi.cloud-sdk');
const {CopyFactory} = MetaApi;
const token = '...';
const api = new MetaApi.default();
const copyFactory = new CopyFactory(token);
Hopefully this will work fine.
As suggested by Bergi, adding default made it work
const { default: MetaApi, CopyFactory } = require(…)
I am new to JavaScript and need the ability to create, edit and export an XML document on the server side. I have seen different options on the Internet, but they do not suit me.
It seems that I found one suitable option with processing my XML file into JSON, and then back and then export it through another plugin, but maybe there is some way to make it easier?
Thanks!
I recently came across a similar problem. The solution turned out to be very simple. It is to use XML-Writer
In your project folder, first install it via the console
npm install xml-writer
Next, first import it and create a new file to parse what's going on here:
var XMLWriter = require ('xml-writer');
xw = new XMLWriter;
xw.startDocument ();
xw.startElement ('root');
xw.writeAttribute ('foo', 'value');
xw.text ('Some content');
xw.endDocument ();
console.log (xw.toString ());
You can find more information here and at the bottom of the page see the different code for each item. In this way, you can create, edit and export xml files. Good luck and if something is not clear, write!
Additional
You will need also fs module
const fs = require("fs")
const xmlParser = require("xml2json")
const formatXml = require("xml-formatter")
Completed code:
const fs = require("fs")
const xmlParser = require("xml2json")
const formatXml = require("xml-formatter")
var XMLWriter = require('xml-writer');
xw = new XMLWriter;
xw.startDocument();
xw.startElement('root');
xw.startElement('man');
xw.writeElement('name', 'Sergio');
xw.writeElement('adult', 'no');
xw.endElement();
xw.startElement('item');
xw.writeElement('name', 'phone');
xw.writeElement('price', '305.77');
xw.endElement();
xw.endDocument();
const stringifiedXmlObj = JSON.stringify(xmlObj)
const finalXml = xmlParser.toXml(stringifiedXmlObj)
fs.writeFile("./datax.xml", formatXml(finalXml, { collapseContent: true }), function (err, result) {
if (err) {
console.log("Error")
} else {
console.log("Xml file successfully updated.")
}
})
})
I'm building a discord bot with node.js for my server and I have a bunch of commands for the bot. Each command is in a different file so I have a lot of const cmd = require("../commands/cmd.js");
const kick = require("../commands/kick");
const info = require("../commands/info");
const cooldown = require("../commands/cooldown");
const help = require("../commands/help");
Is there a simpler way to do this?
Inside folder commands put a file called index.js.
Each time you implement new commands in new file, require that file in index.js and then add it to the exports of it. For example index.js would be:
const kick = require('./kick');
const info = require('./info');
module.exports = {
kick: kick,
info: info
}
And then from any folder you can require multiple commands in one line like this:
const { kick, info } = require('../commands');
Export an object from one file instead?
const kick = require("../commands/kick");
const info = require("../commands/info");
const cooldown = require("../commands/cooldown");
const help = require("../commands/help");
const commands = {
kick,
info,
...
}
module.exports = commands;
And then:
const commands = require('mycommands')
commands.kick()
Create index.js file inside the command folder and then you can export an object like this.
const kick = require("../commands/kick");
const info = require("../commands/info");
const cooldown = require("../commands/cooldown");
const help = require("../commands/help");
const command = {
kick,
info,
cooldown,
help
};
module.exports = command;
You can import and use it like this:
const {kick, info} = require('./commands');
So I'm planning to separate my functions into separate files and then import them into a single index.js which then becomes the main exporter. So I'm wondering if having something like var bcrypt = require('bcrypt') in several of my files be slower than just having it in one file.
Here's how I'm planning to group and export in index.js
const fs = require('fs');
const path = require('path')
const modules = {}
const files = fs.readdirSync(__dirname)
files.forEach(file => {
if (file === 'index.js') return
let temp = require(path.join(__dirname, file))
for (let key in temp) {
modules[key] = temp[key]
}
});
module.exports = modules
As an example of what I mean:
file1.js
var bcrypt = require("bcrypt");
module.exports.file1test = "hi"
file2.js
var bcrypt = require("bcrypt");
module.exports.file2test = "bye"
No, it does not. Whenever a module is required for the first time, the module's code runs, assigns something to its exports, and those exports are returned. Further requires of that module simply reference those exports again. The logic is similar to this:
const importModule = (() => {
const exports = {};
return (name) => {
if (!exports[name]) exports[name] = runModule(name);
return exports[name];
};
})();
So, multiple imports of the same module is no more expensive than referencing an object multiple times.
Is there a way to get the version of an external dependency in JS code, without hardcoding it?
If you wanted to get the value of express you could do something like the following. You are looping over each folder in the node modules and adding the name and the version to an object.
const fs = require('fs');
const dirs = fs.readdirSync('node_modules');
const packages = {};
dirs.forEach(function(dir) {
const file = 'node_modules/' + dir + '/package.json';
const json = require(file);
const name = json.name;
const version = json.version;
packages[name] = name;
packages[version] = version;
});
console.log(packages['react-native']); // will log the version