How do i find the current path of the executing file when its in a meteor package?
I have a meteor package that needs to dynamically include a list of assets.
the fs.readdirSync command needs a full path it seems.
when running the package inside an app, things are fine, as the path to the package is easily deduced. Like this:
var path = Npm.require("path");
var fs = Npm.require("fs");
// normally found inside an app/packages dir like:
var packagePath = path.join(path.resolve("."), "packages", "reactive-ace");
var srcPath = path.join(packagePath, "vendor", "ace", "src") // files rel to package root
var files = fs.readdirSync(srcPath);
files.forEach(function(file){
console.log("add_file", file)
if (file === "snippets"){return;}
addPath = path.join("vendor", "ace", "src", file);
api.add_files(addPath, "client", {isAsset: true});
});
However, when i am local developing, using $PACKAGE_DIRS - the package repo is actually somewhere remote on my system. But I'm unable to deduce this just from the package.js
I tried __dirname and also require.main.filename but these both are unknown in the context of building a meteor package.
My current hack involves parsing process.env.PACKAGE_DIRS but that is ugly.
Any pointers appreciated.
Related
I'm building and trying do deploying a packaged electron app. FOr the packaging i used
electron-packager
electron-installer-debian
electron-installer-dmg
electron-winstaller
and I'm facing a little issue where I have to store tha appa datas somewhere in my user computer.
I saw that the good practice is to use the the folder in the path that is returned by the electron method app.getPath('userData').
from the docs
It is The directory for storing the app's configuration files, which by default it is the appData directory appended with the app name.
%APPDATA% on Windows
$XDG_CONFIG_HOME or ~/.config on Linux
~/Library/Application Support on macOS
By my tests sometimes this folder is not created automatically when the app is installed and other times yes and I'm wondering if i should create it or not.
Right now i'm quitting the app if this folder isn't present in the pc with the following code
var DatasPath = app.getPath('userData')
if (!fs.existsSync(DatasPath)){
process.exit()
}
So the question is
should i create the DatasPath folder with fs.mkdirSync(DatasPath); when it is not present or it is 'bad practice to do so', and if I can create the folder i have to warning the user the i have just added that folder?
(Expanding my reply from a "comment" to an "answer")
i don't know if i'm supposed to create it or not so i automatically
make the app quit if there is not that folder
It seems you are taking "userData" too literally? It is not an actual "folder" named "userData – it is a path to where the operating system stores data for that application. Electron currently runs on 3 operating systems and each one does things differently. For our convenience, Electron hides those differences by creating the wrapper method app.getPath(name) so the same code will work on each OS.
Try this: put the line below in your main.js script:
console.log(app.getPath('userData'));
/Users/*********/Library/Application Support/MyCoolApp
(the "*********" will be your user account name.)
UPDATED:
Run the code below in main.js and then look in the folder specified by the "userData" path
const fs = require("fs");
const path = require('path');
var datasPath = app.getPath('userData')
var data = "I am the cheese"
var filePath = path.join(datasPath, "savedData.txt")
fs.writeFileSync(filePath, data)
At pathConfig.js
function getAppDataPath() {
switch (process.platform) {
case "darwin": {
return path.join(process.env.HOME, "Library", "Application Support", "myApp");
}
case "win32": {
return path.join(process.env.APPDATA, "myApp");
}
case "linux": {
return path.join(process.env.HOME, ".myApp");
}
default: {
console.log("Unsupported platform!");
process.exit(1);
}
}
}
const appPath = __dirname;
const appDataPath =
!process.env.NODE_ENV || process.env.NODE_ENV === "production"
? getAppDataPath() // Live Mode
: path.join(appPath, "AppData"); // Dev Mode
if (!fs.existsSync(appDataPath)) {
// If the AppData dir doesn't exist at expected Path. Then Create
// Maybe the case when the user runs the app first.
fs.mkdirSync(appDataPath);
}
In each operating system the appData folder has a different path and the perfect way of getting this path is by calling app.getPath('userData') in the main process.
But there is a package that can handle this for you, it stores data in a JSON file and update it in every change.
In my opinion this package is much better than handling everything by your self.
Read more :
https://www.npmjs.com/package/electron-data-holder
I use the FayeJS and the latest version has been modified to use RequireJS, so there is no longer a single file to link into the browser. Instead the structure is as follows:
/adapters
/engines
/mixins
/protocol
/transport
/util
faye_browser.js
I am using the following nodejs build script to try and end up with all the above minified into a single file:
var fs = require('fs-extra'),
requirejs = require('requirejs');
var config = {
baseUrl: 'htdocs/js/dev/faye/'
,name: 'faye_browser'
, out: 'htdocs/js/dev/faye/dist/faye.min.js'
, paths: {
dist: "empty:"
}
,findNestedDependencies: true
};
requirejs.optimize(config, function (buildResponse) {
//buildResponse is just a text output of the modules
//included. Load the built file for the contents.
//Use config.out to get the optimized file contents.
var contents = fs.readFileSync(config.out, 'utf8');
}, function (err) {
//optimization err callback
console.log(err);
});
The content of faye_browser.js is:
'use strict';
var constants = require('./util/constants'),
Logging = require('./mixins/logging');
var Faye = {
VERSION: constants.VERSION,
Client: require('./protocol/client'),
Scheduler: require('./protocol/scheduler')
};
Logging.wrapper = Faye;
module.exports = Faye;
As I under stand it the optimizer should pull in the required files, and then if those files have required files, it should pull in those etc..., and and output a single minified faye.min.js that contains the whole lot, refactored so no additional serverside calls are necessary.
What happens is faye.min.js gets created, but it only contains the content of faye_browser.js, none of the other required files are included.
I have searched all over the web, and looked at a heap of different examples and none of them work for me.
What am I doing wrong here?
For anyone else trying to do this, I mist that on the download page it says:
The Node.js version is available through npm. This package contains a
copy of the browser client, which is served up by the Faye server when
running.
So to get it you have to pull down the code via NPM and then go into the NPM install dir and it is in the "client" dir...
I'm in the process of building an npm package which will be installed globally. Is it possible to have non-code files installed alongside code files that can be referenced from code files?
For example, if my package includes someTextFile.txt and a module.js file (and my package.json includes "bin": {"someCommand":"./module.js"}) can I read the contents of someTextFile.txt into memory in module.js? How would I do that?
The following is an example of a module that loads the contents of a file (string) into the global scope.
core.js : the main module file (entry point of package.json)
//:Understanding: module.exports
module.exports = {
reload:(cb)=>{ console.log("[>] Magick reloading to memory"); ReadSpellBook(cb)}
}
//:Understanding: global object
//the following function is only accesible by the magick module
const ReadSpellBook=(cb)=>{
require('fs').readFile(__dirname+"/spellBook.txt","utf8",(e,theSpells)=>{
if(e){ console.log("[!] The Spell Book is MISSING!\n"); cb(e)}
else{
console.log("[*] Reading Spell Book")
//since we want to make the contents of .txt accesible :
global.SpellBook = theSpells // global.SpellBook is now shared accross all the code (global scope)
cb()//callBack
}
})
}
//·: Initialize :.
console.log("[+] Time for some Magick!")
ReadSpellBook((e)=>e?console.log(e):console.log(SpellBook))
spellBook.txt
ᚠ ᚡ ᚢ ᚣ ᚤ ᚥ ᚦ ᚧ ᚨ ᚩ ᚪ ᚫ ᚬ ᚭ ᚮ ᚯ
ᚰ ᚱ ᚲ ᚳ ᚴ ᚵ ᚶ ᚷ ᚸ ᚹ ᚺ ᚻ ᚼ ᚽ ᚾ ᚿ
ᛀ ᛁ ᛂ ᛃ ᛄ ᛅ ᛆ ᛇ ᛈ ᛉ ᛊ ᛋ ᛌ ᛍ ᛎ ᛏ
ᛐ ᛑ ᛒ ᛓ ᛔ ᛕ ᛖ ᛗ ᛘ ᛙ ᛚ ᛛ ᛜ ᛝ ᛞ ᛟ
ᛠ ᛡ ᛢ ᛣ ᛤ ᛥ ᛦ ᛧ ᛨ ᛩ ᛪ ᛫ ᛬ ᛭ ᛮ ᛯ
If you require it from another piece of code, you will see how it prints to the console and initializes by itself.
If you want to achieve a manual initalization, simply remove the 3 last lines (·: Initialize :.) and use reload() :
const magick = require("core.js")
magick.reload((error)=>{ if(error){throw error}else{
//now you know the SpellBook is loaded
console.log(SpellBook.length)
})
I have built some CLIs which were distributed privately, so I believe I can illuminate a bit here.
Let's say your global modules are installed at a directory called $PATH. When your package will be installed on any machine, it will essentially be extracted at that directory.
When you'll fire up someCommand from any terminal, the module.js will be invoked which was kept at $PATH. If you initially kept the template file in the same directory as your package, then it will be present at that location which is local to module.js.
Assuming you edit the template as a string and then want to write it locally to where the user wished / pwd, you just have to use process.cwd() to get the path to that directory. This totally depends on how you code it out.
In case you want to explicitly include the files only in the npm package, then use files attribute of package.json.
As to particularly answer "how can my code file in the npm package locate the path to the globally installed npm folder in which it is located in a way that is guaranteed to work across OSes and is future proof?", that is very very different from the template thingy you were trying to achieve. Anyway, what you're simply asking here is the global path of npm modules. As a fail safe option, use the path returned by require.main.filename within your code to keep that as a reference.
When you npm publish, it packages everything in the folder, excluding things noted in .npmignore. (If you don't have an .npmignore file, it'll dig into .gitignore. See https://docs.npmjs.com/misc/developers#keeping-files-out-of-your-package) So in short, yes, you can package the text file into your module. Installing the module (locally or globally) will get the text file into place in a way you expect.
How do you find the text file once it's installed? __dirname gives you the path of the current file ... if you ask early enough. See https://nodejs.org/docs/latest/api/globals.html#globals_dirname (If you use __dirname inside a closure, it may be the path of the enclosing function.) For the near-term of "future", this doesn't look like it'll change, and will work as expected in all conditions -- whether the module is installed locally or globally, and whether others depend on the module or it's a direct install.
So let's assume the text file is in the same directory as the currently running script:
var fs = require('fs');
var path = require('path');
var dir = __dirname;
function runIt(cb) {
var fullPath = path.combine(__dirname, 'myfile.txt');
fs.readFile(fullPath, 'utf8' , function (e,content) {
if (e) {
return cb(e);
}
// content now has the contents of the file
cb(content);
}
}
module.exports = runIt;
Sweet!
I am new to Browserify and trying the following:
I created a node server and trying to get a package called 'openbci' running on the browser.
so I have the following file structure:
Myapp
-...
-public
--app.js
--index.html
--openBCI.js
--...
--javascript
---openBCI
----bundle.js
---...
-node_modules
--openbci
---openBCIBoard.js
--browserify
--...
my app.js file sets the server to serve the public folder
// app.js
var express = require('express');
var app = express();
app.use(express.static('public'));
app.listen(myPort);
then I created the following openBCI.js
// openBCI.js
var OpenBCIBoard = require('openbci').OpenBCIBoard;
exports.OpenBCIBoard = OpenBCIBoard;
and finally launched the browserify command:
$ browserify public/openBCI.js > public/javascript/openBCI/bundle.js
but once called in my index.html file, I got an Uncaught TypeError: exists is not a function at Function.getRoot:
exports.getRoot = function getRoot (file) {
var dir = dirname(file)
, prev
while (true) {
if (dir === '.') {
// Avoids an infinite loop in rare cases, like the REPL
dir = process.cwd()
}
**if (exists(join(dir, 'package.json')) || exists(join(dir, 'node_modules'))) {**
// Found the 'package.json' file or 'node_modules' dir; we're done
return dir
}
if (prev === dir) {
// Got to the top
throw new Error('Could not find module root given file: "' + file
+ '". Do you have a `package.json` file? ')
}
// Try the parent dir next
prev = dir
dir = join(dir, '..')
}
}
It appears that it could not find the original path for the module.
Could you please tell me what is to change? Or if I understood at all how browserify works ? :)
I notice a few things that seem strange about the code.
exists is undefined in JavaScript or node. It appears to be an alias of fs.exists - is that right?
If so, fs.exists is deprecated. Per the documentation, you can achieve the same effect with fs.stat or fs.access. Note however that you should either supply a callback (preferable) or use the Sync version of these methods.
If you are trying to use file system tools in the browser you are going to run into problems because you are attempting to access the server's file system from the browser. There is a plugin, browserify-fs, that gives you an equivalent to fs in the browser. However, this seems to access the browser's local IndexedDB, not the storage on your server.
I would suggest running code that relies on server-side files on the server, rather than in the browser.
I tried with npm package adm-zip 0.4.4 because the latest one 0.4.7 doesn't work, adm-zip 0.4.4 works on Windows but not on Mac & Linux. Another problem is that I only want zip_folder to be zipped but it zipps the whole directory structure staring from folder_1. This is the code:
var zip = new admZip();
zip.addLocalFolder("./folder_1/folder_2/folder_3/**zip_folder**");
zip.writeZip("./folder_1/folder_2/folder_3/download_folder/zip_folder.zip");
All this happens on the server side. I have searched a lot and tried many npm packages to zip a folder or directory. Any suggestions or any other good approach to solve my problem?
You could also use node-archiver, which was very helpful when I was using it. First you need create an instance of the archiver as follows:
var fs = require('fs');
var archiver = require('archiver');
var archive = archiver.create('zip', {});
var output = fs.createWriteStream(__dirname + '/zip_folder.zip');
This way you tell the archiver to zip the files or folders based on the format you pass along with the method. In this case it's zip. In addition, we create a writeStream which will be piped to the archiver as its output. Also, we use the directory method to append a directory and its files, recursively, given its dirpath:
archive.pipe(output);
archive
.directory(__dirname + '/folder_1/folder_2/folder_3/download_folder/zip_folder')
.finalize();
At the end, we need to finalize the instance which prevents further appending to the archive structure.
Another option is to use the bulk method like so:
archive.bulk([{
expand: true, cwd: './folder_1/folder_2/folder_3/download_folder/zip_folder/',
src: ['**/*']
}]).finalize();
Update 1
A little explanation for the [**/*] syntax: This will recursively include all folders ** and files *.
Try to use the system's zip function:
var execFile = require('child_process').execFile;
execFile('zip', ['-r', '-j', zipName, path], function(err, stdout) {
if(err){
console.log(err);
throw err;
}
console.log('success');
});
Replace zipName and path with what you need.