The intention is to access a library of images outside the meteor application directory. This is the case since the application will not be the only component using the said lib. The meteor app, images lib and other components will reside on the same machine.
How can I do this? And is there such a thing called a local content delivery network?
You need two different things at the same time.
For the first part, to read/write access your files outside the meteor directory, take a look at https://stackoverflow.com/a/20330021/1064151 and http://shiggyenterprises.wordpress.com/2013/05/16/accessing-the-file-system-in-meteor/ which basically tells you to use fs from npm
#fs = Npm.require 'fs'
#path = Npm.require 'path'
readSomeFile = () ->
p = path.resolve './server/somefile.csv'
data = fs.readFileSync p, 'UTF-8'
#...
The second part is to serve those images from an alternative web server which you can do with any web server module from node or apache/nginx that uses the target directory as its docbase.
You can even do that from within meteor but meteor is not the best web server to serve static assets.
Related
I was wondering how I would deal with relative paths within my application when using electron packager.
In my app source folder I have some json files and other files that I reference. When packaging, electron-packager creates the \resources\app directory and places all these files into that directory. This means that any relative paths I'm using during development fail in the packaged app.
I tried pre-emtping this by creating the \resources\app folder in my source directory hoping the packager would notice them and just move them directly but it created \resources\app\resources\app instead.
I have had success using __dirname along with upath to build paths to assets.
I like upath rather than path because it has a toUnix method which "replaces the windows \ with the unix / in all string params & results."
var imgPath = upath.toUnix(upath.join(__dirname, "assets","welcome.png"));
I am writing a framework package which I'd like to make it able to auto require modules from the main projects src/. If you are familiar with rails, this is akin to its autoload feature.
So if in your web app you follow a directory convention, say src/models/my-model.js, then the framework can require the my-model module on its own. The framework, which is a dependency of the web app, only need know the name of the relation (ie "todos") in order to require the model (ie. src/models/todo.js)
I've tried adding my web apps src directory in my web apps webpack chain config.resolve.modules.add(path.resolve(__dirname, 'src')) but it does not seem to apply to the search paths for dependencies (not sure) so my framework lib still can't find modules in my web app.
I've also (desperately) tried passing require from the web app to the dependency and then in the dependency code I call var MyModel = this.thePassedInRequireFn("./models/" + modelName), but it errors:
(`Uncaught Error: Cannot find module './models/my-model'
at MyFramework.webpackEmptyContext
Anybody have ideas how this can be done?
If the solution can be independent of the use of webpack, that would be ideal, but webpack compatibility is what is most important to me.
Here is a webpack specific answer using require.context().
https://webpack.js.org/guides/dependency-management/#require-context
In the web app create a require context. For example:
const requireModule = require.context('./models/', true);
Then pass requireModule to the framework you have as a dependency of your web app.
If your web app has a model in the file ./models/todo-item.js and the framework is given the models name todoItem, the framework can require it using only the model name like so:
let fileName = `./${kebabCase(modelName)}`;
let module = this.requireModule(fileName).default;
I am new to Angular2 and have a bit of confusion with the node.js and the angular2 framework functioning and relationship.
I can run my app with the lite-server on localhost, but my problem is uploading the app to the hosting service.
There are not any tutorials or guides of what to do when the app is ready, so I have been trying to make a bundle with webpack, but I am not successful.
I know it is a BAD practice to upload all node_modules installed by npm but am I correct trying to make such bundle?
Another clarification would be if my app can run my app just by uploading the html, css and js files (including those in the node_modules)? or do I need to configure a host that allows Node.js to run my application?
In Angular2 if you use Typescript you need transpile the webapp, this transpile put the files in /dist folder.
If you use ES6, you use the app in the root folder of you develop.
I you open the "index.html" in your browser of you /dist folder, the app in angular2 work.
In the index.html you have this code
System.import('system-config.js').then(function () {
System.import('main');
}).catch(console.error.bind(console));
In your main.js of the /dist you have this code
var _1 = require('./app/');
In this folder require you have this (for example)
var ng_fire_component_1 = require('./ng-fire.component');
this require call to your principal component of the webbapp... In this logic your app run with only open the index.html when ng-fire.component is your root component.
In node you only need create a web-server, this webserver (if use express js ) you need call the index.html
router.get('/', function(req, res){
res.sendfile('yourAPPfolder/index.html');
});
and your webApp its run again when you open the www.yourweb.com/ or localhost:yourPort/
For the last question, if use the server, you have import the folder /dist in this folder you have all file who need.
I recomend the angular ci (https://cli.angular.io) for work with angular2 ... if you need other vendor file or vendor folder you can add in the file angular-cli-build.js
for example:
/* global require, module */
var Angular2App = require('angular-cli/lib/broccoli/angular2-app');
module.exports = function(defaults) {
return new Angular2App(defaults, {
vendorNpmFiles: [
'systemjs/dist/system-polyfills.js',
'systemjs/dist/system.src.js',
'zone.js/dist/*.js',
'es6-shim/es6-shim.js',
'reflect-metadata/*.js',
'rxjs/**/*.js',
'#angular/**/*.js'
]
});
};
GOAL: I am trying to set up a project in nodejs and webpack such that the require function can use the project directory as root, so I can require with absolute path relative to project root in both environments (isomorphic uses i.e. React server+client render).
SITUATION: In webpack you can set the config.resolve.root to make it work, but in nodejs, its best practice not to override/modify the global.require.
PROPOSITION 1: I can make a new global function
global.p_require
so it works in node; however, I cannot find a way to let webpack parse "p_require" into __webpack_require__ without changing the webpack source code.
PROPOSITION 2: I can make a new global variable
global.ROOT_DIR = process.cwd()
so it works in node by
require(ROOT_DIR + <relative path to root>);
however, webpack would recognize this as dynamic require. Is there a way such that webpack would parse ROOT_DIR? I have already tried the Define Plugin, but it seems to load after require is parsed by webpack.
QUESTION
Anyone has a solution or faces the same issue?
I'm addressing this by letting webpack do more; instead of "node and webpack", it's "webpack: client and server". I have webpack do a build for the client and a build for the server (the latter uses 'node' as its target property in config). It's easy to customize the directories webpack uses to require, so you let it do its work and create a build for node.
When rendering on the server, you just require the compiled server build. If you need to pass some stuff in from the server to the application that webpack built, wire that up in the entry point that you use for the server build -- webpack will build it as a commonJs module, so your entry point can export whatever is the most convenient interface when the server needs to render.
I am currently making a Meteor app and am having trouble reading files from the private subdirectory. I have been following a couple different tutorials and managed to get it to work flawlessly when I run the meteor app locally. This question (Find absolute base path of the project directory) helped me come up with using process.env.PWD to access the root directory, and from there I use .join() to access the private folder and the pertinent file inside. However, when I deployed this code, the website crashes on startup. I am very confident that it is an issue with process.env.PWD, so I am wondering what the proper method of getting Meteor's root directory on a deployed app is.
//code to run on server at startup
var path = Npm.require('path')
//I also tried using the below line (which was recommended in another Stackoverflow question) to no avail
//var meteor_root = Npm.require('fs').realpathSync( process.cwd() + '/../' );
var apnagent = Meteor.require("apnagent"),
agent = new apnagent.Agent();
agent.set('cert file', path.join(process.env.PWD, "private", "certificate-file.pem"))
agent.set('key file', path.join(process.env.PWD, "private", "devkey-file.pem"))
In development mode the file structure is different than after bundling, so you should never rely on it. Particularly, you should not access your files directly like you're doing with path methods.
Loading private assets is described in this section of Meteor's documentation. It mostly boils down to this method:
Assets.getBinary("certificate-file.pem");
and it's getText counterpart.
As for configuring APN agent, see this section of documentation. You don't have to configure the agent by passing file path as cert file param. Instead you may pass the raw data returned by Assets methods directly as cert. The same holds for key file ~ key pair and other settings.
As an alternative, you would need to submit your files independently to the production server to a different folder than your Meteor app and use their global path. This, however, would not be possible for cloud providers like Heroku, so it's better to use assets in the intended way.