Is it possible to register custom helpers in Ghost? - javascript

I am using Ghost as an npm module following this guide.
I would like to add some custom helpers that I can leverage inside of my themes. Is there a way to do this without changing code inside the Ghost module?
This is my current code:
const ghost = require('ghost');
const path = require('path');
const hbs = require('express-hbs');
const config = path.join(__dirname, 'config.js');
const coreHelpers = {};
coreHelpers.sd_nls = require('./sd_nls');
// Register a handlebars helper for themes
function registerThemeHelper(name, fn) {
hbs.registerHelper(name, fn);
}
registerThemeHelper('sd_nls', coreHelpers.sd_nls);
ghost({ config: config })
.then(ghostServer => ghostServer.start());
I think one possible problem is that my hbs is a new handlebars instance, not the same one used by Ghost, therefore when Ghost runs it doesn't include any helpers I've registered.

Unfortunately even with the most recent version this still is a very recent issue. I tried to come up with my own 3-file-based solution that will take the original Ghost Dockerfile and build from there adding custom helpers from just one directory.
Find it here:
https://hub.docker.com/r/activenode/ghost-docker-custom-helpers
https://github.com/activenode/ghost-docker-custom-helpers

Related

Simple and elegant way to override local file If exists?

Looking for elegant and simple solution to have "local configuration override" files.
The idea is to be able to have local configuration that will not ask to be added to git repository every time.
For that I need to include local.config.js if it exists.
I have global app configuration in config.js with configuration like
export const config = {
API_URL="https://some.host",
}
and config.local.js
export const config = {
API_URL="https://other.address",
}
there's .gitignore:
config.local.js
Difficulty:
I do not want to add a node module to project just for this one thing. I believe there should be an elegant way to do this in one or few lines, but have not found any so far.
Things that I tried:
1.
try {
const {
apiUrl: API_URL,
} = require('./config.local.js');
config. API_URL =apiUrl;
} catch (e) {
}
require does not work inside try{} block.
2.
const requireCustomFile = require.context('./', false, /config.local.js$/);
requireCustomFile.keys().forEach(fileName => {
requireCustomFile(fileName);
});
does not work.
3.
export const config = require('./config.local.js') || {default:'config = {...}'}
does not work.
4.
Using .env and settings environment variable: I need to override whole array of configuration values. Not one by one.
This solution uses process.argv. It is native to node as documented here and does not use .env
It inspects the command values used to start the app. Since these should be different between your local and production environments, it's an easy way to switch with no additional modules required.
command prompt to start your node app:
(this might also be in package.json and incurred via npm start if you're using that approach.)
$ node index.js local
index.js of your node app:
var express = require('express');
var config = require('./config');
if (process.argv[2] === 'local') {
// the 3rd argument provided at startup (2nd index) was 'local', so here we are!
config = require('./config_local');
}
var app = express();
// rest of owl…

I want to better write require() path in the Renderer process

I was using modules in the Renderer process with require().
/* renderer.js */
const Vue = require('vue');
const marked = require('marked');
...
There is no problem when executing "electron .", but packaging will be "no such file or directory". (packaging with Electron-packager)
Just to solve this, put the js file I want to require () at the same directory as renderer.js and do as follows.
/* renderer.js */
const Vue = require('./vue.min.js');
const marked = require('./marked.min.js');
...
However, it is troublesome to put the js file.
Is there any better solution?
The problem might be the path. You need to specify the path of vue and marked completely. For example:
var Vue = require("./model/vue");

how do I make webpack not convert process.env variables to their values during build?

I have the following in one of my project files:
const baas = process.env.DBID;
console.log('baas', baas);
If I run:
cross-env PORT=4000 NODE_ENV=production WEBPACK_CONFIG=browser_prod,server_prod webpack --colors
My server.js file looks like:
const baas = undefined;
console.log('baas', baas);
As expected. However, I want to be able to set the ID when I run the built app not when I build the app, ie:
DBID=someotherid node dist/server.js
So I need webpack to not convert const baas = process.env.DBID to it's value at build time, but rather leave it as is, so the server.js uses it's value at runtime.
How do I do this?
Note: if I manually edit the built server.js and change undefined to process.env.DBID then the run script works and the app uses the env var from run time, but I don't want to edit files after building.
You are using the wrong target.
By default, webpack builds the application to be run in the browser. This means it will mock native node functions like path fs and process
Your target is node, so there is no need to mock these.
Add this to your webpack.config.js
module.exports = {
target: 'node'
};
https://webpack.js.org/concepts/targets/#usage
What you need is process.argv not process.env:
// server.js
const baas = process.argv[0];
console.log('baas', baas);
Then:
node dist/server.js baas_value
For convenience, you can use this module https://www.npmjs.com/package/yargs
I was able to prevent Webpack from converting process.env by accessing it indirectly like this:
const processText = "process";
const _process = global[processText];
app.listen(_process.env.PORT || 2000);
You need to get process indirectly instead of env because the process variable is defined by webpack to be something like /* provided dependency */ var process = __webpack_require__(/*! process/browser */ "process/browser");

node.js configure submodules

[EDIT]
Thanks to Stafano that formalized my question in a better way:
You have a module
-) There are several files in this module
-) All these files depend on a configuration whose path is unknown to the module itself
-) This module does not do much on its own, and is meant to be used by other applications
-) These applications should inject a configuration path into the module before it can be used
So i have this module, used from another application. It's composed of other submodules and i want to configure it using a configuration object.
I already tried to inject the configuration in my submodels but i had the same problem exposed in the original question.
For example my module use mongoDB (with mongoose) as a store.
// app.js
// in the config object i have the URI to the mongo instance (in order to create a connection).
var myModule = require('myModule')(config);
// myModule.js
// files
// myModule/index.js expose the module's functionalities
// is the entry point so I create the mongoose connection
var mongoose = require('mongoose');
module.exports = function(config){
var connection = mongoose.createConnection(config.store.URL);
// I need to expose this connection to the others submodules.
}
// myModule/storeController.js contains the business logic that use the store (createItem, deleteItem, get...) and requrie mongoose and my Models (store in the models folder)
var mongoose = require('mongoose');
var Item = require('./models/item.js');
exports.createItem = function(item){
Item.save(item, function(err, item){
if (err) throw
...
});
}
// myModule/models/item.js
// In this module i need to use the connection application in the configuration.
var mongoose = require('mongoose');
var connection = // i don't know how to get it
var ItemSchema = new mongoose.Schema({
name: String
});
module.exports = mongoose.model('item', ItemSchema);
If I inject the configuration obj to the item.js i can't do the module.exports of my model.
I hope that this example can clarify my question, but the problem is the simple, expose an object after get it as a parameter.
[PREVIOUS]
I have a node.js application that require a module. This module accept the coniguration file path (a JSON file).
I need to load that configuration on require and expose it to the module.
How can I achieve this behavior?
Something like:
// app.js
var myModule = require('myModule')(__dirname + '/config/myModuleCnfig.json');
// myModule.js
module.exports = function(configPath){
var config = require(configPath);
module.exports = config; // This is wrong
}
Is there another way to get the configuration path, configure the module and share the configuration??
With "share the configuration" i mean that i want to give the possibility to other files of my module to use that configuration.
Thanks for any suggestions!
FINAL EDIT:
After many misunderstandings, your problem is finally clear to me. To summarise what's in the comments, here is the situation:
You have a module
There are several files in this module
All these files depend on a configuration whose path is unknown to the module
itself
This module does not do much on its own, and is meant to be
used by other applications
These applications should inject a
configuration path into the module before it can be used
Since you cannot modify dynamically what a module exports, you should use another approach. As with most situations that you encounter in programming, there is not one way which is always right, as much pedends on your requirements and limitations.
The easiest way to do this (which I don't recommend) is to use a global variable, which you set in your myModule.js file and will be used by the other files in your module. The biggest drawback of this approach is that you wouldn't be able to use multiple instances of the module at the same time with different configurations. Also, any other module could easily modify (deliberately or not) you configuration at any time, by simply changing the value of the global variable, so it's also a security risk.
A much better way, which will probably require more work on your part - depending on how many files you have - is to implement some kind of Inversion of Control (IoC). In your case, you could turn all your exports into functions that accept a config, and then initialise them by passing the active configuration after you require the module. I don't know the specifics of your implementation, so here is some sample code:
// submodule1.js
module.exports = function(config) {
// return something that uses the configuration
}
// myModule.js
var fs = require('fs');
var submodule1 = require('./submodule1');
var submodule2 = require('./submodule2');
// ...
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath));
var sm1 = submodule1(config);
var sm2 = submodule2(config);
return /* an object that uses sm1 and sm2 */;
}
If your module is quite complex, you can use some IoC library that does the binding for you. An good one could be Electrolite.
Hope this helps.
PREVIOUS ANSWER:
You can use a library called jsop:
var jsop = require('jsop');
var config = jsop('./config/myModuleCnfig.json');
If you don't want to add a dependency to this module, the linked GitHub page also has a snippet that you can use to load the json config using only native methods.
EDIT: I just realised that this module is only for node 0.11, which you are probably not using. Since you don't probably need the writing functionality, you can use the following snippet instead:
var fs = require('fs')
var config = JSON.parse(fs.readFileSync('./config/myModuleCnfig.json'))
EDIT 2:
Now I think I understand your problem better. To pass the path to the required configuration, you can do something like this:
// myModule.js
var fs = require('fs')
module.exports = function(configPath){
var config = JSON.parse(fs.readFileSync(configPath))
return config;
}

Class autoloader in Ember.js?

I am looking for an autoloader, similar to how they operate in languages (e.g. http://php.net/manual/en/language.oop5.autoload.php). I merely specify the algorithm for finding the file and it's automatically loaded into the app.
My initial thinking is a build process that scans directories and builds an index file. Is there a better way?
Here's my solution using browserify and a node.js build script, but I'm curious if anyone has a better solution:
build.js:
var glob = require("glob");
var fs = require('fs');
var path = require('path');
function buildFile(directory, build_file, suffix) {
glob(directory, function(err, files) {
if (fs.existsSync(build_file)) {
fs.unlinkSync(build_file);
}
fs.appendFileSync(build_file, 'module.exports = {');
var controllers = {};
files.forEach(function (file) {
var key = path.basename(file, '.js')+suffix;
var value = "require('"+file+"')";
fs.appendFileSync(build_file, '\n '+key+': '+value + ',');
});
fs.appendFileSync(build_file, '\n}');
});
};
buildFile('./controllers/*.js' , './controllers.js', 'Controller');
buildFile('./routes/*.js' , './routes.js' , 'Route');
app.js:
var App = Ember.Application.create();
App.reopen(require('./controllers.js'));
App.reopen(require('./routes.js'));
routes.js (example output from build.js):
module.exports = {
ApplicationRoute: require('./routes/Application.js'),
IndexRoute: require('./routes/Index.js'),
RecoverRoute: require('./routes/Recover.js'),
RegisterRoute: require('./routes/Register.js'),
UsersRoute: require('./routes/Users.js'),
UsersNewRoute: require('./routes/UsersNew.js'),
ValidateRoute: require('./routes/Validate.js'),
}
I use Grunt.js to watch and rebuild automatically when changes occur.
You probably want to use something like RequireJS: http://requirejs.org/
RequireJS will allow you to specify dependencies which will be loaded as needed. You can also run the RequireJS optimizer to compile your templates and JavaScript in to one file to deploy to your production servers.
One could use a pre-made tool like Yeoman's ember generator or ember tools. They are opinionated about the project's folder structure.

Categories