I'm writing an app using expressjs. I have my views in, commonly, /views folder. They cover 90% of my customers' needs, but sometimes I have to override one or another of those view to add custom-tailored features. I really wonder I can build a folder structure like:
*{ ...other expressjs files and folders...}*
/views
view1.jade
view2.jade
view2.jade
/customerA
view2.jade
/customerB
view3.jade
What I'd like is to override the behaviour of expressjs' response.render() function to apply the following algorithm:
1. a customer requests a view
2. if /{customer_folder}/{view_name}.jade exists, than
render /{customer_folder}/{view_name}.jade
else
render /views/{view_name}.jade
Thus, for customerA, response.render('view1') will refer to /views/view1.jade while response.render('view2') will refer to /customerA/view2.jade
(those who use appcelerator's titanium may sound it familiar)
I'd like an elegant way to implement this behavior without the hassle of modify expressjs' core functionality, and thus possibly get treated at upgrading my framework. I guess it's a common problem but I can't find any article on the web.
I would create a custom View class:
var express = require('express');
var app = express();
var View = app.get('view');
var MyView = function(name, options) {
View.call(this, name, options);
};
MyView.prototype = Object.create(View.prototype);
MyView.prototype.lookup = function(path) {
// `path` contains the template name to look up, so here you can perform
// your customer-specific lookups and change `path` so that it points to
// the correct file for the customer...
...
// when done, just call the original lookup method.
return View.prototype.lookup.call(this, path);
};
app.set('view', MyView);
You can hook http.ServerResponse.render.
Here's some code from the top of my head, to be used as middleware:
var backup = res.render
res.render = function() {
//Do your thing with the arguments array, maybe use environment variables
backup.apply(res, arguments) //Function.prototype.apply calls a function in context of argument 1, with argument 2 being the argument array for the actual call
}
Related
I've started to develop a desktop app with node and electron. It has a package, which is implementing connection with some API. It is structured as one base class, and some derrived classes in this way:
ApiBase
ApiAuth extends ApiBase
ApiAuth.login()
ApiAuth.logout()
etc...
ApiTasks extends ApiBase
ApiTasks.getTaskList()
etc...
etc...
And now, i want to make nice and convinient way to use these classes in my app. So i need to create some entry point, which will provide an access to my API implementation. But, i do not have much expirience to make it right.
I thought about something like this:
index.js:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
login: apiAuth.login,
logout: apiAuth.logout,
getTaskList: apiTasks.getTaskList,
etc...
}
somwhere at the app:
const api = require("./lib/someApi");
// need to get task list for some reason
api.getTaskList(param1, param2)
But there are some problems with this approach i managed:
it is a problem to pass host param to the constructors in index.js dynamicly
i am not sure if creating this instances everytime requiring index.js is a rigth thing
So i want to know about some approches i can use here, because i do now even know where to start research. Thank you.
I think that you identified some of the most crucial decisions with this:
it is a problem to pass host param to the constructors in index.js dynamicly
IMO Configuration and the interface are important considerations. Even though it can be refactored after the fact an easy to configure and consume interface will help reduce adoption of your library. As you pointed out the configuration is static right now and very brittle. Ie a change to the URL will cascade to all clients and require all clients to update.
A first intuitive alternative may be to allow dynamic configuration of the current structure:
apiAuth = new ApiAuth(process.env.API_AUTH_URL || 'www.sample-host.com');
apiTasks = new ApiTasks(process.env.API_TASKS_URL || 'www.sample-host.com');
While this allows client to dynamically configure the URL, the configuration is "implicit". IMO this is unintuitive and difficult to document. Also it's not explicit and requires a client to look in the code to see the environmental variables and instantiation flow.
I would favor exposing these classes to the client directly. I would consider this approach "explicit" as it forces the client to explicitly configure/instantiate your components. I think it's like providing your clients with primitives and allowing them to compose, build, and configure them in whatever way they want:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
module.exports = {
auth: ApiAuth,
tasks: ApiTasks
}
This automatically namespaces the api behind its functions (auth|tasks) AND requires that the client instantiatae the classes before using:
const api = require("./lib/someApi");
const auth = new api.auth(process.env.SOMETHING, 'some-url');
This pulls the configuration further out in the architecture. It forces the client to decide how it wants to get the URL and explicitly instantiate the library. What if one of your clients doesn't use login/logout? This may be more flexible in that case.
i am not sure if creating this instances everytime requiring index.js is a rigth thing
If instantiation should remain hidden, another alternative would be to provide a builder function in order to encapsulate it:
const ApiAuth = require('./amazing-time-api-auth');
const ApiTasks = require('./amazing-time-api-tasks');
apiAuth = new ApiAuth('www.sample-host.com');
apiTasks = new ApiTasks('www.sample-host.com');
module.exports = {
auth: {
build: (url) => {
return new ApiAuth(url);
}
},
tasks: {
build: (url) => {
return new ApiTasks(url);
}
}
}
This should still hide each class but still allows the client to decide how it configures each class:
const api = require("./lib/someApi");
const auth = api.auth.build('my-url');
auth.login();
I want to create a really basic CRUD (sort-of) example app, to see how things work.
I want to store items (items of a shopping-list) in an array, using functions defined in my listService.js such as addItem(item), getAllItems() and so on.
My problems come when using the same module (listService.js) in different files, because it creates the array, in which it stores the data, multiple times, and I want it to be like a static "global" (but not a global variable) array.
listService.js looks like this:
const items = [];
function addItem (item) {
items.push(item);
}
function getItems () {
return items;
}
module.exports = {addItem, getItems};
and I want to use it in mainWindowScript.js and addWindowScript.js, in addWindowScript.js to add the elements I want to add to the array, and in mainWindowScript.js to get the elements and put them in a table. (I will implement later on Observer pattern to deal with adding in table when needed)
addWindowScript.js looks something like this:
const electron = require('electron');
const {ipcRenderer} = electron;
const service = require('../../service/listService.js');
const form = document.querySelector('form');
form.addEventListener('submit', submitForm);
function submitForm(e) {
e.preventDefault();
const item = document.querySelector("#item").value;
service.addItem(item);
console.log(service.getItems());
// This prints well all the items I add
// ...
}
and mainWindowScript.js like this:
const electron = require('electron');
const service = require('../../service/listService.js');
const buttonShowAll = document.querySelector("#showAllBtn")
buttonShowAll.addEventListener("click", () => {
console.log(service.getItems());
// This just shows an empty array, after I add the items in the add window
});
In Java or C#, or C++ or whatever I would just create a Class for each of those and in main I'd create an instance of the Service and pass a reference of it to the windows. How can I do something similar here ?
When I first wrote the example (from a youtube video) I handled this by
sending messages through the ipcRenderer to the main module, and then sending it forward to the other window, but I don't want to deal with this every time there's a signal from one window to another.
ipcRenderer.send('item:add', item);
and in main
ipcMain.on('item:add', (event, item) => {
mainWindow.webContents.send('item:add', item);
})
So, to sum up, I want to do something like : require the module, use the function wherever the place and have only one instance of the object.
require the module, use the function wherever the place and have only one instance of the object.
TL:DR - no, that isn't possible.
Long version: Nature of Electron is multi process, code you runs in main process (node.js side) and renderer (chromium browser) is runnning in different process. So even you require same module file, object created memory in each process is different. There is no way to share object between process except synchrnonize objects via ipc communication. There are couple of handful synchronization logic modules out there, or you could write your module do those job like
module.js
if (//main process)
// setup object
//listen changes from renderer, update object, broadcast to renderer again
else (//rendere process)
//send changes to main
//listen changes from main
but either cases you can't get away from ipc.
I've created some code using a View Composer where I am passing my Route Collection through to the front end on all views, so I can access all of my laravel routes in Vuejs via the route named associated with them.
For example, to upload an image using a vue component, instead of passing my upload route into the Vue Component, it is listed as a part of a global variable:
var uploadRoute = _.find(globalRoutes, function(route) { return route.name == 'route-name.image.upload' });
$.post(uploadRoute, data) ... etc
My question is...is this sensible? I'm publically publishing my entire app's routes.
Thanks
I think your hunch about exposing your entire apps routes is legit. IMO you should explicitly pick out the routes that you need. So in thise case, you should only expose route-name.image.upload. You could create a tiny helper function to look up routes and output them along with the URL as JSON.
function json_routes(array $routes)
{
$return = [];
foreach($routes as $route)
{
$return[$route] = route($route);
}
return new \Illuminate\Support\HtmlString(json_encode($return));
}
And the, in your main view:
var routes = {{ json_routes(["route-name.image.upload"]) }};
Getting a route is simple:
routes['route-name.image.upload'];
This is the most basic exaple I can think of. You can optimize it quite a bit. Just some ideas:
Place the routes in a central place, fx. a config element: json_routes(config('app.json_routes'))
Build a command that generates a static .json file so that you don't iterate through the routes on each page load. Remember to re-generate when you add more routes.
Create a function instead of an object to get the route. That allows you to build in logic and gives a more Laravel-like feel in your js: function route(path){ return window.routes.hasOwnProperty(path) ? window.routes[path] : null ;}
(Advanced) Re-write Laravels router logic and hook into the options array, allowing you to do something like Route::get('dashboard', '...', ['as'=>'dashboard', 'expose'=>true]);, then dynamically generate the before mentioned json-file on all routes with the expose option.
Hello I was wondering if there is a proper way to pass a variable or object to a layout view?
This is what I'm doing at the moment and it works
index: function(req, res){
res.view({ layout: 'mylayout', myvar: 'This is a view var' });
}
But on every action I have to define 'myvar' so I can use it at the layout level, so what I would like to know if there is some kind of controller or action for layouts so I can place there my logic?
Actually starting with Sails v0.10-rc5, you can use sails.config.views.locals hash to specify variables that should be provided by default to all views. So in config/views.js, something like:
{
locals: {
myvar : 'this is a view var'
}
}
would add the var to every view. This is mostly useful for variables that affect the view engine; see here for more details.
You can also use a policy in Sails v0.10.x to set vars across multiple views, by altering req.options.locals. So if you created a policy /api/policies/decorate.js with:
module.exports = function(req, res, next) {
// Default to an object if empty, or use existing locals that may have
// been set elsewhere
req.options.locals = req.options.locals || {};
// Set a new local var that will be available to any controller that
// implements the policy
req.options.locals.myVar = Math.random()
and then set up your /config/policies.js with something like:
module.exports = {
'*': 'decorate'
}
then any controller action that uses res.view will have that myVar variable available in the view.
It depends on the layout engine you have configured with sails (ejs is default). However passing a variable to a view is commonly done like this:
index: function(req, res){
res.view({ layout: 'mylayout', myModel: modelObj });
}
You have to inject the model in every view that uses it. If you want to register the model a globally you have to use a custom middleware like described here: How to create global variables accessible in all views using Express / Node.JS?
You can use config.views.locals, but i like to keep this out of the config folder and put it in the api/services directory. I call it api/services/page.js
then add some variable and other view helpers if you want.
module.exports = {
title:'test',
appName: 'My App Name',
getBlueprintPath: function (path) {
return sails.config.blueprints.prefix + path;
},
};
then in my view i have
<%= page.appName %>
the advantage of doing this way is your logic is kept inside api directory. You can still set locals for the view and not accidentally override some variable you had previously set in config.views.local.
So I'm writing a whole bunch of vendor-specific files in node which all have a similar controller pattern, so it makes sense for me to cut them out and put into a common file.
You can see my common controller file here: https://gist.github.com/081a04073656bf28f46b
Now when I use them in my multiple modules, each consecutively loaded module is overwriting the first. This is because the file is only required once and passed dynamically through to each module on load (this allows me to add extra modules and these modules are able to add their own routes, for example). You can see an example module here: https://gist.github.com/2382bf93298e0fc58599
You can see here on line 53 I've realised that we need to create a seperate instance every time, so I've tried to create a new instance by copying the standardControllers object into a new object, then initialising the new object. This has zero impact on the code, and the code behaves in exactly the same way.
Any ideas guys? I'm in a bit of a jam with this one!
First thing I'd do is try to make things simpler and reduce coupling by invoking the single responsibility principle, et al.
http://www.codinghorror.com/blog/2007/03/curlys-law-do-one-thing.html
Put those Schemas into their own files, eg
models/client.js
models/assistant.js
models/contact.js
I've also found that embedded docs + mongoose is generally a PITA. I'd probably promote all those to top level docs.
You don't need to enclose your object's keys in quotes.
routes = {
list: function() {} // no quotes is aok
}
Also 'list' in typical REST apps is called 'index'. Anyway.
Ok, I'd break this up differently. Since you're requiring stuff from the index.js file in the middleware, they become tightly coupled, which is bad. in fact, I think I'd rewrite this whole thing so it was tidier. Sorry.
I'd probably replace your 'middleware' file with an express-resource controller
https://github.com/visionmedia/express-resource (built by author of express). This is a good framework for restful controllers, such as what you're building. The auto-loader is really sweet.
You may also want to look at: http://mcavage.github.com/node-restify/ It's new, I haven't tried it out, but I've heard good things.
Since what you're building is basically an automated mongoose-crud system, with optional overriding, I'd create an express-resource controller as your base
/controllers/base_controller.js
and it might look like
var BaseController = function() {} // BaseController constructor
BaseController.prototype.index = function() {
// copy from your middleware
}
BaseController.prototype.show = function() {
// copy from your middleware
}
BaseController.prototype.create = function() {
// copy from your middleware
}
// etc
module.exports = BaseController
Then I'd do something like:
/controllers/some_resource_controller.js
which might look something like:
var BaseController = require('./base_controller')
var NewResourceController = function() {
// Apply BaseController constructor (i.e. call super())
BaseController.apply(this, arguments)
}
NewResourceController.prototype = new Base()
NewResourceController.prototype.create = function() {
// custom create method goes here
}
module.exports = NewResourceController
Then to use it, you can do:
var user = app.resource(myResourceName, new ResourceController());
…inside some loop which sets myResourceName to be whatever crud you're trying to set up.
Here's some links for you to read:
http://tobyho.com/2011/11/11/js-object-inheritance/
http://yehudakatz.com/2011/08/12/understanding-prototypes-in-javascript/
Also, it sounds like you're not writing tests. Write tests.
http://www.codinghorror.com/blog/2006/07/i-pity-the-fool-who-doesnt-write-unit-tests.html