Assume the following scenario :
Rest client is a module, which has many middlewares; which are themselves modules.
Now, we're trying to create a new middleware that requires client itself to fetch/update metadata for the client input url.
Testing this middleware, will bring in a published version of the client from npm registry, because middleware has a devDependency on the client. But we want to serve our local client.
Also, the published version of client does not contain this new middleware, so it will not allow testing the request pipeline with this middleware.
We want to initiate the client with this middleware, when we're testing the middleware itself to send a request to fetch data.
The middleware is smart enough to not request metadata for metadata, so it will skip the second call. The new flow should be like the diagram below :
Wrap the nodejs module loader, to return local client instead of published one when client is requested during test execution.
describe(()=>{
const localClient = require('../client');
const m = require('module');
const orig = m._load;
before(()=>{
m._load = function(name){
if(name==='client'){
return localClient;
}
return orig.appy(this, arguments)
}
});
after(()=>{
m._load = orig;
});
it('test goes here',()=>{...})
})
Related
I have a small express script that was making an api. I then used node services to make it a service and set it up to run after mongoDb kicked off. Worked like a charm however I had to add HTTPS support which requires me to read the keys using fs.readFileSync. Now the process just ends after starting. If I run the file it works fine so I think it has to do with the window service trying to access a users file. Has anyone run into this before? Below is both the code for the file creating the API and the code used to make the service.
https
.createServer(
{
key: fs.readFileSync("certs/server.com.key"),
cert: fs.readFileSync("certs/server.com.crt"),
passphrase: "redacted",
},
app
)
.listen(port, function () {
console.log(
`Example app listening on port ${port}! Go to https://localhost:${port}/api`
);
});
var Service = require("node-windows").Service;
var svc = new Service({
name: "FavoritesAPI1.1",
description:
"This starts the express server for the favorites API and connects to MongoDB.",
script: "C:\\Users\\srv-qlik\\Desktop\\FavoritesAPI\\index.js",
nodeOptions: ["--harmony", "--max_old_space_size=4096"],
});
svc.on("install", function () {
svc.start();
});
svc.install();
Changed the fs file paths to the full paths and it seem to be working.
I have to execute an ansync function before execute an express server, because I'm going to get a aparameter to pass it to the API. So I guess I can call the async funcion and on the promise I will get, call my express server (until now I have exectued it with "node my_server.js")
How can I do that? How can I call my_server.js on my new js ?
Usually the server does not get created while merely running node server.js. It will just execute javascript contained in server.js. The server will be created when your code reaches express() if you are creating server via express or createServer from plain nodejs.
That means you can probably do
var param = await someSync();
// Set the param in global context and then create your server
var app = express();
// or
var app = createHttpServer()
So now before your app starts you will have that param and you can use it in your api
If you are using promise instead of async, just create your server in then part of your promise, this will also make sure that your app will have that param before your server starts
You can fork a new process using the child_process module in node.
const { fork } = require('child_process')
const path = require('path')
const newProcess = fork(path.join(__dirname, 'path/to/jsfile'), [...otherArgs])
I am attempting to work with Dashing-JS, a port of a Sinatra based framework project in Ruby, Dashing.IO, to Javascript/Node. Essentially I have the core packages and dependencies for Dashing-JS configured; however, when attempting to run just the sample, I am unable to display anything other than this 404 error rather than a sample dashboard:
NodeJS CLI output is as follows:
The project is not well maintained; however, I am curious if an expert in Node might be able to shed some light on the situation. Is a path reference incorrect?
Notes:
1. server.js is referencing sample.jade.
var dashing = require('new-dashing-js').Dashing();
// Set your auth token here
//dashing.auth_token = 'YOUR_AUTH_TOKEN';
/*
dashing.protected = function(req, res, next) {
// Put any authentication code you want in here.
// This method is run before accessing any resource.
// if (true) next();
}
*/
// Set your default dashboard here
dashing.default_dashboard = 'sample';
dashing.start();
"Sample.jade" is located within the "dashboards" directory
I have install yo angular-fullstack
The source code of the project is here : https://github.com/DaftMonk/fullstack-demo
My api look like that :
thing
├── index.js - Routes
├── thing.controller.js - Controller for our `thing` endpoint
├── thing.model.js - Database model
├── thing.socket.js - Register socket events
└── thing.spec.js - Test
How can i use the sockets in thing.controller.js, the socket in the clicked function doesn't work
/**
* Using Rails-like standard naming convention for endpoints.
* GET /things -> index
* POST /things -> create
* GET /things/:id -> show
* PUT /things/:id -> update
* DELETE /things/:id -> destroy
*/
'use strict';
var _ = require('lodash');
var Thing = require('./thing.model');
[...]
exports.clicked = function(req, res) {
//Why socket is not defined ?
socket.emit('test', data);
};
On my clicked function i just want to emit a socket to the client side.
You need to inject your socket.io instance into your thing.controller instance...
thing.controller.js
module.exports = function(context) {
var controller = {};
...
controller.clicked = function(req,res){
context.io.emit('test','data');
}
...
return controller;
}
routes.js
module.exports = function(app, context) {
...
app.use('/api/things', require('./api/thing')(context));
...
app.js
...
require('./routes')(app, {io:socketio});
...
NOTE: This will emit the event to ALL listeners...
When you connect via socket.io a channel is formed between the client and server.. this shows up via the socket.io connection event's socket on the server... When a REST call is made from angular.js to express, there is nothing that ties that request to the socket.io connection from the browser (or any way to know it's even from the same window in the browser).
If you need to communicate with socket.io to a specific instance, then you need to rework your angular service to use socket.io instead of REST, or maintain a reference table from the browser to a given socket as part of the REST request. This is a much broader discussion, and will either be limited to a single process, or be a much larger development.
Towards developing against a socket.io based service, you may want to checkout at least the following...
angular-socket-io component
Writing an angular.js app with socket.io
Make sure socket.io script is included and define sensor var for get it working as
var socket = io('http://localhost');
And finally get sure clicked function is getting called when event is fired.
Hope it helps!
I have an application using node.js backend and require.js/backbone frontend.
My backend has a config/settings system, which depending on the environment (dev, production, beta) can do different things. I would like to propagate some of the variables to the client as well, and have them affect some template rendering (e.x change the Title or the URL of the pages).
What is the best way to achieve that?
I came up with a way to do it, and it seems to be working but I don't think its the smartest thing to do and I can't figure out how to make it work with requirejs optimizer anyway.
What I do is on the backend I expose an /api/config method (through GET) and on the client
I have the following module config.js:
// This module loads an environment config
// from the server through an API
define(function(require) {
var cfg = require('text!/api/config');
return $.parseJSON(cfg);
});
any page/module that needs config will just do:
var cfg = require('config');
As I said I am having problem with this approach, I can't compile/optimize my client code
with requirejs optimizer since /api/config file doesn't exist in offline during optimization. And I am sure there are many other reason my approach is a bad idea.
If you use use module bundlers such as webpack to bundle JavaScript files for usage in a browser, you can reuse your Node.js module for the client running in a browser. In other words, put your settings or configuration in Node.js modules, and share them between the backend and the client.
For example, you have the following settings in config.js:
Normal Node.js module: config.js
const MY_THIRD_PARTY_URL = 'https://a.third.party.url'
module.exports = { MY_THIRD_PARTY_URL }
Use the module in Node.js backend
const config = require('path-to-config.js')
console.log('My third party URL: ', config.MY_THIRD_PARTY_URL)
Share it in the client
import config from 'path-to-config.js'
console.log('My third party URL: ', config.MY_THIRD_PARTY_URL)
I do the following (note that this is Jade, i have never used require.js or backbone, however as long as you can pass variables from express into your templating language, you should be able to place JSON in data-* attributes on any element you want.)
// app.js
app.get('/', function(req, res){
var bar = {
a: "b",
c: Math.floor(Math.random()*5),
};
res.locals.foo = JSON.stringify(bar);
res.render('some-jade-template');
});
// some-jade-template.jade
!!!
html
head
script(type="text/javascript"
, src="http://ajax.googleapis.com/ajax/libs/jquery/1.10.2/jquery.min.js")
script(type="text/javascript")
$.ready(init);
function init(){
var json = $('body').attr('data-stackoverflowquestion');
var obj = JSON.parse(json);
console.log(obj);
};
body(data-stackoverflowquestion=locals.foo)
h4 Passing data with data-* attributes example