Hapi.js Documentation generator for REST API Servers - javascript

I was looking for a tool that generates static documentation for Hapi.js routes, but doesn't create dependencies to the API server.
So I was just visualising a cli tool where I could pass my server.js as an argument, and would create the API documentation by parsing my route definitions.
hapi-swagger modules fails, as it creates the follwoing dependencies to my server;
Have to define a view engine
Have to disable the minimal option of my api servers
Have to define a /documentation route (I know I can change that, but the issue persists)
If such tool doesn't exists, what is the best alternative to create swagger ui static files documentation?
Thanks!

First the lout module is officially supported by hapijs and is not deprecated. It provides an alternative to swagger. But it does not solve your problem because it does not generate static html/css.
Now to the solution: I would add hapi-swagger but only in development like this (so you keep your server lightweight in staging/production)...
...
if (process.env.NODE_ENV === 'development') {
//Register inert, vision, hapi-swagger with server.register()
}
...
...then use bootprint-swagger or something similar to generate static html which you can serve on a web server of your choice.
Hope this helps.

Related

How to share webpack result between frontend and backend?

I have two projects - frontend (vue) and backend (nodejs) - of one web app. I'm using webpack to build frontend. All images change their names to hashes during building. Now i need to know some of them at backend side. What is best way to share that info between frontend and backend?
I can use simple hardcoding approach or rely on custom hand-written mapping and use custom script to load assets. But both of these are fragile approach.
Webpack allows multi-build projects using target. Exactly, how to achieve this using Webpack?
Webpack programmatic can help you webpack Node API
// webpack/script.js
import webpack from "webpack";
const compiler = webpack({ /* your config*/ }, (err, stats) => {
// stats.toJson(options)
console.log(stats.toString())
// make api call for getting the info in backend with stats of your choice
})

Single API to load JSON file in Browser and NodeJS?

Is there an existing API or library that can be used to load a JSON file in both the browser and Node?
I'm working on a module that I intend to run both from the command-line in NodeJS, and through the browser. I'm using the latest language features common to both (and don't need to support older browsers), including class keywords and the ES6 import syntax. The class in question needs to load a series of JSON files (where the first file identifies others that need to be loaded), and my preference is to access them as-is (they are externally defined and shared with other tools).
The "import" command looks like it might work for the first JSON file, except that I don't see a way of using that to load a series of files (determined by the first file) into variables.
One option is to pass in a helper function to the class for loading files, which the root script would populate as appropriate for NodeJS or the browser.
Alternatively, my current leading idea, but still not ideal in my mind, is to define a separate module with a "async function loadFile(fn)" function that can be imported, and set the paths such that a different version of that file loads for browser vs NodeJS.
This seems like something that should have a native option, or that somebody else would have already written a module for, but I've yet to find either.
For node, install the node-fetch module from npm.
Note that browser fetch can't talk directly to your filesystem -- it requires an HTTP server on the other side of the request. Node can talk to your filesystem, as well as making HTTP calls to servers.
It sounds like as of now, there is no perfect solution here. The 'fetch' API is the most promising, but only if Node implements it some day.
In the meantime I've settled for a simple solution that works seamlessly with minimal dependencies, requiring only a little magic with my ExpressJS server paths to point the served web instance to a different version of utils.js.
Note: To use the ES-style import syntax for includes in NodeJS (v14+) you must set "type":"module" in your package.json. See https://nodejs.org/api/esm.html#esm_package_json_type_field for details. This is necessary for true shared code bases.
Module Using it (NodeJS + Browser running the same file):
import * as utils from "../utils.js";
...
var data = await utils.loadJSON(filename);
...
utils.js for browser:
async function loadJSON(fn) {
return $.getJSON(fn); // Only because I'm using another JQuery-dependent lib
/* Or natively something like
let response = await fetch(fn);
return response.json();
*/
}
export { loadJSON };
utils.js for nodeJS
import * as fs from 'fs';
async function loadJSON(fn) {
return JSON.parse(await fs.promises.readFile(fn));
}
export { loadJSON };

How can I create a shared utility to use with two different AngularJS apps (on different html pages)?

I'm working on a multi-page site using AngularJS, and I want to write a utility that can be included in more than one page. I've looked at services and providers, and all the examples I find are single-page examples. I'm not sure how to generalize this to multiple apps used on different pages.
This is what I want to have for my two different pages/apps.
in app1.js:
var app1 = angular.module('app1',['myUtil'])
app1.controller('ctrl1',function ctrl1($scope,myUtil){...})
in app2.js:
var app2 = angular.module('app2',['myUtil'])
app2.controller('ctrl2',function ctrl2($scope,myUtil){...})
in myUtil.js:
??? Provider? Service? Module?
All the examples I have found for providers and services show them as being attached to a single app. Is this possible with AngularJS? If so, what am I missing?
The answer from zero298 is a nice answer as it's a great way of organising and reusing the utility module you create.
If you want a less broad and more "codey" answer, then one way of doing it would be to have some kind of utility module that houses whatever services you want to put in it, and then you can pass that in as a dependency for all apps that use it. This will all depend on your build process as to how you import/organise the files, but as a very basic example you could have a "utilsmodule" module with a "utils" service:
myUtils.js:
angular.module('utilsmodule', []);
// Service could be in another file
angular.module('utilsmodule').service('myutil', function() {
return {
myUtilFunction : function() {
return "This is from myutil";
}
};
});
Then in your app files you can pass in the module by name, which will give the app access to the 'myutil' service.
app1.js:
var app1 = angular.module('app1',['utilsmodule'])
app1.controller('ctrl1',function ctrl1($scope,myutil){...})
Then you would import the myUtils.js file before the app1.js file so that the "utilsmodule" module is registered with angular before your app is created. You do the same with app2 and the utility module should be available to both.
Example Plunker
This may be a bit too broad. However, what I would suggest you do is create a library module dedicated to the feature/utility that you want to make available to your projects.
I would suggest using npm to organize all of this. Give this feature module it's own package.json and add whatever code you need to make it run. In your consumer projects, add the library module as a dependency.
A good method to get this working locally (as well as quickly since you don't have to constantly push to the npm registry) is to use the npm link utility.
If your consumer projects are already npm oriented, the workflow would be as follows:
Create a new directory to contain your utility library module lets call it my-utility
cd to the new directory
npm init to create a package.json in this library
npm link to make the library available locally
cd to any of the consumer projects
npm link my-utility so that the consumer projects create a symlink to the local folder that contains your utility module
After that is setup, depending on how your consumer projects build or use dependencies, you can use your new utility library in your top level projects.

Use web-workers in Meteor clientside

I got an meteor app. On clientside I got a heavy calculation which freezes the whole tab in the browser while executing.
So I want to use Web Workers to avoid freezing and to handle the process better (terminating, loading informations percentages etc).
For getting the web workers to work I had to include a selfwritten webworker.js in my package.js.
So my main question is: How do I set up web workers in a meteor app (clientside)?
First I tried following things:
Add a file via api.addFile() with bare option.
Package.describe({
name: 'some:name',
version: '0.0.1',
// Brief, one-line summary of the package.
summary: 'Generates PDFs',
// URL to the Git repository containing the source code for this package.
git: '',
// By default, Meteor will default to using README.md for documentation.
// To avoid submitting documentation, set this field to null.
documentation: 'README.md'
});
Package.onUse(function(api) {
api.versionsFrom('1.1.0.2');
api.use(['mrt:redis#0.1.3'],
'server',
{weak: false, unordered: false}
);
api.addFiles([
"vendor/FileSaver.js",
"vendor/jspdf.debug.js",
"dcPDF.js"
],["client"]);
api.addFiles([
"server.js"
],["server"]);
api.addFiles(["pdfProgressWorker.js"],["client"], {bare: true});
api.export('DCPDF');
});
Meteor compresses all files in packages. To set up the workers right, I have to deploy the webworker.js as own js-file on clientside. The bare option seems not to work for this case. My js-files cannot call my webworker.js if I include the file in that way.
Second try:
I added my webworker.js to the /public folder.
Problem here: my webworker.js uses some external libaries which I already included in my own package. If I add the webworker.js to the public folder, I have to add all my external js-libaries too, which are all loaded on every site in my meteor app, which slows my whole application down. Not intended, not maintainable, not meteor-style (I think).

How can I read files from a subdirectory in a deployed Meteor app?

I am currently making a Meteor app and am having trouble reading files from the private subdirectory. I have been following a couple different tutorials and managed to get it to work flawlessly when I run the meteor app locally. This question (Find absolute base path of the project directory) helped me come up with using process.env.PWD to access the root directory, and from there I use .join() to access the private folder and the pertinent file inside. However, when I deployed this code, the website crashes on startup. I am very confident that it is an issue with process.env.PWD, so I am wondering what the proper method of getting Meteor's root directory on a deployed app is.
//code to run on server at startup
var path = Npm.require('path')
//I also tried using the below line (which was recommended in another Stackoverflow question) to no avail
//var meteor_root = Npm.require('fs').realpathSync( process.cwd() + '/../' );
var apnagent = Meteor.require("apnagent"),
agent = new apnagent.Agent();
agent.set('cert file', path.join(process.env.PWD, "private", "certificate-file.pem"))
agent.set('key file', path.join(process.env.PWD, "private", "devkey-file.pem"))
In development mode the file structure is different than after bundling, so you should never rely on it. Particularly, you should not access your files directly like you're doing with path methods.
Loading private assets is described in this section of Meteor's documentation. It mostly boils down to this method:
Assets.getBinary("certificate-file.pem");
and it's getText counterpart.
As for configuring APN agent, see this section of documentation. You don't have to configure the agent by passing file path as cert file param. Instead you may pass the raw data returned by Assets methods directly as cert. The same holds for key file ~ key pair and other settings.
As an alternative, you would need to submit your files independently to the production server to a different folder than your Meteor app and use their global path. This, however, would not be possible for cloud providers like Heroku, so it's better to use assets in the intended way.

Categories