loopback upload file by storage component - javascript

I try to build a server by loopback, which can upload and download files. But when I was reading docs, I followed its steps, but couldn't understand some descriptions. Storage component REST API. I can't understand "Arguments Container specification in POST body."
Then fail in uploading and downloading. I'm not familiar with javascript and start learning node.js for only one week.

http://loopback.io/doc/en/lb2/Storage-component.html
follow this docs. Here container is the folder name. After Following the steps in the docs then create a folder inside server called storage and create a model named container using slc cli. Then check it from explore you can see the file handling routes inside container section of the explorer.
Use the below code to configure the model inside datasource.json
"imagestorage": {
"name": "imagestorage",
"connector": "loopback-component-storage",
"provider": "filesystem",
"root": "./server/storage"
}
cannot forget to specify your on location in root

Related

Is it possible to store a dotenv variable in a json file?

Using Gatsby with Ghost CMS requires a .ghost.json file containing my API keys. I'd like to push the repo to Github and don't want my keys in my repository. Hence the question: is it possible to use .env variables within json files?
By default, Gatsby looks for .env variables inside .env.development (or .env.production) when you exposes the:
require("dotenv").config({
path: `.env.${process.env.NODE_ENV}`,
})
Of course, you can change this behavior. If you want to keep your variables inside a .json file without pushing it, just add them to .gitignore and import them in the files you need (gatsby-config.js or whatever) using a require function. Using, for example: require('../../ghost.json').
So, I would recommend using the default configuration to avoid possible issues. You can keep your file pushed without API keys and move them to the .env local file and simply load the where you need by: process.env.YOUR_API_KEY_VARIABLE
For further information: https://www.gatsbyjs.org/docs/environment-variables/

Can I specify a config file that an executable compiled with electron-builder can access after packaging?

I'm building an Electron app where a client asks a server for information stored in a JSON file on the server. How can I compile the server app (using electron-builder or other) and then include a JSON file that the compiled executable has access to?
I've look through the Electron and electron-builder docs but I was unable to find any relevant information.
In the end, I'd need the JSON file to be located outside of the packaged server app so that it can be freely modified by the person using it.
I appreciate any and all help!
EDIT: I have since solved my issue. Please refer to the post below explaining my solution!
After asking on the Electron Slack chatroom, I was informed that I can use the fs module from Node to reference the file's location and use electron-builder's extraResources option to have that file be moved outside the EXE after compilation.
For example, if you wanted to reference config.json, you would reference it like so in your main.js file:
const { readFileSync } = require('fs');
var configFile = JSON.parse(readFileSync('./config.json'));
Then, in your package.json file, you would use extraResources to tell electron-builder what file to pull from where:
"build": {
"extraResources": [
{
"filter": ["./config.json"]
}
]
}
And of course, with filter being an array, you can continue to specify files that you'd like to remain external just by deliminating them with a comma!
I hope this helps whoever else may have been having issues with it!

Dynamic configuration variables in Javascript / React

I am writing a Client / Server application with the front end UI based in React. As a back-end Unix developer web technologies are not my forte so this is all new to me.
I need to be able to configure the UI to point to the server's URL and also to set other preferences. The typical react approach seems to be to use .env environment variables. However, as per this link:
multiple-environments-with-react
'the “environment variables” will be baked in at build time'. This does not work for me as the application is an OEM offering to be sold to customers who would configure it themselves for their own environment. I do not know their server URLS at build time so I need a way that I can deliver the pre-built (minified / linted, etc) JS to them in a single archive and let them edit some sort of properties file to configure it for their needs.
What would the general JavaScript / React best practice be for this sort of use case?
Thanks,
Troy
The easiest solution for me turned out to be to include a tag in my index.html. This gets minified during the NPM build but it does not get bundled with the other javascript so it can easily be replaced with another file at deploy time. My config.js looks like this:
config = {
"title": "Application Title",
"envName": "LocalDev",
"URL": "localhost:8090"
}
Then inside my react components they're accessible by using:
const config = window.config;
alert("Application branding title is: " + config.title);
I will now have various config.js files for each environment (config.js.dev, config.js.uat, config.js.prod, etc) and at deployment I will link or renmae the appropriate one to config.js.

How to link features files with the step definitions using Cypress io

I am currently using cucumber with cypress io for testing, however, unlike when using selenium with cucumber/Gherkin that allows you to trace each scenario step to the corresponding step definition step by pressing down the control key and clicking against the scenario step, this is not happening with cypress io. The test also failed when using the below syntax;
given(/^I entered a valid client id as "([^"]*)"$/, (client_id) => {
cy.get('#bpId')
.clear()
.type(client_id);
});
Could someone help me with the way to solve these problems? I am new to cypress.
I have ran into a similar problem, while setting up the project.
I have created a file named, .cypress-cucumber-preprocesorrc and added this line in there
{
"step_definitions": "cypress/integration/**/step_definitions/"
}
The recommended way to integrate cucumber and cypress is the cypress-cucumber-preprocessor. Under the hood, this module uses cosmiconfig, which allows you to specify json or yaml configuration.
The configuration file naming conventions are explained in the cosmiconfig README:
By default, Cosmiconfig will start where you tell it to start and search up the directory tree for the following:
a package.json property
a JSON or YAML, extensionless "rc file"
an "rc file" with the extensions .json, .yaml, .yml, or .js.
a .config.js CommonJS module
For example, if your module's name is "myapp", cosmiconfig will search up the directory tree for configuration in the following places:
a myapp property in package.json
a .myapprc file in JSON or YAML format
a .myapprc.json file
a .myapprc.yaml, .myapprc.yml, or .myapprc.js file
a myapp.config.js file exporting a JS object
This is why the configuration file in this answer works.

How can I read files from a subdirectory in a deployed Meteor app?

I am currently making a Meteor app and am having trouble reading files from the private subdirectory. I have been following a couple different tutorials and managed to get it to work flawlessly when I run the meteor app locally. This question (Find absolute base path of the project directory) helped me come up with using process.env.PWD to access the root directory, and from there I use .join() to access the private folder and the pertinent file inside. However, when I deployed this code, the website crashes on startup. I am very confident that it is an issue with process.env.PWD, so I am wondering what the proper method of getting Meteor's root directory on a deployed app is.
//code to run on server at startup
var path = Npm.require('path')
//I also tried using the below line (which was recommended in another Stackoverflow question) to no avail
//var meteor_root = Npm.require('fs').realpathSync( process.cwd() + '/../' );
var apnagent = Meteor.require("apnagent"),
agent = new apnagent.Agent();
agent.set('cert file', path.join(process.env.PWD, "private", "certificate-file.pem"))
agent.set('key file', path.join(process.env.PWD, "private", "devkey-file.pem"))
In development mode the file structure is different than after bundling, so you should never rely on it. Particularly, you should not access your files directly like you're doing with path methods.
Loading private assets is described in this section of Meteor's documentation. It mostly boils down to this method:
Assets.getBinary("certificate-file.pem");
and it's getText counterpart.
As for configuring APN agent, see this section of documentation. You don't have to configure the agent by passing file path as cert file param. Instead you may pass the raw data returned by Assets methods directly as cert. The same holds for key file ~ key pair and other settings.
As an alternative, you would need to submit your files independently to the production server to a different folder than your Meteor app and use their global path. This, however, would not be possible for cloud providers like Heroku, so it's better to use assets in the intended way.

Categories