Multer destination file is not readable by ZEIT server - javascript

I got the following error: no such as file or directory public/uploads/bae1774e-d6dc-454b-ba63-a4c8c53d3053.png while I'm uploading an image to the server (nodejs) using multer hosted through ZEIT
const multerMultiple = {
storage: multer.diskStorage({
destination: (req, file, callback) => {
callback(null, "./public/uploads"); // I think the problem here
},
filename: (req, file, callback) => {
const extension = file.mimetype.split("/")[1];
const name = `${uuid.v4()}.${extension}`;
callback(null, name);
}
})
};
The configuration above is working perfectly locally, while the node js is running locally
now.json
{
"name": "application-name",
"version": 2,
"builds": [
{ "src": "index.js", "use": "#now/node-server" },
{ "src": "./public/uploads", "use": "#now/static" }
],
"routes": [{ "src": "/.*", "dest": "/index.js" }],
"env": {
.../ env here
}
}
Source View:
Which is mean the public directory is presented
So any idea why I'm getting the presented issue while the node js is hosted? It is something missing in ZEIT now configuration or something related to my code?

It looks like you're trying to upload files to your app, running as a lambda on Zeit.
Zeit works on top of AWS Lambda. AWS Lambda, and thus Now lambdas, offer only very limited writing to the filesystem during execution, but any changes will be lost when execution completes. There is no durability between lambda executions.
Instead, you'll need to write the files to some sort of durable storage, like AWS S3. Here's an example of how you might do that: https://github.com/zishon89us/node-cheat/tree/master/aws/express_multer_s3

Related

Is this a Vercel bug? Cannot find module './model'

I have 2 files:
controller.js
model.js
I'm making an express.js app.
So, model.js is required inside controller.js but I have this error when I call my api
And the logs are:
But here is the problem, './model.js' really does exist, but vercel doesn't recognize it
And works fine in local development and it is required correctly
this is model.js
const { nanoid } = require("nanoid");
const getDateStr = () => {
const now = new Date();
return `${now.getFullYear()}-${now.getMonth()+1}-${now.getDate()}`;
}
const Purchase = ({ id_user_buyer, id_user_seller, id_product, quantity }) => (
{
id_user_buyer,
id_user_seller,
id_product,
quantity,
id_purchase: nanoid(),
date: getDateStr(),
}
);
module.exports = {
Purchase
}
And this is controller.js
const err = require("../../../utils/error");
const { Purchase } = require("./model")
// others modules and they are well imported, so the problem is './model'
const userController = require("../user");
const productController = require("../product");
const cartController = require("../cart");
const TABLE = 'purchase';
function purchaseController(injectedStore) {
// example code
async function makePurchase(data) {
const purchase = Purchase(data);
await injectedStore.insert(TABLE, purchase);
}
return {
makePurchase,
}
}
module.exports = purchaseController;
As you can see, model.js is well imported inside controller.js I don't know why vercel says ERROR Cannot find module './model' I say it again, works fine in local development but not in vercel
A quick fix is copy and paste all the code of model.js inside controller.js I tried it, I deploy him and it worked.
All my app also works fine if I just comment that line where I import ./model , but obviously my application would stop having that functionality, so the first solution is uggly but works, but those are not the best solutions. the best solution is for the file to be imported correctly
Curious fact I tried renaming the file and it didn't help either. It also doesn't work if I import a new file.
NOTE I changed the nodejs version from 12 to 14, will that have something to do with it?
just in case I put my folder structure
root
api
this is my vercel.json
{
"version": 2,
"builds": [
{
"src": "/api/index.js",
"use": "#vercel/node"
}
],
"routes": [
{
"src": "/api/auth(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/users(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/products(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/cart(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/purchases(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/sales(.*)",
"dest": "/api/index.js"
}
]
}
I don't know if it's a vercel bug or it's a mistake on my part. My application works currently but doing the trick that I named before, which was about putting all the code of model.js inside of controller.js
Thanks for reading the whole issue.
Well, I don't know why, but i think was a vercel bug, the solution for my problem was not to use optional chaining.
In controller.js I have more code and I was using optional chaining, just change the logic and works fine. Vercel supports optional chaining but with node 14.x and I was using node 12.x in vercel, so I change it to 14.x and works fine in others files, but not in controller.js
So, if you are using javascript new things, like optional chaining, nullish operator, etc. And you are using version 12.x in vercel, that will produce errors. So you change to node 14.x, it could give you bugs too.
So the best you can do is make sure from the beginning that you have version 14.x of node.js in vercel
how to change node version in vercel

Firebase Cloud Functions versioning

I am working with Firebase Cloud Functions and I am trying to versioning my api. I am using express as all the tutorials suggest. However, with this solution, we use Firebase Hosting instead of Cloud Functions.
Hosting: https://xxxx.firebaseapp.com
Cloud Functions: https://xxxx.cloudfunctions.net
The solution which comes closest to what I am looking for is this

.


const app1 = express();
app1.use(cors({ origin: true }));
app1.get("*", (request, response) => {
if(request.url.includes("/v1/")){
response.send("V1: "+request.path);
}
else{
response.send("V2: "+request.path);
}
});
const stats = functions.https.onRequest(app1);
module.exports = {
stats
};
However, you only can see only one function in Firebase Cloud Functions:

https://xxxx.cloudfunctions.net/stats
Also, you only can handle one kind of HTTP request (GET, POST, etc).
What I am looking for is to have in Firebase Cloud Functions:

https://xxxx.cloudfunctions.net/stats/v1/ (including GET, POST, PUT or in another case separate functions with “/:userId” “/save”, etc)
https://xxxx.cloudfunctions.net/stats/v2/
https://xxxx.cloudfunctions.net/item/v1/
Is it possible to do it with Cloud Functions?
Each path can point to a distinct function (or Express app). You configure this in your firebase.json file like below. You don't need a separate domain at all. The below firebase.json results in
https://example.com is handled by firebase hosting
https://example.com/v1 is handled by firebase function
https://example.com/v2 is handled by firebase function
https://example.com/v3 is handled by firebase function
{
"firestore": {
"rules": "firestore.rules",
"indexes": "firestore.indexes.json"
},
"hosting": {
"public": "public",
"cleanUrls": true,
"trailingSlash": false,
"rewrites": [
{
"source": "/v1",
"function": "v1"
},
{
"source": "/v2",
"function": "v2"
},
{
"source": "/v3",
"function": "v3"
}
],
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
]
},
"storage": {
"rules": "storage.rules"
},
"functions": {
"source": "functions"
}
}
Finally, I solved it in this way:
const express = require("express")
const getUserFunctions = express();
getUserFunctions.get("/stats", (request, response) => {
...
...
});
const v1 = functions.https.onRequest(getUserFunctions);
module.exports = {
v1,
v2
};
So the endpoint would be like this:
https://xxxx.cloudfunctions.net/v1/stats
It's expected that you'll see only one function with the name you've exported from your index.js. All URLs must be anchored through that one function. It looks like you're expecting that both /stats/* and /items/* will route through your function, but that's not the way it works.
If you want different path prefixes, you will either need different functions to handle them, or you will need to have a single common prefix for them all. For example, you could export a function called "api", then use it to handle all the different types of queries:
/api/stats/...
/api/items/...
You may also want to consider using different functions (Express apps) for each different version of your api, and have Firebase Hosting route the different versions to the different app, instead of having a single app check the version at runtime.

Portable electron app is extracted in a different folder every time it opens

electron-builder version: 20.9.2
Target: windows/portable
I'm building a portable app with electron-builder and using socket.io to keep a real-time connection with a backend service but I have an issue with the firewall. Because this is a portable app everytime the app is opened it looks that it is extracted in the temporary folder, which will generate a new folder (so the path to the app will be different) in every run which will make the firewall think that this is another app asking for the connection permissions. How can I change the extraction path when I run the app?
(This is the screen that I get every time I run the app)
This is my socket.io configuration
const io = require("socket.io")(6524);
io.on("connect", socket => {
socket.on("notification", data => {
EventBus.$emit("notifications", JSON.parse(data));
});
});
My build settings in package.json
"build": {
"productName": "xxx",
"appId": "xxx.xxx.xxx",
"directories": {
"output": "build"
},
"files": [
"dist/electron/**/*",
"!**/node_modules/*/{CHANGELOG.md,README.md,README,readme.md,readme,test,__tests__,tests,powered-test,example,examples,*.d.ts}",
"!**/node_modules/.bin",
"!**/*.{o,hprof,orig,pyc,pyo,rbc}",
"!**/._*",
"!**/{.DS_Store,.git,.hg,.svn,CVS,RCS,SCCS,__pycache__,thumbs.db,.gitignore,.gitattributes,.editorconfig,.flowconfig,.yarn-metadata.json,.idea,appveyor.yml,.travis.yml,circle.yml,npm-debug.log,.nyc_output,yarn.lock,.yarn-integrity}",
"!**/node_modules/search-index/si${/*}"
],
"win": {
"icon": "build/icons/myicon.ico",
"target": "portable"
}
},
Any idea about how at least I could specify an extraction path or make this extract it the execution folder?
BTW I already created an issue about this in the electron-builder repo
In version 20.40.1 they added a new configuration key unpackDirName
/**
* The unpack directory name in [TEMP](https://www.askvg.com/where-does-windows-store-temporary-files-and-how-to-change-temp-folder-location/) directory.
*
* Defaults to [uuid](https://github.com/segmentio/ksuid) of build (changed on each build of portable executable).
*/
readonly unpackDirName?: string
Example
config: {
portable: {
unpackDirName: "0ujssxh0cECutqzMgbtXSGnjorm",
}
}
More info #3799.

Retrieve or Specify output file name in electron-builder

I am working with electron-builder programmatically to generate installation packages. So far I have this as my utility to create the installation package for the current OS type:
const packagejson = require("../package.json");
const builder = require("electron-builder");
const Platform = builder.Platform;
function buildPromise(){
//Development package.json
const devMetadata = packagejson.electronBuilder;
//Application package.json
const appMetadata = {
name: packagejson.name,
version: packagejson.version,
description: packagejson.description,
author: packagejson.author,
productName: packagejson.productName
};
//Build for the current target and send back promise
return builder.build({
projectDir: "./",
devMetadata,
appMetadata
});
}
module.exports = {
buildPromise,
outputPath : packagejson.electronBuilder.directories.output
};
What it does is pull in the needed metadata from the apps MAIN package.json file which contains this section (so the application package.json is empty):
...
"electronBuilder": {
"build": {
"productName": "Node App",
"appId": "my.id",
"asar": false,
"win": {
"iconUrl": "http://localhost:5000/images/logo-multi.ico",
"target": "nsis"
},
"nsis" :{
"oneClick": false
}
},
"directories": {
"output": "electron/output",
"app":"electron/app",
"buildResources": "electron/buildResources"
}
}
...
When I run the build in Windows I get a file out called Node App Setup 1.0.0.exe. So far so go. But how do I actually control that final file name? Or at least retrieve that file name programmatically so I can read it in and respond to the client in some way? Obviously, I could piece it together from the json file settings but it I would rather it be more definitive.
You can specify the output filename using artifactName in the build section of your package.json.
The docs say the artifact file name template supports the ${ext} macro:
${ext} macro is supported in addition to file macros.
File Macros
You can use macros in the file patterns, artifact file name patterns and publish configuration url:
${arch} — expanded to ia32, x64. If no arch, macro will be removed from your pattern with leading space, - and _ (so, you don't need to worry and can reuse pattern).
${os} — expanded to mac, linux or win according to target platform.
${name} – package.json name.
${productName} — Sanitized product name.
${version} — from package.json
${channel} — detected prerelease component from version (e.g. beta).
${env.ENV_NAME} — any environment variable.
Any property of AppInfo (e.g. buildVersion, buildNumber).
Example
"build": {
"appId": "com.electron.app.my",
"artifactName": "node-app-${version}.${ext}",
...
},
If your package version is 1.0.0, a Windows target would output:
node-app-1.0.0.exe
At my request the author added it to the current version (8.5.1):
https://github.com/electron-userland/electron-builder/issues/899
so now we can do:
builder.build()
.then(paths => {
//paths contains an array of export file paths, e.g.:
console.log(paths[0]); //= c:/MyProject/dist/My Project Setup 1.0.0.exe
console.log(paths[1]); //= c:/MyProject/dist/myproject-1.0.0-x86_64.AppImage
});

How to check file format before uploading a file in container using loopback-component-storage?

I have tried this way sample code.
I can successfully upload the file, but before that I have to validate file type and allow only *.csv files.
You can use mime node module for that.
https://www.npmjs.com/package/mime
You must complement sample code as follows:
Edit server/datasources.json, add to storage datasources
the allowedContentTypes field, an array of mime types that you want support
...
"storage": {
"name": "storage",
"connector": "loopback-component-storage",
"provider": "filesystem",
"root": "/var/www/storage",
"maxFileSize": "52428800",
"allowedContentTypes":["text/csv", "application/vnd.ms-excel"]
}
...
This step is optional but is a good practice. The error can be handled in callback, also can add a remote hook in the container model file common/models/container.js
module.exports = function (container) {
container.afterRemoteError('upload', function (ctx, next) {
//do any with ctx.error
});
}

Categories