I am working with Firebase Cloud Functions and I am trying to versioning my api. I am using express as all the tutorials suggest. However, with this solution, we use Firebase Hosting instead of Cloud Functions.
Hosting: https://xxxx.firebaseapp.com
Cloud Functions: https://xxxx.cloudfunctions.net
The solution which comes closest to what I am looking for is this
.
const app1 = express();
app1.use(cors({ origin: true }));
app1.get("*", (request, response) => {
if(request.url.includes("/v1/")){
response.send("V1: "+request.path);
}
else{
response.send("V2: "+request.path);
}
});
const stats = functions.https.onRequest(app1);
module.exports = {
stats
};
However, you only can see only one function in Firebase Cloud Functions:
https://xxxx.cloudfunctions.net/stats
Also, you only can handle one kind of HTTP request (GET, POST, etc).
What I am looking for is to have in Firebase Cloud Functions:
https://xxxx.cloudfunctions.net/stats/v1/ (including GET, POST, PUT or in another case separate functions with “/:userId” “/save”, etc)
https://xxxx.cloudfunctions.net/stats/v2/
https://xxxx.cloudfunctions.net/item/v1/
Is it possible to do it with Cloud Functions?
Each path can point to a distinct function (or Express app). You configure this in your firebase.json file like below. You don't need a separate domain at all. The below firebase.json results in
https://example.com is handled by firebase hosting
https://example.com/v1 is handled by firebase function
https://example.com/v2 is handled by firebase function
https://example.com/v3 is handled by firebase function
{
"firestore": {
"rules": "firestore.rules",
"indexes": "firestore.indexes.json"
},
"hosting": {
"public": "public",
"cleanUrls": true,
"trailingSlash": false,
"rewrites": [
{
"source": "/v1",
"function": "v1"
},
{
"source": "/v2",
"function": "v2"
},
{
"source": "/v3",
"function": "v3"
}
],
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
]
},
"storage": {
"rules": "storage.rules"
},
"functions": {
"source": "functions"
}
}
Finally, I solved it in this way:
const express = require("express")
const getUserFunctions = express();
getUserFunctions.get("/stats", (request, response) => {
...
...
});
const v1 = functions.https.onRequest(getUserFunctions);
module.exports = {
v1,
v2
};
So the endpoint would be like this:
https://xxxx.cloudfunctions.net/v1/stats
It's expected that you'll see only one function with the name you've exported from your index.js. All URLs must be anchored through that one function. It looks like you're expecting that both /stats/* and /items/* will route through your function, but that's not the way it works.
If you want different path prefixes, you will either need different functions to handle them, or you will need to have a single common prefix for them all. For example, you could export a function called "api", then use it to handle all the different types of queries:
/api/stats/...
/api/items/...
You may also want to consider using different functions (Express apps) for each different version of your api, and have Firebase Hosting route the different versions to the different app, instead of having a single app check the version at runtime.
Related
I have 2 files:
controller.js
model.js
I'm making an express.js app.
So, model.js is required inside controller.js but I have this error when I call my api
And the logs are:
But here is the problem, './model.js' really does exist, but vercel doesn't recognize it
And works fine in local development and it is required correctly
this is model.js
const { nanoid } = require("nanoid");
const getDateStr = () => {
const now = new Date();
return `${now.getFullYear()}-${now.getMonth()+1}-${now.getDate()}`;
}
const Purchase = ({ id_user_buyer, id_user_seller, id_product, quantity }) => (
{
id_user_buyer,
id_user_seller,
id_product,
quantity,
id_purchase: nanoid(),
date: getDateStr(),
}
);
module.exports = {
Purchase
}
And this is controller.js
const err = require("../../../utils/error");
const { Purchase } = require("./model")
// others modules and they are well imported, so the problem is './model'
const userController = require("../user");
const productController = require("../product");
const cartController = require("../cart");
const TABLE = 'purchase';
function purchaseController(injectedStore) {
// example code
async function makePurchase(data) {
const purchase = Purchase(data);
await injectedStore.insert(TABLE, purchase);
}
return {
makePurchase,
}
}
module.exports = purchaseController;
As you can see, model.js is well imported inside controller.js I don't know why vercel says ERROR Cannot find module './model' I say it again, works fine in local development but not in vercel
A quick fix is copy and paste all the code of model.js inside controller.js I tried it, I deploy him and it worked.
All my app also works fine if I just comment that line where I import ./model , but obviously my application would stop having that functionality, so the first solution is uggly but works, but those are not the best solutions. the best solution is for the file to be imported correctly
Curious fact I tried renaming the file and it didn't help either. It also doesn't work if I import a new file.
NOTE I changed the nodejs version from 12 to 14, will that have something to do with it?
just in case I put my folder structure
root
api
this is my vercel.json
{
"version": 2,
"builds": [
{
"src": "/api/index.js",
"use": "#vercel/node"
}
],
"routes": [
{
"src": "/api/auth(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/users(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/products(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/cart(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/purchases(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/sales(.*)",
"dest": "/api/index.js"
}
]
}
I don't know if it's a vercel bug or it's a mistake on my part. My application works currently but doing the trick that I named before, which was about putting all the code of model.js inside of controller.js
Thanks for reading the whole issue.
Well, I don't know why, but i think was a vercel bug, the solution for my problem was not to use optional chaining.
In controller.js I have more code and I was using optional chaining, just change the logic and works fine. Vercel supports optional chaining but with node 14.x and I was using node 12.x in vercel, so I change it to 14.x and works fine in others files, but not in controller.js
So, if you are using javascript new things, like optional chaining, nullish operator, etc. And you are using version 12.x in vercel, that will produce errors. So you change to node 14.x, it could give you bugs too.
So the best you can do is make sure from the beginning that you have version 14.x of node.js in vercel
how to change node version in vercel
I got the following error: no such as file or directory public/uploads/bae1774e-d6dc-454b-ba63-a4c8c53d3053.png while I'm uploading an image to the server (nodejs) using multer hosted through ZEIT
const multerMultiple = {
storage: multer.diskStorage({
destination: (req, file, callback) => {
callback(null, "./public/uploads"); // I think the problem here
},
filename: (req, file, callback) => {
const extension = file.mimetype.split("/")[1];
const name = `${uuid.v4()}.${extension}`;
callback(null, name);
}
})
};
The configuration above is working perfectly locally, while the node js is running locally
now.json
{
"name": "application-name",
"version": 2,
"builds": [
{ "src": "index.js", "use": "#now/node-server" },
{ "src": "./public/uploads", "use": "#now/static" }
],
"routes": [{ "src": "/.*", "dest": "/index.js" }],
"env": {
.../ env here
}
}
Source View:
Which is mean the public directory is presented
So any idea why I'm getting the presented issue while the node js is hosted? It is something missing in ZEIT now configuration or something related to my code?
It looks like you're trying to upload files to your app, running as a lambda on Zeit.
Zeit works on top of AWS Lambda. AWS Lambda, and thus Now lambdas, offer only very limited writing to the filesystem during execution, but any changes will be lost when execution completes. There is no durability between lambda executions.
Instead, you'll need to write the files to some sort of durable storage, like AWS S3. Here's an example of how you might do that: https://github.com/zishon89us/node-cheat/tree/master/aws/express_multer_s3
I currently have a Firebase App that works locally when I use localhost:5001 for calling the functions; however, when I try to use the cloud functions when directly routed I get ERROR: FORBIDDEN, and when I try to directly fetch from my deployed Firebase app I get a CORS error.
The front end calls a service, which fetches from the backend the data.
const requestOptions = {
method: 'GET',
mode: 'cors',
headers: new Headers({ 'Content-Type': 'application/json' })
};
return fetch("https://us-central1-polling-269dc.cloudfunctions.net/api/polls/get", requestOptions).then(handleResponse);
I've also tried just not having the mode and headers on there, but it doesn't work. Fetch is correctly being called, but on the loading page of https://polling-269dc.firebaseapp.com/#/ I get this error message:
Access to fetch at 'https://us-central1-polling-269dc.cloudfunctions.net/api/polls/get' from origin 'https://polling-269dc.firebaseapp.com' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
In my functions folder in my index.js, this is the function and prequisite imports:
const functions = require('firebase-functions');
const express = require('express');
const app = express();
const fire = require("./fire.js");
var database = fire.database();
var auth = fire.auth();
app.get("api/polls/get/", (req, res) => {
var ref = database.ref("polls/");
var query = ref.orderByChild("question");
var sum = [];
query.once("value", (snap) => {
snap.forEach( (childSnap) => {
sum.push(childSnap.val());
});
res.json(sum);
});
});
I've also tried
app.get("https://us-central1-polling-269dc.cloudfunctions.net/api/polls/get/", (req, res) => {
Which has similar results. I suspect that I'm calling the wrong URL or I need to prepare something, but I can't find information about it.
Here's my firebase.json, is something wrong here perhaps?
{
"database": {
"rules": "database.rules.json"
},
"hosting": {
"public": "client/build",
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
],
"rewrites": [ {
"source": "**",
"function": "app"
} ]
},
"functions": {
"predeploy": [
"npm --prefix \"$RESOURCE_DIR\" run lint"
]
}
}
Again, this works when I use a localhost and use firebase serve, but I'm trying to figure it out for Firebase Deployment online. Thanks in advance.
Okay, so I figured out the issue; you need to add the name of the exported set of functions in the call. My call should've been:
https://us-central1-polling-269dc.cloudfunctions.net/app/api/polls/get
instead of
https://us-central1-polling-269dc.cloudfunctions.net/api/polls/get
I hope this helps anyone who was having a similar issue!
Is it possible to use different templates for 404 and 500 errors in a kraken.js application? Below is how you enable the middleware and tell it what template to use. For my application I need to use different templates depending on what section of the site the user is in.
"fileNotFound": {
"enabled": true,
"priority": 130,
"module": {
"name": "kraken-js/middleware/404",
"arguments": ["tamarin/errors/404"]
}
},
"serverError": {
"enabled": true,
"priority": 140,
"module": {
"name": "kraken-js/middleware/500",
"arguments": ["tamarin/errors/500"]
}
},
I could modify the middleware myself to accept multiple templates and some sort of logic to choose which template but I'm wondering if there is another solution.
Answered here on the kraken github repo: https://github.com/krakenjs/kraken-js/issues/434
...
For route specific 404 pages you can just the kraken 404 middleware directly on the routes you want to have different 404 templates for. Here is how I accomplished that.
var _404 = require('kraken-js/middleware/404');
module.exports = function(router) {
router.get("/:lang?/*", _404('tamarin/admin/404'), function(req, res) {
...
This is great because the 404 template I configured in the config.json will be the default template and for anything I want on a route by route basis can use the above approach.
I'm using LoopBack ver. 1.6 and have a local mongoDB server running for development using he following datasource configuration:
"mongodb": {
"defaultForType": "mongodb",
"connector": "loopback-connector-mongodb",
"database": "xxxdbname",
"host": "localhost",
"port": "27017"
},
Now I want to deploy to Heroku but I don't know how to configure the datasource to point at the MongoLab db since it has a dynamically generated connection string:
from the Heroku dox:
var mongo = require('mongodb');
var mongoUri = process.env.MONGOLAB_URI ||
process.env.MONGOHQ_URL ||
'mongodb://localhost/mydb';
mongo.Db.connect(mongoUri, function (err, db) {
db.collection('mydocs', function(er, collection) {
collection.insert({'mykey': 'myvalue'}, {safe: true}, function(er,rs) {
});
});
});
So what kind of changes do I need to make to my datasource JSON to map the Heroku connection string?
This has now (as of June 27 2014) been addressed: create a file datasources.local.js with the following content (where mongodb is your data source name):
var mongoUri = process.env.MONGOLAB_URI ||
process.env.MONGOHQ_URL ||
'mongodb://localhost/mydb';
module.exports = {
mongodb: {
defaultForType: "mongodb",
connector: "loopback-connector-mongodb",
url: mongoUri
}
};
Note: datasources.json is still required (can be empty) and the .js overrides the configuration in the .json file.
This is a TODO for LoopBack to support configuration of datasources/models from environment variables and other sources. One idea is to use a template engine to load datasources.json so that it can have variables to be resolved at runtime.
Related to your question, LoopBack allows you to configure the datasource using a 'url' property. For example:
{
"connector": "loopback-connector-mongodb",
"url": "mongodb://localhost:27017/mydb"
}
As a workaround, you can write a post-deployment script for Heroku to replace the url value with process.env.MONGOLAB_URI or process.env.MONGOHQ_URL.
sed -i.bak s/MONGODB_URL/$MONGOHQ_URL/g datasources.json
Meanwhile, please open an issue at https://github.com/strongloop/loopback/issues.