Bluemix: Node.js example with sqldb service (VCAP_SERVICES credentials) - javascript

I am attempting to use the Node.js SDK runtime starter on bluemix with the SQL service and am finding that the code excerpts from the documentation are incompatible with the current version of the Node.js starter code - looks like the app code has been overhauled recently but the documentation on the bluemix site has not.
I am having difficulty in accessing the VCAP_SERVICE credentials using cfenv, which is what the new version of the starter app uses. On the bluemix dashboard I have this:
{
"sqldb": [
{
"name": "SQL Database-nc",
"label": "sqldb",
"plan": "sqldb_free",
"credentials": {
"port": 50000,
"db": "SQLDB",
"username": "user****",
"host": "75.126.***.***",
"hostname": "75.126.***.**",
"jdbcurl": "jdbc:db2://75.126.***.***:50000/SQLDB",
"uri": "db2://user*****:********#75.126.****:50000/SQLDB",
"password": "***********"
}
}
]
}
and I am trying to access the sqldb service credentials as follows:
var appEnv = cfenv.getAppEnv();
var sqlService = appEnv.getService("sqldb");
console.log("user=" + sqlService.credentials.username);
and I have also tried this:
var appEnv = cfenv.getAppEnv();
var sqlService = appEnv.getService("SQLDB");
console.log("user=" + sqlService.credentials.username);
The app is crashing and reporting that sqlService is null. Feels like I am missing something obvious but could use help in figuring out what that is.
Thank you for any assistance,
-Andy

Use the "name" property, "SQL Database-nc".
From the cfenv documentation...
"The spec parameter should be a regular expression, or a string which is the exact name of the service."

Related

AWS EC2 IAM Role Credentials not passed into Node.js application

I am developing a Node.js application which is deployed on an EC2 instance. I am using the AWS SDK for JavaScript. I have tried both v2 and v3. My problem is that whenever I make a call to any AWS service in my application, I get an error saying that the credentials are missing. However, according to the documentation, assigning an IAM role to the EC2 instance should enable the SDK to automatically retrieve the credentials: AWS Documentation. I believe that I have correctly added the IAM role with sufficient permissions to the EC2 instance so I don't understand why the requests are not going through. I do not want to use Environment variables as I would have to manually use them in my code then. Any suggestions of how to debug this issue or thoughts what the problem might be, are greatly appreciated.
For example, a call is made as follows:
const client = new CloudFormationClient({ region: "eu-central-1" });
const params = {
StackStatusFilter: [
"CREATE_IN_PROGRESS"
]
};
const command = new ListStacksCommand(params);
client.send(command).then(
(data) => {
console.log(data);
},
(error) => {
console.log(error);
}
);
The error is simply: Error: Credential is missing.
This originates from the console.log(error).
In have tried with multiple roles but even with the AdministratorAccess the same error occurs. For reference, the permissions are:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "*",
"Resource": "*"
}
]
}

Is this a Vercel bug? Cannot find module './model'

I have 2 files:
controller.js
model.js
I'm making an express.js app.
So, model.js is required inside controller.js but I have this error when I call my api
And the logs are:
But here is the problem, './model.js' really does exist, but vercel doesn't recognize it
And works fine in local development and it is required correctly
this is model.js
const { nanoid } = require("nanoid");
const getDateStr = () => {
const now = new Date();
return `${now.getFullYear()}-${now.getMonth()+1}-${now.getDate()}`;
}
const Purchase = ({ id_user_buyer, id_user_seller, id_product, quantity }) => (
{
id_user_buyer,
id_user_seller,
id_product,
quantity,
id_purchase: nanoid(),
date: getDateStr(),
}
);
module.exports = {
Purchase
}
And this is controller.js
const err = require("../../../utils/error");
const { Purchase } = require("./model")
// others modules and they are well imported, so the problem is './model'
const userController = require("../user");
const productController = require("../product");
const cartController = require("../cart");
const TABLE = 'purchase';
function purchaseController(injectedStore) {
// example code
async function makePurchase(data) {
const purchase = Purchase(data);
await injectedStore.insert(TABLE, purchase);
}
return {
makePurchase,
}
}
module.exports = purchaseController;
As you can see, model.js is well imported inside controller.js I don't know why vercel says ERROR Cannot find module './model' I say it again, works fine in local development but not in vercel
A quick fix is copy and paste all the code of model.js inside controller.js I tried it, I deploy him and it worked.
All my app also works fine if I just comment that line where I import ./model , but obviously my application would stop having that functionality, so the first solution is uggly but works, but those are not the best solutions. the best solution is for the file to be imported correctly
Curious fact I tried renaming the file and it didn't help either. It also doesn't work if I import a new file.
NOTE I changed the nodejs version from 12 to 14, will that have something to do with it?
just in case I put my folder structure
root
api
this is my vercel.json
{
"version": 2,
"builds": [
{
"src": "/api/index.js",
"use": "#vercel/node"
}
],
"routes": [
{
"src": "/api/auth(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/users(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/products(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/cart(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/purchases(.*)",
"dest": "/api/index.js"
},
{
"src": "/api/sales(.*)",
"dest": "/api/index.js"
}
]
}
I don't know if it's a vercel bug or it's a mistake on my part. My application works currently but doing the trick that I named before, which was about putting all the code of model.js inside of controller.js
Thanks for reading the whole issue.
Well, I don't know why, but i think was a vercel bug, the solution for my problem was not to use optional chaining.
In controller.js I have more code and I was using optional chaining, just change the logic and works fine. Vercel supports optional chaining but with node 14.x and I was using node 12.x in vercel, so I change it to 14.x and works fine in others files, but not in controller.js
So, if you are using javascript new things, like optional chaining, nullish operator, etc. And you are using version 12.x in vercel, that will produce errors. So you change to node 14.x, it could give you bugs too.
So the best you can do is make sure from the beginning that you have version 14.x of node.js in vercel
how to change node version in vercel

Firebase Cloud Functions versioning

I am working with Firebase Cloud Functions and I am trying to versioning my api. I am using express as all the tutorials suggest. However, with this solution, we use Firebase Hosting instead of Cloud Functions.
Hosting: https://xxxx.firebaseapp.com
Cloud Functions: https://xxxx.cloudfunctions.net
The solution which comes closest to what I am looking for is this

.


const app1 = express();
app1.use(cors({ origin: true }));
app1.get("*", (request, response) => {
if(request.url.includes("/v1/")){
response.send("V1: "+request.path);
}
else{
response.send("V2: "+request.path);
}
});
const stats = functions.https.onRequest(app1);
module.exports = {
stats
};
However, you only can see only one function in Firebase Cloud Functions:

https://xxxx.cloudfunctions.net/stats
Also, you only can handle one kind of HTTP request (GET, POST, etc).
What I am looking for is to have in Firebase Cloud Functions:

https://xxxx.cloudfunctions.net/stats/v1/ (including GET, POST, PUT or in another case separate functions with “/:userId” “/save”, etc)
https://xxxx.cloudfunctions.net/stats/v2/
https://xxxx.cloudfunctions.net/item/v1/
Is it possible to do it with Cloud Functions?
Each path can point to a distinct function (or Express app). You configure this in your firebase.json file like below. You don't need a separate domain at all. The below firebase.json results in
https://example.com is handled by firebase hosting
https://example.com/v1 is handled by firebase function
https://example.com/v2 is handled by firebase function
https://example.com/v3 is handled by firebase function
{
"firestore": {
"rules": "firestore.rules",
"indexes": "firestore.indexes.json"
},
"hosting": {
"public": "public",
"cleanUrls": true,
"trailingSlash": false,
"rewrites": [
{
"source": "/v1",
"function": "v1"
},
{
"source": "/v2",
"function": "v2"
},
{
"source": "/v3",
"function": "v3"
}
],
"ignore": [
"firebase.json",
"**/.*",
"**/node_modules/**"
]
},
"storage": {
"rules": "storage.rules"
},
"functions": {
"source": "functions"
}
}
Finally, I solved it in this way:
const express = require("express")
const getUserFunctions = express();
getUserFunctions.get("/stats", (request, response) => {
...
...
});
const v1 = functions.https.onRequest(getUserFunctions);
module.exports = {
v1,
v2
};
So the endpoint would be like this:
https://xxxx.cloudfunctions.net/v1/stats
It's expected that you'll see only one function with the name you've exported from your index.js. All URLs must be anchored through that one function. It looks like you're expecting that both /stats/* and /items/* will route through your function, but that's not the way it works.
If you want different path prefixes, you will either need different functions to handle them, or you will need to have a single common prefix for them all. For example, you could export a function called "api", then use it to handle all the different types of queries:
/api/stats/...
/api/items/...
You may also want to consider using different functions (Express apps) for each different version of your api, and have Firebase Hosting route the different versions to the different app, instead of having a single app check the version at runtime.

Serverless invoke local does nothing

I'm trying to run locally a node lambda to debug it. I am using Serverless and this launch config in vsCode
{
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Launch Program",
"program": "${workspaceRoot}/node_modules/.bin/sls",
"args": [
"invoke",
"local",
"--function",
"hello",
"--data",
"hello world"
]
}
]
}
My export.handler looks like this:
module.exports.handler = (event, context, callback) => {
if (event.triggerSource === CONSTANTS.TRIGGER_SOURCE) {
console.log("event = " + JSON.stringify(event));
const uri = process.env.SCT_URL_BASE;
const country = process.env.SCT_COUNTRY;
const username =
event.request.userAttributes[CONSTANTS.USER_ATTRIBUTES];
const codeP = event.request.codeParameter;
console.log("URI = " + url);
console.log("Code:" + codeP);
getUrlData(uri, country, username, codeP);
} else {
context.done(null, event);
}
};
When I run de debug mode it does nothing. Serverless does not throw any error, I just can not reach inside the function.
Also, there is another thing I can not understand. In serverless documentation it said:
--function or -f The name of the function in your service that you want to invoke locally. Required.
I don't know what they are refering in this, if a function that we call to run the lambda or the function that it is called when the lambda is called. In this case, the function that I am exporting is "handler" but it doesn't work either.
Thanks in advance.
I have used this approach and it works for me:
https://standardofnorms.wordpress.com/2017/12/03/locally-debugging-aws-lambdas-written-in-node-js/
The bad thing is that I would like to use serverless and not lambda-local package due to the greater community of serverless. Lambda-local works like charm though, so I send a big hug to its creator from here.
Answers to the first question are still very welcome.
EDIT: Ok, I figured this out.
Results that Serverless, as framework, uses a serverless.yml file when we need to add some configuration. There, I had to create the function I am going to run with the serverless command and then point it to the file where I have my handler. This is my serverles.yml right now:
service: serverless-simple
frameworkVersion: ">=1.1.0 <2.0.0"
provider:
name: aws
runtime: nodejs4.3
functions:
lambdaHandler:
handler: src/customMessageLambda.handler
events:
- http:
path: ping
Sure I have to research a little more on this file but I have solved my issue.
Hope this helps someone, sometime.

How to configure StrongLoop LoopBack MongoDB datasource for deployment to Heroku

I'm using LoopBack ver. 1.6 and have a local mongoDB server running for development using he following datasource configuration:
"mongodb": {
"defaultForType": "mongodb",
"connector": "loopback-connector-mongodb",
"database": "xxxdbname",
"host": "localhost",
"port": "27017"
},
Now I want to deploy to Heroku but I don't know how to configure the datasource to point at the MongoLab db since it has a dynamically generated connection string:
from the Heroku dox:
var mongo = require('mongodb');
var mongoUri = process.env.MONGOLAB_URI ||
process.env.MONGOHQ_URL ||
'mongodb://localhost/mydb';
mongo.Db.connect(mongoUri, function (err, db) {
db.collection('mydocs', function(er, collection) {
collection.insert({'mykey': 'myvalue'}, {safe: true}, function(er,rs) {
});
});
});
So what kind of changes do I need to make to my datasource JSON to map the Heroku connection string?
This has now (as of June 27 2014) been addressed: create a file datasources.local.js with the following content (where mongodb is your data source name):
var mongoUri = process.env.MONGOLAB_URI ||
process.env.MONGOHQ_URL ||
'mongodb://localhost/mydb';
module.exports = {
mongodb: {
defaultForType: "mongodb",
connector: "loopback-connector-mongodb",
url: mongoUri
}
};
Note: datasources.json is still required (can be empty) and the .js overrides the configuration in the .json file.
This is a TODO for LoopBack to support configuration of datasources/models from environment variables and other sources. One idea is to use a template engine to load datasources.json so that it can have variables to be resolved at runtime.
Related to your question, LoopBack allows you to configure the datasource using a 'url' property. For example:
{
"connector": "loopback-connector-mongodb",
"url": "mongodb://localhost:27017/mydb"
}
As a workaround, you can write a post-deployment script for Heroku to replace the url value with process.env.MONGOLAB_URI or process.env.MONGOHQ_URL.
sed -i.bak s/MONGODB_URL/$MONGOHQ_URL/g datasources.json
Meanwhile, please open an issue at https://github.com/strongloop/loopback/issues.

Categories