I've been trying to setup a rest API for a few days now. I've been following a great tutorial that really helped me understand a large part of how these work (sending requests, getting responses, etc). However it uses MongoDB and Mongoose. I'm using MySQL. My tables and views are a bit complicated so I decided instead of using an ORM I'd just use mysql2 package and do the querying myself. I'm stuck with trying to PATCH and PUT at the moment. Part of my front end functions by sometimes only sending 1 or 2 fields that need to be updatd (a PATCH from everything I've gathered). So I used part of the MongoDB and Mongoose tutorial to build an array of objects and pass them into connection.query. Here's my patch route:
router.patch('/:txnid', (req, res, next) => { //UPDATE fields that are passed
const txnid = req.params.txnid;
for (const field of req.body) {
fieldsToUpdate[field.name] = field.value;
}
connection.query("UPDATE QuoteToClose SET ? WHERE qb_TxnID = '" + txnid + "'", { fieldsToUpdate }, function (error, results) {
if (error) {
res.status(404).json({
message: error,
field: fieldsToUpdate
});
} else {
res.status(201).json({
"record_count" : results.length,
"error": null,
"response": results
});
}
});
});
Sometimes I will pass 1 field, sometimes I will pass 2. In this case I'm only passing 1. I build my body in POSTMAN and send this with the PATCH request:
[
{
"name": "margin",
"value": "50"
}
]
When I run this through POSTMAN I get the error:
{
"message": {
"code": "ER_BAD_FIELD_ERROR",
"errno": 1054,
"sqlState": "42S22",
"sqlMessage": "Unknown column 'fieldsToUpdate' in 'field list'"
},
"field": {
"margin": "50"
}
}
I'm not sure why though. I'm not using Mongoose unfortunately so I don't know if something is dependent on it that I'm missing. My body parser is set like this:
app.use(bodyParser.urlencoded({extended: false}));
app.use(bodyParser.json());
I want to dynamically build that query instead of specifying each field (it just seems cleaner this way.
Hi im using this library build dynamic queries.https://www.npmjs.com/package/flexqp
let result = await qp.executeUpdatePromise('update user set ? where user.id = ?', [user, user.id], dbconfig);
user is an object with lots of sub elements, the library will auto populate the
query to E.g update user set name = 'xxx' , address ='xxx' ..etc where user.id = 1
fieldsToUpdate is already an object. If you remove the curlies when you parameterize if you should be good to go:
connection.query("UPDATE QuoteToClose SET ? WHERE qb_TxnID = '" + txnid + "'", fieldsToUpdate,
Also, as a side note that string concatenation is a bad idea, you're just asking for a SQL Injection attack.
Related
I'm using Dynamoose to simplify my interactions with DynamoDB in a node.js application. I'm trying to write a query using Dynamoose's Model.query function that will search a table using an index, but it seems like Dynamoose is not including all of the info required to process the query and I'm not sure what I'm doing wrong.
Here's what the schema looks like:
const UserSchema = new dynamoose.Schema({
"user_id": {
"hashKey": true,
"type": String
},
"email": {
"type": String,
"index": {
"global": true,
"name": "email-index"
}
},
"first_name": {
"type": String,
"index": {
"global": true,
"name": "first_name-index"
}
},
"last_name": {
"type": String,
"index": {
"global": true,
"name": "last_name-index"
}
}
)
module.exports = dynamoose.model(config.usersTable, UserSchema)
I'd like to be able to search for users by their email address, so I'm writing a query that looks like this:
Users.query("email").contains(query.email)
.using("email-index")
.all()
.exec()
.then( results => {
res.status(200).json(results)
}).catch( err => {
res.status(500).send("Error searching for users: " + err)
})
I have a global secondary index defined for the email field:
When I try to execute this query, I'm getting the following error:
Error searching for users: ValidationException: Either the KeyConditions or KeyConditionExpression parameter must be specified in the request.
Using the Dynamoose debugging output, I can see that the query winds up looking like this:
aws:dynamodb:query:request - {
"FilterExpression": "contains (#a0, :v0)",
"ExpressionAttributeNames": {
"#a0": "email"
},
"ExpressionAttributeValues": {
":v0": {
"S": "mel"
}
},
"TableName": "user_qa",
"IndexName": "email-index"
}
I note that the actual query sent to DynamoDB does not contain KeyConditions or KeyConditionExpression, as the error message indicates. What am I doing wrong that prevents this query from being written correctly such that it executes the query against the global secondary index I've added for this table?
As it turns out, calls like .contains(text) are used as filters, not query parameters. DynamoDB can't figure out if the text in the index contains the text I'm searching for without looking at every single record, which is a scan, not a query. So it doesn't make sense to try to use .contains(text) in this context, even though it's possible to call it in a chain like the one I constructed. What I ultimately needed to do to make this work is turn my call into a table scan with the .contains(text) filter:
Users.scan({ email: { contains: query.email }}).all().exec().then( ... )
I am not familiar with Dynamoose too much but the following code below will do an update on a record using node.JS and DynamoDB. See the key parameter I have below; by the error message you got it seems you are missing this.
To my knowledge, you must specify a key for an UPDATE request. You can checks the AWS DynamoDB docs to confirm.
var params = {
TableName: table,
Key: {
"id": customerID,
},
UpdateExpression: "set customer_name= :s, customer_address= :p, customer_phone= :u, end_date = :u",
ExpressionAttributeValues: {
":s": customer_name,
":p": customer_address,
":u": customer_phone
},
ReturnValues: "UPDATED_NEW"
};
await docClient.update(params).promise();
I have been trying to figure out how to do 2fa with webauthn and I have the registration part working. The details are really poorly documented, especially all of the encoding payloads in javascript. I am able to register a device to a user, but I am not able to authenticate with that device. For reference, I'm using these resources:
https://github.com/cedarcode/webauthn-ruby
https://www.passwordless.dev/js/mfa.register.js
And specifically, for authentication, I'm trying to mimic this js functionality:
https://www.passwordless.dev/js/mfa.register.js
In my user model, I have a webauthn_id, and several u2f devices, each of which has a public_key and a webauthn_id.
In my Rails app, I do:
options = WebAuthn::Credential.options_for_get(allow: :webauthn_id)
session[:webauthn_options] = options
In my javascript, I try to mimic the js file above and I do (this is embedded ruby):
options = <%= raw #options.as_json.to_json %>
options.challenge = WebAuthnHelpers.coerceToArrayBuffer(options.challenge);
options.allowCredentials = options.allowCredentials.map((c) => {
c.id = WebAuthnHelpers.coerceToArrayBuffer(c.id);
return c;
});
navigator.credentials.get({ "publicKey": options }).then(function (credentialInfoAssertion)
{
// send assertion response back to the server
// to proceed with the control of the credential
alert('here');
}).catch(function (err)
{
debugger
console.error(err); /* THIS IS WHERE THE ERROR IS THROWN */
});
The problem is, I cannot get past navigator.credentials.get, I get this error in the javascript console:
TypeError: CredentialsContainer.get: Element of 'allowCredentials' member of PublicKeyCredentialRequestOptions can't be converted to a dictionary
options at the time navigator.credentials.get is called looks like this:
I've tried every which way to convert my db-stored user and device variables into javascript properly encoded and parsed variables but cannot seem to get it to work. Anything obvious about what I'm doing wrong?
Thanks for any help,
Kevin
UPDATE -
Adding options json generated by the server:
"{\"challenge\":\"SSDYi4I7kRWt5wc5KjuAvgJ3dsQhjy7IPOJ0hvR5tMg\",\"timeout\":120000,\"allowCredentials\":[{\"type\":\"public-key\",\"id\":\"OUckfxGNLGGASUfGiX-1_8FzehlXh3fKvJ98tm59mVukJkKb_CGk1avnorL4sQQASVO9aGqmgn01jf629Jt0Z0SmBpDKd9sL1T5Z9loDrkLTTCIzrIRqhwPC6yrkfBFi\"},{\"type\":\"public-key\",\"id\":\"Fj5T-WPmEMTz139mY-Vo0DTfsNmjwy_mUx6jn5rUEPx-LsY51mxNYidprJ39_cHeAOieg-W12X47iJm42K0Tsixj4_Fl6KjdgYoxQtEYsNF-LPhwtoKwYsy1hZgVojp3\"}]}"
This is an example of the serialised JSON data returned by our implementation:
{
"challenge": "MQ1S8MBSU0M2kiJqJD8wnQ",
"timeout": 60000,
"rpId": "identity.acme.com",
"allowCredentials": [
{
"type": "public-key",
"id": "k5Ti8dLdko1GANsBT-_NZ5L_-8j_8TnoNOYe8mUcs4o",
"transports": [
"internal"
]
},
{
"type": "public-key",
"id": "LAqkKEO99XPCQ7fsUa3stz7K76A_mE5dQwX4S3QS6jdbI9ttSn9Hu37BA31JUGXqgyhTtskL5obe6uZxitbIfA",
"transports": [
"usb"
]
},
{
"type": "public-key",
"id": "nbN3S08Wv2GElRsW9AmK70J1INEpwIywQcOl6rp_DWLm4mcQiH96TmAXSrZRHciZBENVB9rJdE94HPHbeVjtZg",
"transports": [
"usb"
]
}
],
"userVerification": "discouraged",
"extensions": {
"txAuthSimple": "Sign in to your ACME account",
"exts": true,
"uvi": true,
"loc": true,
"uvm": true
}
}
This is parsed to an object and the code used to coerce those base64url encoded values is:
credentialRequestOptions.challenge = WebAuthnHelpers.coerceToArrayBuffer(credentialRequestOptions.challenge);
credentialRequestOptions.allowCredentials = credentialRequestOptions.allowCredentials.map((c) => {
c.id = WebAuthnHelpers.coerceToArrayBuffer(c.id);
return c;
});
Hope that helps. The JSON data is retreived via a fetch() call and the byte[] fields are encoded as base64url on the serverside.
I am a beginner in Backend developing, and I have this array called (Movie). I use expressJS and I want to save the array on MongoDB.I will use Mongodb atlas for my database. I appreciate your help
I tried to follow this instruction on this website:
https://medium.com/#lavitr01051977/node-express-js-aea19636a500
I ignored the first steps and starts from ( Connecting to the Database ) title but It doesn't work.
var express = require('express')
var app = express()
app.get('/', (req, res) => {
res.send('ok')
});
//movie array
const movies = [
{ title: 'Jaws', year: 1975, rating: 8 },
{ title: 'Avatar', year: 2009, rating: 7.8 },
{ title: 'Brazil', year: 1985, rating: 8 },
{ title: 'الإرهاب والكباب', year: 1992, rating: 6.2 }
]
//read the array movie
app.get('/movies/read/',(req,res) => {
res.send({status:200, data:movies})
})
//add elements to array movies
app.get('/movies/add',(req,res) => {
var t = req.query.title
var y = req.query.year
var r = req.query.rating
if(t == undefined || y == undefined || y.length > 4 || isNaN(y)) {
res.send({status:403, error:true, message:'you cannot create a movie without providing a title and a year'})
}
if (r == "") {
r = 4
}
movies.push({title: t, year: y, rating: r})
res.send({status:200, data:movies})
})
//delete elements from array movies
app.get('/movies/delete/:ID',(req,res) => {
var d = req.params.ID
if (d > 0 && d < movies.length ) {
movies.splice(d-1, 1)
res.send({status:200, message: movies})
}
else {
res.send({status:404, error:true, message:'the movie <ID> does not exist'})
}
})
//update elements from array movies
app.get('/movies/update/:ID',(req,res) => {
let c = req.params.ID
let x = req.query.title
let y = req.query.year
let z = req.query.rating
function update(a, b) {
if(a != undefined || a == "") {
movies[c-1][b] = a
}
}
if(c > 0 && c < movies.length ) {
update(x, 'title')
update(y, 'year')
update(z, 'rating')
res.send({status:200, message: movies})
}
else {
res.send({status:404, error:true, message:'the movie <ID> does not exist'})
}
})
app.listen(3000, () => console.log('listinig on port 3000'))
I expect the answer is like the link that I put it above on medium.com website
mongoose is a framework that facilitates interacting with MongoDB. Actually you basically never want to do all the validation, casting, and logic boilerplate on your own, so why reinvent the wheel.
And since you're a beginner, don't be afraid of frameworks. There are many useful frameworks for many areas of backend and frontend to make life easier for you.
The article you shared is self-explanatory, but I will sum up only the database part for you (I won't go deep into your code, no donkey work. the rest is up to you):
1) First of all install mongoose.
npm install mongoose
The article has --save which is no need to add anymore, as "npm install saves any specified packages into dependencies by default."(ref.)
2) to able to access and use mongoose, you need to import it, in node way, that is require().
const express = require(‘express’)
const mongoose = require(“mongoose”);
const bodyParser = require(‘body-parser’);
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
And what's body-parser there for?
When dealing with database in express, you'll sooner or later come across errors like this one.
and the reason why we need one after all is best explained in this answer.
Also, recent versions of express has its own body parser now, so you can use app.use(express.json()) instead of app.use(bodyParser.json()).
Important: body-parser must be before your routes.
3) use mongoose.connect(url).
url argument is what you find in your MongoDB Atlas. :
Location: clusters tab -> connect -> connect your application -> driver node.js
Which gives you, something like this:
mongodb+srv://<user>:<password>#,cluster>.mongodb.net/test?retryWrites=true&w=majority
Important: you use user and password of the user you made within Database Access tab, not your own user and password.
You can set up your environment variables to secure sensitive and changeable data. But I prefer using config.js for simplicity, and which usually resides in the root of app.
Not only you can secure them (like for using .gitignore), but also easily modify them as there are variables that might change from one environment to another environment, making them available in one place and easy to find, instead of looking for them to change all over your app.
For .env file approach, read this article.
Important: in case you want to put your code in github or anywhere online, which one reason we use config.js, don't forget to add this file in .gitignore to avoid such sensitive data get leaked and exposed to others online.
in config.js you can do so:
exports.username = 'your user';
exports.pass = 'your pass';
exports.myCluster = 'your cluster's name';
Then import them so:
const { username, pass, myCluster } = require('./config'); <- the path might be different for you!
Tip: You can use back-tick (` `) to easily insert those variables for const url, through interpolation.
That is:
const url = `mongodb+srv://${username}:${password},${myCluster}.mongodb.net/test?retryWrites=true&w=majority`
Important: make sure to whitelist your IP from MongoDB side (atlas), otherwise you will get connection error.
Under security: Network Access -> IP Whitelist
You could use 0.0.0.0/0 to whitelist all IPs.
Also, when using VPN, your IP will change too.
4) last but not least, after connecting to database, you need to define your schema:
const moviesSchema = new mongoose.Schema({
title: String,
year: Number,
rating: Number
});
And
const Movies = mongoose.model(“Movies”, moviesSchema);
Tip: A common mistake many newbies make is that they forgot to use new:
new mongoose.Schema({...})
If you want to create your model in a separate file (which is the best practice), you will need to modify your const Movies so:
module.exports = mongoose.model(“Movies”, moviesSchema);
Don't forgot to add const mongoose = require('mongoose'); in that separate js model file.
And in wherever you use want to use this model, you import so:
const Movies= require('../models/movies'); (the path may different for your app)
The rest, my friend, is up to you. What you want to do with your database and how to use it.
Note to others: I put so much time and mind to this as I was writing this. Please, if you see something wrong, or think you can add something, feel free to edit and improve my answer.
I would suggest you to take look at mongoose framework to interact with a Mongo database using NodeJS.
However, in the code you've provided you're not interacting with any database. You would need to define a Schema and then you could save a new doc or do any other action with your collection. Please follow some 'Get started' guide on how to do it.
Hope it helps!
I ll explain step by step. NOTE THAT THIS PROCESS SHOULD BE RUN ONLY ONCE. SO YOU SHOULD ADD THIS CODE TO A SEPARATE MODULE AND RUN IT ONCE. otherwise you will keep adding more items to the db. lets name this module
movies.js
//you need to connect to mongodb through mongoose.
const mongoose = require("mongoose");
mongoose
.connect("mongodb://127.0.0.1:27017/movies", { //will create movies db automatically
useNewUrlParser: true,
useCreateIndex: true
})
.catch(err => {
console.log(err.message);
process.exit(1);
})
.then(() => {
console.log("connected");
});
//next create a schema and model:
const movieSchema = new mongoose.Schema({
title: String,
year: Number,
rating: Number
});
const Movie = mongoose.model("Movie", movieSchema);
//create movies array based on `Movie model` so u can use save() method.
const movies = [
new Movie({ title: "Jaws", year: 1975, rating: 8 }),
new Movie({ title: "Avatar", year: 2009, rating: 7.8 }),
new Movie({ title: "Brazil", year: 1985, rating: 8 }),
new Movie({ title: "الإرهاب والكباب", year: 1992, rating: 6.2 })
];
//Last step save it.
movies.map(async (p, index) => {
await p.save((err, result) => {
if (index === movies.length - 1) {
console.log("DONE!");
mongoose.disconnect();
}
});
});
map is an array method.it iterates over array and save each item inside the array. Once every item in the array is saved we need to disconnect from db. array method takes 2 arguments. on is each item inside the array, second one is the index of each item. index in array starts from 0, so last item's index in movies array is 3 but length of the array is 4 so once 4-1=3 that means we saved every item in the array.
Now run the code
node movies.js
What I am trying to do
I am creating a social media app with react native and firebase. I am trying to call a function, and have that function return a list of posts from off of my server.
Problem
Using the return method on a firebase query gives me a hard to use object array:
Array [
Object {
"-L2mDBZ6gqY6ANJD6rg1": Object {
//...
},
},
]
I don't like how there is an object inside of an object, and the whole thing is very hard to work with. I created a list inside my app and named it items, and when pushing all of the values to that, I got a much easier to work with object:
Array [
Object {
//...
"key": "-L2mDBZ6gqY6ANJD6rg1",
},
]
This object is also a lot nicer to use because the key is not the name of the object, but inside of it.
I would just return the array I made, but that returns as undefined.
My question
In a function, how can I return an array I created using a firebase query? (to get the objects of an array)
My Code
runQ(group){
var items = [];
//I am returning the entire firebase query...
return firebase.database().ref('posts/'+group).orderByKey().once ('value', (snap) => {
snap.forEach ( (child) => {
items.push({
//post contents
});
});
console.log(items)
//... but all I want to return is the items array. This returns undefined though.
})
}
Please let me know if I'm getting your question correctly. So, the posts table in database looks like this right now:
And you want to return these posts in this manner:
[
{
"key": "-L1ELDwqJqm17iBI4UZu",
"message": "post 1"
},
{
"key": "-L1ELOuuf9hOdydnI3HU",
"message": "post 2"
},
{
"key": "-L1ELqFi7X9lm6ssOd5d",
"message": "post 3"
},
{
"key": "-L1EMH-Co64-RAQ1-AvU",
"message": "post 4"
}
...
]
Is this correct? If so, here's what you're suppose to do:
var items = [];
firebase.database().ref('posts').orderByKey().once('value', (snapshot) => {
snapshot.forEach((child) => {
// 'key' might not be a part of the post, if you do want to
// include the key as well, then use this code instead
//
// const post = child.val();
// const key = child.key;
// items.push({ ...post, key });
//
// Otherwise, the following line is enough
items.push(child.val());
});
// Then, do something with the 'items' array here
})
.catch(() => { });
Off the topics here: I see that you're using firebase.database().... to fetch posts from the database, are you using cloud functions or you're fetching those posts in your App, using users' devices to do so? If it's the latter, you probably would rather use cloud functions and pagination to fetch posts, mainly because of 2 reasons:
There might be too many posts to fetch at one time
This causes security issues, because you're allowing every device to connect to your database (you'd have to come up with real good security rules to keep your database safe)
on serverside, i get a simple json file via REST with a lot of IDs, something like that:
[ {
"_id": "5825a49dasdasdasd8417c1b6d5",
}
"_id": "dfsdfsdf4960932218417c1b6d5",
}
"_id": "23434344960932218417c1b6d5",
},]
For that i have written in the main:
main.post('/..../add', Controller.addEvent);
In the Controller, i want to recieve the request and search in the mongodb for these ID's, because i need some informations about these IDs
exports.addEvent = function(req, res) {
var collection = db.get().collection('events');
My question is, if someone send me over "localhost:8080/events/add" the given simple json file, how do i have to handle this json? i need the ID's and want to search with them.
Thanks for help!
----------ADDED------------
I am bit further now. In my controller i have the following function
exports.addevent = function(req, res)
{
var ids = req.body;
console.log(ids);
}
Now i'm getting all IDs, which i have posted with "Postman" from chrome.
The output in the console is:
[ { _id: '5825a49dasdasdasd8417c1b6d5' },
{ _id: 'dfsdfsdf4960932218417c1b6d5' },
{ _id: '23434344960932218417c1b6d5' } ]
How can i have every single ID?
Ok, the request is an object and not a string. That was the error :-/
for(var i in ids) {
console.log(ids[i]._id);
}
and it works, now i can connect to the database and search for the id