I'm tasked with modifying a legacy app so that users can upload payroll adjustments in bulk. Currently they have to fill out a form and input the data item by item, hitting submit after each one. I'm giving them the ability to upload a CSV file containing tons of adjustments at once.
On the server they are inserting items into couch one by one, like this:
function wsapiPOST(req, res, next) {
var path = req.path.substr(6)
, url = couchPath + path + requestUtil.buildQueryString(req);
request.post({
url: url,
headers: { 'content-type': 'application/json' },
body: JSON.stringify(req.body)
},function (err, resp, body) {
if (err) {
if (resp) {
res.writeHead(resp.statusCode);
res.end(body);
} else { // This would happen if the request timed out
res.writeHead(408);
res.end('timeout');
}
}
}).pipe(res);
}
The couch URL is built dynamically.
req.body contains the properties for a single item.
I'm new to couch but I'm not sure how to send multiple documents for insertion in a single operation. I could throw the request.post call into a loop as is, but I imagine that's not going to be very performant.
I just need pointed in the right direction for bulk insertion into couch via its REST API. Thanks!
You can use the bulk document API to insert (and even update) multiple documents.
You can use nano - there is a bulk insert option.
Related
In my application I read huge data of images, and send the whole data to the client:
const imagesPaths = await getFolderImagesRecursive(req.body.rootPath);
const dataToReturn = await Promise.all(imagesPaths.map((imagePath) => new Promise(async (resolve, reject) => {
try {
const imageB64 = await fs.readFile(imagePath, 'base64');
return resolve({
filename: imagePath,
imageData: imageB64,
});
} catch {
return reject();
}
})));
return res.status(200).send({
success: true,
message: 'Successfully retreived folder images data',
data: dataToReturn,
});
Here is the client side:
const getFolderImages = (rootPath) => {
return fetch('api/getFolderImages', {
method: 'POST',
headers: { 'Content-type': 'application/json' },
body: JSON.stringify({ rootPath }),
});
};
const getFolderImagesServerResponse = await getFolderImages(rootPath);
const getFolderImagesServerData = await getFolderImagesServerResponse.json();
When I do send the data I get failure due to the huge data. Sending the data just with res.send(<data>) is impossible. So, then, how can I bypass this limitation - and how should I accept the data in the client side with the new process?
The answer to your problem requires some read :
Link to the solution
One thing you probably haven’t taken full advantage of before is that webserver’s http response is a stream by default.
They just make it easier for you to pass in synchron data, which is parsed to chunks under the hood and sent as HTTP packages.
We are talking about huge files here; naturally, we don’t want them to be stored in any memory, at least not the whole blob. The excellent solution for this dilemma is a stream.
We create a readstream with the help of the built-in node package ‘fs,’ then pass it to the stream compatible response.send parameter.
const readStream = fs.createReadStream('example.png');
return response.headers({
'Content-Type': 'image/png',
'Content-Disposition': 'attachment; filename="example.png"',
}).send(readStream);
I used Fastify webserver here, but it should work similarly with Koa or Express.
There are two more configurations here: naming the header ‘Content-Type’ and ‘Content-Disposition.’
The first one indicates the type of blob we are sending chunk-by-chunk, so the frontend will automatically give the extension to it.
The latter tells the browser that we are sending an attachment, not something renderable, like an HTML page or a script. This will trigger the browser’s download functionality, which is widely supported. The filename parameter is the download name of the content.
Here we are; we accomplished minimal memory stress, minimal coding, and minimal error opportunities.
One thing we haven’t mentioned yet is authentication.
For the fact, that the frontend won’t send an Ajax request, we can’t expect auth JWT header to be present on the request.
Here we will take the good old cookie auth approach. Cookies are set automatically on every request header that matches the criteria, based on the cookie options. More info about this in the frontend implementation part.
By default, cookies arrive as semicolon separated key-value pairs, in a single string. In order to ease out the parsing part, we will use Fastify’s Cookieparser plugin.
await fastifyServer.register(cookieParser);
Later in the handler method, we simply get the cookie that we are interested in and compare it to the expected value. Here I used only strings as auth-tokens; this should be replaced with some sort of hashing and comparing algorithm.
const cookies = request.cookies;
if (cookies['auth'] !== 'authenticated') {
throw new APIError(400, 'Unauthorized');
}
That’s it. We have authentication on top of the file streaming endpoint, and everything is ready to be connected by the frontend.
I'm trying to update one of my subscriber's status using mailchimp API 3.0, Meteor and javascript.
Here is my js code I'm using:
request({
uri,
list_id,
method: 'PUT',
headers: {
'Content-Type': 'application/json',
'Authorization': 'apikey (my api key)'
},
json,
}, function(err, res, body) {
if (err) {
return console.log("err:", err);
}
console.log("connection succeed");
console.log("res: ", res.body);
console.log("body: ", body);
});
with
uri = "https://us15.api.mailchimp.com/3.0/lists/" + (id of my list) + "/members/" + (md5 of my user mail);
and
json = {
"email_address": (user mail as a string),
"status": "unsubscribed"
};
But I always have the same output:
I20181204-18:42:12.714(8)? title: 'Member Exists',
I20181204-18:42:12.714(8)? status: 400, I20181204-18:42:12.714(8)? detail: '(user mail adress) is already a list member. Use PUT
to insert or update list members.'
But I am using PUT already... The request works with POST if it's the first time I add the user. But now I can't update my user status...
Is something wrong with my request or with the way I use the API? Thanks in advance.
EDIT 1 -> trying with GET doesn't work. The request itself is correct but it has no effect on my subscriber's status. So I still need to make PUT work.
After looking at the official doc in the "Edit" tab, I found the answer!
The json required another mandatory parameter and should look like this:
json = {
"email_address": (user mail as a string),
"status_if_new": "unsubscribed",
"status": "unsubscribed"
};
I know that this is an older question but I just wanted to add something in the event that it helps someone.
I was having a similar issue intermittently with most of my PUT requests working as expected and some not.
It took me a while but eventually I figured out that some of my email addresses had spaces at the end.
This error would result.
Trimming the addresses before doing anything resolved my issue.
My goal is to fetch the status data from a UBNT radio (https://www.ubnt.com/) using an HTTP request. The web interface url is formatted as http://192.168.0.120/status.cgi. Making the request requires a authentication cookie. Using the cookie copied from the existing web interface I am able to successfully retrieve the data.
This is my current code using the Meteor framework.
radioHost = "http://192.168.0.120";
HTTP.call("POST", radioHost + "/login.cgi",
{
headers: {
"Content-Type": "multipart/form-data"
},
data: {
username: "ubnt",
password: "ubnt"
}
}, (err, res) = > {
if(err) return console.log(err);
var cookie = res.headers["set-cookie"][0];
HTTP.call("GET", radioHost + "/status.cgi", {
headers: {
cookie
}
}, (err, res) = > {
if(err) return console.log("Error");
console.log(res);
})
})
The above code achieves both request successfully. However the server is responding to the first with a faulty token ("set-cookie" string). Using the cookie from the existing web framework the response is correct.
Here is a library written in Python that I believe does a similar thing. https://github.com/zmousm/ubnt-nagios-plugins
I believe my problem lies within the HTTP request and the web api not cooperating with the username and password.
Thanks in advance for any help.
A direct POST request to a url is not a recommended way. When you open a browser you just don't directly login. You fetch the page and then submit/login
Not simulating this behavior may impact certain sites depending on how the server works.
So if always want to look at the simulating like a real user/browser would do, make a GET request first and then the POST.
Also capture any cookies from the first GET request and then pass the same on to the next one
Iam trying to create a post call which basically takes a file(eg img,pdf file) and then it need to upload in to object storage on bluemix.I was able to authenticate and get the token and create the authurl.I just need to pass file which we upload along with the url.But Iam out of ideas how I can get the file uploaded from postman to be passed to that url with in the post call..Below is my code
app.post('/uploadfile',function(req,res){
getAuthToken().then(function(token){
if(!token){
console.log("error");
}
else{
var fileName = req.body.file;
console.log("data",file);
console.log(SOFTLAYER_ID_V3_AUTH_URL,"url");
var apiUrl = SOFTLAYER_ID_V3_AUTH_URL + config.projectId + '/' + containerName + fileName ;
url : apiurl,
method :'PUT',
headers :{
'X-Auth-Token': token
},function(error, response, body) {
if(!error && response.statusCode == 201) {
res.send(response.headers);
} else {
console.log(error, body);
res.send(body);
}
}
}
})
});
Can someone help here.
Since you're using Express, you should use something like:
https://www.npmjs.com/package/express-fileupload
https://github.com/mscdex/connect-busboy
https://github.com/expressjs/multer
https://github.com/andrewrk/connect-multiparty
https://github.com/mscdex/reformed
Without a body parser that handles file uploads you will not be able to get the uploaded file in the Express request handler.
Then, you need to pass the uploaded file to the request that you're making.
For that you should use this module:
https://www.npmjs.com/package/bluemix-object-storage
There is no need to reinvent the wheel when there are tested and eay to use solutions available. Especially when you're dealing with sensitive information like API keys and secrets I would not advice you to implement your own solution from scratch, unless you really know what you're doing. And if you really know what you're doing, then you don't need to seek advice for things like that.
Here is the official Object Storage SDK for Node.js:
https://github.com/ibm-bluemix-mobile-services/bluemix-objectstorage-serversdk-nodejs
Connect to Object Storage:
var credentials = {
projectId: 'project-id',
userId: 'user-id',
password: 'password',
region: ObjectStorage.Region.DALLAS
};
var objStorage = new ObjectStorage(credentials);
Create a container:
objstorage.createContainer('container-name')
.then(function(container) {
// container - the ObjectStorageContainer that was created
})
.catch(function(err) {
// AuthTokenError if there was a problem refreshing authentication token
// ServerError if any unexpected status codes were returned from the request
});
}
Create a new object or update an existing one:
container.createObject('object-name', data)
.then(function(object) {
// object - the ObjectStorageObject that was created
})
.catch(function(err) {
// TimeoutError if the request timed out
// AuthTokenError if there was a problem refreshing authentication token
// ServerError if any unexpected status codes were returned from the request
});
I might be out of depth but I really need something to work. I think a write/read stream will solve both my issues but I dont quite understand the syntax or whats required for it to work.
I read the stream handbook and thought i understood some of the basics but when I try to apply it to my situation, it seems to break down.
Currently I have this as the crux of my information.
function readDataTop (x) {
console.log("Read "+x[6]+" and Sent Cached Top Half");
jf.readFile( "loadedreports/top"+x[6], 'utf8', function (err, data) {
resT = data
});
};
Im using Jsonfile plugin for node which basically shortens the fs.write and makes it easier to write instead of constantly writing catch and try blocks for the fs.write and read.
Anyways, I want to implement a stream here but I am unsure of what would happen to my express end and how the object will be received.
I assume since its a stream express wont do anything to the object until it receives it? Or would I have to write a callback to also make sure when my function is called, the stream is complete before express sends the object off to fullfill the ajax request?
app.get('/:report/top', function(req, res) {
readDataTop(global[req.params.report]);
res.header("Content-Type", "application/json; charset=utf-8");
res.header("Cache-Control", "max-age=3600");
res.json(resT);
resT = 0;
});
I am hoping if I change the read part to a stream it will allievate two problems. The issue of sometimes receiving impartial json files when the browser makes the ajax call due to the read speed of larger json objects. (This might be the callback issue i need to solve but a stream should make it more consistent).
Then secondly when I load this node app, it needs to run 30+ write files while it gets the data from my DB. The goal was to disconnect the browser from the db side so node acts as the db by reading and writing. This due to an old SQL server that is being bombarded by a lot of requests already (stale data isnt an issue).
Any help on the syntax here?
Is there a tutorial I can see in code of someone piping an response into a write stream? (the mssql node I use puts the SQL response into an object and I need in JSON format).
function getDataTop (x) {
var connection = new sql.Connection(config, function(err) {
var request = new sql.Request(connection);
request.query(x[0], function(err, topres) {
jf.writeFile( "loadedreports/top"+x[6], topres, function(err) {
if(err) {
console.log(err);
} else {
console.log(x[6]+" top half was saved!");
}
});
});
});
};
Your problem is that you're not waiting for the file to load before sending the response. Use a callback:
function readDataTop(x, cb) {
console.log('Read ' + x[6] + ' and Sent Cached Top Half');
jf.readFile('loadedreports/top' + x[6], 'utf8', cb);
};
// ...
app.get('/:report/top', function(req, res) {
// you should really avoid using globals like this ...
readDataTop(global[req.params.report], function(err, obj) {
// setting the content-type is automatically done by `res.json()`
// cache the data here in-memory if you need to and check for its existence
// before `readDataTop`
res.header('Cache-Control', 'max-age=3600');
res.json(obj);
});
});