how to use a nodeJS out in javascript file - javascript

I would like to use the output of my nodeJS. This is my code
var fs = require('fs'); //File System
var rutaImagen = 'C:/Users/smontesc/Desktop/imagenes1/'; //Location of images
fs.readdir(rutaImagen, function(err, files) {
if (err) { throw err; }
var imageFile = getNewestFile(files, rutaImagen);
//process imageFile here or pass it to a function...
console.log(imageFile);
});
function getNewestFile(files, path) {
var out = [];
files.forEach(function(file) {
var stats = fs.statSync(path + "/" +file);
if(stats.isFile()) {
out.push({"file":file, "mtime": stats.mtime.getTime()});
}
});
out.sort(function(a,b) {
return b.mtime - a.mtime;
})
return (out.length>0) ? out[0].file : "";
}
And the result is console.log(imageFile), I want to call the result of this in my javascript project, like
<script>
document.write(imageFile)
</script>
All this is to get the newest file created in a directory because I can't do it directly on JS.
Thanks a lot

First, there are several fundamental things about how the client/server relationship of the browser and a web server work that we need to establish. That will then offer a framework for discussing solving your problem.
Images are displayed in a browser, not with document.write(), but by inserting an image tag in your document that points to the URL of a specific image.
For a web page to get some result from the server, it has to either have that result embedded in the web page when the web page was originally fetched from the server or the Javascript in the web page has to request information from the server with an Ajax request. An ajax request is an http request where the Javascript in your web page, forms an http request that is sent to your server, your server receives that request and sends back a response which the Javascript in your web page receives and can then do something with.
To implement something where your web page requests some data from your back-end, you will have to have a web server in your back-end that can response to Ajax requests sent from the web page. You cannot just run a script on your server and magically modify a web page displayed in a browser. Without the type of structure described in the previous points, your web page has no connection at all to the displayed server. The web page can't directly reach your server file system and the server can't directly touch the displayed web page.
There are a number of possible schemes for implementing this type of connection. What I would think would work best would be to define an image URL that, when requested by any browser, it returns an image for the newest image in your particular directory on your server. Then, you would just embed that particular URL in your web page and anytime that image was refreshed or displayed, your server would send it the newest version of that image. Your server probably also needs to make sure that the browser does not cache that URL by setting appropriate cache headers so that it won't mistakenly just display the previously cached version of that image.
The web page could look like this:
<img src='http://mycustomdomain.com/dimages/newest'>
Then, you'd set up a web server at mycustomdomain.com that is publicly accessible (from the open internet - you choose your own domain obviously) that has access to the desired images and you'd create a route on that web server that answers to the /dimages/newest request.
Using Express as your web server framework, this could look like this:
const app = require('express')();
const fs = require('fs');
const util = require('util');
const readdir = util.promisify(fs.readdir);
const stat = util.promisify(fs.stat);
// middleware to use in some routes that you don't want any caching on
function nocache(req, res, next) {
res.header('Cache-Control', 'private, no-cache, no-store, must-revalidate, proxy-revalidate');
res.header('Expires', '-1');
res.header('Pragma', 'no-cache');
next();
}
const rutaImagen = 'C:/Users/smontesc/Desktop/imagenes1/'; //Location of images
// function to find newest image
// returns promise that resolves with the full path of the image
// or rejects with an error
async function getNewestImage(root) {
let files = await readdir(root);
let results = [];
for (f of files) {
const fullPath = root + "/" + f;
const stats = await stat(fullPath);
if (stats.isFile()) {
results.push({file: fullPath, mtime: stats.mtime.getTime()});
}
}
results.sort(function(a,b) {
return b.mtime - a.mtime;
});
return (results.length > 0) ? results[0].file : "";
}
// route for fetching that image
app.get(nocache, '/dimages/newest', function(req, res) {
getNewestImage(rutaImagen).then(img => {
res.sendFile(img, {cacheControl: false});
}).catch(err => {
console.log('getNewestImage() error', err);
res.sendStatus(500);
});
});
// start your web server
app.listen(80);

To be able to use that result in your Javascipt project, we definitely have to create an API which has a particular route that responses the imageFile. Then, in your Javascript project, you can use XMLHttpRequest (XHR) objects or the Fetch API to interact with servers to get the result.
The core idea is we definitely need both server-side and client-side programming to perform that functionality.

Related

May I have a res.json and res.sendFile together in the same api endpoint in express?

I want upload the content of an excel file into the server in order to get its data and do some stuff...
I came up with the following code, however it seems like it is not working properly as the following error displays in the console Error: Can't set headers after they are sent.
The file is getting uploaded into the folder and the json message is being displayed... However I do not know if I am going to face any issue in the future...
Actually I just need the excel data no need for the excel being uploaded... Maybe you could give me a workaround, guys...
const router = express.Router();
const storage = multer.diskStorage({
destination(req, file, cb) {
cb(null, 'uploads/');
},
filename(req, file, cb) {
cb(
null,
`${file.fieldname}-${Date.now()}${path
.extname(file.originalname)
.toLowerCase()}`
);
},
});
const excelFilter = (req, file, cb) => {
if (
file.mimetype.includes('excel') ||
file.mimetype.includes('spreadsheetml')
) {
cb(null, true);
} else {
cb('Please upload only excel file.', false);
}
};
const upload = multer({
storage,
fileFilter: excelFilter,
});
router.post('/', upload.single('file'), (req, res) => {
var workbook = XLSX.readFile(req.file.path);
var sheet_name_list = workbook.SheetNames;
var xlData = XLSX.utils.sheet_to_json(workbook.Sheets[sheet_name_list[0]]);
res.json(xlData).sendFile(`/${req.file.path}`, { root: path.resolve() });
});
May I have a res.json and res.sendFile together in the same api endpoint in express?
No, you cannot. Each of those methods, sends a complete http response (including calling res.end() which terminates the http request) and you can only send one http response to each incoming request. The particular error you're getting has to do with the res.sendFile() trying to configure the response that it's getting ready to send and finding that the http response object has already been used for sending a response and can't be used again.
Ordinarily, if you wanted to sent two different pieces of data, you would just combine them into a single Javascript object and just call res.json() on the object that contains both pieces of data.
But, sending a binary file is not something you can easily put in a JSON package. You could construct a multipart response where one part was the JSON and one part was the file. You could JSON encode binary data (though that's inefficient). I presume there are probably some modules that would help you do that, but for most clients, that isn't what they are really expecting or equipped to handle.
The only way to a proper solution is for us to understand what client/server workflow you're trying to implement here and why you're trying to send back the same file that was just uploaded. There would normally not be a reason to do that since the client already has that data (they just uploaded it).

node.js internal message server

I am pretty new to node.js. I am working on an app able to display NFC content on a webpage. I am using nfc-pcsp package (https://github.com/pokusew/nfc-pcsc), I can easily read data on server side. Now I just would like to display the data in the webpage, but I am stuck on the logic. Here is a part of my server code:
// ### launch server for client
var http = require('http');
var html = require('fs').readFileSync(__dirname+'/custom.html');
var server = require('http').createServer(function(req, res){
res.end(html);
});
server.listen(3000, '127.0.0.1');
console.log('Server running at http://127.0.0.1:3000/');
//######
//#launch NFC routine ######
const nfc = new NFC(); // const nfc = new NFC(minilogger); // optionally you can pass logger to see internal debug logs
let readers = [];
nfc.on('reader', async reader => {
pretty.info(`device attached`, { reader: reader.name });
// the event is correctly displayed in the console. How to update html here?
readers.push(reader);
nfc.on('error', err => {
pretty.error(`an error occurred`, err);
});
It seems to me that I need a res object to update the html page, but as I do not get any request from client, how do I update the page just based on the callback from NFC module reader? Hope my question is clear.
Thanks,
Matt
I suggest you to use the express API
command with npm CLI : npm install --save express at your root project folder in your terminal
Then, you will be able to create a route in Get, Post, Put or Delete.
Next, In your client side you will be able to call this route by a get, post or whatever with a promise, ajax request whatever you want :)
Just understand that in order to receive or send data to your server, you need an url and with Express, you can create your own url.
https://www.npmjs.com/package/express
Don't hesitate to have a look on this API, and i'm pretty sure you will find the answer to your question on your own :)

Node.js Pipe a PDF API Response

So my scenario is a user clicks a button on a web app, this triggers a server side POST request to an internal (i.e non public) API sitting on another server in the same network, this should return a PDF to my server which will proxy (pipe) it back to the user.
I want to just proxy the PDF body content directly to the client without creating a tmp file.
I have this code which works using the npm request module but it does not feel right:
var pdfRequest = request(requestOptions);
pdfRequest.on('error', function (err) {
utils.sendErrorResponse(500, 'PROBLEM PIPING PDF DOWNLOAD: ' + err, res);
});
pdfRequest.on('response', function (resp) {
if (resp.statusCode === 200) {
pdfRequest.pipe(res);
} else {
utils.sendErrorResponse(500, 'PROBLEM PIPING PDF DOWNLOAD: RAW RESP: ' + JSON.stringify(resp), res);
}
});
Is the the correct way to pipe the PDF response?
Notes:
I need to check the status code to conditionally handle errors, the payload for the POST is contained in the requestOptions (I know this part is all correct).
I would like to keep using the request module
I defiantly do not want to be creating any temp files
If possible I would also like to modify the content disposition header to set a custom filename, i know how to do this without using pipes

express node server return a value to async client call

Sorry, I tend to be a bad writer when I have not fully woken up, let me revise.
I am using expressjs with passportjs (local strategy) to manage my server and using connect-busboy to manage file uploading. I do not think passport will play a role in this.
Here is the server code for managing file uploads:
app.post('/upload', isLoggedIn, (req, res) => {
if(req.busboy){
req.pipe(req.busboy);
req.busboy.on('file', (fieldname, file, filename, encoding, mimetype) => {
if(mimetype.match(/^image\//)){
var root = path.join(__dirname, "../public/images/");
if(fs.existsSync(path.join(root, filename))){
var name = getUnique(path.join(root, filename));
} else {
var name = filename;
}
var ws = fs.createWriteStream(path.join(root, name), { flags: "a" });
file.pipe(ws);
}
});
}
});
As for my client page, it is used to change a JSON object which will get re-uploaded to the server as a configuration tool. When I upload a new image asynchronously I need to get the filename to update this JSON object while working on it. For uploading from the clients end I am using dropzonejs, which did not require any configuration on my part to work.
So, in summary I upload a number of images via dropzone asynchronously, busboy and fs on my server save the file, and I would like to get the filename returned to my javascript to modify the existing JSON object.
Edit solution:
Thanks to Elliot Blackburn for pointing me in the right direction.
By calling:
ws.on('close', () => {
res.send({filename: name});
});
after file.pipe(ws); to send the response back to the client. On the client side modify dropzone to handle the response like so:
dropzone.on('success', (file, res) => {
console.log(res);
});
Just send it in the normal http response. It'll depend what library you're using but most will allow you to trigger a normal req, res, next express call. From that you can access the file object, and return anything you want.
Something like:
req.send({filename: name}); // name is the filename var set earlier in the code.
Once you've finished editing the file and such, you can get the name and put it into that returned object and your client will receive that as object as the response which you can act upon.

NodeJS - How to get cookies from server response

I want to use nodeJS as tool for website scrapping. I have already implemented a script which logs me in on the system and parse some data from the page.
The steps are defined like:
Open login page
Enter login data
Submit login form
Go to desired page
Grab and parse values from the page
Save data to file
Exit
Obviously, the problem is that every time my script has to login, and I want to eliminate that. I want to implement some kind of cookie management system, where I can save cookies to .txt file, and then during next request I can load cookies from file and send it in request headers.
This kind of cookie management system is not hard to implement, but the problem is how to access cookies in nodejs? The only way I found it is using request response object, where you can use something like this:
request.get({headers:requestHeaders,uri: user.getLoginUrl(),followRedirect: true,jar:jar,maxRedirects: 10,},function(err, res, body) {
if(err) {
console.log('GET request failed here is error');
console.log(res);
}
//Get cookies from response
var responseCookies = res.headers['set-cookie'];
var requestCookies='';
for(var i=0; i<responseCookies.length; i++){
var oneCookie = responseCookies[i];
oneCookie = oneCookie.split(';');
requestCookies= requestCookies + oneCookie[0]+';';
}
}
);
Now content of variable requestCookies can be saved to the .txt file and can loaded next time when script is executed, and this way you can avoid process of logging in user every time when script is executed.
Is this the right way, or there is a method which returns cookies?
NOTE: If you want to setup your request object to automatically resend received cookies on every subsequent request, use the following line during object creation:
var request = require("request");
request = request.defaults({jar: true});//Send cookies on every subsequent requests
In my case, i've used 'http'library like the following:
http.get(url, function(response) {
variable = response.headers['set-cookie'];
})
This function gets a specific cookie value from a server response (in Typescript):
function getResponseCookieValue(res: Response, param: string) {
const setCookieHeader = res.headers.get('Set-Cookie');
const parts = setCookieHeader?.match(new RegExp(`(^|, )${param}=([^;]+); `));
const value = parts ? parts[2] : undefined;
return value;
}
I use Axios personally.
axios.request(options).then(function (response) {
console.log(response.config.headers.Cookie)
}).catch(function (error) {
console.error(error)
});

Categories