I am creating a discord bot (irrelevent) that sends images into the chat. The user can type out the name of the image without needing to type the file extention. The problem is that the bot doesn't know what the file extention is so it will crash if the picture is a .jpg and the program was expecting a .png. Is there a way to make the program not require a file extention to open the file?
let image = imageName;
message.channel.send({ files: [`media/stickers/${imageName}.png`] });
Unfortunately, the extension of the filename is required. You know file.mp4 and file.mp3 is entirely different.
However, you can use a try-except and a for loop to get the correct file!
I would suggest:
let image = imageName;
let extensions = [".png", ".jpg", "gif"] // All the extensions you can think of
const pass = () => {}
for (const extension of extensions) {
try {
message.channel.send({ files: [`media/stickers/${imageName}${extension}`] }); // successfully get file and send
break
} catch(error) {
pass() // do nothing, and go back to the loop and test other extension
}
}
I haven't tried that before, and I am a Python programmer. But I hope you get the idea.
Using fs - specifically the Promise version of fs, makes this quite simple
import { readdir } from 'fs/promises';
const getFullname = async (path, target) =>
(await readdir(path))
.find(file =>
file === target || file.split('.').slice(0,-1).join('.') === target
);
try {
const actualName = await getExtension('media/stickers', imageName);
if (!actualName) {
throw `File ${imageName} not found`;
}
message.channel.send({ files: [`media/stickers/${actualName}`] });
} catch(error) {
// handle your errors here
}
You can pass in the name with or without the extension and it will be found - note, this is NOT case insensitive ... so XYZ won't match xyz.jpg - easily changed if you need case insensitivity
There are only a few known image extensions like jpg, png, gif, jpeg. Maybe try and fetch the file with best guess extension, if it throws exception try the next format.
Related
I have a process where a client uploads a document. This document can be in the form of a PDF, JPG or PNG file only and it should be reuploaded once a year (it is an errors and omissions insurance policy).
I am saving this file in a container.
For deleting files from anywhere at the application, I have this function (Node):
deleteFromBlob = async function (account, accountKey, containerName, blobFolder, blobName) {
try {
const {
BlobServiceClient,
StorageSharedKeyCredential
} = require("#azure/storage-blob");
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
const containerClient = blobServiceClient.getContainerClient(containerName);
const blockBlobClient = containerClient.getBlockBlobClient(blobFolder + '/' + blobName);
const uploadblobResponse = await blockBlobClient.deleteIfExists()
return true
}
catch(e) {
return false
}
}
And this works perfect when I know the file name and extension I want to delete, like "2448.pdf":
let deleteFile = await utils.deleteFromBlob(account, accountKey, "agents", "/eopolicies/", userData.agentid.toString() + ".pdf" )
But the problem Im facing is that the function above is to delete a file I know exists; for example, if the agent ID is 2448 and he uploads "policy.pdf" I save it as "2448.pdf" for easy file identification.
The problem Im facing is if the agent uploaded a .PNG last year. a .DOC a year before, and a .PDF now. If that's the case, I want to delete 2448.* and keep only the latest version of the document.
So I tried changing my function to
let deleteFile = await utils.deleteFromBlob(account, accountKey, "agents", "/eopolicies/", userData.agentid.toString() + ".*" )
And of course it is not working...
I tried to find a solution and all I found is one to list the content of a folder, then loop it and delete the specific file I want; but that will not work for me since there are 37,000 EO policies on that folder.
Is there a way to delete files with a specific name, and whatever extension?
Thanks.
I've never tried using a wildcard on the extension side of the file name. However, I would iterate through the files in the directory and find the one that contains the specific string you are looking for. Get it's index, and delete from there.
could someone tell me how can I create a file using JS for browser? Is there a way to do that without Node JS? I was trying to create a file using Node JS, fs command, but said that "fs.createWriteStream is not a function".
You can use File System Access API
For example according to MDN
// store a reference to our file handle
let fileHandle;
async function getFile() {
// open file picker
[fileHandle] = await window.showOpenFilePicker();
if (fileHandle.kind === 'file') {
// run file code
} else if (fileHandle.kind === 'directory') {
// run directory code
}
}
here is the useful link
I have an Api that returns the data in the format of
{ fileName: string, blob: Blob }[]
I want to print all these files, so I am using
_files.forEach((_fileInfo) => {
const blobUrl = URL.createObjectURL(_fileInfo.blob);
const oWindow = window.open(blobUrl, "print");
oWindow.print();
oWindow.close();
});
this opens the multiple print windows, but in preview it shows blank documents.
but when i download all these files as a zip it downloads the correct PDF files.
// add files to zip
files.forEach((_fileInfo) => {
zip.file(_fileInfo.fileName, _fileInfo.blob);
});
// download and save
return zip.generateAsync({ type: "blob" }).then((content) => {
if (content) {
return saveAs(content, name);
}
});
What could be the issue,
is there any way to print all documents in a sequence without opening multiple windows?
Pdf file takes time to load thats why it was showing blank document, So i used print-js to solve this issue.
how to import
import * as printJS from "print-js";
how to use
const blobUrl = URL.createObjectURL(_fileInfo.blob);
printJS(blobUrl);
In case, you don't want to use printJS
I was facing the same issue but in my case I only have to deal with one file at a time. I solved it by putting the print function in a timeout block, so in your case this would be
setTimeout(function(){
oWindow.print();
}, 1000)
if this works then great, if not then you might also need to set the source of window equal to the 'blobUrl' that you created. This should be done after you come out of the loop.
I have a local JSON file which I intent to read/write from a NodeJS electron app. I am not sure, but I believe that instead of using readFile() and writeFile(), I should get a FileHandle to avoid multiple open and close actions.
So I've tried to grab a FileHandle from fs.promises.open(), but the problem seems to be that I am unable to get a FileHandle from an existing file without truncate it and clear it to 0.
const { resolve } = require('path');
const fsPromises = require('fs').promises;
function init() {
// Save table name
this.path = resolve(__dirname, '..', 'data', `test.json`);
// Create/Open the json file
fsPromises
.open(this.path, 'wx+')
.then(fileHandle => {
// Grab file handle if the file don't exists
// because of the flag 'wx+'
this.fh = fileHandle;
})
.catch(err => {
if (err.code === 'EEXIST') {
// File exists
}
});
}
Am I doing something wrong? Are there better ways to do it?
Links:
https://nodejs.org/api/fs.html#fs_fspromises_open_path_flags_mode
https://nodejs.org/api/fs.html#fs_file_system_flags
Because JSON is a text format that has to be read or written all at once and can't be easily modified or added onto in place, you're going to have to read the whole file or write the whole file at once.
So, your simplest option will be to just use fs.promises.readFile() and fs.promises.writeFile() and let the library open the file, read/write it and close the file. Opening and closing a file in a modern OS takes advantage of disk caching so if you're reopening a file you just previously opened not long ago, it's not going to be a slow operation. Further, since nodejs performs these operations in secondary threads in libuv, it doesn't block the main thread of nodejs either so its generally not a performance issue for your server.
If you really wanted to open the file once and hold it open, you would open it for reading and writing using the r+ flag as in:
const fileHandle = await fsPromises.open(this.path, 'r+');
Reading the whole file would be simple as the new fileHandle object has a .readFile() method.
const text = await fileHandle.readFile({encoding 'utf8'});
For writing the whole file from an open filehandle, you would have to truncate the file, then write your bytes, then flush the write buffer to ensure the last bit of the data got to the disk and isn't sitting in a buffer.
await fileHandle.truncate(0); // clear previous contents
let {bytesWritten} = await fileHandle.write(mybuffer, 0, someLength, 0); // write new data
assert(bytesWritten === someLength);
await fileHandle.sync(); // flush buffering to disk
I read that createRreadStream doesn't put the whole file into the memory, instead it works with chunks. However I have a situation where I am simultaneously writing and reading from a file; Write gets finished first, then I delete the file from disk. Somehow, readstream was able to complete reading whole file without any error.
Does anyone have any explanation for this ? Am I wrong to think that streams doesn't load the file into memory?
Here's the code for writing to a file
const fs = require('fs');
const file = fs.createWriteStream('./bigFile4.txt');
function write(stream,data) {
if(!stream.write(data))
return new Promise(resolve=>stream.once('drain',resolve));
return true;
}
(async() => {
for(let i=0; i<1e6; i++) {
const res = write(file,'a')
if(res instanceof Promise)
await res;
}
write(file,'success');
})();
For Reading I used this,
const file = fs.createReadStream('bigFile4.txt')
file.on('data',(chunk)=>{
console.log(chunk.toString())
})
file.on('end',()=>{
console.log('done')
})
At least on UNIX-type OS'es, if you open a file and then remove it, the file data will still be available to read until you close the file.