So I need to generate the smaller previews of the image files that will be uploaded and I have to append "_preview" at the end of each file name.
Currently I'm doing this:
uploadFile.map((file) => {
if (file.type.includes('image')) {
console.log('Generating thumbnail for ' + file.name)
const fileName = file.name.split('.').slice(0, -1).join('.')
const fileExtension = file.name.split('.').pop()
const compressedFile = new File(
[file.slice(0, file.size, file.type)],
fileName + '_preview.' + fileExtension,
)
console.log('Generated file:', compressedFile)
convert({
file: compressedFile,
width: 300,
height: 300,
type: fileExtension,
})
.then((resp) => {
uploadFile.push(resp)
})
.catch((error) => {
// Error
console.error('Error compressing ', file.name, '-', error)
})
}
})
The problem is that "compressedFile" is missing some fields which were present in the original file and so the convert functoin throws the error "File type not supported". As you can see "type" and "webkitRelativePath" are not copied.
Can anybody suggest how I can retain all the information from the original file and just append _preview at the end of file name?
I realized File API provides an option to pass "options" object as well which can specify the file type. For instance:
const file = new File(["foo"], "foo.txt", {
type: "text/plain",
});
Source: https://developer.mozilla.org/en-US/docs/Web/API/File/File
for copy code in js or duplicate, you can use this code
//copyfile.js
const fs = require('fs');
// destination will be created or overwritten by default.
fs.copyFile('C:\folderA\myfile.txt', 'C:\folderB\myfile.txt', (err) => {
if (err) throw err;
console.log('File was copied to destination');
});
Related
TLDR; Can I get filename from readable stream?
Got a method to download file from Google Drive and save it to local directory as I'm requesting response as stream , I would like to save the file with the same name as it has on the drive without explicitly asking user.
Do I have to make 1 extra request prior, to get filename from drive , is that the only way ?
async fetchFile(fileID) {
// Get file as stream
const { data: gdriveFile } = await this.drive.files.get(
{
fileId: fileID,
alt: "media"
},
{
responseType: "stream"
}
)
// FIXME: get filename from gdrive
console.log("Saving File: " + JSON.stringify(gdriveFile))
// fs writeStream
// const destStream = fs.createWriteStream(
// path.resolve(destDir + "/" + filename || "bleh.log")
// )
// pipeline(gdriveFile, destStream).end(() => {
// return "File copied"
// })
}
Thnx
I have the following setup, by which I send the image, from its url, to be edited and sent back to be uploaded to S3. The problem I currently have is that the image gets on S3 corrupted, and I am wondering if there's trouble in my code that's causing the issue.
Server side:
function convertImage(inputStream) {
return gm(inputStream)
.contrast(-2)
.stream();
}
app.get('/resize/:imgDetails', (req, res, next) => {
let params = req.params.imgDetails.split('&');
let fileName = params[0]; console.log(fileName);
let tileType = params[1]; console.log(tileType);
res.set('Content-Type', 'image/jpeg');
let url = `https://${process.env.Bucket}.s3.amazonaws.com/images/${tileType}/${fileName}`;
convertImage(request.get(url)).pipe(res);
})
Client side:
axios.get('/resize/' + fileName + '&' + tileType)
.then(res => {
/** PUT FILE ON AWS **/
var img = res;
axios.post("/sign_s3_sized", {
fileName : fileName,
tileType : tileType,
ContentType : 'image/jpeg'
})
.then(response => {
var returnData = response.data.data.returnData;
var signedRequest = returnData.signedRequest;
var url = returnData.url;
this.setState({url: url})
// Put the fileType in the headers for the upload
var options = {
headers: {
'Content-Type': 'image/jpeg'
}
};
axios.put(signedRequest,img, options)
.then(result => {
this.setState({success: true});
}).bind(this)
.catch(error => {
console.log("ERROR: " + JSON.stringify(error));
})
})
.catch(error => {
console.log(JSON.stringify(error));
})
})
.catch(error => console.log(error))
Before going any further, I can assure you now that uploading any images via this setup minus the convertImage() works, otherwise the image gets put on S3 corrupted.
Any pointers as to what the issue behind the image being corrupted is?
Is my understanding of streams here lacking perhaps? If so, what should I change?
Thank you!
EDIT 1:
I tried not running the image through the graphicsmagick API at all (request.get(url).pipe(res);) and the image is still corrupted.
EDIT 2:
I gave up at the end and just uploaded the file from Node.js straight to S3; it turned out to be better practice anyway.
So if you are end goal is to upload the image in the S3 bucket using Node Js, there are simple ways by using multer-s3 node module.
I am trying to upload multiple large size JSON files from React-native to node js.
The files are being uploaded unless the file in larger in size, in which case, it does not upload in one try.
I suspect that:
Since the upload code is in a for loop the code is starting the upload but not waiting for the file to upload and starting to upload the next file
Is there any way to ensure that each file gets uploaded in one go?
syncFunction() {
var RNFS = require('react-native-fs');
var path = RNFS.DocumentDirectoryPath + '/toBeSynced';
RNFS.readDir(path)
.then((success) => {
for (let i = 0; i < success.length; i++) {
var fileName = success[i].name
var filePath = success[i].path
var uploadUrl = 'http://192.168.1.15:3333/SurveyJsonFiles/GetFiles/'
if (Platform.OS === 'android') {
filePath = filePath.replace("file://", "")
} else if (Platform.OS === 'ios') {
filePath = filePath
}
const data = new FormData();
data.append("files", {
uri: filePath,
type: 'multipart/form-data',
name: fileName,
});
const config = {
method: 'POST',
headers: {
'Accept': 'application/json',
},
body: data,
};
fetch(uploadUrl, config)
.then((checkStatusAndGetJSONResponse) => {
console.log(checkStatusAndGetJSONResponse);
this.moveFile(filePath, fileName)
}).catch((err) => {
console.log(err)
});
}
})
.catch((err) => {
console.log(err.message);
});
}
The JSON files will more than 50Mb depending on data, since it contains base64 image data the size will increase as the user takes more photos.
The app will be creating new files when the user records any information, There is no error message displayed for partial file upload.
The this.moveSyncedFiles() is moving the synced files to another folder so that the same file does not get uploaded multiple times
moveFile(oldpath, oldName) {
var syncedPath = RNFS.DocumentDirectoryPath + '/syncedFiles'
RNFS.mkdir(syncedPath)
syncedPath = syncedPath + "/" + oldName
RNFS.moveFile(oldpath, syncedPath)
.then((success) => {
console.log("files moved successfully")
})
.catch((err) => {
console.log(err.message)
});
}
It turns out the fault was on the nodejs side and nodemon was restarting the server every time a new file was found so we just moved the uploads folder outside the scope of the project
I am trying to upload a file from mobile to google bucket using ionic 4. Although a file can upload into the could. I am struggling to get the file properties out of file object.
Here is my method,
async selectAFile() {
const uploadFileDetails = {
name: '',
contentLength: '',
size: '',
type: '',
path: '',
};
this.fileChooser.open().then(uri => {
this.file.resolveLocalFilesystemUrl(uri).then(newUrl => {
let dirPath = newUrl.nativeURL;
const dirPathSegments = dirPath.split('/');
dirPathSegments.pop();
dirPath = dirPathSegments.join('/');
(<any>window).resolveLocalFileSystemURL(
newUrl.nativeURL,
function(fileEntry) {
uploadFileDetails.path = newUrl.nativeURL;
const file: any = getFileFromFileEntry(fileEntry);
//log 01
console.log({ file });
uploadFileDetails.size = file.size;
uploadFileDetails.name = `${newUrl.name
.split(':')
.pop()}.${file.type.split('/').pop()}`;
uploadFileDetails.type = file.type;
async function getFileFromFileEntry(fileEntry) {
try {
return await new Promise((resolve, reject) =>
fileEntry.file(resolve, reject)
);
} catch (err) {
console.log(err);
}
}
},
function(e) {
console.error(e);
}
);
});
});
// here uploadFileDetails is simller to what I declared at the top ;)
// I wan't this to be populated with file properties
// console.log(uploadFileDetails.name) --> //''
const uploadUrl = await this.getUploadUrl(uploadFileDetails);
const response: any = this.uploadFile(
uploadFileDetails,
uploadUrl
);
response
.then(function(success) {
console.log({ success });
this.presentToast('File uploaded successfully.');
this.loadFiles();
})
.catch(function(error) {
console.log({ error });
});
}
even though I can console.log the file in log 01. I am unable to get file properties like, size, name, type out of the resolveLocalFileSystemURL function. basically, I am unable to populate uploadFileDetails object. What am I doing wrong? Thank you in advance.
you actually need 4 Ionic Cordova plugins to upload a file after getting all the metadata of a file.
FileChooser
Opens the file picker on Android for the user to select a file, returns a file URI.
FilePath
This plugin allows you to resolve the native filesystem path for Android content URIs and is based on code in the aFileChooser library.
File
This plugin implements a File API allowing read/write access to files residing on the device.
File Trnafer
This plugin allows you to upload and download files.
getting the file's metadata.
file.resolveLocalFilesystemUrl with fileEntry.file give you all the metadata you need, except the file name. There is a property called name in the metadata but it always contains value content.
To get the human readable file name you need filePath. But remember you can't use returning file path to retrieve metadata. For that, you need the original url from fileChooser.
filePathUrl.substring(filePathUrl.lastIndexOf('/') + 1) is used to get only file name from filePath.
You need nativeURL of the file in order to upload it. Using file path returning from filePath is not going to work.
getFileInfo(): Promise<any> {
return this.fileChooser.open().then(fileURI => {
return this.filePath.resolveNativePath(fileURI).then(filePathUrl => {
return this.file
.resolveLocalFilesystemUrl(fileURI)
.then((fileEntry: any) => {
return new Promise((resolve, reject) => {
fileEntry.file(
meta =>
resolve({
nativeURL: fileEntry.nativeURL,
fileNameFromPath: filePathUrl.substring(filePathUrl.lastIndexOf('/') + 1),
...meta,
}),
error => reject(error)
);
});
});
});
});
}
select a file from the file system of the mobile.
async selectAFile() {
this.getFileInfo()
.then(async fileMeta => {
//get the upload
const uploadUrl = await this.getUploadUrl(fileMeta);
const response: Promise < any > = this.uploadFile(
fileMeta,
uploadUrl
);
response
.then(function(success) {
//upload success message
})
.catch(function(error) {
//upload error message
});
})
.catch(error => {
//something wrong with getting file infomation
});
}
uploading selected file.
This depends on your backend implementation. This is how to use File Transfer to upload a file.
uploadFile(fileMeta, uploadUrl) {
const options: FileUploadOptions = {
fileKey: 'file',
fileName: fileMeta.fileNameFromPath,
headers: {
'Content-Length': fileMeta.size,
'Content-Type': fileMeta.type,
},
httpMethod: 'PUT',
mimeType: fileMeta.type,
};
const fileTransfer: FileTransferObject = this.transfer.create();
return fileTransfer.upload(file.path, uploadUrl, options);
}
hope it helps. :)
I have got a express server, which creates a pdf file.
I am trying to send this file to the client:
const fs = require('fs');
function download(req, res) {
var filePath = '/../../myPdf.pdf';
fs.readFile(__dirname + filePath, function(err, data) {
if (err) throw new Error(err);
console.log('yeyy, no errors :)');
if (!data) throw new Error('Expected data, but got', data);
console.log('got data', data);
res.contentType('application/pdf');
res.send(data);
});
}
On the client I want to download it:
_handleDownloadAll = async () => {
console.log('handle download all');
const response = await request.get(
`http://localhost:3000/download?accessToken=${localStorage.getItem(
'accessToken'
)}`
);
console.log(response);
};
I recieve an body.text like
%PDF-1.4↵1 0 obj↵<<↵/Title (��)↵/Creator (��)↵/Producer (��Qt 5.5.1)↵
but I can't achieve a download.
How can I create a PDF from the data OR directly download it from the server?
I've got it working:
The answer was pretty simple. I just let the browser handle the download with an html anchor tag:
server:
function download(req, res) {
const { creditor } = req.query;
const filePath = `/../../${creditor}.pdf`;
res.download(__dirname + filePath);
}
client:
<a href{`${BASE_URL}?accessToken=${accessToken}&creditor=${creditorId}`} download>Download</a>
The result is the string of the binary. We use base 64 to convert from binary to pdf
var buffer = Buffer.from(result['textBinary'], 'base64')
fs.writeFileSync('/path/to/my/file.pdf', buffer)
You can prompt the browser to download the file by setting the correct content-disposition header:
res.setHeader('Content-disposition', 'attachment; filename=myfile.pdf');
readFile returns a Buffer which is a wrapper around bytes. You're sending Buffer back to the client which is logging them to the console.
The body.text you see is to be expected.
You will need to write these bytes to a file using fs.writeFile or similar. Here's an example:
_handleDownloadAll = async () => {
console.log('handle download all');
const response = await request.get(
`http://localhost:3000/download?accessToken=${localStorage.getItem(
'accessToken'
)}`
);
// load your response data into a Buffer
let buffer = Buffer.from(response.body.text)
// open the file in writing mode
fs.open('/path/to/my/file.pdf', 'w', function(err, fd) {
if (err) {
throw 'could not open file: ' + err;
}
// write the contents of the buffer
fs.write(fd, buffer, 0, buffer.length, null, function(err) {
if (err) {
throw 'error writing file: ' + err;
}
fs.close(fd, function() {
console.log('file written successfully');
});
});
});
};
You may need to experiment with the buffer encoding, it defaults to utf8.
Read this!
The other option you may want to consider is generating the PDF on the server and simply sending the client a link to where it can download this.