PSD.js in react error fs.readFileSync is not a function - javascript

I have a reactjs project using psd.js and on my dropzone config like this
accept: function(file, done){
const reader = new FileReader();
reader.onload = handleReaderLoad;
reader.readAsDataURL(file);
function handleReaderLoad(evt) {
console.log(evt.target.result);
let psdFile = PSD.fromFile(evt.target.result);
psdFile.parse();
console.log(psdFile);
}
done();
},
The error is:
Uncaught TypeError: fs.readFileSync is not a function
at Function.fromFile (init.coffee:6)
at FileReader.handleReaderLoad (index.js?03a7:153)
in my webpack config i inlclude :
node: {
fs: 'empty'
},
Because if I don't include it, the error is: not found fs module
Please help.

You should use PSD.fromEvent(evt), not PSD.fromFile.
The former reads the blob from a file input, while the latter tries to hit the filesystem which is obviously absent in the browser context.
So I guess your code should look like this (but I'm not quite sure)
accept: function(file, done){
const reader = new FileReader();
reader.onload = handleReaderLoad;
reader.readAsDataURL(file);
function handleReaderLoad(evt) {
PSD.fromEvent(evt).then(function (psd) {
// here you can access the parsed file as psd
console.log(psd.tree().export());
done();
});
}
},

Related

Read uploaded JSON file in Angular

I am using the <p-fileUpload> from PrimeNG to upload a json file in my web-app. I want to read the json file, in the front-end, and change some values of a data table.
However, I have no idea how to parse the uploaded file as a json to a typescript object. Any ideas?
In the HTML file:
<p-fileUpload #ratingsUpload mode="basic" name="demo[]"
url="" accept=".json"
styleClass="p-button-raised"
[auto]="true" chooseLabel="Upload ratings"
(onUpload)="onRatingsUpload($event)"></p-fileUpload>
In the typescript file:
onRatingsUpload(event: any) {
console.log(event.files)
// TODO:
// data = JSON(event.files);
}
Edit: I can't get the event to fire. onRatingsUpload does not seem to be called...
You have to use FileReader:
const reader = new FileReader();
reader.onload = (event) => {
try {
var obj = JSON.parse((event.target.result) as string);
console.log('my json:', obj);
} catch (error) {
console.error(error);
}
};
reader.readAsText(file);
//You need to use FileReader
onFileChanged(event) {
this.selectedFile = event.target.files[0];
const fileReader = new FileReader();
fileReader.readAsText(this.selectedFile, "UTF-8");
fileReader.onload = () => {
console.log(JSON.parse(fileReader.result));
}
fileReader.onerror = (error) => {
console.log(error);
}
}

convert file to base64 in angular

i am new to angular and i want to convert files into base64 string. i try this and it gives base64 string but when i try to decode it using online tool it did not give correct result can anyone help me thank you.
can we convert pdf and word file into base64 or just images. with my code i successfully converted images but not any other file
my html code:
<input type="file" (change)="base($event)">
and my ts code:
base(event: { target: { files: any[]; }; }) {
const file = event.target.files[0];
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onload = () => {
console.log(reader.result);
};
}
Your code works fine. Are you sure you are correctly getting the file object?
Below is the stackblitz demo of the same.
Also, it better to use promise while reading files. You can use below function to read a file. It takes file as input and resolves to base64 string.
private readBase64(file): Promise<any> {
const reader = new FileReader();
const future = new Promise((resolve, reject) => {
reader.addEventListener('load', function () {
resolve(reader.result);
}, false);
reader.addEventListener('error', function (event) {
reject(event);
}, false);
reader.readAsDataURL(file);
});
return future;
}
Call this function as below -
this.readBase64(file)
.then((data) => {
// your code
});

Uploading to S3 a video captured using Cordova Media Capture plugin using Javascript AWS SDK

I am facing a peculiar issue while trying to upload a video file (.mp4 and .mov) to S3, captured using cordova-plugin-media-capture#1.4.3 or picked from gallery using cordova-plugin-camera#2.4.1.
I am using javascript AWS SDK v2.3.11 and calling the .upload function of the SDK.
It only copies 15 Bytes of data onto the S3 regardless of the actual size of the video file and it is non-playable.
Implementation -
Capture video:
navigator.device.capture.captureVideo(
captureSuccess,
captureError,
{
limit: 1,
duration: 30,
destinationType: 2,
sourceType: 1,
mediaType: 1
}
);
var captureSuccess = function (mediaFiles) {
var mediaFile = mediaFiles[0];
var filedata = {
Key: "videos/" + fileName,
ContentType: mediaFile.type,
Body: mediaFile
};
var aws = AWS;
var creds = new aws.Credentials(
AccessKeyId,
SecretAccessKey,
SessionToken
);
aws.config.credentials = creds;
s3 = new aws.S3();
s3.upload(
filedata,
{
Bucket: bucketName
},
function(err, location){
if(!err){
//uploaded successfully
} else {
//upload failed
}
}
);
}
When I try to convert the media file to its Base64 data and upload, it does write the complete base64 file to the S3 bucket. However, I then need to strip the prefixed filetype and base64 identifiers text then decompile the data to a binary format save it again to S3 (from EB nodeJS service).
Another issue with this approach is that converting a video file to a base64 data and saving in RAM memory of the phone is prone to application crashes due to memory management on both IOS and Android. I am unable to use this mechanism to convert a video file of more than 5 secs in Android, and more than 10 secs in 16GB iPhone6. The application crashes beyond both these scenarios.
Changed Implementation with Base64:
var captureSuccess = function (mediaFiles) {
var mediaFile = mediaFiles[0];
var filedata = {
Key: "videos/" + fileName,
ContentType: mediaFile.type
};
var aws = AWS;
var creds = new aws.Credentials(
AccessKeyId,
SecretAccessKey,
SessionToken
);
aws.config.credentials = creds;
s3 = new aws.S3();
getBase64data(
mediaFile.fullPath, //tried with mediaFile.localURL as well
function(data){
filedata.Body = data;
s3.upload(
filedata,
{
Bucket: bucketName
},
function(err, location){
if(!err){
//uploaded successfully
} else {
//upload failed
}
}
); //ending s3.upload
); //ending getBase64data
}
function getBase64Data(filePath, cb){
window.resolveLocalFileSystemURL(
filePath,
function(entry){
entry.file(
function(file) {
var reader = new FileReader();
reader.onloadend = function(event) {
cb(event.target.result);
};
reader.readAsDataURL(file);
},
function(e){
//error retrieving file object
}
); //ending entry.file
},
function(e){
//error getting entry object
}
); //ending resolveLocalFileSystemURL
}
The AWS S3 JavaScript SDK allows a couple of different ways to provide the file data to the upload function. According to the documentation the file data can be any of the following:
Body — (Buffer, Typed Array, Blob, String, ReadableStream) Object data.
You need to extract the data from your captured video to any of those formats before attempting to upload.
Using mediaFile.fullPath, you can read the data from the file into a Buffer or create a ReadableStream and then use that to upload the file. To extract the data from the file you can use cordova-plugin-file.

Write file to disk from blob in electron application

I am creating an Electron Application in which I am recording data from webcam and desktop, at the end of the recording session, I want to save the data to a file in the background. I do not know how to write the data from a blob to a file directly. Any suggestions?
Below is my current handling for MediaRecord Stop event.
this.mediaRecorder.onstop = (e) => {
var blob = new Blob(this.chunks,
{ 'type' : 'video/mp4; codecs=H.264' });
var fs = require('fs');
var fr = new FileReader();
var data = null;
fr.onload = () => {
data = fr.result;
fs.writeFile("test.mp4", data, err => {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
};
fr.readAsArrayBuffer(blob);
}
You can do it using FileReader and Buffer.
In the renderer process, send the event to the main process to save the file with the buffer:
function saveBlob(blob) {
let reader = new FileReader()
reader.onload = function() {
if (reader.readyState == 2) {
var buffer = new Buffer(reader.result)
ipcRenderer.send(SAVE_FILE, fileName, buffer)
console.log(`Saving ${JSON.stringify({ fileName, size: blob.size })}`)
}
}
reader.readAsArrayBuffer(blob)
}
Get back the confirmation:
ipcRenderer.on(SAVED_FILE, (event, path) => {
console.log("Saved file " + path)
})
(SAVE_FILE and SAVED_FILE are static strings containing event name)
and in the main process:
ipcMain.on(SAVE_FILE, (event, path, buffer) => {
outputFile(path, buffer, err => {
if (err) {
event.sender.send(ERROR, err.message)
} else {
event.sender.send(SAVED_FILE, path)
}
})
})
outputFile is from 'fs-extra'
Handling node operations in main process is preferred. See Electron Security suggestions.
If you do want to not use main process, you can use 'electron-remote' to create background processes to write the file. Additionally, you can invoke ffmpeg in the background process to compress/encode the file into different format.

Changing filename of a ReadStream in NodeJS

I'm using thecodingmachine/gotenberg for converting office documents to PDF files (gotenberg is using unoconv):
Documentation
I have the following code written in javascript (using NodeJS library request) to send a request with a local file to gotenberg:
function openFile(file, fullPath) {
return new Promise((resolve, reject) => {
const filePath = pathModule.join(fullPath, file);
var formData = {
files: fs.createReadStream(filePath),
};
request.post({url:"http://docker:3000/convert/office", formData: formData}, function(err, httpResponse, body) {
if (err) {
reject('Upload failed!');
}
else {
resolve(body);
}
});
});}
When i'm sending to gotenberg a file with an english name, it works.
But when i try to send a filename with a special characters (written in hebrew: בדיקה.docx), gotenberg fails and returns an error:
unoconv: non-zero exit code: exit status 1
This is probably happening because unoconv doesn't support files with an hebrew filename.
Is there any way to change the file name in the file's ReadStream to something like temp.docx instead of בדיקה.docx on the fly, without renaming the file on my server?
Thanks
You need to change the formData object to the following:
let formData = {
files: {
value: fs.createReadStream(filePath),
options: {
filename: 'test.docx'
}
}
};
Solved this issue for me :)
const FormData = require('form-data');
const form = new FormData();
form.append('file', fs.createReadStream(filepath), {filename: 'newname'});

Categories