Custom name for blob URL file preview if downloaded - javascript

I am using blob to view a file(no specific type) in a new tab. If the file preview is supported, the file is view-able. However, if it is not supported the file gets downloaded. is it possible to stop it from downloading or download using a custom file name instead of random name generated using createObjectURL?
download(image: Blob) {
const url = window.URL.createObjectURL(image);
window.open(url);
}
this.http.post<Blob>(**APILink**, data, { headers: {token details}, responseType: 'blob' as 'json' }).subscribe((response) => {
this.download(response);
}

Related

Download Excel works in Postman but doesn't in Javascript

I'm trying to download an Excel file via AJAX (using Axios). I'm doing this way because I need to send a JWT token to access to it.
Now, I'm getting a file response with the content like this:
Which seems to be binary. In Postman I can set the token and click Save and download button and all works. Now, this is my code in JS:
requestWithFullResponse({
url: url,
method: 'GET',
}, this.props.token, false).then((response) => {
const responseData = response.data
// I've tried with different types and nothing works
// var blob = new Blob([responseData], { type: `${response.headers['content-type']};charset=utf-8` });
// var blob = new Blob([responseData], { type: 'application/octet-stream;charset=utf-8' });
var blob = new Blob([responseData], { type: 'application/octet-stream' });
saveAs(blob, filename, true)
}).catch((error) => { console.log('Error downloading file -> ', error); });
That code downloads the file, but when I open It it Libre Office says that the file is corrupt. What I'm missing? Is there a way to see the code executed by Postman when downloads the file?
Any kind of help would be really appreciated
You need to set: responseType: 'blob'. The default is application/json.

Getting a black image generated from a blob URL

I am using a library react-avatar-editor for creating profile picture of the user. Below is the code which I am using the send the cropped image to the server
const canvasScaled = this.editor.getImageScaledToCanvas(); //a canvas object
canvasScaled.toBlob((blob) => {
this.setState({
preview: blob
},
() => {
let data = new FormData();
const file = new File([blob], "my-image", {
type: "image/png"
});
data.append("file", file);
axios({
method: "POST",
data: data,
url: "/media",
headers: {
'Content-Type': 'multipart/form-data'
}
}).then().catch()
}
);
})
But the image generated at the server is a black image. Tested with many files but result is same - a black image.
My backend service is written in JAVA so I tested if the backend service handling the image correctly or not, I wrote a simple file selected
<input type="file" accept={"image/png"} onChange={onSelect}/>
<button onClick={() => {
const data = new FormData();
data.append("file", this.state.file);
axios({
method: "POST",
data: data,
url: "/media",
headers: {
'Content-Type': 'multipart/form-data'
}
}).then().catch()
}}>Send<button>
And yes, the service side code is working fine with above POST request. Which means, I have definitely something wrong written at the react-avatar-editor handling code.
Is it the blob causing the issue? How the image is sent in the case of a file-selector? Base64 or blob?
Update: I noticed a strange behavior with an image. If I upload this file
This gets converted to
which is not a complete black image. Something fishy is going on with colors.
To check if the library itself is creating the black image, I check it using canvasScaled.toDataURL("image/png") and used this into an image, and yes there is no issue with the created canvas, image is getting rendered.
Try data.append("file", blob); without calling new File. According to MDN,
A File object is a specific kind of a Blob, and can be used in any context that a Blob can. In particular, FileReader, URL.createObjectURL(), createImageBitmap(), and XMLHttpRequest.send() accept both Blobs and Files.
add this in second line after defining canvasScaled
var ctx = canvasScaled.getContext("2d");
ctx.fillStyle = "white";
ctx.fillRect(0, 0, canvas.width, canvas.height);

PDF Blob is showing nothing in new tab, using stream from backend

I used https://github.com/barryvdh/laravel-dompdf
stream method for sending a response to front-end.
Here is my code which I wrote for opening pdf in a new tab.I'm calling stream from backend API in result it gives a response but when I try to create blob it shows nothing in PDF.
APICaller({
method: 'get',
responseType: "arraybuffer",
headers: {
'Accept': 'application/pdf'
},
endpoint: gep('generate/certificate?path=certificate.pdf', 'v3'),
}).then( (data) => {
var file = new Blob([data.data], {type: 'application/pdf'});
var fileURL = URL.createObjectURL(file);
window.open(fileURL);
});
here is Empty PDF
I had a similar issue with axios, it doesn't work for downloading files using Blob. Use XMLHttpRequest and do the similar response handler in its on('load') event to achieve file download.

Posting a base64 encoded PDF file with AJAX

I'm trying to post a base64-encoded PDF file to a Zendesk file upload API endpoint but the file URL returned from the API shows that the file is corrupted.
First I receive the PDF as a base64-encoded string from a separate API call. Let's call it base64String.
If I do window.open("data:application/pdf;base64," + base64String) I can view the PDF in my browser.
Now I am trying to follow the documentation here for uploading files via the API. I can successfully complete a cURL call as shown in the example. However, the jQuery AJAX call will corrupt the PDF file.
client.request({
url: '/api/v2/uploads.json?filename=test.pdf',
type: 'POST',
data: atob(base64String),
contentType: 'application/binary'
}).then(function(data) {
window.open(data.upload.attachment.content_url); // corrupt file
}, function(response) {
console.log("Failed to upload file to Zendesk.");
console.log(response);
});
Like I said, this will succeed but when I visit the content_url the PDF does not display. I am quite sure the file is being corrupt in the POST request.
I have tried uploading the file as a base64 string (without decoding with atob()) with no luck among other things.
UPDATE
I'm still not able to view the PDF after converting the base64 string to blob.
var blob = base64ToBlob(base64String);
console.log(blob); // Blob {size:39574, type: "application/pdf"}
client.request({
url: '/api/v2/uploads.json?filename=test.pdf',
type: 'POST',
data: blob,
processData: false,
contentType: 'application/pdf'
}).then(function(data) {
window.open(data.upload.attachment.content_url); // corrupt file
}, function(response) {
console.log("Failed to upload file to Zendesk.");
console.log(response);
});
function base64ToBlob(byteString) {
// write the bytes of the string to an ArrayBuffer
var ab = new ArrayBuffer(byteString.length);
var ia = new Uint8Array(ab);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
// write the ArrayBuffer to a blob, and you're done
var blob = new Blob([ab], {type: 'application/pdf'});
return blob;
};
I learned that the Zendesk app framework uses a jQuery AJAX wrapper for requests and the arraybuffer type is unsupported, so the file was getting corrupted. The app framework team has fixed the issue.

Multiform Data is not transferring properly (AngularJS to uniPaaS)

I am trying to send an Excel file from my local machine to our webservice.
In my view, I am using input type "file" to select the file, and then I send that file to this webservice call:
uploadArticles: function (file, filename) {
var fd = new FormData();
fd.append('UPLOADFILE', file);
fd.append('APPNAME', 'MY_APP');
fd.append('PRGNAME', 'UPLOAD_EXCEL');
fd.append('SESSIONID', '12345');
fd.append('UPDATE', 'Y');
fd.append('UPLOADFILENAME', filename);
fd.append('ARGUMENTS', 'SESIONID,UPLOADFILE,UPLOADFILENAME,UPDATE');
$http.post(MAGIC_URL, fd, {
transformRequest: angular.identity,
headers: {'Content-Type': undefined}
}).success(function (response) {
console.log('SUCCESS! ', response);
}).error(function (response) {
console.log('ERROR! ', response);
});
}
We are using uniPaaS to take ARGUMENTS file blob and write it to a directory on our server. It is working properly, however when I try to open the file itself, the blob text is being written inside of the Excel file. It is almost as if the file is not interpreted properly.
Here's an example of what is inside of the Excel file:
data:text/rtf;base64,e1xydGYxXGFuc2lcYW5zaWNwZzEyNTJcY29jb2FydGYxNDA0XGNvY29hc3VicnRmNDcwCntcZm9udHRibFxmMFxmc3dpc3NcZmNoYXJzZXQwIEhlbHZldGljYTt9CntcY29sb3J0Ymw7XHJlZDI1NVxncmVlbjI1NVxibHVlMjU1O30KXG1hcmdsMTQ0MFxtYXJncjE0NDBcdmlld3cxMDgwMFx2aWV3aDg0MDBcdmlld2tpbmQwClxwYXJkXHR4NzIwXHR4MTQ0MFx0eDIxNjBcdHgyODgwXHR4MzYwMFx0eDQzMjBcdHg1MDQwXHR4NTc2MFx0eDY0ODBcdHg3MjAwXHR4NzkyMFx0eDg2NDBccGFyZGlybmF0dXJhbFxwYXJ0aWdodGVuZmFjdG9yMAoKXGYwXGZzMjQgXGNmMCB0ZXN0IDEyMzR9
Am I approaching this file upload improperly?
EDIT:
I was able to resolve this. uniPaaS has the ability to transform Base64 to Blobs, but this was not working properly for whatever reason. I was able to convert this Base64 to a Blob using JavaScript, I sent the blob to uniPaaS, and the file was written properly.
For anyone else running into this issue, please see this discussion: Creating a Blob from a base64 string in JavaScript

Categories