I am trying to download PDF file from FTP server with Jquery Ajax request. I referred http://www.dave-bond.com/blog/2010/01/JQuery-ajax-progress-HMTL5/.
My Jquery ajax call is as below
$.ajax({
xhr: function () {
var xhr = new window.XMLHttpRequest();
//Download progress
xhr.addEventListener("progress", function (evt) {
console.log("Event :"+evt.lengthComputable);
if (evt.lengthComputable) {
var percentComplete = evt.loaded / evt.total;
//Do something with download progress
console.log(percentComplete);
}
}, false);
return xhr;
},
type: 'POST',
url: "Downloader.ashx",
success: function (data) {
//Do something success-ish
}
});
And My C# generic handler code to download file is as below
public void ProcessRequest(HttpContext context)
{
DownLoadFilesFromFTp("MyFile.pdf", "Foldername");
}
public bool DownLoadFilesFromFTp(string fileName,string ftpFolder)
{
//Create FTP Request.
try
{
string Ftp_Host = System.Configuration.ConfigurationManager.AppSettings["Ftp_Host"];
string Ftp_UserName = System.Configuration.ConfigurationManager.AppSettings["Ftp_UserName"];
string Password = System.Configuration.ConfigurationManager.AppSettings["Password"];
string downloadpath= System.Configuration.ConfigurationManager.AppSettings["downloadpath"];
//Fetch the Response and read it into a MemoryStream object.
string ftpurl = Ftp_Host + ftpFolder + "/" + fileName;
FtpWebRequest reqFTP;
reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri(ftpurl));
reqFTP.Credentials = new NetworkCredential(Ftp_UserName, Password);
reqFTP.KeepAlive = false;
reqFTP.Method = WebRequestMethods.Ftp.DownloadFile;
reqFTP.UseBinary = true;
reqFTP.Proxy = null;
reqFTP.UsePassive = false;
FtpWebResponse response = (FtpWebResponse)reqFTP.GetResponse();
Stream responseStream = response.GetResponseStream();
FileStream writeStream = null;
//if (fileName.Substring(fileName.Length - 3, 3) == "pdf" || fileName.Substring(fileName.Length - 3, 3) == "PDF")
//{
writeStream = new FileStream(downloadpath + fileName, FileMode.Create);
//}
int Length = 2048; // 2048;
Byte[] buffer = new Byte[Length];
int bytesRead = responseStream.Read(buffer, 0, Length);
while (bytesRead > 0)
{
writeStream.Write(buffer, 0, bytesRead);
bytesRead = responseStream.Read(buffer, 0, Length);
}
responseStream.Close();
writeStream.Close();
response.Close();
return true;
}
catch (WebException wEx)
{
return false;
}
catch (Exception ex)
{
return false;
}
}
When I run a code files downloads to a folder without any issues and on Ajax call
if (evt.lengthComputable) {
}
When I console evt i got below result
Always returns false so i am unable to track a progress.
1) is there anything wrong with the code ?
2) Any alternative way to show progress bar while downloading pdf
For the bytes uploaded, it is quite easy to show progress bar. Just monitor the xhr.upload.onprogress event. The browser knows the size of the files it has to upload and the size of the uploaded data, so it can provide the progress info.
For the bytes downloaded, it is a little bit more difficult, because the only thing that the browser knows in this case is the size of the bytes it is receiving.
The reason of evt.lengthComputable is 0 is that the browser doesn't
know how many bytes will be sent in the server request.
There is a solution for this, it's sufficient to set a Content-Length header on the server like below, in order to get the total size of the bytes the browser is going to receive.
// prepare the response to the client. resp is the client Response
var resp = HttpContext.Current.Response;
// Add Content-Length of the file to headers
// if the headers is not set then the evt.loaded will be 0
resp.AddHeader("Content-Length", "lengthOfYourFile");
Your code JS side look fine.
I am not C# programmer, but i observed that C# server side, download the file ftp and save it to disk server, but never response/send the PDF binary to JS SIDE?
From JS side is 0 bytes download. and evt.lengthComputable is alway false/0.
Related
I'm using the gcloud API on a Nodejs web server to upload files. I'd prefer the files not be uploaded on the client side and instead uploaded on the server. Currently, I am producing a blob on the client side, then converting it to text and passing that to the server through a POST request. All of the information gets successfully passed from the client to the server as expected. This data is also uploaded to gcloud, however, Gcloud does not recognize this as a valid file nor does my computer when I download it.
What is the best way to get the contents of the file to gcloud from the server side? I've tried using dataURIs and reading the orignal file by text and both produce similiar issues. I've also explored piping a readFileStream from the blob on the server end but blobs are not natively supported by node so I have not done so yet.
Client Side
function readSingleFile(e, func, func2){
var file = e.target.files[0];
if(!file){
return; // Add error msg_here
}
var reader = new FileReader();
reader.onload = function(e){
let contents = e.target.result;
let img = document.createElement('img')
let cvs = document.createElement('canvas');
img.onload = ()=>{
cvs.width = img.width;
cvs.height= img.height;
let ctx = cvs.getContext('2d');
ctx.drawImage(img,0,0);
cvs.toBlob((res)=>{res.text().then((text)=>{func2(text)})}, "image/jpeg", 0.92);
}
img.src=contents;
func(contents);
}
reader.readAsDataURL(file);
}
Server Side
function publishPrintjob(dataObj){
try{
var newElemKey = database.ref().child('queue').push().key; // Get random Key
// Create a new blob in the bucket and upload the file data.
const gcloudFile = storage.file('images/' + newElemKey + '.jpg');
gcloudFile.save(dataObj.sockImageFile, function(err) {
if (!err) {
Console.log("File Uploaded!")
}
});
var data = {
date: dataObj.Date,
email: dataObj.email,
design: dataObj.Design,
author: dataObj.Author,
address: dataObj.address,
imageKey: newElemKey,
}
admin.database().ref('queue/' + newElemKey).set(data);
} catch(err){
console.log(err)
}
}
Note: func simply shows the image on the client side, func2 just adds the contents to the POST object.
Uploading a file directly from the computer would be easiest using the storage.bucket(bucketName).upload() function from the cloud storage library. However, this uses location of a file locally and thus will not work unless a file is transferred to the server and saved first. This could be achieved using multi-part form data. Using multipart or uploading locally are better methods for uploading to google storage.
Instead, I solve this by first converting the image to a dataURI, sending the data URI to the server via the body of a GET request, and then converting it to a buffer with a readable stream that can be piped to google storage.
Client
let formData = getFormData('myForm');
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
// Typical action to be performed when the document is ready:
}
};
xhttp.open("POST", "dashboard", true);
xhttp.setRequestHeader('Content-Type', 'application/json');
xhttp.send(JSON.stringify(formData));
xhttp.onload = ()=> {
console.log(JSON.parse(xhttp.response))
// Handle server response here
};
}
Server
// DataObject is the body of the GET request, the property imageFile is the URI from readFileAsURI
function uploadImageOnServer(dataObj){
try{
var newElemKey = database.ref().child('queue').push().key; // Get random Key to use as filename
// Create a new blob in the bucket and upload the file data.
const gcloudFile = storage.file('images/' + newElemKey + '.jpeg');
var fs = require('fs'); // Should be required at the top of the file
var string = dataObj.ImageFile;
var regex = /^data:.+\/(.+);base64,(.*)$/;
var matches = string.match(regex);
var ext = matches[1];
var data = matches[2];
var buffer = Buffer.from(data, 'base64');
// Create the readstream
const readableInstanceStream = new Readable({
read() {
this.push(buffer);
this.push(null);
}
});
readableInstanceStream.pipe(gcloudFile.createWriteStream()) // link to gcloud storage api
.on('error', function(err) {
console.log('error')
})
.on('finish', function() {
console.log('upload complete')
});
} catch(err){
console.log(err)
}
}
I am doing a file upload with a POST ajax call from JS client side to my receiveFile.aspx server side. (Say for a file size of 5MB)
Ajax reports file progress very fast. The file is uploaded in 1-3 seconds. (I slowed down my network speed to be sure and yes I get progress reports nicely)
However after completing the POST request i have to wait for about 5-10 seconds for the receiveFile.aspx to complete the request and respond back.
ReceiveFile is very basic
Private Sub ReceiveFile()
Dim targetFileName = "uploadedFile.jpg"
Using ms = New MemoryStream()
Dim buffer = New Byte(4096) {}
Dim bytesRead = -1
Do Until bytesRead = 0
bytesRead = Context.Request.InputStream.Read(buffer, 0, buffer.Length)
ms.Write(buffer, 0, bytesRead)
Loop
Using fs = New FileStream(targetFileName, FileMode.Create)
ms.WriteTo(fs)
End Using
End Using
End Sub
I have debugged this and seen that
The ReceiveFile does not start until POST is completed, so IIS waits
for the whole request to complete
The call hangs at Context.Request.InputStream, waiting the data to be fully loaded to InputStream
If i use Context.Request.GetBufferlessInputStream instead, it will slowly process the stream as data is available but the end result is the same, takes 5-10 seconds to get the whole stream.
This is my question. The file stream has already been loaded to ISS. Why does it wait another 5-10 seconds as if the file is being loaded again from client?
Is there anything I can do to speed things up?
Edit: File Upload code
var fileName = "sampleFileName.jpg"
var reader = new FileReader;
var UploadProgress = 0;
reader.onloadend = (e) => {
var xhr = new XMLHttpRequest();
xhr.upload.addEventListener("progress", (e) => {
UploadProgress = (e.loaded / e.total * 100).toDecimal(0);
}, false);
xhr.onreadystatechange = () => {
if (xhr.readyState == 4) {
if (xhr.status != 200) {
//fail
}else{
//success
}
}
};
xhr.open("POST", "URL of receiveFile.aspx", true);
xhr.setRequestHeader("Content-Type", file.type);
xhr.send(reader.result);
}
reader.readAsArrayBuffer(file);
I have set up in an SPA application the ability to send files to Azure Blob Storage.
To do this I used XMLHttpRequest and FormData (my users are on computers managed by my company and all have access to HTML5).
In order to manage security, the sending of each file is preceded by a call to a method of a Web Api to obtain the shared access signature.
I forward the Content-Type of the file as well as other information to headers.
Everything happens for the better, the files are correctly sent and saved in Azure Blob Storage, but during the transfer, the image files seem to be "altered".
They are well present, I can download them and read them after the download, but I can not open them directly from an img tag.
On the other hand if I send the same image file via Microsoft Azure Storage Explorer, there is no problem, the image is well recognized in the img tag.
However, in one case as in the other, the content-type is marked as "image / jpeg". The only noticeable difference is that the MD5 is not the same between these 2 mailings while it is the same file of origin.
From my findings it seems that there is text added to the beginning and the end of the file when sending via XMLHttpRequest.
I explain my code so that you can guide me:
Note 1 : I use typescript (but a javascript solution will suit me) and Promise.
Note 2 : I have resolve all the CORS problems.
Note 3 : I'm using Azure Storage Emulator, but i try with the normal Azure service and the problem is the same.
Here is the text added in the image in Chrome:
------WebKitFormBoundaryKj5cK88faAwJd4av
Content-Disposition: form-data; name="file1"; filename="test.jpg"
Content-Type: image/jpeg
[image content]
------WebKitFormBoundaryKj5cK88faAwJd4av--
My Web Api :
[Route(#"api/Storage/FileSas/Customers/{id:int}")]
public async Task<IHttpActionResult> GetFileSas(int id, string fileName, long? fileSize = 0, string contentType = null)
{
if (string.IsNullOrWhiteSpace(fileName))
this.ModelState.AddModelError("fileName", "File name i");
if (!fileSize.HasValue || fileSize.Value > maxFileSize)
this.ModelState.AddModelError("fileSize", "File size exceeded");
if (!this.ModelState.IsValid)
return BadRequest(this.ModelState);
var serverUrl = ConfigurationManager.AppSettings[SERVER_URL];
var container = ConfigurationManager.AppSettings[CONTAINER_NAME];
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Write,
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-60),
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(60),
};
CloudBlockBlob blobFile = blobContainer.GetBlockBlobReference(Path.Combine("customers", id.ToString(), fileName));
var exists = await blobFile.ExistsAsync();
if (exists)
{
await blobFile.SnapshotAsync();
}
var signature = blobFile.GetSharedAccessSignature(policy);
return Content<string>(HttpStatusCode.Created, Path.Combine(serverUrl, container, blobFile.Name + signature));
}
My TypeScript file :
context.Storage.getFileSas(customerId, file)
.then((response: Interfaces.Result<string>) => {
let sasUrl = response.Data;
let formData = new FormData();
formData.append("file1", file, file.name);
var xhr = new XMLHttpRequest();
xhr.upload.onprogress = (event) => {
if (event.total > 0)
this.Progress(event.loaded * 100 / event.total);
};
xhr.onloadstart = function (e) {
}
xhr.onloadend = (e) => {
this.Progress(0);
}
xhr.open("PUT", sasUrl, true);
xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');
xhr.setRequestHeader('Content-Type', file.type);
xhr.setRequestHeader('x-ms-blob-content-type', file.type);
xhr.setRequestHeader('x-ms-version', "2016-05-31");
xhr.setRequestHeader('x-ms-meta-CustomerId', customerId);
xhr.setRequestHeader('x-ms-meta-UserId', context.User.User.Id.toString());
xhr.setRequestHeader('x-ms-meta-UserName', context.User.User.Name);
xhr.send(formData);
})
})).catch((error) => {
console.log(error);
});
File come from here :
let fileInputElement1: HTMLInputElement = <HTMLInputElement>document.getElementById("file1");
let file = fileInputElement1.files[0];
My HTML part : (i'm using knockout)
<form method="put" target="_blank" enctype="multipart/form-data">
<input type="file" name="name" value="" id="file1" />
<button data-bind="click:send"> Send</button>
</form>
If someone have an idea ? ...
Thank's in advance.
PS : sasUrl is like this : http://127.0.0.1:10000/devstoreaccount1/customers/65143/test.jpg?sv=2016-05-31&sr=b&sig=s0671%2BLvCZTqyNfhlCthZW8KftjKyIMAlOT1nbsnlng%3D&st=2017-03-05T11%3A38%3A22Z&se=2017-03-06T12%3A38%3A22Z&sp=r&rsct=image%2Fjpeg
Thank's to Gaurav Mantri, he point me to the right, here my modifications (only because i use typescript) :
context.Storage.getFileSas(customerId, file)
.then((response: Interfaces.Result<string>) => {
let sasUrl = response.Data;
var xhr = new XMLHttpRequest();
xhr.upload.onprogress = (event) => {
if (event.total > 0)
this.Progress(event.loaded * 100 / event.total);
};
xhr.onloadstart = function (e) {
}
xhr.onloadend = (e) => {
this.Progress(0);
}
let reader = new FileReader();
reader.readAsArrayBuffer(file);
reader.onloadend = (event) => {
let target = <FileReader>event.target;
if (target.readyState == reader.DONE) {
var requestData = new Uint8Array(target.result);
xhr.open("PUT", sasUrl, true);
xhr.responseType = "blob";
xhr.setRequestHeader('X-Requested-With', 'XMLHttpRequest');
xhr.setRequestHeader('X-File-Name', file.name);
xhr.setRequestHeader('x-ms-blob-type', 'BlockBlob');
xhr.setRequestHeader('Content-Type', file.type || 'application/octet-stream');
xhr.setRequestHeader('x-ms-blob-content-type', file.type || 'application/octet-stream');
xhr.setRequestHeader('x-ms-version', "2016-05-31");
xhr.setRequestHeader('x-ms-meta-CustomerId', customerId);
xhr.setRequestHeader('x-ms-meta-UserId', context.User.Id.toString());
xhr.setRequestHeader('x-ms-meta-UserName', context.User.Name);
xhr.send(requestData);
}
}
})
})).catch((error) => {
console.log(error);
});
Now i'll start to write a Promise to embedded this fonctionality.
PS : i didn't find the way to mark the Gaurav Mantri as answer so i create mine.
PS 2 : I'll like to put some +1 to Gaurav Mantri for the help... but i can't :/
I send multiple files chunked into Blob's over XHR2 to a Node.js/Express server.
How can I receive them on the server while making sure they are put together correctly? In their right order and to the right file when multiple files are uploaded "at once".
Following is the code (both front- and backend) I have so far but doesn't account for multiple uploads yet.
Frontend:
// 'files' is of type FileList, directly from file input.
for (var i = 0, length = files.length; i < length; i++) {
var file = files[i];
var bytes = 51200; // 50 KB
var size = file.size;
var start = 0;
var end = bytes;
while (start < size) {
sendBlob(file.slice(start, end), file.name, file.type);
start = end;
end = start + bytes;
}
}
// sendBlob()
var sendBlob: function (data, filename, filetype) {
var xhr = new XMLHttpRequest();
xhr.open('POST', this.url, false);
xhr.setRequestHeader('X_FILENAME', filename);
xhr.setRequestHeader('Content-Type', filetype);
xhr.send(data);
};
Backend:
app.post('/', function (req, res) {
var body = '';
req.on('data', function (data) {
body += data;
});
req.on('end', function () {
var filename = req.headers['x_filename'];
var newPath = __dirname + '/upload/' + filename;
fs.writeFile(newPath, body, function (err) {
res.send({
filename: filename
});
});
});
});
Very small text files are stored correctly but images seem to always get messed up and end up with a bigger file size. Bigger text files are written correctly but there the first chunk seems to be missing.
Your upload logic is naive. Here are some things you should do to ensure correctness :
You have to maintain and communicate the chunk id/number between client and server so that order can be maintained.
var sendBlob: function (data, filename, filetype, chunkid)
//set chunkid in header or in data.
In your server you are accepting any post request and appending it to the body. You should maintain variables for filename and filetype and match it with incoming request before appending it.
Files[Name] = { //Create a new Entry in The Files Variable for each new file
Filetype : "",
FileSize: 0,//size of Data in buffer
Data: "", //buffer for storing data
Downloaded: //chunks recieved
}
Append to Data only when you check it. (Extra file size could be due to this)
In your fs.writeFile you should set encoding as binary, image and video files are binary encoded and writing them into default utf-8 encoding may corrupt them.
fs.writeFile(newPath, body, 'binary', function (err){...});
(optional) For each chunk received by server it should send an acknowledgement back to client so that it knows which chunk is dropped and must be sent.
I have to create an image uploader for a future project (No flash, IE10+, FF7+ etc.) that does image resizing/converting/cropping on the clientside and not on the server.
So I made a javascript interface where the user can 'upload' their files and get resized/cropped in the browser directly, without ever contacting the server. The performance is OK, not that good, but it works.
The endresult is an array of canvas elements. The user can edit/crop the images after they got resized, so I keep them as canvas instead of converting them to jpeg. (Which would worsen the initial performance)
Now this works fine, but I don't know what's the best way to actually upload the finished canvas elements to the server now. (Using a asp.net 4 generic handler on the server)
I have tried creating a json object from all elements containing the dataurl of each canvas.
The problem is, when I got 10-40 pictures, the browser starts freezing when creating the dataurls, especially for images that are larger than 2 megabyte.
//images = array of UploadImage
for (var i = 0; i < images.length; i++) {
var data = document.getElementById('cv_' + i).toDataURL('image/jpg');
images[i].data = data.substr(data.indexOf('base64') + 7);
}
Also converting them to a json object (I am using json2.js) usually crashes my browser. (FF7)
My object
var UploadImage = function (pFileName, pName, pDescription) {
this.FileName = pFileName;
this.Name = pName;
this.Description = pDescription;
this.data = null;
}
The upload routine
//images = array of UploadImage
for (var i = 0; i < images.length; i++) {
var data = document.getElementById('cv_' + i).toDataURL('image/jpg');
images[i].data = data.substr(data.indexOf('base64') + 7);
}
var xhr, provider;
xhr = jQuery.ajaxSettings.xhr();
if (xhr.upload) {
xhr.upload.addEventListener('progress', function (e) {
console.log(Math.round((e.loaded * 100) / e.total) + '% done');
}, false);
}
provider = function () {
return xhr;
};
var ddd = JSON.stringify(images); //usually crash here
$.ajax({
type: 'POST',
url: 'upload.ashx',
xhr: provider,
dataType: 'json',
success: function (data) {
alert('ajax success: data = ' + data);
},
error: function () {
alert('ajax error');
},
data: ddd
});
What would be the best way to send the canvas elements to the server?
Should I send them all at once or one by one?
Uploading files one by one is better. Requires less memory and as soon as one file ready to upload, the upload can be started instead of waiting while all files will be prepared.
Use FormData to send files. Allows to upload files in binary format instead of base64 encoded.
var formData = new FormData;
If Firefox use canvas.mozGetAsFile('image.jpg') instead of canvas.toDataUrl(). Allow to avoid unnecessary conversion from base64 to binary.
var file = canvas.mozGetAsFile('image.jpg');
formData.append(file);
In Chrome use BlobBuilder to convert base64 into blob (see dataURItoBlob function
accepted
After playing around with a few things, I managed to figure this out myself.
First of all, this will convert a dataURI to a Blob:
//added for quick reference
function dataURItoBlob(dataURI) {
// convert base64/URLEncoded data component to raw binary data held in a string
var byteString;
if (dataURI.split(',')[0].indexOf('base64') >= 0)
byteString = atob(dataURI.split(',')[1]);
else
byteString = unescape(dataURI.split(',')[1]);
// separate out the mime component
var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0];
// write the bytes of the string to a typed array
var ia = new Uint8Array(byteString.length);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
return new Blob([ia], {type:mimeString});
}
From this question):
var blob = dataURItoBlob(canvas.toDataURL('image/jpg'));
formData.append(blob);
And then send the formData object. I'm not sure how to do it in jQuery, but with plain xhr object it like so:
var xhr = new XMLHttpRequest;
xhr.open('POST', 'upload.ashx', false);
xhr.send(formData);
On server you can get files from Files collection:
context.Request.Files[0].SaveAs(...);