save blob audio file on server with xmlhttprequest - javascript

I'm working on an audio web application where you can play along with music and record yourself. I'm working with a recorder plugin and I'm trying to save what has been recorded in a folder on a server. I get a blob file from the plugin via this javascript code:
recorder.exportWAV(function(blob) {
var xhr = new XMLHttpRequest();
var url = '../../audio/recordings/test.wav';
xhr.open('POST',url,true);
xhr.onload = function(e) {
if (this.status == 200) {
console.log(this.responseText);
}
};
xhr.send(blob);
},'audio/wav');
I have never worked with this before so I'm not sure if my code is right. But I get no errors my file is just not saved. I have been searching the internet for this and what I have found is that a lot of people use a php file as url. Why? What php file are they using?
Thanks!

Related

How to request, & output JSON from local browser storage using AJAX (not jQuery)

Is it posible to request a JSON object from my browsers local storage with Ajax? it's just a simple object that i made with Js, and converted into JSON.
I then stored it to local browser storage, but i'm not sure that will work, considering that it might only work to request from a server.
I have seen simular questions about this, but i only see examples of jQuery, not pure JavaScript and AJAX.
<p id="demo"></p>
<script>
var info = {
name: "Josh",
age: 22,
born: "New York"
};
var jason = JSON.stringify(info);
localStorage.setItem("myJason", jason)
var http = new XMLHttpRequest();
http.open("GET", "file:///D:/HTML%20Files/Nettside%20med%20JSON%20og%20AJAX/nettside.html", true);
http.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
document.getElementById("demo").innerHTML = this.responseText;
}
};
http.send();
If you are running the html file from the local disk then yes you can access it by navigating the file structure using "../" to go up a file, however if the HTML file has been loaded from a web server then the only way to access it is to use a file input and then read the file contents. The user must choose the file though.
Here is an article on reading binary data from a file that the user has selected.
https://www.html5rocks.com/en/tutorials/file/dndfiles/

How would I convert a local image file to Base64 to POST to a custom API Endpoint?

I am working on a JavaScript library that is a rich text editor. Currently, we can not copy + paste from a Word Document due to the clipboard being a local file and most browsers block the ability to read local files. Our work around to this issue is to convert the local image file to a base-64, and using our local API, store the image on our servers so the img has a proper, hosted source.
The problem is this has to be handled entirely in JavaScript, which still is run in the browser and I can not see local files to be able to convert them:
This is currently the code I am trying to use, but can't seem to figure out a way to handle this entirely in JavaScript without using some time of File Loader from HTML.
function convertToBase64(){
var xhr = new XMLHttpRequest();
// Set up our request
xhr.open('POST', 'http://uploadFileHere');
//convert image to base64
xhr.onload = function() {
var reader = new FileReader();
reader.onloadend = function() {
callback(reader.result);
}
reader.readAsDataURL(xhr.response);
};
//handle the response from Image API
xhr.onreadystatechange = function() {
if(xhr.readyState == XMLHttpRequest.DONE && xhr.status == 200) {
return xhr.responseText;
}
}
xhr.send(base64String);
}
Is it possible to work around this issue or does my team and I have to come up with another solution?
EDIT: Here is another rich text editor, where it IS possible to copy and paste images from Word, Froala. If this is not possible, how are they doing it?

How to upload file in binary format using ng-file upload library in Angularjs?

I am uploading a file to Amazon S3 Bucket using pre-signed URL from AngularJs. Zip file getting corrupted while opening the downloaded file. It is working as expected when I upload the file from postman in binary format.
I am using ng-file-upload library in angularjs. For every downloaded file contains web-kit form boundary appended in the beginning as follows:
If we edit and remove the web-kit form boundary and try to open the same file its opening perfectly.
I experienced the same problem the week ago and the only solution I came up with, was to write my own code which sends files via http request. Feel free to use the example:
function azureChangeFn() {
var inputElement = document.getElementById("fileSelect");
var newFile = inputElement.files[0];
var url = "https://YOUR_STORAGE_NAME.blob.core.windows.net/CONTAINER_NAME/"+ newFile.name +"GENERATED_SharedAccessSignature_FROM_AZURE_SITE";
var xhr = new XMLHttpRequest();
xhr.addEventListener("progress", updateProgress);
xhr.onreadystatechange = function() {
if (xhr.readyState == XMLHttpRequest.DONE) {
alert("Sent to azure storage service!");
}
}
xhr.open('PUT', url, true);
xhr.setRequestHeader("Content-type", "multipart/form-data");
xhr.setRequestHeader("lng", "en");
xhr.setRequestHeader("x-ms-blob-type", "BlockBlob");
xhr.setRequestHeader("x-ms-blob-content-type", newFile.type);
xhr.send(newFile);
}
function updateProgress (oEvent) {
if (oEvent.lengthComputable) {
var percentComplete = oEvent.loaded / oEvent.total;
console.log(percentComplete);
} else {
alert('FAIL');
}
}
<html>
<head>
</head>
<body>
<label>Example without library:</label>
<br>
<label>Select file: <input type="file" id="fileSelect" onchange="azureChangeFn()"></label>
</body>
</html>
Talking about ng-file-upload library, in source code I found that formData.append method when appending a file, automaticaly adds web-kit header. As far as I understood, FormData is used for POST method, but when using PUT all body is taken as a file. Sadly I was unable to upload file to azure storage with POST method.

XMLHttpRequest Status of 404 With Blob URL

I'm trying to upload a blob URL generated by getUserMedia to the server.
Someone else wanted to do the same thing, and that question has been answered.
I am using the same code for the XHR request as the person who answered that question, but I'm getting a 404. I made a fiddle, and the same thing happens when you look at the console. I have a live example on my site as well, and the same thing happens.
Here is the piece of code that needs some TLC:
function XHR(){
var xhr = new XMLHttpRequest();
xhr.open('GET',record.src); //404 ???
xhr.responseType = 'blob';
xhr.onload = function(e) {
if (this.status == 200){
//do some stuff
result.innerText = this.response;
}
};
xhr.send();
}
Everybody else seems to be able to do this, because this has been discussed before.
Exhibit A
Exhibit B
I'm using ChromeOS, 41.0.2252.2 dev. What exactly is going on, and why can't I get the blob?
I'm almost certain the media in a MediaStream isn't saved anywhere, just thrown away after use.
There is a API in the works to record streams, MediaRecorder .
Only Firefox has the most basic implementation of this so it isn't usable as yet.
If you're implementing this on a mobile device you can use a file input with the capture attribute.
<input type="file" accept="video/*" capture>
function XHR(){
var xhr = new XMLHttpRequest();
xhr.open("GET","record.src",true); // adding true will make it work asynchronously
xhr.responseType = 'blob';
xhr.onload = function(e) {
if (this.status == 200){
//do some stuff
result.innerText = this.response;
}
};
xhr.send();
}
Try now! it should work.
look at this post:
Html5 video recording and upload?
What you are missing is the declaration of what "blob" is. First thing this person does inside the .onload function() is
var blob = new Blob([this.response], {type: 'video/webm'});

Sending tab screenshots to a remote server from a firefox addon

I have firefox addon (made using the firefox addon sdk) which when instructed takes a screenshot of the currently active tab and sends it in an ajax request to a remote server.
The following code in addon-script main.js is responsible for taking getting the thumbnail
var tab_image_data=tabs.activeTab.getThumbnail();
var tab_image_data = base64.encode(tab_image_data);
panel.port.emit("screenshot",{image_data:tab_image_data});
The image data generated by the function getThumbnail() is sent to a content script file belonging to a panel.
In the content script the following code is responsible for sending the image data to the server
var tab_image_data=addonmessage.image_data;
var myBlob = new Blob([tab_image_data], { "type" : "text/base64data"} );
var formData = new FormData();
formData.append('img',myBlob);
var xhr = new XMLHttpRequest();
xhr.open("POST", "http://xyz.abc.com/imageshot/index.php", true);
xhr.onload = function() {
if (xhr.status == 200) {
console.log('all done: sq');
} else {
console.log('Nope');
}
};
xhr.send(formData);
Everything works fine and the image file is created on server in jpeg format. But when I try to view that image, the windows photo viewer gives a not supported file format error.
I tried to base64 encode the data before sending it but that did not work as well. Suprisingly I have a chrome extension that performs a similar function and that works flawlessly. In both cases the same php script is being used on the server side.
Any help regarding this matter will be highly appreciated
Thanks in advance.

Categories