I'm trying to POST a blob object from client written in angularJS to NodeJS server. I'm successfully receiving the message.
client side code is something like
var imgData = [/*about 8K of data read from file*/];
var blob = new Blob([imgData], { type: 'application/octet-binary'});
var reader = new FileReader();
reader.addEventListener("loadend", function() {
var request = new XMLHttpRequest();
request.open("POST", "https://xyzabc.io");
request.send(reader.result);
});
reader.readAsArrayBuffer(blob);
I have written a C++ addon as described in https://nodejs.org/api/addons.html
. when i'm trying to parse the arguments it detects attached data as v8::Object & not v8::arrayBuffer.
void RunCallback(const FunctionCallbackInfo<Value>& args) {
if (args[1]->Isobject()) {
//console.log('Not what i was expecting');
}
if (args[1]->IsArrayBuffer()) {
//console.log('this is what i'm looking for');
}
}
What am I doing wrong here?? I blob not supposed to be send that way??
I took client side code from https://developer.mozilla.org/en/docs/Web/API/Blob
In my module i'm did this like:
if (node::Buffer::HasInstance(args[1])) {
//this is buffer
}
And it's ok for node v4.
Related
I am creating a website and I need to make a backend POST server. I have created 90% of it, but the issue I am facing is that I need to send JSON as well as a pdf (normally sent by multipart/form-data, but would need a different route). What I am trying to do is transform the file into base64string, send it over in a request, and then restore it back and write it to a file. This whole thing happens, and the PDF even returns scrambled data, but the PDF is just a blank page when written down even after being converted back to binary and being written
HTML-side JS code:
async function post(endpoint){
let binaryCV;
let CV = document.getElementById("upfile").files[0];
var reader = new FileReader();
reader.onload = (readerEvt)=>{
var binaryString = readerEvt.target.result;
binaryCV = binaryString;
let xhr = new XMLHttpRequest();
let object = {
name: document.getElementById("ecaName").value,
email: document.getElementById("ecaEmail").value,
phone: document.getElementById("ecaTel").value,
class: document.getElementById("ecaClass").value,
institute: document.getElementById("ecaInstitute").value,
paragraph:{
experience: document.getElementById("ecaExp").value,
why: document.getElementById("ecaWhyPart").value,
changes: document.getElementById("ecaWhatChanges").value,
},
CV: binaryCV,
}
xhr.open("POST", `http://localhost:8080/apply/internship/${endpoint}`,true);
xhr.setRequestHeader('Content-Type', 'application/json');
xhr.send(JSON.stringify(object));
};
await reader.readAsBinaryString(CV);
/*xhr.onload = function() {
if (this.readyState == 4 && this.status == 200) {
document.getElementById("demo").innerHTML = xhttp.responseText;
}
};*/
}
Server side JS callback:
app.post('/apply/internship/econoxe', async (req, res)=>{
res.sendStatus(200);
let CV = req.body.CV;
fs.writeFileSync(path.join(__dirname,`../CV/${req.body.email}.pdf`),CV)
console.log(req.body);
})
All this returns 100% blank PDFs (with larger file size than original for some reason) no matter what PDF I upload
Please help
If you know any other way to do what I mean in one request and route, please tell!
I found the answer!I had to change
fs.writeFileSync(path.join(__dirname,`../CV/${req.body.email}.pdf`),CV)
to
fs.writeFileSync(path.join(__dirname,`../CV/${req.body.email}.pdf`),CV,"binary")
And this would fix the issue
I'm working on a REST web application that manages documents between users and uploaders. The backend is written in Java and my Document entity contains, besides various attributes, a byte[] content. I was able to send a file created at server side by
#GET
...
document.setContent(Files.readAllBytes(Paths.get("WEB-INF/testFile.txt")));
return Response.ok(document).build();
and retrieve it at front-end (vueJs) through
async function download(file) {
const fileURL = window.URL.createObjectURL(new Blob([atob(file.content)]));
const fileLink = document.createElement("a");
fileLink.href = fileURL;
fileLink.setAttribute("download",`${file.name}.${file.type}`);
document.body.appendChild(fileLink);
fileLink.click();
fileLink.remove;
window.URL.revokeObjectURL(fileURL);
}
the problem is that when I try to upload a file and then download it, its content is not parsed correctly (is shown undefined, string in Base64 or numbers depending on how I try to solve it). The file is sent by a post request and is retrieved through an input form bound to an onFileSelected function.
function onFileSelected(e) {
var reader = new FileReader();
reader.readAsArrayBuffer(e.target.files[0]);
reader.onloadend = (evt) => {
if (evt.target.readyState === FileReader.DONE) {
var arrayBuffer = evt.target.result;
this.file.content = new Uint8Array(arrayBuffer);
//this.file.content = arrayBuffer;
}
};
}
axios.post(...,document,...)
and I have tried using atob and btoa as well before assigning the value to this.file.content. If I print the file on server Welcome.txt it gives B#ae3b74d and if I use Arrays.toString(welcome.getContent()) it gives an array of numbers but as soon as it passed to the frontend its content become in Base64 welcome: { ... content: IFRoaXMgaXMgYSB0ZXN0IGZpbGUhIAo...}. Any idea? Thank you a lot!
Im trying to send audio from a client (javascript) to a server (java). I take the user audio from the microphone and then make a blob from it (and a url for the blob). The project is a spring boot project so i am looking for a way to send it as a parameter in a method to upload it to the server.
Was hoping that it would be possible to upload the blob to the server, but it seems to only be avalible localy on the browser and since the url for the blob starts with "blob:" before "http" it causes problems.
I have also looked at serialization but dont seem to find a way to do that with a blob in js.
Just passing the blob url here between the client and the server
Client side in js
// Convert the audio data in to blob
// after stopping the recording
mediaRecorder.onstop = function (ev) {
console.log(dataArray);
// blob of type mp3
let audioData = new Blob(dataArray,
{ 'type': 'audio/mp3;' });
// After fill up the chunk
// array make it empty
dataArray = [];
// Creating audio url with reference
// of created blob named 'audioData'
let audioSrc = window.URL
.createObjectURL(audioData);
//console.log(audioSrc);
// Pass the audio url to the 2nd video tag
playAudio.src = audioSrc;
const url = "http://localhost:8080/speech?url=" + audioSrc;
console.log(url);
$.get(url, function(data) {
$("#resultat").html("transcribed tekst: " + data);
});
}
Server in Java
#GetMapping("/speech")
public String speechToText(String url) throws IOException {
try (SpeechClient speechClient = SpeechClient.create()) {
// The path to the audio file to transcribe
String gcsUri = url;
// Builds the sync recognize request
RecognitionConfig config =
RecognitionConfig.newBuilder()
.setEncoding(RecognitionConfig.AudioEncoding.LINEAR16)
.setSampleRateHertz(16000)
.setLanguageCode("en-US")
.build();
RecognitionAudio audio = RecognitionAudio.newBuilder().setUri(gcsUri).build();
// Performs speech recognition on the audio file
RecognizeResponse response = speechClient.recognize(config, audio);
List<SpeechRecognitionResult> results = response.getResultsList();
for (SpeechRecognitionResult result : results) {
// There can be several alternative transcripts for a given chunk of speech. Just use the
// first (most likely) one here.
SpeechRecognitionAlternative alternative = result.getAlternativesList().get(0);
System.out.printf("Transcription: %s%n", alternative.getTranscript());
return alternative.getTranscript();
}
return "idk";
} catch (IOException e) {
e.printStackTrace();
return "noe ble feil";
}
}
I'm using the gcloud API on a Nodejs web server to upload files. I'd prefer the files not be uploaded on the client side and instead uploaded on the server. Currently, I am producing a blob on the client side, then converting it to text and passing that to the server through a POST request. All of the information gets successfully passed from the client to the server as expected. This data is also uploaded to gcloud, however, Gcloud does not recognize this as a valid file nor does my computer when I download it.
What is the best way to get the contents of the file to gcloud from the server side? I've tried using dataURIs and reading the orignal file by text and both produce similiar issues. I've also explored piping a readFileStream from the blob on the server end but blobs are not natively supported by node so I have not done so yet.
Client Side
function readSingleFile(e, func, func2){
var file = e.target.files[0];
if(!file){
return; // Add error msg_here
}
var reader = new FileReader();
reader.onload = function(e){
let contents = e.target.result;
let img = document.createElement('img')
let cvs = document.createElement('canvas');
img.onload = ()=>{
cvs.width = img.width;
cvs.height= img.height;
let ctx = cvs.getContext('2d');
ctx.drawImage(img,0,0);
cvs.toBlob((res)=>{res.text().then((text)=>{func2(text)})}, "image/jpeg", 0.92);
}
img.src=contents;
func(contents);
}
reader.readAsDataURL(file);
}
Server Side
function publishPrintjob(dataObj){
try{
var newElemKey = database.ref().child('queue').push().key; // Get random Key
// Create a new blob in the bucket and upload the file data.
const gcloudFile = storage.file('images/' + newElemKey + '.jpg');
gcloudFile.save(dataObj.sockImageFile, function(err) {
if (!err) {
Console.log("File Uploaded!")
}
});
var data = {
date: dataObj.Date,
email: dataObj.email,
design: dataObj.Design,
author: dataObj.Author,
address: dataObj.address,
imageKey: newElemKey,
}
admin.database().ref('queue/' + newElemKey).set(data);
} catch(err){
console.log(err)
}
}
Note: func simply shows the image on the client side, func2 just adds the contents to the POST object.
Uploading a file directly from the computer would be easiest using the storage.bucket(bucketName).upload() function from the cloud storage library. However, this uses location of a file locally and thus will not work unless a file is transferred to the server and saved first. This could be achieved using multi-part form data. Using multipart or uploading locally are better methods for uploading to google storage.
Instead, I solve this by first converting the image to a dataURI, sending the data URI to the server via the body of a GET request, and then converting it to a buffer with a readable stream that can be piped to google storage.
Client
let formData = getFormData('myForm');
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
// Typical action to be performed when the document is ready:
}
};
xhttp.open("POST", "dashboard", true);
xhttp.setRequestHeader('Content-Type', 'application/json');
xhttp.send(JSON.stringify(formData));
xhttp.onload = ()=> {
console.log(JSON.parse(xhttp.response))
// Handle server response here
};
}
Server
// DataObject is the body of the GET request, the property imageFile is the URI from readFileAsURI
function uploadImageOnServer(dataObj){
try{
var newElemKey = database.ref().child('queue').push().key; // Get random Key to use as filename
// Create a new blob in the bucket and upload the file data.
const gcloudFile = storage.file('images/' + newElemKey + '.jpeg');
var fs = require('fs'); // Should be required at the top of the file
var string = dataObj.ImageFile;
var regex = /^data:.+\/(.+);base64,(.*)$/;
var matches = string.match(regex);
var ext = matches[1];
var data = matches[2];
var buffer = Buffer.from(data, 'base64');
// Create the readstream
const readableInstanceStream = new Readable({
read() {
this.push(buffer);
this.push(null);
}
});
readableInstanceStream.pipe(gcloudFile.createWriteStream()) // link to gcloud storage api
.on('error', function(err) {
console.log('error')
})
.on('finish', function() {
console.log('upload complete')
});
} catch(err){
console.log(err)
}
}
I am following this and I want to upload any type of file to OneDrive. It says that it accepts a buffer for the file content but my following code does not seem to work in case of any type of file. The file gets uploaded but it cannot be opened so the contents are messed up for sure.
Using the following method I am trying to get the body contents so that I can send them with the request.
private fileToBuffer(file: File): Observable<any> {
return Observable.create(observer => {
var arrayBuffer;
var fileReader = new FileReader();
fileReader.onload = function () {
arrayBuffer = this.result;
observer.next(arrayBuffer);
observer.complete();
};
fileReader.readAsArrayBuffer(file);
});
}
I did not notice that the Angular 2's http's PUT was taking the body as string. So, I resorted to using XHR to upload a file with its contents.
var oReq = new XMLHttpRequest();
oReq.open("PUT", url, true);
oReq.setRequestHeader("Content-Type", "text/plain");
oReq.onload = function(e){
console.log('done');
};
oReq.send(arrayBuffer);