Send a file with POST request using C# to Node.js server - javascript

i've a node.js local server and i want to upload a file from WP to this server using a POST request. Node.js server can parse a multipart/form-data - read data from "files" and "fields" and save the file in a local folder using node-formidable library.
How i can send a request like this (using the web form) using C# for WinRT?:
https://www.imageupload.co.uk/images/2015/04/27/Cattura.png
I try with this code:
public async Task<string> SendWithMultiPartForm()
{
string servResp = "";
using (var content = new MultipartFormDataContent(boundary))
{
content.Headers.Remove("Content-Type");
content.Headers.TryAddWithoutValidation("Content-Type", "multipart/form-data; boundary=" + boundary);
content.Add(new StreamContent(await fileToSend.OpenStreamForReadAsync()), "files", "files");
HttpClientHandler handler = new HttpClientHandler();
cookieContainer = new CookieContainer();
handler.CookieContainer = cookieContainer;
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Post, SiteURI);
request.Headers.ExpectContinue = false;
request.Content = content;
httpClient = new HttpClient(handler);
HttpResponseMessage response = await httpClient.SendAsync(request);
response.EnsureSuccessStatusCode();
servResp = await response.Content.ReadAsStringAsync();
}
return servResp;
}
but the result is:
https://www.imageupload.co.uk/images/2015/04/27/Cattura2.png
Can anybody help me? Thanks!

Related

How do I send compressed data from Laravel server to frontend JS (Solved)

Solved: Added the solution I used in an answer below
I have a compressed json file in my storage folder with path storage/app/public/data.json.gz I am trying to send this data to my js frontend via a fetch request. Sending the data is working but I am having trouble decompressing it back to JSON with js to use in my js code. I've read that it might be possible for the browser to automatically decompress this but I'm not sure how to enable it. Also the reason I am trying to send over the data compressed is because it is 130 MB of data that shrinks down to 7 MB when compressed and I am hoping that sending less data will speed up the site for users.
Laravel route that sends compressed file
Route::get('/chunks/{index}', function ($index) {
$path = 'public/chunks_'.$index.'.json.gz';
if (!Storage::exists($path)) {
abort(404);
}
return Storage::response($path);
});
Currently I am using the fetch API to get the data
JS Code
let chunks = await getZipFile("/chunks/0",[]).then((data) => {
return data;
});
public static async getZipFile(url: string, params: any = {}, method = "GET", headers: any = {
"Content-Type": "application/zip",
}){
headers['X-Requested-With'] = 'XMLHttpRequest';
let options: any = {
'method': method,
'headers': headers
};
url += "?" + new URLSearchParams(params).toString();
const result = await fetch(url, options).then((response) => response);
return result;
};
any help would be appreciated, currently I can retrieve the compressed data and convert it to a string with result.text() but I have not been able to figure out how to decompress it. I tried using Zlib to decompress but got an error Can't resolve './zlib_bindings'. So looking for a solution similar to using Zlib (or something similar) to decompress or figuring out to configure the server/browser to automatically decompress.
I ended up taking Moradnejad's answer and used zip.js, here is the updated code
Laravel Route:
Instead of declaring a route to send the file I used a laravel symbolic link to get a .zip file from my public storage. (https://laravel.com/docs/9.x/filesystem#the-public-disk)
Also incase it is helpful here is the command I wrote to create the .zip file from the files in my storage.
public function handle()
{
$fileNames = [];
for($i = 0; $i < 10000/1000; $i++){
array_push($fileNames,'public/chunks_'.$i.'.json');
}
$this->zipFiles($fileNames,'./storage/app/public','./storage/app/public/chunks.zip');
return 0;
}
public function zipFiles($files,$path,$zipFileNameAndPath) {
$zip = new ZipArchive;
$zip->open($zipFileNameAndPath, ZipArchive::CREATE);
foreach ($files as $file) {
$zip->addFile($path.'/'.$file,$file);
}
$zip->close();
}
Updated JS request code, I used result.blob() to return a blob of the data.
public static zipRequest(url: string){
return this.getZipFile(url);
}
public static async getZipFile(url: string, params: any = {}, method = "GET", headers: any = {
"Content-Type": "application/zip",
}){
headers['X-Requested-With'] = 'XMLHttpRequest';
let options: any = {
'method': method,
'headers': headers
};
if ("GET" === method) {
url += "?" + new URLSearchParams(params).toString();
} else {
//add csrf token to post request
options.headers["X-CSRF-TOKEN"] = document.querySelector<HTMLElement>('meta[name="csrf-token"]')!.getAttribute('content');
options.body = JSON.stringify(params);
}
const result = await fetch(url, options).then((response) => response);
return result.blob();
};
Updated JS handle blob result, I am using zip.js to get all 10 json files from the .zip data, and then I am merging the 10 json files together.
import * as zip from "#zip.js/zip.js";
async function getAllChunks() {
let chunks = await Helper.getZipFile("storage/chunks.zip",[]).then( async (data) => {
//console.log(data);
let allChunks: any = [];
let textReader = new zip.BlobReader(data);
let zipReader = new zip.ZipReader(textReader);
let entries = await zipReader.getEntries();
for(let i = 0; i < entries.length; i++){
let entry = entries[i];
if(entry.getData){
let textWriter = new zip.TextWriter();
let jsonString = await entry.getData(textWriter);
let chunks = await JSON.parse(jsonString);
allChunks.push.apply(allChunks, chunks);
}
}
return allChunks;
});
return chunks;
}
You're mixing two ideas. HTTP requests can be compressed and decompressed at a lower level than application level. This means that it will handle compression and decompression by itself, if enabled. See here.
What you have here is a compressed file. No frontend or ajax call would decompress it automatically for you.
Solutions:
Either enable compression for HTTP requests and depend on it to handle compression automatically. So send the uncompressed in this version. This could be helpful.
Or use a frontend library, like 'zip.js', to decompress when you receive the compressed file.

How do I parse an image sent from Retrofit API from Android (multipart/form) in Python's Flask

I am sending my image as a part of Form Data through Retrofit API. There are no issues loading the image. I am trying to get this image in a Python Flask server.
My python code is not responding the expected way. I have tested my Python code with a JavaScript frontend application and the python server responds as expected. I believe the issue is parsing the multipart/form file which I receive from Android.
There are no network issues, I am able to log the requests. The detectFace() function is not responding as expected for the same image sent through both clients, VueJs and Android.
Any ideas will be appreciated.
Here is the android code for uploading:
private void sendImageToServer() {
File imageFile = loadImageFromStorage(tempImagePath);
RequestBody reqBody = RequestBody.create(MediaType.parse("image/jpeg"), imageFile);
MultipartBody.Part partImage = MultipartBody.Part.createFormData("file", "testImage", reqBody);
API api = RetrofitClient.getInstance().getAPI();
Call<TestResult> upload = api.uploadImage(partImage);
upload.enqueue(new Callback<TestResult>() {
#Override
public void onResponse(Call<TestResult> call, Response<TestResult> response) {
if(response.isSuccessful()) {
TestResult res = response.body();
String jsonRes = new Gson().toJson(response.body());
String result = res.getResult();
Log.v("REST22", result);
}
}
#Override
public void onFailure(Call<TestResult> call, Throwable t) {
Log.v("REST22", t.toString());
Toast.makeText(MainActivity.this, t.toString(), Toast.LENGTH_SHORT).show();
}
});
}
Here is Python code:
#app.route('/detectFaces/', methods=['POST'])
def detectFaces():
img = request.files.get('file')
print('LOG', request.files)
groupName = 'random-group-03'
result = face.detectFaces(img, groupName)
print('RESULT', result)
return {'result' : result[0]}
VueJs - alternate working frontend (REST client):
sendImage(img) {
console.log(img)
var form = new FormData();
form.append('file', img, 'testImage')
axios.post(this.baseUrl + 'detectFaces/?groupName=random-group-03', form)
.then(res => {console.log(res.data); this.log = 'Detected face ids: \n ' + res.data.result});
}

pinFileToIPFS API calling for S3 File URL

My file is image link or any other file link. I was trying to call pinata pinFileToIpfs API (Documentation).
According to documentaion , i have to pass path of local file for appending. But i have AWS URL. How to still call the below API?
let data = new FormData();
data.append('file', fs.createReadStream('./yourfile.png'));
NOTE : I tried this also
data.append('file', s3.getObject({Bucket: myBucket, Key: myFile})
.createReadStream());
but it didnt worked out.
I spent 3 days on this and finally got it to work. The trick is that you will have a hard time POSTing a Stream directly to the Pinata API as it expects a File. I finally gave up and took the Stream coming from S3 and saved it to a temporary file on the server, then sent it to Pinata, then deleted the temp file. That works.
try
{
//Copy file from S3 to IPFS
AmazonS3.AmazonS3Utility m = new AmazonS3.AmazonS3Utility();
Stream myfile = m.GetObjectStream(fileName);
using (var client = new RestClient("https://api.pinata.cloud"))
{
//Add IPFS Metadata
JObject pinataOptions = new JObject(
new JProperty("cidVersion", "0")
);
string pO = JsonConvert.SerializeObject(pinataOptions, Formatting.Indented);
JObject pinataMetadata = new JObject(
new JProperty("name", fileName),
new JProperty("keyvalues",
new JObject(
new JProperty("file_url", FileUrl),
new JProperty("description", "")
)
));
string pM = JsonConvert.SerializeObject(pinataMetadata, Formatting.Indented);
string tempFile = AppDomain.CurrentDomain.BaseDirectory + #"temp\\" + fileName;
long fileLength;
using (FileStream outputFileStream = new FileStream(tempFile, FileMode.Create))
{
myfile.CopyTo(outputFileStream);
fileLength = outputFileStream.Length;
}
var request = new RestRequest("/pinning/pinFileToIPFS", Method.Post);
request.AddQueryParameter("pinata_api_key", your_APIKey);
request.AddQueryParameter("pinata_secret_api_key", your_SecretAPIKey);
request.AddParameter("pinataOptions", pO);
request.AddParameter("pinataMetadata", pM);
request.AddHeader("Authorization", your_Pinata_JWT);
request.AddFile("file", tempFile, fileName);
RestResponse response = await client.ExecutePostAsync(request);
File.Delete(tempFile);
return response.Content;
}
}
catch { }

Java: Image upload with JavaScript - File is damaged, corrupted or too large

I am using Spring Boot as backend server and I have a JavaScript frontend.
For sending data between front- and backend I'm using the Axios library, which usually works pretty fine.
The Problem:
The image looks like this in the (Chrome) browser console:
It's a very very long alphanumeric string and that's what I send to the server with the following code:
static uploadFiles(files) {
const data = new FormData();
Object.keys(files).forEach(key => {
data.append("files", new Blob([files[key]], { type: 'image/jpeg' }));
});
const url = API_URL + "uploadFiles";
return axios.post(url, data, RestServices.getAuth({
"Content-Type": "multipart/form-data;boundary=gc0p4Jq0M2Yt08jU534c0p"
}));
}
I have no idea what the boundary thing does but it worked to receive a file in the backend tho...
On backend (spring) side I successfully receive an array of MultipartFiles:
#RequestMapping(value = "/uploadFiles", method = RequestMethod.POST)
#ResponseBody
public boolean uploadFiles(HttpServletRequest request, #RequestParam("files") MultipartFile[] files) throws IOException {
String filePath = Thread.currentThread().getContextClassLoader().getResource("assets/images/").getFile();
InputStream inputStream;
OutputStream outputStream;
for(MultipartFile file : files) {
File newFile = new File(filePath + file.getOriginalFilename() + ".jpg");
inputStream = file.getInputStream();
if (!newFile.exists() && newFile.createNewFile()) {
outputStream = new FileOutputStream(newFile);
int read;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
}
System.out.println(newFile.getAbsolutePath());
}
return true;
}
I've also tried it file.transferTo(newFile); instead of in- and outputstreams - which didn't work either.
After that I get the following output, which means that the image was saved successfully:
/path/to/blob.jpg
If I check the path where the file was uploaded, there is a file named blob.jpg, but if I open it, the windows photo viewer has the following problem:
I've opened the image before and after upload with notepad++:
Before upload:
I think this is a byte array, but If I open the image after upload I get exactly the output of the browser. This means it didn't get converted to a byte array (correct me if I'm wrong) and I believe that's why it's a corrupt image...
My questions are:
What's the problem?
How can I fix it?
I really tried everything which crossed my mind but I ran out of ideas.
Thanks for your help! :-)
I've read following *related* questions (but they **don't** have an answer):
[Question1][5], [Question2][6], and **many** more...
I've finally found an answer on my own!
I think the problem was that I used the e.target.result (which is used to show the image on the frontend) but insted I had to use the JS File object. The standard HTML 5 file input fields return those File objects (as I've read here).
The only thing I had to do now is to make a FormData object, append the File Object, set the FormData as Body and set the Content-Type header and that's it!
const data = new FormData();
data.append("files", fileObject);
return axios.post(url, data, {
"Content-Type": "multipart/form-data"
});
Those JS File Objects are recognized from Java as Multipart files:
#RequestMapping(value = "/uploadFiles", method = RequestMethod.POST)
#ResponseBody
public boolean uploadFiles(HttpServletRequest request, #RequestParam("files") MultipartFile[] files) {
boolean transferSuccessful = true;
for (MultipartFile file : files) {
String extension = file.getOriginalFilename().substring(file.getOriginalFilename().lastIndexOf('.'));
String newFileName = genRandomName() + extension; //set unique name when saving on server
File newFile;
File imageFolder = new File(imageBasePath);
//check if parent folders exist else create it
if(imageFolder .exists() || imageFolder .mkdirs()) {
while ((newFile = new File(imageFolder .getAbsolutePath() + "\\" + newFileName)).exists()) {
newFileName = genRandomName(); //generate new name if file already exists
}
try {
file.transferTo(newFile);
} catch (IOException e) {
e.printStackTrace();
transferSuccessful = false;
}
} else {
LOG.error("Could not create folder at " + imageFolder.getAbsolutePath());
transferSuccessful = false;
}
}
return transferSuccessful;
}
I hope this is helpful :)

how to save a image from js client to blobstore?

I want to save a image from my browser into google blobstore.
What I already tried:
Transform my img into a base64 string:
reader = new FileReader();
reader.readAsBinaryString(file);
call my Endpoint Function to send it to google app engine.
var request = gapi.client.helloworldendpoints.uploadImage({'imageData': __upload.imageData, 'fileName': __upload.fileName, 'mimeType': __upload.mimeType, 'size': __upload.size});
request.execute(
function (result) {
console.log("Callback:");
console.log(result);
}
);
My EndPoint in Java looks like this:
#ApiMethod(name = "uploadImage", path = "/uploadImage", httpMethod = "POST")
public ImageUploadRequest uploadImage(#Named("imageData") byte[] imageData, #Named("fileName") String fileName, #Named("mimeType") String mimeType, #Named("size") float size) {
return new ImageUploadRequest(imageData, fileName, mimeType, size);
}
The problem is, that my endpoint seems to be unable to handle the transfer of my base64. i always get 503 backend error
What would be the best way to send the data from my js client via app engine to blobstore?
UPDATE
The main Problem is solved. I managed to upload the blob.
Javascript
var request = gapi.client.helloworldendpoints.uploadImage({
'imageData': __upload.imageData,
'fileName': __upload.fileName,
'mimeType': __upload.mimeType,
'size': __upload.size
});
Java Endpoint
public ImageUploadRequest uploadImage(
Request imageData,
#Named("fileName") String fileName,
#Named("mimeType") String mimeType,
#Named("size") float size
) { ... }
Request is just this
public class Request {
public Blob image;
}
My next Problem is the following
How can i send a MultipartRequest from my Java Endpoint at GAE to my UploadServlet to create a blobkey and save the data into blobstorage, since Blobstorage only accepts data send to servlet?

Categories