My file is image link or any other file link. I was trying to call pinata pinFileToIpfs API (Documentation).
According to documentaion , i have to pass path of local file for appending. But i have AWS URL. How to still call the below API?
let data = new FormData();
data.append('file', fs.createReadStream('./yourfile.png'));
NOTE : I tried this also
data.append('file', s3.getObject({Bucket: myBucket, Key: myFile})
.createReadStream());
but it didnt worked out.
I spent 3 days on this and finally got it to work. The trick is that you will have a hard time POSTing a Stream directly to the Pinata API as it expects a File. I finally gave up and took the Stream coming from S3 and saved it to a temporary file on the server, then sent it to Pinata, then deleted the temp file. That works.
try
{
//Copy file from S3 to IPFS
AmazonS3.AmazonS3Utility m = new AmazonS3.AmazonS3Utility();
Stream myfile = m.GetObjectStream(fileName);
using (var client = new RestClient("https://api.pinata.cloud"))
{
//Add IPFS Metadata
JObject pinataOptions = new JObject(
new JProperty("cidVersion", "0")
);
string pO = JsonConvert.SerializeObject(pinataOptions, Formatting.Indented);
JObject pinataMetadata = new JObject(
new JProperty("name", fileName),
new JProperty("keyvalues",
new JObject(
new JProperty("file_url", FileUrl),
new JProperty("description", "")
)
));
string pM = JsonConvert.SerializeObject(pinataMetadata, Formatting.Indented);
string tempFile = AppDomain.CurrentDomain.BaseDirectory + #"temp\\" + fileName;
long fileLength;
using (FileStream outputFileStream = new FileStream(tempFile, FileMode.Create))
{
myfile.CopyTo(outputFileStream);
fileLength = outputFileStream.Length;
}
var request = new RestRequest("/pinning/pinFileToIPFS", Method.Post);
request.AddQueryParameter("pinata_api_key", your_APIKey);
request.AddQueryParameter("pinata_secret_api_key", your_SecretAPIKey);
request.AddParameter("pinataOptions", pO);
request.AddParameter("pinataMetadata", pM);
request.AddHeader("Authorization", your_Pinata_JWT);
request.AddFile("file", tempFile, fileName);
RestResponse response = await client.ExecutePostAsync(request);
File.Delete(tempFile);
return response.Content;
}
}
catch { }
Related
I am trying to send a pdf file from javascript to a rest wcf service.
The service expects an array of byte with the following signature
The trick is in the byte array parameter, all the others are working fine
[OperationContract]
[WebInvoke(UriTemplate = "rest/{sessionToken}/ImportNewTemplate?commit={commit}&createApplication={createApplication}&templateName={templateName}&option={option}")]
[CloudMethod(Group = "02. Templates", Description = "Import a new template in the platform.", HelpFile = "ListPaperTemplate.aspx")]
[CloudParameter(Name = "sessionToken", Description = "session token", HelpFile = "ServiceAPIDoc.aspx?q=sessionToken")]
[CloudParameter(Name = "createApplication", Description = "Create a standalone application linked to this template.")]
[CloudParameter(Name = "commit", Description = "Commit the upload ? if true, the template will be imported, else the return just allow you to preview template description.")]
[CloudParameter(Name = "templateName", Description = "Name of the new template. Only valid for single pdf upload. If the files are zipped, the file name in the zip will be used instead")]
[CloudParameter(Name = "templateFile", Description = "Can be a PDF file, or a zip file containing a flat pdf + xml definition", HelpFile = "ServiceAPIDoc.aspx?q=templateFile")]
CloudObjects.TemplateImportation ImportNewTemplate(string sessionToken, bool commit, bool createApplication, byte[] templateFile, string templateName, string option);
this is what I use from the javascript end to send the pdf file
const file = e.target.files[0];
// Encode the file using the FileReader API
const reader = new FileReader();
var fileByteArray = [];
reader.onloadend = async (e) => {
const arrayBuffer = e.target.result,
array = new Uint8Array(arrayBuffer);
for (const a of array) {
console.log(a);
fileByteArray.push(a);
}
let ret = await dispatch('createTemplate', {name: this.newForm.name, pdf:fileByteArray, save:false});
await this.$store.dispatch('hideLoadingScreen')
// Logs data:<type>;base64,wL2dvYWwgbW9yZ...
};
reader.onerror = async () => {
await this.$store.dispatch('hideLoadingScreen')
}
reader.onabort = async () => {
await this.$store.dispatch('hideLoadingScreen')
}
await this.$store.dispatch('showLoadingScreen');
reader.readAsArrayBuffer(file);
And here is the code to send it to the rest service
let url = `${getters.getServiceUrl}ImportNewTemplate?templateName=${name}&commit=${save || true}`
const xhr = new XMLHttpRequest;
xhr.open("POST", url, false);
xhr.setRequestHeader('Content-Type', 'application/json');
let response = await xhr.send(pdf);
However every time I get an error from the service when it tries to deserialise the byte array.
The exception message is 'There was an error deserializing the object of type System.Byte[]. End element 'root' from namespace '' expected.
I have tried a lot of alternatives but nothing works.
Any suggestions are welcome !
Thanks
For those interested, the trick was to add JSON.stringify to the returned array.
So: xhr.send(JSON.stringify(pdf))
would do the trick
I want to retrieve the content of a password-protected file as a File object using PDF.js library.
I tried to get the raw data from the promise and convert it to a File. However the file is still encrypted.
Here is a sample of code:
const loadingTask = pdfjsLib.getDocument(file);
loadingTask.onPassword = (callback, reason) => {
callback(password);
};
loadingTask.promise.then((pdfDocument) => {
//Here I tried to retrieve data from `pdfDocument` but it is still encrypted
const data = await pdfDocument.getData(); //The data is still encrypted
const blob = new Blob([data]);
const pdfFile = new File([blob], 'name', { type: 'application/pdf' });
});
Is there a way with PDF.js library to get the decrypted PDF as a File object ?
Thanks in advance.
I'm building an app for my friend and I need to record audio and store it on a server. I have successfully done it with text and images, but I can't make it work for audio.
I am probably missing something when creating the file (the file is created but not playable). There has to be a problem with the data conversion from the JS Blob to the actual file.
I managed to get the audio blob and even play it back in JS. But when I create a file from the blob it can't be played (I tried to save it locally with the same outcome - got the file but could not play it). I also tried to save it in different formats with no success (wav, mp3). So the problem has to be in the conversion from the blob to the actual file. With text and images, it was straightforward forward and the files were created from the blob just by saving them with a filename. But I guess that with audio isn't that simple.
My understanding is that I have some binary data (JS Blob), that can be saved as a file. But with audio, there has to be some special conversion or encoding so the output file works and can be played.
here is the frontend code (I am using this with some of the variables because its part of a Vue component)
this.mediaRecorder.addEventListener("stop", () => {
// tried to save it as WAV with the same result got the file, but couldn't play it
this.audioBlob = new Blob(this.audioChunks, { 'type' : 'audio/mpeg-3' })
//debugging - playing back the sound in the browser works fine
const audioUrl = URL.createObjectURL(this.audioBlob);
const audio = new Audio(audioUrl);
audio.play();
//adding the blob to the request
let filename = this.$store.state.counter + "-" + this.$store.state.step
const formData = new FormData();
formData.append('file', this.audioBlob, `${filename}.mp3`);
const config = {
headers: { 'content-type': 'multipart/form-data' }
}
//sending it to my Flask API (xxx is the name of the folder it gets saved to on the server)
this.$axios.post('http://localhost:5000/api/v1/file/upload/xxx', formData, config)
})
here is my endpoint on the server
#app.route('/api/v1/file/upload/<test_id>', methods=['POST'])
def upload_file(test_id):
uploaded_file = request.files['file']
filename = secure_filename(uploaded_file.filename)
if filename != '':
uploaded_file.save(os.path.join(app.config['UPLOAD_PATH'], test_id, filename))
return jsonify({'message': 'file saved'})
Here is the whole recording code snippet
navigator.mediaDevices.getUserMedia({ audio: true })
.then(stream => {
this.mediaRecorder = new MediaRecorder(stream);
// audio.srcObject = stream
this.mediaRecorder.start();
this.mediaRecorder.addEventListener("dataavailable", event => {
this.audioChunks.push(event.data)
})
this.mediaRecorder.addEventListener("stop", () => {
this.audioBlob = new Blob(this.audioChunks, { 'type' : 'audio/mpeg-3' })
//debugging - playing back the sound in the browser works fine
const audioUrl = URL.createObjectURL(this.audioBlob);
const audio = new Audio(audioUrl);
audio.play();
//adding the blob to the request
let filename = this.$store.state.counter + "-" + this.$store.state.step
const formData = new FormData();
formData.append('file', this.audioBlob, `${filename}.mp3`);
const config = {
headers: { 'content-type': 'multipart/form-data' }
}
//sending it to my Flask API (xxx is the name of the folder it gets saved to on the server)
this.$axios.post('http://localhost:5000/api/v1/file/upload/xxx', formData, config)
})
})
I'm using the gcloud API on a Nodejs web server to upload files. I'd prefer the files not be uploaded on the client side and instead uploaded on the server. Currently, I am producing a blob on the client side, then converting it to text and passing that to the server through a POST request. All of the information gets successfully passed from the client to the server as expected. This data is also uploaded to gcloud, however, Gcloud does not recognize this as a valid file nor does my computer when I download it.
What is the best way to get the contents of the file to gcloud from the server side? I've tried using dataURIs and reading the orignal file by text and both produce similiar issues. I've also explored piping a readFileStream from the blob on the server end but blobs are not natively supported by node so I have not done so yet.
Client Side
function readSingleFile(e, func, func2){
var file = e.target.files[0];
if(!file){
return; // Add error msg_here
}
var reader = new FileReader();
reader.onload = function(e){
let contents = e.target.result;
let img = document.createElement('img')
let cvs = document.createElement('canvas');
img.onload = ()=>{
cvs.width = img.width;
cvs.height= img.height;
let ctx = cvs.getContext('2d');
ctx.drawImage(img,0,0);
cvs.toBlob((res)=>{res.text().then((text)=>{func2(text)})}, "image/jpeg", 0.92);
}
img.src=contents;
func(contents);
}
reader.readAsDataURL(file);
}
Server Side
function publishPrintjob(dataObj){
try{
var newElemKey = database.ref().child('queue').push().key; // Get random Key
// Create a new blob in the bucket and upload the file data.
const gcloudFile = storage.file('images/' + newElemKey + '.jpg');
gcloudFile.save(dataObj.sockImageFile, function(err) {
if (!err) {
Console.log("File Uploaded!")
}
});
var data = {
date: dataObj.Date,
email: dataObj.email,
design: dataObj.Design,
author: dataObj.Author,
address: dataObj.address,
imageKey: newElemKey,
}
admin.database().ref('queue/' + newElemKey).set(data);
} catch(err){
console.log(err)
}
}
Note: func simply shows the image on the client side, func2 just adds the contents to the POST object.
Uploading a file directly from the computer would be easiest using the storage.bucket(bucketName).upload() function from the cloud storage library. However, this uses location of a file locally and thus will not work unless a file is transferred to the server and saved first. This could be achieved using multi-part form data. Using multipart or uploading locally are better methods for uploading to google storage.
Instead, I solve this by first converting the image to a dataURI, sending the data URI to the server via the body of a GET request, and then converting it to a buffer with a readable stream that can be piped to google storage.
Client
let formData = getFormData('myForm');
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
// Typical action to be performed when the document is ready:
}
};
xhttp.open("POST", "dashboard", true);
xhttp.setRequestHeader('Content-Type', 'application/json');
xhttp.send(JSON.stringify(formData));
xhttp.onload = ()=> {
console.log(JSON.parse(xhttp.response))
// Handle server response here
};
}
Server
// DataObject is the body of the GET request, the property imageFile is the URI from readFileAsURI
function uploadImageOnServer(dataObj){
try{
var newElemKey = database.ref().child('queue').push().key; // Get random Key to use as filename
// Create a new blob in the bucket and upload the file data.
const gcloudFile = storage.file('images/' + newElemKey + '.jpeg');
var fs = require('fs'); // Should be required at the top of the file
var string = dataObj.ImageFile;
var regex = /^data:.+\/(.+);base64,(.*)$/;
var matches = string.match(regex);
var ext = matches[1];
var data = matches[2];
var buffer = Buffer.from(data, 'base64');
// Create the readstream
const readableInstanceStream = new Readable({
read() {
this.push(buffer);
this.push(null);
}
});
readableInstanceStream.pipe(gcloudFile.createWriteStream()) // link to gcloud storage api
.on('error', function(err) {
console.log('error')
})
.on('finish', function() {
console.log('upload complete')
});
} catch(err){
console.log(err)
}
}
I am using the following code to send a list of files to the backend:
var formdata = new FormData();
if(fileObjectList.length>0){
Object.keys(fileObjectList).forEach(i => {
formdata.append('file' + i, fileObjectList[i]);
});
}
formdata.append('requestModel', JSON.stringify(request));
req.open("POST", 'contorller');
req.send(formdata);
The controller converts the file to base64 data.
To send the data via email, we have to attach the content as base64,
which I again send to the controller as a file object.
You can use jszip to add files in a zip and send whole doc as base64 in single request. check the below link for more information jszip
var jszip = new ZipHandler;
var formdata = new FormData();
if(fileObjectList.length>0){
Object.keys(fileObjectList).forEach(i => {
jszip.addFile(`${fileObjectList[i]}.fileTypeExt`, '(buffer|base64)');
});
};
var zipcomplete = await t.generate({
base64: !0,
compression: "DEFLATE"
});
formdata.append('fileDataZip', zipcomplete);
formdata.append('requestModel', JSON.stringify(request));
req.open("POST", 'contorller');
req.send(formdata)
by using C# use below code to save base64 file
System.IO.File.WriteAllBytes("/fileDataZip.zip", Convert.FromBase64String(fileDataZip));
By using nodejs utilize the below code to save base64 file
require("fs").writeFile("fileDataZip.zip", fileDataZip, 'base64');