React-native-fetch-blob downloading file - javascript

I just want my app download a file from the server by using react-native-fetch-blob. The problem is, where do the file stored? I just console.log the callback from react-native-fetch-blob and got this object
React-native-fetch-blob object callback
this is my code
alert("downloading");
RNFetchBlob
.config({
useDownloadManager : true,
fileCache : true
})
.fetch('GET', 'http://fontawesome.io/assets/font-awesome-4.7.0.zip', {})
.then((res) => {
console.log(res);
alert("Download");
alert('The file saved to ', res.path());
})
Any solution?

To download a file directly with rn-fetch-blob, you need to set fileCache as true.
btw, react-native-fetch-blob is not maintained anymore, use rn-fetch-blob instead
document of download file directly
RNFetchBlob
.config({
// add this option that makes response data to be stored as a file,
// this is much more performant.
fileCache : true,
})
.fetch('GET', 'http://www.example.com/file/example.zip', {
//some headers ..
})
.then((res) => {
// the temp file path
console.log('The file saved to ', res.path())
})

function downloadFile(url,fileName) {
const { config, fs } = RNFetchBlob;
const downloads = fs.dirs.DownloadDir;
return config({
// add this option that makes response data to be stored as a file,
// this is much more performant.
fileCache : true,
addAndroidDownloads : {
useDownloadManager : true,
notification : false,
path: downloads + '/' + fileName + '.pdf',
}
})
.fetch('GET', url);
}
use this answer it will work for sure

I am using it and it works perfectly.
You just need to get the path like this.
var filePath = res.path();
this is where your file is stored.

Related

Laravel 9 and Javascript: how to download a file returned from Storage::download()

DISCLAIMER: Before creating this question, I've checked here, here and here, and also checked Laravel docs.
Context
Laravel 9 full-stack
No JS framework on front-end, which means I'm using vanilla JS
The folders on Storage are setted like this:
storage
app
public
folder1
folder1A
folder1B
folder1C
etc
The files stored in each folder1X are .pdf format and I don't know its names.
No folders are empty, nor with invalid/corrupted files.
The problem
I have a FileController.php to download files that are inside a folder1X/ directory. The method to download it is as follows:
public function downloadFileFromStorage(Request $request): mixed
{
$dirpath = $request->dirpath; // dirpath = public/folder1/folder1X.
$files = Storage::allFiles($dirpath);
return response()->download(storage_path('app\\' . $files[0]));
}
(Note: dirpath is sent in a axios request by client and is also fetched from database on a previous request)
My Javascript CLI needs to enable the download of this file. The download is enabled by clicking on a button. The button calls downloadPDF(dirpath) which works as follows:
function downloadPDF(dirpath) {
axios.post('/download-pdf-file', { dirpath })
.then(
success => {
const url = success.data
const a = document.createElement('a')
a.download = 'file.pdf'
a.href = url
a.click()
},
error => {
console.log(error.response)
}
)
}
But, when I run this function, I get a about:blank#blocked error.
Attempts
Changed the a HTML DOM approach to a window.open(url) on client;
Changed response() to Storage::download($files[0], 'file-name.pdf'), and with this I also tried using Blob on client as follows:
success => {
const blob = new Blob([success.data], { type: 'application/pdf' })
const fileURL = URL.createObjectURL(blob)
window.openURL(fileURL)
},
Also mixed Blob with the a HTML DOM approach;
Changed storage_path argument to /app/public/ before concatenating to $files[0].
UPDATE
Following tips from #BenGooding and #cengsemihsahin, I changed files to the following:
JS
// FileDownload is imported on a require() at the code beginning
function downloadPDF(dirpath) {
axios({
url: '/download-pdf-file',
method: 'GET',
responseType: 'blob',
options: {
body: { dirpath }
}
}).then(
success => {
FileDownload(success.data, 'nota-fiscal.pdf')
}
)
}
PHP:
public function downloadFileFromStorage(Request $request): mixed
{
$dirpath = $request->dirpath; // dirpath = public/folder1/folder1X.
$files = Storage::allFiles($dirpath);
return Storage::download($files[0], 'filename.pdf');
}
and now it downloads a corrupted PDF that can't be opened.
Finally found the issue, and it was here:
axios({
url: '/download-pdf-file',
method: 'GET',
responseType: 'blob',
options: { // here
body: { dirpath } // here
}
})
Laravel's Request arrow operator -> can't fetch a GET body sent through options (At least, not on $request->key fashion; see more about it here) thus making me download a corrupted file - it wasn't fetching any file on Laravel as it didn't get any path at all.
Here is the solution I came with:
As I want to get a file in a route that doesn't change except for the 1X at folder1X, I'm processing the path obtained and sending the 1X as a GET query param:
let folderNumber = dirpath.split('/')
folderNumber = folderNumber[folderNumber.length].replaceAll('/', '')
axios({
url: '/download-pdf-file?folder=',
method: 'GET',
responseType: 'blob'
})
This way I don't pass the whole path to back-end and it's possible to get folderNumber by using $request->query():
public function downloadFileFromStorage(Request $request): mixed
{
$folderNumber = $request->query('folderNumber');
$folderPath = '/public/folder1/folder' . $folderNumber . '/';
$files = Storage::allFiles($folderPath);
return Storage::download($files[0], 'file-name.pdf');
}
In a nutshell:
To download files, use GET requests;
To send arguments within GET requests, use query parameters and fetch them with $request->query('keyname') (or find out another way. Good luck!);

Evaporate JS - add is undefined

I can't seem to figure out what the problem is. I'm trying to use EvaporateJS to upload files to S3, I'm also using React. Here is what my code looks like:
Blockquote
useEffect(() => {
Evaporate.create({
aws_key: AWS_ACCESS_KEY,
bucket: S3_BUCKET,
awsRegion: 'us-west-1', // s3 region
signerUrl: '/api/videos/signv4_auth',
awsSignatureVersion: '4',
computeContentMd5: true,
cloudfront: true,
cryptoMd5Method: (data) => {
return AWS.util.crypto.md5(data, 'base64');
},
cryptoHexEncodedHash256: (data) => {
return AWS.util.crypto.sha256(data, 'hex');
}
}).then(evaporate => {
console.log(evaporate);
// evaporate.add(); // showing as not a function
});
}, []);
But I get an error message: evaporate.add is not a function. When I inspect the evaporate variable that's being passed with then, it doesn't contain the add function, nor some of the other functions mentioned in documentation. Not sure why it's not working, any help would be highly appreciated.
Console output of evaporate
Error Message

Cypress, response body as BLOB instead of JSON, but JSON in chrome devtools

i've been struggling with this behaviour of Cypress that i do not understand and i need help.
When i set route and wait for the request i can see that the response body is in BLOB, when in chrome devtools response body arrives as JSON, so is in application. I have Content-type set to application/vnd.api+json. Cypress version 3.7.0. I also disabled Fetch because Cypress have problems with that Cypress documentation #wait
cy.server();
cy.route('POST', '**/services').as('postService');
cy.get('[data-cy=AddServices_submit]').click();
cy.wait('#postService').then((xhr) => {
//xhr.response.body is BLOB
//xhr.responseBody is BLOB
})
Found similar question: Stackoverflow Similar question but this is not helpful for me.
Did any one had similar problems with response arriving as BLOB?
Any help would be great, if you need more information feel free to ask. Thanks
EDIT
I have a workaround to this problem if anyone needed one. But the problem Still occurs
cy.wait('#postService').then(async (xhr) => {
const response = await new Response(xhr.responseBody).text();
const jsonResponse = JSON.parse(response);
// jsonResponse is real json
});
I got the same problem and it was solved by adding cypress fetch polyfill as here
If the link won't be available, I copy the content here:
In directory cypress/support/ in file hooks.js add this code:
// Cypress does not support listening to the fetch method
// Therefore, as a workaround we polyfill `fetch` with traditional XHR which
// are supported. See: https://github.com/cypress-io/cypress/issues/687
enableFetchWorkaround();
// private helpers
function enableFetchWorkaround() {
let polyfill;
before(() => {
console.info('Load fetch XHR polyfill')
cy.readFile('./cypress/support/polyfills/unfetch.umd.js').then((content) => {
polyfill = content
})
});
Cypress.on('window:before:load', (win) => {
delete win.fetch;
// since the application code does not ship with a polyfill
// load a polyfilled "fetch" from the test
win.eval(polyfill);
win.fetch = win.unfetch;
})
}
In directory cypress/support/ in file index.js import hooks.js
import './hooks'
In directory cypress/support/ add directory polyfills and add there file unfetch.umd.js with this code:
// cypress/support/polyfills/unfetch.umd.js
// Version: 4.1.0
// from: https://unpkg.com/unfetch/dist/unfetch.umd.js
!function(e,n){"object"==typeof exports&&"undefined"!=typeof module?module.exports=n():"function"==typeof define&&define.amd?define(n):e.unfetch=n()}(this,function(){return function(e,n){return n=n||{},new Promise(function(t,o){var r=new XMLHttpRequest,s=[],u=[],i={},f=function(){return{ok:2==(r.status/100|0),statusText:r.statusText,status:r.status,url:r.responseURL,text:function(){return Promise.resolve(r.responseText)},json:function(){return Promise.resolve(JSON.parse(r.responseText))},blob:function(){return Promise.resolve(new Blob([r.response]))},clone:f,headers:{keys:function(){return s},entries:function(){return u},get:function(e){return i[e.toLowerCase()]},has:function(e){return e.toLowerCase()in i}}}};for(var a in r.open(n.method||"get",e,!0),r.onload=function(){r.getAllResponseHeaders().replace(/^(.*?):[^\S\n]*([\s\S]*?)$/gm,function(e,n,t){s.push(n=n.toLowerCase()),u.push([n,t]),i[n]=i[n]?i[n]+","+t:t}),t(f())},r.onerror=o,r.withCredentials="include"==n.credentials,n.headers)r.setRequestHeader(a,n.headers[a]);r.send(n.body||null)})}});
So, it worked for me
Same problem here...
I manage to get the data as JSON when I use cy.request() but I can't when I use an alias with cy.wait()
Could you try this as a workaround ?
const setBodyAsJson = async (xhr) => ({ ...xhr, body: JSON.parse(String.fromCharCode.apply(null, new Uint8Array(await xhr.response.body.arrayBuffer()))) })
cy.server();
cy.route('POST', '**/services').as('postService');
cy.get('[data-cy=AddServices_submit]').click();
cy.wait('#postService').then(setBodyAsJson).then((res) => {
// res should contain body as JSON
})
This does not explain why but in case your response.body is a Blob but responseBody is null, you can use this to read it:
cy.wait('#postService', TIMEOUT)
.its('response.body')
.then(body => {
return new Promise(done => {
const reader = new FileReader();
reader.onload = function() {
done(JSON.parse(this.result));
};
reader.readAsText(body);
});
})
.then(object => {
expect(typeof object).to.equal('object')
});

ENCODING_ERR Javascript Blob with Ionic file plugin

Can you tell me why this code is not working?
Note: file is native plugin
var blob = new Blob(["This is my blob content"], { type: "text/plain" });
this.file.writeFile(this.file.dataDirectory, 'myletter.txt', blob, { replace: true })
.then(() => {
//code
})
.catch((err) => {
console.error(err); //it comes to here
});
It gives this exception:
FileError
code : 5
message : "ENCODING_ERR"
__proto__ : Object
I have found the issue here. That was due to this path this.file.dataDirectory.
Solution: Use this instead this.file.externalApplicationStorageDirectory

Using logstash and elasticseach

I'm actually using node-bunyan to manage log information through elasticsearch and logstash and I m facing a problem.
In fact, my log file has some informations, and fills great when I need it.
The problem is that elastic search doesn't find anything on
http://localhost:9200/logstash-*/
I have an empty object and so, I cant deliver my log to kibana.
Here's my logstash conf file :
input {
file {
type => "nextgen-app"
path => [ "F:\NextGen-dev\RestApi\app\logs\*.log" ]
codec => "json"
}
}
output {
elasticsearch {
host => "localhost"
protocol => "http"
}
}
And my js code :
log = bunyan.createLogger({
name: 'myapp',
streams: [
{
level: 'info',
path: './app/logs/nextgen-info-log.log'
},
{
level: 'error',
path: './app/logs/nextgen-error-log.log'
}
]
})
router.all('*', (req, res, next)=>
log.info(req.url)
log.info(req.method)
next()
)
NB : the logs are well written in the log files. The problem is between logstash and elasticsearch :-/
EDIT : querying http://localhost:9200/logstash-*/ gives me "{}" an empty JSON object
Thanks for advance
Here is how we managed to fix this and other problems with Logstash not processing files correctly on Windows:
Install the ruby-filewatch patch as explained here:
logstash + elasticsearch : reloads the same data
Properly configure the Logstash input plugin:
input {
file {
path => ["C:/Path/To/Logs/Directory/*.log"]
codec => json { }
sincedb_path => ["C:/Path/To/Config/Dir/sincedb"]
start_position => "beginning"
}
}
...
"sincedb" keeps track of your log files length, so it should have one line per log file; if not, then there's something else wrong.
Hope this helps.
Your output scope looks not complete. Here's the list of the output parameters http://logstash.net/docs/1.4.2/outputs/elasticsearch
Please, try:
input {
file {
type => "nextgen-app"
path => [ "F:\NextGen-dev\RestApi\app\logs\*.log" ]
codec => "json"
}
}
output {
elasticsearch {
host => "localhost"
port => 9200
protocol => "http"
index => "logstash-%{+YYYY.MM.dd}"
}
}
Alternatively, you can try the transport protocol:
output {
elasticsearch {
host => "localhost"
port => 9300
protocol => "transport"
index => "logstash-%{+YYYY.MM.dd}"
}
}
I also recommend using Kibana as a data viewer. You can download it at https://www.elastic.co/downloads/kibana

Categories