How to effectively open downloaded file in external application using expo linking - javascript

I'm downloading a file from an URI, and trying to open it in another application. This file is a simple .xlsx spreadsheet. I'm using an Android device to test. In the code below is a reproduceble example:
import React, { useCallback } from 'react'
import { Button, View } from 'react-native'
import * as Linking from 'expo-linking'
import * as FileSystem from 'expo-file-system'
const App = () => {
const download = useCallback(async () => {
const downloadUri = 'http://url.to/my/spreadsheet.xlsx')
const localPath = `${FileSystem.cacheDirectory}spreadsheet.xlsx`
try {
await FileSystem.downladAsync(downloadUri, localPath)
.then(async ({ uri }) => {
const contentURL = await FileSystem.getContentUriAsync(uri)
await Linking.openURL(contentURL)
})
.catch((err) => console.log(err))
} catch (err) {
console.log(err)
}
}, [])
return (
<View>
<Button onPress={download}>
Pree Me!
</Button>
</View>
)
}
No error is displayed in the console log, however when the Linking is attempting to open the file gives the error "Google Sheets was unable to open your Spreadsheet". The same thing happens when I'm trying to open other filetypes as:
.docx with Google Docs
.pptx with Google Slides
.pdf with Drive
... and so on
Any idea what could be causing this?
NOTE
I verified if the file returned by the URL is not corrupted using:
curl -X GET http://url.to/my/spreadsheet.xlsx -o spreadsheet.xlsx (from within the device) and it could be opened normally with Google Sheets.

Related

Problem with file input size (video upload to cloudinary through netlify serverless functions)

I'm having issues uploading video files to cloudinary in my react app, deployed in netlify. In my App I have a react page with a form that sends the data to my API. My API handles the HTTP requests to my netlify functions (using Axios), and then with the serverless function I call the cloudinary node API to store the video file. The problem happens when I'm passing the data from my API to the serverless function, I'm getting "Error: Stream body too big", because the video exceeds the payload limit of netlify functions (6mb). Do I have to compress the file? It's okay to do it like this (frontend page -> api.js -> serverless function)? Thanks for all the help you guys provide everyday, it helped me a lot!
Files that I have and the error:
page.jsx
const formHandler = async (formValues) => {
try {
...
const res = await addExercise(formValues);
...
} catch (error) {
console.log("error ", error);
}
};
api.js
import { instance } from "./instance";
...
export const addExercise = async (data) => {
try {
const reader = new FileReader();
reader.readAsDataURL(data.exerciseVideo[0]);
reader.onloadend = async () => {
const cloudVideo = reader.result;
const cloudinaryResponse = await instance.post("upload-exerciseVideo", { cloudVideo });
...
}
} catch (error) {
console.log("error", error);
}
};
serverless function (upload-exerciseVideo.js)
import cloudinary from '../src/config/cloudinary';
export const handler = async (event, context) => {
try {
const data = JSON.parse(event.body);
const { cloudVideo } = data;
const cloudinaryRequest = await cloudinary.uploader
.upload(cloudVideo, {
resource_type: "video",
upload_preset: "z9qxfc6q"
});
...
} catch (error) {
console.log('error', error)
return {
statusCode: 400,
body: JSON.stringify(error)
}
}
}
Error:
Netlify Serverless functions are built on top of AWS Lambda functions so there is a hard limit to the size of the file and the amount of time it takes to run the code in the file. You didn't mention the size of your video, but the video does take longer to upload, and even if you are within the 1GB size limit, you may be exceeding the 10-second processing limit. Your video likely already has been compressed, so compression is not a viable option, and decompressing it in the serverless function would probably exceed the time limit. https://www.netlify.com/blog/intro-to-serverless-function.
If you're uploading a large file, like a video, from front-end code, consider using the Upload Widget with an unsigned preset. Here's a link to a code sandbox showing how to create and use the upload widget in React: https://codesandbox.io/s/cld-uw-uymkb. You will need to add your Cloudinary cloudname and an unsigned preset to make this work. You'll find instructions for creating unsigned presets here: https://cloudinary.com/documentation/upload_presets

How to use pdfjs in a google cloud function to accept a pdf read from Dropbox as a fileBinary?

Context
I am using Dropbox and PDFJs library inside a Google Cloud Function
What I'm doing
Inside my functions folder i run
npm i --save pdfjs-dist
Then I download a pdf content from dropbox (this works)
exports.readAProgram = functions.https.onRequest(async(req, res) => {
var dbx = new Dropbox.Dropbox({ accessToken: ACCESS_TOKEN });
dbx.filesDownload({ path: "/full/path/20220702.pdf" })
.then(function(response) {
console.log('response', response)
res.json(response.result.fileBinary);
})
.catch(function(error) {
// console.error(error);
res.json({"error-1": error})
});
});
I got this
Formatted is this
Note
I do not known what exactly is a fileBinary because
in the official doc (https://www.dropbox.com/developers/documentation/http/documentation#files-download) I cannot see the fileBinary
in the official example of a download they are using a diffent method (https://github.com/dropbox/dropbox-sdk-js/blob/main/examples/javascript/download/index.html)
Next step: pass data to PDF.js.getDocument
I'm looking at the sourcecode, because obviously official api doc is useless.
See here: https://github.com/mozilla/pdf.js/blob/master/src/display/api.js#L232
The getDocument function accepts
string|URL|TypedArray|PDFDataRangeTransport|DocumentInitParameters
Question
How can I convert my Dropbox fileBinary structure into something accettable from PDFJS.getDocument ?
I tried
dbx.filesDownload({ path: "/full/path/20220702.pdf" })
.then(function(response) {
var loadingTask = PDFJS.getDocument(response.result.fileBinary)
.then(function(pdf) {
console.log ("OK !!!!")
res.json(response.result.fileBinary);
})
.catch(function (error) {
console.log ("error)
res.json({"error_2": error})
});
But I got this on console
> C:\laragon\www\test-pdf-dropbox\functions\node_modules\pdfjs-dist\build\pdf.js:2240
> data: structuredClone(obj, transfers)
> ^
>
> ReferenceError: structuredClone is not defined
> at LoopbackPort.postMessage (C:\laragon\www\test-pdf-dropbox\functions\node_modules\pdfjs-dist\build\pdf.js:2240:13)
> at MessageHandler.sendWithPromise (C:\laragon\www\test-pdf-dropbox\functions\node_modules\pdfjs-dist\build\pdf.js:8555:19)
> at _fetchDocument (C:\laragon\www\test-pdf-dropbox\functions\node_modules\pdfjs-dist\build\pdf.js:1356:48)
> at C:\laragon\www\test-pdf-dropbox\functions\node_modules\pdfjs-dist\build\pdf.js:1302:29
> at processTicksAndRejections (node:internal/process/task_queues:96:5)
i solved
First: use legacy dist of PDFJS
instead of using
const PDFJS = require("pdfjs-dist");
I do now
const PDFJS = require("pdfjs-dist/legacy/build/pdf.js");
the npm package is the same, pdfjs-dist
Then: using PDFJS in this way
var pdf = PDFJS.getDocument(new Uint8Array(response.result.fileBinary)).promise
.then(function(pdf) {
console.log ("Letto il PDF !!!!", pdf)
res.json({done: true})
})
Note
fileBinary can be passed to PDFJS using new Uint8Array
i appended .promise before of .then

Problem displaying PDF in React from UInt8Array (react-pdf)

I'm trying to build a PDF viewer in React, I've already built the backend to upload the files and fetch the buffer data. But I'm having some problems with react-pdf, as I can't seem to serve the right type of data.
This is the code I've written:
const [data, setData] = useState();
useEffect(() => {
fetch("http://localhost:5000/files/pdf-test.pdf")
.then((res) => res.text())
.then((res) => setData(new Buffer(res, "binary")));
});
return (
<>
<Document file={{ data: data }} />
</>
);
This is one of the few tries I've made, in this one the backend serves the binary data of the file, and if we console log the last .then we can see that I'm serving the Document a UInt8Array, which is the recommended data format:
Apart from the code in the image, I've also tried it with binary and an ArrayBuffer, but still no results, changing both the backend and frontend code.
The error I get is:
TL;DR: I'm trying to display PDF files in React with react-pdf using their buffer but I can't manage to do it. I have already made a backend to upload files (express-fileupload) and store them so that their data can be fetched.
Thank u for helping, and I'm open to other approaches to the problem
For RCA (react-create-app) you need to config the worker in order to view your PDF file.
import { Document, Page, pdfjs } from 'react-pdf';
Then configure it like this:
pdfjs.GlobalWorkerOptions.workerSrc = `//cdnjs.cloudflare.com/ajax/libs/pdf.js/${pdfjs.version}/pdf.worker.js`;
Usage:
const [uint8Arr, setUint8Arr] = useState();
function getUint8Array() {
let reader = new FileReader();
// reader.readAsDataURL(selectedFile); base64
reader.readAsArrayBuffer(selectedFile);
reader.onloadend = async (e) => {
if ( e.target?.result instanceof ArrayBuffer) {
const uint8Array = new Uint8Array(e.target.result);
setUint8Arr(uint8Array) <----
// more callbacks(file.name, Buffer.from(new Uint8Array(target.result)));
}
};
}
<Document file={{
data: uint8Arr <----
}} onLoadSuccess={() => console.log('SUCCESS LOAD')}>
<Page pageNumber={1} />
</Document>
Hope that this will fix your issue.

Dynamic import of Javascript module from stream

Goal: To support dynamic loading of Javascript modules contingent on some security or defined user role requirement such that even if the name of the module is identified in dev tools, it cannot be successfully imported via the console.
A JavaScript module can be easily uploaded to a cloud storage service like Firebase (#AskFirebase) and the code can be conditionally retrieved using a Firebase Cloud Function firebase.functions().httpsCallable("ghost"); based on the presence of a custom claim or similar test.
export const ghost = functions.https.onCall(async (data, context) => {
if (! context.auth.token.restrictedAccess === true) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called while authenticated.');
}
const storage = new Storage();
const bucketName = 'bucket-name.appspot.com';
const srcFilename = 'RestrictedChunk.chunk.js';
// Downloads the file
const response = await storage
.bucket(bucketName)
.file(srcFilename).download();
const code = String.fromCharCode.apply(String, response[0])
return {source: code};
})
In the end, what I want to do...
...is take a webpack'ed React component, put it in the cloud, conditionally download it to the client after a server-side security check, and import() it into the user's client environment and render it.
Storing the Javascript in the cloud and conditionally downloading to the client are easy. Once I have the webpack'ed code in the client, I can use Function(downloadedRestrictedComponent) to add it to the user's environment much as one would use import('./RestrictedComponent') but what I can't figure out is how to get the default export from the component so I can actually render the thing.
import(pathToComponent) returns the loaded module, and as far as I know there is no option to pass import() a string or a stream, just a path to the module. And Function(downloadedComponent) will add the downloaded code into the client environment but I don't know how to access the module's export(s) to render the dynamically loaded React components.
Is there any way to dynamically import a Javascript module from a downloaded stream?
Edit to add: Thanks for the reply. Not familiar with the nuances of Blobs and URL.createObjectURL. Any idea why this would be not found?
const ghost = firebase.functions().httpsCallable("ghost");
const LoadableRestricted = Loadable({
// loader: () => import(/* webpackChunkName: "Restricted" */ "./Restricted"),
loader: async () => {
const ghostContents = await ghost();
console.log(ghostContents);
const myBlob = new Blob([ghostContents.data.source], {
type: "application/javascript"
});
console.log(myBlob);
const myURL = URL.createObjectURL(myBlob);
console.log(myURL);
return import(myURL);
},
render(loaded, props) {
console.log(loaded);
let Component = loaded.Restricted;
return <Component {...props} />;
},
loading: Loading,
delay: 2000
});
Read the contents of the module file/stream into a BLOB. The use URL.createObjectURL() to create your dynamic URL to the BLOB. Now use import as you suggested above:
import(myBlobURL).then(module=>{/*doSomethingWithModule*/});
You can try using React.lazy:
import React, {lazy, Suspense} from 'react';
const Example = () => {
const [userAuthenticated, setUserAuthenticated] = useState(true);
if (userAthenticated) {
const RestrictedComponent = lazy(() => import('./RestrictedComponent'));
return (
<div>
<Suspense fallback={<div><p>Loading...</p></div>}>
<RestrictedComponent />
</Suspense>
</div>
)
}
return (
<div>
<h1>404</h1>
<p>Restricted</p>
</div>
);
}
export default Example;

Expo FileSystem.moveAsync location is not moveable?

I'm working on react native App using expo API, the App basically takes a picture then the App crop it using ImageEditor.cropImage, finally copy the picture from cache to another location. the code:
takePicture = async function() {
if (this.camera) {
this.camera.takePictureAsync().then(async (data) => {
cropdata = {
offset:{x:0, y:0},
size:{width:100, height:100},
};
await ImageEditor.cropImage(
data.uri,
cropdata,
async (uri) => {
FileSystem.moveAsync({
from: uri,
to: `${FileSystem.documentDirectory}photos/Photo_${this.state.photoId}.jpg`,
}).then(() => {
this.setState({
photoId: this.state.photoId + 1,
});
Vibration.vibrate();
});
},
(error) => {
console.log(error);
}
);
});
}
};
But the following error is shown:
[Unhandled promise rejection: Error: Location
'file:///data/user/0/host.exp.exponent/cache/ReactNative_cropped_image_574763720.jpg'
isn't movable.]
any idea?
Expo’s FileSystem module can copy/move/etc. files that are previously saved in the app’s scope (for example via ImagePicker or using Asset.loadAsync). ImagerEditor is a core React Native functionality and it saves your image to a file that is outside Expo’s scope, thus FileSystem cannot perform actions on this file. more on this can be found here:
https://forums.expo.io/t/where-does-camera-takepictureasync-save-photos/6475/7
so instead of using ImageEditor.cropImage() one should use expo ImageManipulator.

Categories