Goal: To support dynamic loading of Javascript modules contingent on some security or defined user role requirement such that even if the name of the module is identified in dev tools, it cannot be successfully imported via the console.
A JavaScript module can be easily uploaded to a cloud storage service like Firebase (#AskFirebase) and the code can be conditionally retrieved using a Firebase Cloud Function firebase.functions().httpsCallable("ghost"); based on the presence of a custom claim or similar test.
export const ghost = functions.https.onCall(async (data, context) => {
if (! context.auth.token.restrictedAccess === true) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called while authenticated.');
}
const storage = new Storage();
const bucketName = 'bucket-name.appspot.com';
const srcFilename = 'RestrictedChunk.chunk.js';
// Downloads the file
const response = await storage
.bucket(bucketName)
.file(srcFilename).download();
const code = String.fromCharCode.apply(String, response[0])
return {source: code};
})
In the end, what I want to do...
...is take a webpack'ed React component, put it in the cloud, conditionally download it to the client after a server-side security check, and import() it into the user's client environment and render it.
Storing the Javascript in the cloud and conditionally downloading to the client are easy. Once I have the webpack'ed code in the client, I can use Function(downloadedRestrictedComponent) to add it to the user's environment much as one would use import('./RestrictedComponent') but what I can't figure out is how to get the default export from the component so I can actually render the thing.
import(pathToComponent) returns the loaded module, and as far as I know there is no option to pass import() a string or a stream, just a path to the module. And Function(downloadedComponent) will add the downloaded code into the client environment but I don't know how to access the module's export(s) to render the dynamically loaded React components.
Is there any way to dynamically import a Javascript module from a downloaded stream?
Edit to add: Thanks for the reply. Not familiar with the nuances of Blobs and URL.createObjectURL. Any idea why this would be not found?
const ghost = firebase.functions().httpsCallable("ghost");
const LoadableRestricted = Loadable({
// loader: () => import(/* webpackChunkName: "Restricted" */ "./Restricted"),
loader: async () => {
const ghostContents = await ghost();
console.log(ghostContents);
const myBlob = new Blob([ghostContents.data.source], {
type: "application/javascript"
});
console.log(myBlob);
const myURL = URL.createObjectURL(myBlob);
console.log(myURL);
return import(myURL);
},
render(loaded, props) {
console.log(loaded);
let Component = loaded.Restricted;
return <Component {...props} />;
},
loading: Loading,
delay: 2000
});
Read the contents of the module file/stream into a BLOB. The use URL.createObjectURL() to create your dynamic URL to the BLOB. Now use import as you suggested above:
import(myBlobURL).then(module=>{/*doSomethingWithModule*/});
You can try using React.lazy:
import React, {lazy, Suspense} from 'react';
const Example = () => {
const [userAuthenticated, setUserAuthenticated] = useState(true);
if (userAthenticated) {
const RestrictedComponent = lazy(() => import('./RestrictedComponent'));
return (
<div>
<Suspense fallback={<div><p>Loading...</p></div>}>
<RestrictedComponent />
</Suspense>
</div>
)
}
return (
<div>
<h1>404</h1>
<p>Restricted</p>
</div>
);
}
export default Example;
Related
I use Next.js 13 Server Components to query MongoDB in the component
The data is rendered to the page.
If something changes in the database and I refresh the page, the changes should also be reflected in the page.
This also works, but only in development mode (npm run dev).
But when I start the server with npm run start, the new database data is not reflected on the page. It's gets it just one time.
With npm run build the route is marked as Static. Is this correct?
/lib/mongodb.js
import { MongoClient } from 'mongodb'
if (!process.env.MONGODB_URI) {
throw new Error('Invalid/Missing environment variable: "MONGODB_URI"')
}
const uri = process.env.MONGODB_URI
const options = {}
let client
let clientPromise
console.log("DEVELOPMENT MODE MONGODB")
// In development mode, use a global variable so that the value
// is preserved across module reloads caused by HMR (Hot Module Replacement).
if (!global._mongoClientPromise) {
client = new MongoClient(uri, options)
global._mongoClientPromise = client.connect()
}
clientPromise = global._mongoClientPromise
// Export a module-scoped MongoClient promise. By doing this in a
// separate module, the client can be shared across functions.
export default clientPromise
tickets.js
This component runs on the server
import clientPromise from "../../../lib/mongodb";
export default async function Tickets() {
const client = await clientPromise;
const db = client.db('sample_mflix');
const collection = db.collection('contact');
const result = await collection.find().toArray()
console.log("Database fetched")
return(
<>
{result.map(el => {
return (
<div key={el._id}>
{el.name}
{el.email}
{el.message}
</div>
)
})}
</>
)
}
The idea is as follows:
Images/documents are stored privately on the server
A logged-in user on frontend clicks a button which sends an axios request to backend to get an aggregated result of ModelA from TableA and it's associated attachment file list from TableB
For each ModelA, numerous requests are made to endpoint to fetch images which are returned as \Symfony\Component\HttpFoundation\StreamedResponse via Storage::download($request->file_name)
This works in the sense that files are returned.
Note - I tried attaching all files to response in step 2 but this didn't work, so added the extra step to get file list and get individual files after that based on the list. This might kill the webserver if the amount of requests becomes too high, so would appreciate any advise on a different approach.
The problem
How to display the files in React and is this the right approach at all considering potential performance issues noted above?
I've tried the following:
Create an octet-stream url link with FileReader but these wouldn't display and had the same url despite await being used for the reader.readAsDataURL(blob) function:
const { email, name, message, files } = props
const [previews, setPreviews] = useState<string[]>([])
const { attachments } = useAttachment(files)
useEffect(() => {
const p = previews
files && attachments?.forEach(async filename => {
const reader = new FileReader()
reader.onloadend = () => {
p.push(reader.result as string)
setPreviews(p)
}
const blob = new Blob([filename])
await reader.readAsDataURL(blob)
})
}, [files, attachments, previews])
Create src attributes with URL.createObjectURL() but these, although generated and unique, wouldn't display when used in an <img /> tag:
useEffect(() => {
const p = previews
files && attachments?.forEach(filename => {
const blob = new Blob([filename])
const src = URL.createObjectURL(blob)
p.push(src)
setPreviews(p)
})
}, [files, attachments, previews])
Results example:
<img src="blob:http://127.0.0.1:8000/791f5efb-1b4e-4474-a4b6-d7b14b881c28" class="chakra-image css-0">
<img src="blob:http://127.0.0.1:8000/3d93449e-175d-49af-9a7e-61de3669817c" class="chakra-image css-0">
Here's the useAttachment hook:
import { useEffect, useState } from 'react'
import { api } from '#utils/useAxios'
const useAttachment = (files: any[] | undefined) => {
const [attachments, setAttachments] = useState<any[]>([])
const handleRequest = async (data: FormData) => {
await api().post('api/attachment', data).then(resp => {
const attach = attachments
attach.push(resp)
setAttachments(attach)
})
}
useEffect(() => {
if (files) {
files.forEach(async att => {
const formData = new FormData()
formData.append('file_name', att.file_name)
await handleRequest(formData)
})
}
}, [files, attachments])
return { attachments }
}
export default useAttachment
Try Storage::response(). This is the same as Storage::download() just that it sets the Content-Disposition header to inline instead of attachment.
This tells the browser to display it instead of downloading it. See MDN Docs Here
Then you can use it as the src for an <img/>.
Solved it by sending the files in a single response but encoded with base64encode(Storage::get('filename')). Then, on the frontend, it was as simple as:
const base64string = 'stringReturned'
<img> src={`data:image/png;base64,${base64string}`}</img>```
I try to use a brand new feature released in NextJS v.12.1 https://deepinder.me/nextjs-on-demand-isr. The API itself works fine. I can reach it. But in exchange it returns 500 error that says res.unstable_revalidate is not a function. It does not work either over dev (next server && next dev) run or production one (next build && next start).
This is the api endpoint:
// ./api/misc/revalidate
const revalidateCache = async (_, res) => {
console.log(res.unstable_revalidate, 'TEST REVALIDATE'); // res.unstable_revalidate is undefined here :(
try {
await res.unstable_revalidate('/');
return res.json({ revalidated: true });
} catch (err) {
return res.status(500).send(`Error revalidating: ${err}`);
}
};
export default revalidateCache;
This is the invoke:
// ./apps/client/services/server
const getRevalidate = async () => {
await fetch('/api/misc/revalidate');
};
export default getRevalidate;
View layer that I call from:
// .src/pages/index.js
// ...some code here
const HomePage = ({ page, legacy }) => {
const handleClick = () => {
getRevalidate();
};
return (
<div className={styles.homeRoot}>
<button onClick={handleClick}>REVALIDATE</button>
</div>
);
};
UPD:
I use express to handle API abstration.
import express from 'express';
import revalidateCacheApi from './api/misc/revalidate';
export default app => {
// ...some code here
app.use('/api/misc/revalidate', revalidateCacheApi);
};
NVM. It was an issue with my local server. I use advanced set up with two independent instances (:3000 and :4000) spinning in the memory.
The way I designed API above suppose to call it over :4000 server. Which is in fact Express server (obviously does not has NextJS internal API to purge the cache).
So I moved the call to the pages/api/revalidate and up to :3000 server.
Works fine:
// ./src/pages/server/revalidate.js
const getRevalidate = async () => {
await fetch('/api/revalidate');
};
export default getRevalidate;
My plugin, env.js:
export default async (_ctx, inject) => {
const resp = await fetch('/config.json')
const result = await resp.json()
inject('env', result)
// eslint-disable-next-line no-console
console.log('env injected', result)
return result
}
Then an idea was to use it's data inside nuxt.config.js to inject into publicRuntimeConfig:
import env from './plugins/env.js'
publicRuntimeConfig: {
test: env,
},
Then in a browser console i'm checking it:
this.$nuxt.$config
It shows me:
instead of a value, though this.$nuxt.$env shows the correct values:
What's wrong?
UPDATE 1
Tried Tony's suggestion:
// nuxt.config.js
import axios from 'axios'
export default async () => {
const resp = await axios.get('/config.json')
const config = resp.data
return {
publicRuntimeConfig: {
config
}
}
}
It cannot fetch config.json, but if i point it to an external resource: "https://api.openbrewerydb.org/breweries" it does work.
Intention of this question, is to have config.json where a user could simply change variable values there (from a compiled code) and change endpoints without a re-build process.
In nuxt.config.js, your env variable is a JavaScript module, where the default export is the function intended to be automatically run by Nuxt in a plugin's context. Importing the plugin script does not automatically execute that function. Even if you manually ran that function, it wouldn't make sense to use an injected prop as a runtime config because the data is already available as an injected prop.
If you just want to expose config.json as a runtime config instead of an injected prop, move the code from the plugin into an async configuration:
// nuxt.config.js
export default async () => {
const resp = await fetch('/config.json')
const config = await resp.json()
return {
publicRuntimeConfig: {
keycloak: config
}
}
}
I'm trying to consume a json response from an Express style server in Firebase's Cloud functions, but I'm unable to present the response in the DOM. The response path and status (200) are good, but the response data I'm getting in the browser is my entire index.HTML page, not the json data.
Here's my basic set up in the cloud functions:
app.get("/tester", (req, res) => {
res.json({ serverData: "Hello" });
});
exports.app = functions.https.onRequest(app);
and my React FE code to consume it:
function App() {
let I;
const onClick = () => {
axios.get("/tester").then(res => {
I = res.data.serverData;
console.log(I);
});
};
return (
<div>
<button onClick={onClick}>click</button>
<div>{I}</div>
</div>
);
}
Like I said above, the response data I'm getting in the dev tools is just the barebones index.html page, not the text I want to receive. How can I map this data to the DOM?
You need I to be defined as a 'state' property within your function. Easiest way to do that is use React's useState hook. First add this to your script by including:
import React, { useState } from 'react';
Then declare I using the useState hook and, after retrieving the data, use the associated setter to set the new value:
function App() {
let [I, setI] = useState('');
const onClick = () => {
axios.get("/tester").then(res => {
setI(res.data.serverData);
console.log(I);
});
};
return (
<div>
<button onClick={onClick}>click</button>
<div>{I}</div>
</div>
);
}