Passing Firestore (v9) instance across packages - javascript

I'm trying to write some Firestore operations into a separate package so that it could be imported and reused in different web apps. I'm building a monorepo with different packages and I'm trying to use Firebase v9 for the following example:
From packageA I'm defining and exporting a getPosts(db) function that takes in a Firestore object and returns some posts form the given database
// in 'packageA'
import { collection, getDocs, Firestore } from 'firebase/firestore';
export const getPosts = async (db: Firestore) => {
console.log('Passed in db: ', db); // This correctly prints the passed in Firestore object
try {
const postsCollection = collection(db, 'posts'); // This function will throw
const querySnapshot = await getDocs(postsCollection);
return querySnapshot.docs.map((doc) => doc.data());
} catch (e) {
console.error('Error reading posts: ', e);
}
}
In a web app I'm initialising the Firebase app and exporting the Firestore instance
// firebase.js in 'web-app-1'
import { initializeApp } from 'firebase/app';
import { getFirestore } from 'firebase/firestore';
const firebaseConfig = { /* my Firebase config */ };
export const app = initializeApp(firebaseConfig);
export const db = getFirestore(app);
Then I'm trying to use the getPosts function from the package in a component...
// App.js in 'web-app-1'
import { db } from './firebase.js';
import { getPosts } from 'packageA';
let posts;
async function loadPosts() {
try {
posts = await getPosts(db);
} catch (e) {
console.error(e);
}
}
loadPosts(); // throws an error
but I get the following error from the collection(db, 'posts') call
Error reading posts: Expected first argument to collection() to be a CollectionReference, a DocumentReference or FirebaseFirestore
even though the passed in database is correctly printed in the console (form the getPosts function)
Note: If I copy the whole getPosts function and use it directly in the web app (i.e. without importing it from another package) then it works and correctly fetches the posts.

It looks like a bug with version 9, and the method is trying to use a Firebase Realtime Database instead of Firestore, so the method is sending an error for the collections.
It seems to override the fact that it's Firestore when using the function, so I would send this to the Firebase support directly because the way that the package is being formed seems to be the main issue.

I've been looking around a bit more and found this answer to a similar question to solve my problem too.
Basically what I had to do is to specify Firebase as a peerDependency in packageA and not include it in the final bundle. The web apps that consume packageA will include Firebase as a regular dependency.
So the package.json files look as follows
In the utility package
{
"name": "packageA",
"peerDependencies": {
"firebase": "^9.6.3"
}
}
and then in the web apps
{
"name": "web-app-1",
"dependencies": {
"firebase": "^9.6.3",
}
}
This approach also makes sense to my use case as the web app – and only the web app – that initialises the Firebase app will include it in its bundle. I can imagine however that in some other use cases this is not a possible solution.
Nevertheless I have submitted my issue to the Firebase support as suggested and here is their answer:
We have received some similar cases and we are already working to solve this. However, it can take a while due the workload of the engineering team, please, be patient.

I am currently experience the same problem. The workaround is importing the files direct via the bundler.
Keep in mind this is not optimal because I have to install the packages in the native project again, so it requires some manual maintenance
Project structure
apps
native
web
packages
utils
This ensures that my app uses the firebase instance and package that is inside native/node_modules/
Metro.config.js
`
const { getDefaultConfig } = require("#expo/metro-config");
const path = require("path");
const projectRoot = __dirname;
const workspaceRoot = path.resolve(__dirname, "../..");
const config = getDefaultConfig(__dirname);
const extraNodeModules = {
'#aim/utils': path.resolve(__dirname + '/../../packages/utils'),
};
const watchFolders = [
path.resolve(__dirname + '/../../packages/utils')
];
config.watchFolders = [workspaceRoot];
config.resolver.nodeModulesPath = [
path.resolve(projectRoot, "node_modules"),
path.resolve(workspaceRoot, "node_modules"),
];
module.exports = {
transformer: {
getTransformOptions: async () => ({
transform: {
experimentalImportSupport: false,
inlineRequires: false,
},
}),
},
resolver: {
extraNodeModules: new Proxy(extraNodeModules, {
get: (target, name) =>
//redirects dependencies referenced from common/ to local node_modules
name in target ? target[name] : path.join(process.cwd(), `node_modules/${name}`),
}),
},
watchFolders,
};
// module.exports = config;
`
getting types to work (native)
tsconfig.json
`
{
"compilerOptions": {
"allowSyntheticDefaultImports": true,
"jsx": "react-native",
"lib": ["dom", "esnext"],
"moduleResolution": "node",
"noEmit": true,
"skipLibCheck": true,
"resolveJsonModule": true,
"strict": true,
"baseUrl": ".",
"paths": {
"#aim/utils/*": ["../../packages/utils/*"]
}
},
}
`

Related

How to config testcafe basic configuration on ES modules

TypeScript and CoffeeScript
The testcafe documentation says that no additional settings are needed to use ES modules when writing tests, however, it's not clear how to configure the testcafe configuration if the project uses ES modules, for example to write global hooks. Because it looks line you have only 2 options to config testcafe globaly: .json and CommomJS
I need authorization before each test in the project, and I have this function for that:
import { Role, Selector, t } from 'testcafe';
export const user = Role('http://localhost:3000/login', async t => {
await t
.typeText(Selector('#loginInput'), 'Login')
.typeText(Selector('#passwordInput'), 'Password')
.click(Selector('button').withAttribute('data-testid', 'submitButton'));
});
And I have tried this in .testcafers.js file
import { user } from './src/testing/utilities/loginUser';
export default {
hooks: {
testRun: {
before: async () => {
await t.useRole(user)
}
},
}
};
To summarize, how can I write a global hook for testcafe, if my project is using ES modules
There is an exception for the config file. You should use CommonJS syntax as described in the documentation.
const { user } = require('./src/testing/utilities/loginUser');
module.exports = {
hooks: {
testRun: {
before: async () => {
await t.useRole(user)
}
},
}
};

Using firebase emulator with react-fire package

I am, following this answer, setting up the firebase emulator using react-fire with the following code
const preloadSDKs = (firebaseApp: firebase.app.App) => {
return Promise.all([
preloadFirestore({
firebaseApp,
setup: firestore => {
return firestore().useEmulator('localhost', 8080);
}
}),
]);
};
interface IAppProps {
}
const App: React.FunctionComponent<IAppProps> = (props) => {
const firebaseApp = useFirebaseApp();
preloadSDKs(firebaseApp).then(() => Promise.resolve());
However, I am a bit confused as to how I can add some condidtional code that makes sure I use the emulator in development, and use my cloud firestore in production build.
Can anyone help me configure this?

How to test yargs application using Jest (Javascript/Typescript)

I already asked the question on Jest repository here. And also pushed a sample application here to reproduce the behavior. But for the sake of completeness here's the full story:
Essentially it's like this (./parsers.ts):
import yargs from "yargs";
export const parser = yargs
.strict(true)
.help()
.commandDir("cmds")
.demandCommand(1)
.recommendCommands();
And in cmds folder, there's a remote.ts:
import { Argv } from "yargs";
export const command = "remote <command>";
export const describe = "Manage set of tracked repos";
export const handler = (yargs: Argv<any>) => {};
export const builder = (yargs: Argv<any>) => {
return yargs
.commandDir("remote_cmds")
.demandCommand(1, 1)
.recommendCommands();
};
And then there's add.ts:
import { Argv } from "yargs";
export const command = "add <name> <url>";
export const handler = (yargs: Argv<any>): void => {};
export const describe = "Add remote named <name> for repo at url <url>";
export const builder = (yargs: Argv<any>): Argv => {
return yargs.demandCommand(0, 0);
};
Now I've got two more files:
// index.ts
import { parser } from "./parsers";
import { Arguments } from "yargs";
parser.parse("remote add foo", (err, argv, output) => {
console.log("parsed argv: %s", JSON.stringify(argv));
if (err) console.log("ERROR\n" + err);
if (output) console.log("OUTPUT\n" + output);
});
When I run this, it fails, rightly so. Because remote add command expects two arguments. And if I pass correct input, it gives correct output. Meaning everything works just fine.
// parsers.test.ts
import { Arguments } from "yargs";
import { parser } from "./parsers";
describe("remote", () => {
test("add", async () => {
const argv = parser.parse("remote add foo", (err, argv, output) => {
console.log(JSON.stringify(argv));
if (err) console.log("ERROR\n" + err);
if (output) console.log("OUTPUT\n" + output);
});
expect(argv.name).toEqual("foo");
});
});
Also the Jest configuration is:
module.exports = {
transform: {
"^.+\\.ts?$": "ts-jest",
},
testEnvironment: "node",
testRegex: "./src/.*\\.(test|spec)?\\.(ts|ts)$",
moduleFileExtensions: ["ts", "tsx", "js", "jsx", "json", "node"],
roots: ["<rootDir>/src"],
};
But when I run the above test, it doesn't fail at all, as if the parser has no configuration. (The assertion interestingly fails because foo is not extracted as a property into argv which shows, again, the parser didn't pick up the configuration inside cmds folder.)
Not sure if it's a bug or feature; while testing yargs parsers, something is messing with the parser configuration so that, nothing from commands directories gets loaded into the parser.
How can I test my parser using Jest? Thanks.

error sending request for url (https://deno.land/std/encoding/csv.ts) (os error 10013)

Unable to import the stdlib files from deno.land to local cache on running mod.ts.
error: error sending request for url (https://deno.land/std/encoding/csv.ts): error trying to connect: tcp connect error: An attempt was made to access
a socket in a way forbidden by its access permissions. (os error 10013)
Imported from "file:///C:/Current_Tasks/Deno/Kepler/mod.ts:3"
Is there anything additional that needs to be enabled to import these files?
import { join } from "https://deno.land/std/path/mod.ts";
import { BufReader } from "https://deno.land/std/io/bufio.ts";
import { parse } from "https://deno.land/std/encoding/csv.ts";
async function loadPlanetsData() {
const path = join(".", "test.csv");
const file = await Deno.open(path);
const bufReader = new BufReader(file);
const result = await parse(bufReader, {
header: true,
comment: "#",
});
Deno.close(file.rid);
console.log(result);
}
await loadPlanetsData();
Update: Used
deno run --allow-read mod.ts
import { join } from "https://deno.land/std/path/mod.ts";
import { BufReader } from "https://deno.land/std/io/bufio.ts";
import { parse } from "https://deno.land/std/encoding/csv.ts";
async function loadPlanetsData() {
const path = join(".", "test.csv");
const file = await Deno.open(path);
const bufReader = new BufReader(file);
const result = await parse(bufReader, {
header: true,
comment: "#",
});
Deno.close(file.rid);
console.log(result);
}
await loadPlanetsData();
While running this file you need to give read access to the Deno.
Deno is secure by default. Therefore, unless you specifically enable it, a deno module has no file, network, or environment access for example. Access to security sensitive areas or functions requires the use of permissions to be granted to a deno process on the command line.
For the following example, mod.ts has been granted read-only access to the file system. It cannot write to it, or perform any other security sensitive functions.
deno run --allow-read mod.ts

nextjs route middleware for authentication

I'm trying to figure out an appropriate way of doing authentication, which I know is a touchy subject on the GitHub issue page.
My authentication is simple. I store a JWT token in the session. I send it to a different server for approval. If I get back true, we keep going, if I get back false, it clears the session and puts sends them to the main page.
In my server.js file I have the following (note- I am using the example from nextjs learn and just adding isAuthenticated):
function isAuthenticated(req, res, next) {
//checks go here
//if (req.user.authenticated)
// return next();
// IF A USER ISN'T LOGGED IN, THEN REDIRECT THEM SOMEWHERE
res.redirect('/');
}
server.get('/p/:id', isAuthenticated, (req, res) => {
const actualPage = '/post'
const queryParams = { id: req.params.id }
app.render(req, res, actualPage, queryParams)
})
This works as designed. If I refresh the page /p/123, it will redirect to the /. However, if I go there via a next/link href, it doesn't. Which I believe is because it's not using express at this point but next's custom routing.
Is there a way I can bake in a check for every single next/link that doesn't go through express so that I can make sure the user is logged in?
Tim from the next chat helped me solve this. Solution can be found here but I will quote him so you all can see:
You can do the check in _app.js getInitialProps and redirect like this
Example of how to use it
_app.js documentation
I've also created an example skeleton template you can take a look at.
--
EDIT July 2021 - WARNING: This is an outdated solution and has not been confirmed to work with the latest versions of next.js. Use skeleton template at your own risk.
Edit: Updated answer for Next 12.2+
Note: The below contents is copied from the official blog post since SO generally discourages links that can become stale/dead over time
https://nextjs.org/blog/next-12-2#middleware-stable
Middleware is now stable in 12.2 and has an improved API based on feedback from users.
// middleware.ts
import { NextRequest, NextResponse } from 'next/server';
// If the incoming request has the "beta" cookie
// then we'll rewrite the request to /beta
export function middleware(req: NextRequest) {
const isInBeta = JSON.parse(req.cookies.get('beta') || 'false');
req.nextUrl.pathname = isInBeta ? '/beta' : '/';
return NextResponse.rewrite(req.nextUrl);
}
// Supports both a single value or an array of matches
export const config = {
matcher: '/',
};
Migration guide
https://nextjs.org/docs/messages/middleware-upgrade-guide
Breaking changes
No Nested Middleware
No Response Body
Cookies API Revamped
New User-Agent Helper
No More Page Match Data
Executing Middleware on Internal Next.js Requests
How to upgrade
You should declare one single Middleware file in your application, which should be located next to the pages directory and named without an _ prefix. Your Middleware file can still have either a .ts or .js extension.
Middleware will be invoked for every route in the app, and a custom matcher can be used to define matching filters. The following is an example for a Middleware that triggers for /about/* and /dashboard/:path*, the custom matcher is defined in an exported config object:
// middleware.ts
import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
export function middleware(request: NextRequest) {
return NextResponse.rewrite(new URL('/about-2', request.url))
}
// Supports both a single string value or an array of matchers
export const config = {
matcher: ['/about/:path*', '/dashboard/:path*'],
}
Edit: Outdated answer for next > 12 and < 12.2
With the release of Next.js 12, there's now beta support for middleware using Vercel Edge Functions.
https://nextjs.org/blog/next-12#introducing-middleware
Middleware uses a strict runtime that supports standard Web APIs like fetch. > This works out of the box using next start, as well as on Edge platforms like Vercel, which use Edge Functions.
To use Middleware in Next.js, you can create a file pages/_middleware.js. In this example, we use the standard Web API Response (MDN):
// pages/_middleware.js
export function middleware(req, ev) {
return new Response('Hello, world!')
}
JWT Authentication example
https://github.com/vercel/examples/tree/main/edge-functions/jwt-authentication
in next.config.js:
const withTM = require('#vercel/edge-functions-ui/transpile')()
module.exports = withTM()
in pages/_middleware.js:
import { NextRequest, NextResponse } from 'next/server'
import { setUserCookie } from '#lib/auth'
export function middleware(req: NextRequest) {
// Add the user token to the response
return setUserCookie(req, NextResponse.next())
}
in pages/api/_middleware.js:
import type { NextRequest } from 'next/server'
import { nanoid } from 'nanoid'
import { verifyAuth } from '#lib/auth'
import { jsonResponse } from '#lib/utils'
export async function middleware(req: NextRequest) {
const url = req.nextUrl
if (url.searchParams.has('edge')) {
const resOrPayload = await verifyAuth(req)
return resOrPayload instanceof Response
? resOrPayload
: jsonResponse(200, { nanoid: nanoid(), jwtID: resOrPayload.jti })
}
}
in pages/api/index.js:
import type { NextApiRequest, NextApiResponse } from 'next'
import { verify, JwtPayload } from 'jsonwebtoken'
import { nanoid } from 'nanoid'
import { USER_TOKEN, JWT_SECRET_KEY } from '#lib/constants'
export default async function handler(
req: NextApiRequest,
res: NextApiResponse
) {
if (req.method !== 'GET') {
return res.status(405).json({
error: { message: 'Method not allowed' },
})
}
try {
const token = req.cookies[USER_TOKEN]
const payload = verify(token, JWT_SECRET_KEY) as JwtPayload
res.status(200).json({ nanoid: nanoid(), jwtID: payload.jti })
} catch (err) {
res.status(401).json({ error: { message: 'Your token has expired.' } })
}
}
There is no middleware for no API routes in NextJS, but there are HOCs, which you can use to connect to db - select the user, etc:
https://hoangvvo.com/blog/nextjs-middleware

Categories