I'm working on standing up a simple Relay/GraphQL app using Node.
At the root level, I have a connection called 'notes' that paginates all notes in the database. On my user object, I have a notes connection that paginates all notes that user has created.
const rootNotesId = ConnectionHandler.getConnectionID('client:root', 'RootNotesConnection_notes');
const userNotesId = ConnectionHandler.getConnectionID(queryData?.me?.id, 'UserNotesConnection_notes');
const rootNotes = usePaginationFragment(graphql `
fragment NotesRoot_notes on Query #refetchable(queryName: "NotesRootQuery") {
notes(first: $count, after: $cursor) #connection(key: "RootNotesConnection_notes") {
edges {
node {
...Note_note
}
}
}
}
`, queryData);
const userNotes = usePaginationFragment(graphql`
fragment NotesUser_notes on Query #refetchable(queryName: "NotesUserQuery") {
me {
id
notes(first: $count, after: $cursor) #connection(key: "UserNotesConnection_notes") {
edges {
node {
...Note_note
}
}
}
}
}
`, queryData);
How do I add or delete a note to both connections at once client-side? I have two different edge types and I thought this code would work:
const [commit, isInFlight] = useMutation(graphql `
mutation NotesCreateMutation($input: createNoteInput!) {
createNote(input: $input) {
noteEdge {
cursor,
node {
id
user {
username
}
content
}
}
}
}
`);
const handleButtonClick = () => {
if (!isInFlight) {
commit({
variables: {
input: {
content: newNoteInput
}
},
updater: store => {
const rootCon = ConnectionHandler.getConnection(store.get('client:root'), 'RootNotesConnection_notes');
const userCon = ConnectionHandler.getConnection(store.get(userId), 'UserNotesConnection_notes');
const payload = store.getRootField('createNote');
const newEdge = payload.getLinkedRecord('noteEdge');
const newNote = newEdge.getLinkedRecord('node');
debugger;
const newRootEdge = ConnectionHandler.createEdge(store, rootCon, newNote, 'QueryNotesEdge');
const newUserEdge = ConnectionHandler.createEdge(store, userCon, newNote, 'UserNotesEdge');
ConnectionHandler.insertEdgeAfter(rootCon, newRootEdge);
ConnectionHandler.insertEdgeAfter(userCon, newUserEdge);
}
});
setNewNoteInput('');
}
}
The only thing I can find in my debugging is that the cursor never gets set for the new edge. Stepping through this code in the debugger reveals that all variables before newRootEdge resolve just fine
This worked. Thanks to this thread for a good workaround: https://github.com/facebook/relay/issues/2761
const handleButtonClick = () => {
if (!isInFlight) {
commit({
variables: {
input: {
content: newNoteInput
}
},
updater: store => {
const rootCon = ConnectionHandler.getConnection(store.get('client:root'), 'RootNotesConnection_notes');
const payload = store.getRootField('createNote');
const newRootEdge = payload.getLinkedRecord('noteEdge');
const prevRootEdges = rootCon.getLinkedRecords('edges');
const nextRootEdges = [...prevRootEdges, newRootEdge];
rootCon.setLinkedRecords(nextRootEdges, 'edges');
const userCon = ConnectionHandler.getConnection(store.get(userId), 'UserNotesConnection_notes');
const newNote = newRootEdge.getLinkedRecord('node');
const newUserEdge = ConnectionHandler.createEdge(store, userCon, newNote, 'UserNotesEdge');
newUserEdge.setValue(newRootEdge.getValue('cursor'), 'cursor');
const prevUserEdges = userCon.getLinkedRecords('edges');
const nextUserEdges = [...prevUserEdges, newUserEdge];
userCon.setLinkedRecords(nextUserEdges, 'edges');
}
});
setNewNoteInput('');
}
}
Related
I'm trying to create an anti-crash function, but got confused at the moment that the channel does not return the author. How can I get the author in another way?
I tried to connect to AuditLogEvent, but it didn't work
My code:
const { AuditLogEvent } = requier('discord.js')
const usersMap = new Map();
const LIMIT = 3;
const TIMES = 10000
bot.rest.on('channelDelete', async channel => {
const fetchedLogs = await channel.guild.fetchAuditLogs({
limit: 1,
type: AuditLogEvent.ChannelDelete,
})
const deletionLog = fetchedLogs.entries.first();
const { executor, target } = deletionLog
if(channel.guild.id != "940990129307263046") return
if(usersMap.has(executor.id)) {
const userData = usersMap.get(executor.id);
const { lastDelete, timer } = userData;
let deleteCount = userData.deleteCount;
const tim = channel.createdTimestamp - lastDelete.createdTimestamp
if(tim > TIMES) {
usersMap.delete(executor.id)
} else {
++deleteCount;
if(parseInt(deleteCount) === LIMIT) {
executor.ban()
}
}
}
})
i am initializing a node js app with crucial data for the app to work from a database in index.js.
index.ts
import {getInitialData} from 'initData.ts';
export let APP_DATA: AppData;
export const initializeAppData = async () => {
try {
APP_DATA = (await getInitialData()) as AppData;
if (process.env.NODE_ENV !== 'test') {
initializeMongoose();
startServer();
}
} catch (error) {
console.log(error);
}
};
initData.ts
let dbName: string = 'initialData';
if (process.env.NODE_ENV === 'test') {
dbName = 'testDb';
}
const uri = `${process.env.MONGODB_URI}/?maxPoolSize=20&w=majority`;
export async function getInitialData() {
const client = new MongoClient(uri);
try {
await client.connect();
const database = client.db(dbName);
const configCursor = database
.collection('config')
.find({}, { projection: { _id: 0 } });
const config = await configCursor.toArray();
const aaoCursor = database
.collection('aao')
.find({}, { projection: { _id: 0 } });
const aao = await aaoCursor.toArray();
return { config, aao };
} catch {
(err: Error) => console.log(err);
} finally {
await client.close();
}
}
I'm using this array in another file and import it there.
missionCreateHandler
import { APP_DATA } from '../index';
export const addMissionResources = (
alarmKeyword: AlarmKeyword,
newMission: MissionDocument
) => {
const alarmKeywordObject = APP_DATA?.aao.find(
(el) => Object.keys(el)[0] === alarmKeyword
);
const resourceCommand = Object.values(alarmKeywordObject!);
resourceCommand.forEach((el) => {
Object.entries(el).forEach(([key, value]) => {
for (let ii = 1; ii <= value; ii++) {
newMission.resources?.push({
initialType: key,
status: 'unarranged',
});
}
});
});
};
I'm setting up a mongodb-memory-server in globalSetup.ts for Jest and copy the relevant data to the database from json-files.
globalSetup.ts
export = async function globalSetup() {
const instance = await MongoMemoryServer.create({
instance: { dbName: 'testDb' },
});
const uri = instance.getUri();
(global as any).__MONGOINSTANCE = instance;
process.env.MONGODB_URI = uri.slice(0, uri.lastIndexOf('/'));
process.env.JWT_SECRET = 'testSECRET';
const client = new MongoClient(
`${process.env.MONGODB_URI}/?maxPoolSize=20&w=majority`
);
try {
await client.connect();
const database = client.db('testDb');
database.createCollection('aao');
//#ts-ignore
await database.collection('aao').insertMany(aao['default']);
} catch (error) {
console.log(error);
} finally {
await client.close();
}
};
missionCreateHandler.test.ts
test('it adds the correct mission resources to the array', async () => {
const newMission = await Mission.create({
address: {
street: 'test',
houseNr: 23,
},
alarmKeyword: 'R1',
});
const expected = {
initialType: 'rtw',
status: 'unarranged',
};
addMissionResources('R1', newMission);
expect(newMission.resources[0].initialType).toEqual(expected.initialType);
expect(newMission.resources[0].status).toEqual(expected.status);
});
When runing the test, i get an 'TypeError: Cannot convert undefined or null to object at Function.values ()'. So it seems that the APP_DATA object is not set. I checked that the mongodb-memory-server is set up correctly and feed with the needed data.
When i hardcode the content of APP_DATA in index.ts, the test runs without problems.
So my questions are: How is the best practice to set up initial data in a node js app and where to store it (global object, simple variable and import it in the files where needed)? How can the test successfully run, or is my code just untestable?
Thank you!
I have a page with a list of objects called stories that displays all my stories in an array. I also have a detail page with displays an individual story.
I want to click on a link on any given story on the list, then it will navigate me to the individual story. I want to use _id as my dynamic part of the URL, as shown in the GraphQL below.
My Graphql
export const listAllStories = () => {
const query = gql`
query StoryEntries($size: Int) {
storyEntries(_size: $size) {
data {
_id
_ts
name
premises{
data{
_id
content
}
}
createdAt
}
}
}
`
return graphQLClient
.request(query, { size: 999 })
.then(({ storyEntries: { data } }) => data)
}
IN MY PAGES API I HAVE
export default async function handler(req, res) {
const handlers = {
GET: async () => {
const storyEntries = await listAllStories()
res.json(storyEntries)
},
}
if (!handlers[req.method]) {
return res.status(405).end()
}
await handlers[req.method]()
}
ON THE STORY LIST PAGE I HAVE
const ENTRIES_PATH = '/api/entries/allStories'
const useEntriesFlow = ({ initialEntries }) => {
const { data: entries } = useSWR(ENTRIES_PATH, {
initialData: initialEntries,
})
const EntryItem = ({ entry }) => (
<>
{entries?.map((entry) => (
{entry.name}
<Link href="/story/[storyId]" as={`/story/${entry._id}`}>
<a>Go</a>
</Link>
))}
</>
)
export const getStaticProps = async () => ({
props: {
initialEntries: await listAllStories(),
},
revalidate: 1,
})
This is fine and works.
**AND THEN ON THE DETAIL PAGE FOR EACH INDIVIDUAL STORY [storyId].js I HAVE **
export default function Story({story}) {
const router = useRouter()
const storyId = router.query.storyId
return(
<>
<h5>hello {story._id}</h5>
</>
)
}
export const getStaticPaths = async () => {
const res = await fetch(`${server}/api/entries/allStories/`);
const { data } = await res.json();
const paths = data.map(story => {
return {
params: { id: story._id.toString() }
}
// trying to get the _id from each story
})
return {
paths,
fallback: false
}
}
export const getStaticProps = async (context) => {
const { storyId } = context.query; // Your dynamic page is [storyId].js
const server = "http://localhost:3000";
const res = await fetch(`${server}/api/entries/allStories/${storyId}`);
// trying to get the params._id from each story
console.log(res)
const { data } = await res.json();
return {
props: { story: data }
}
}
ERROR
TypeError: Cannot read properties of undefined (reading 'map')
QUESTION
All I want to do is click on any story link, then it takes me to the details page, via the _id. I have tried a few things but I'm doing something (or some things) wrong.
Any help will be greatly appreciated.
EDIT AFTER. ERROR I'M GETTING. I'm not able to map my results on getStaticPaths
export const getStaticProps = async (context) => {
const { storyId } = context.query; // Your dynamic page is [storyId].js
const server = "YOUR SERVER VARIABLE";
const res = await fetch(`${server}/api/entries/allStories/${storyId}`);
// trying to get the params._id from each story
const { data } = await res.json();
return {
props: { story: data }
}
}
uncomment
const router = useRouter()
const storyId = router.query.storyId
// some helpful links
// https://nextjs.org/docs/basic-features/data-fetching#the-paths-key-required
// https://stackoverflow.com/questions/65783199/error-getstaticpaths-is-required-for-dynamic-ssg-pages-and-is-missing-for-xxx
export const getStaticPaths = async () => {
const server = "http://localhost:3000";
const data = await fetch(`${server}/api/entries/allStories/`).then(res => res.json() )
const paths = data.map(({_id}) => ({
params: { storyId: _id },
}))
return {
paths,
fallback: false
}
}
export const getStaticProps = async (context) => {
const storyId = context.params.storyId; // Your dynamic page is [storyId].js
const server = "http://localhost:3000";
// const res = await fetch(`${server}/api/entries/allStories/${storyId}`);
// trying to get the params._id from each story
// single api call (here)
const res = await fetch(`${server}/api/entries/allStories/`);
// removing const { data } because the data will be returned when calling res.json()
const data = await res.json();
// instead of the calling the single api (just a fix not recommended to access [0] directly )
return {
props: { story: data.filter(story => story._id === storyId)[0] }
}
}
acquisitions table
id
acquisition_id
acquiring_id
price
1
c:11
c:4
1000000000
2
c:134
c:4
2300000000
objects table
id
name
c:11
Instagram
c:134
Oculus VR
c:4
Facebook
results table
parent company
acquired startup
price
Facebook
Instagram
1000000000
Facebook
Oculus VR
2300000000
I have types.ts file which lookes like:
export interface Acquisition {
id: string
acquisition_id: string
acquiring_id: string
price: string
}
export interface Acquisition2 {
id: string
acquisition_id: string
parent_company: string
price: string
}
export interface Object {
id: string
name: string
}
And my index.ts file looks like:
const fs = require('fs')
const fastcsv = require('fast-csv')
const neatCsv = require('neat-csv')
import { Acquisition, Acquisition2, Object } from './types'
const resultsCSV: Array<{
parent_company?: string
acquired_startup?: string
price?: string
}> = []
const writeResultsCSV = (csv: typeof resultsCSV) => {
fastcsv
.write(csv, { headers: true })
.pipe(fs.createWriteStream('./output/acquisitions.csv')) // create `output` folder manually
}
const firstRun = async () => {
const acquisitions: Acquisition[] = await neatCsv(
fs.createReadStream('./sample/acquisitions.csv')
)
const objects: Object[] = await neatCsv(
fs.createReadStream('./sample/objects.csv')
)
for (let object of objects) {
for (let acquisition of acquisitions) {
if (object.id === acquisition.acquiring_id) {
const data = {
id: acquisition.id,
acquisition_id: acquisition.acquisition_id,
price: acquisition.price,
parent_company: object.name,
}
resultsCSV.push(data)
}
}
}
writeResultsCSV(resultsCSV)
}
const secondRun = async () => {
const acquisitions: Acquisition2[] = await neatCsv(
fs.createReadStream('./output/acquisitions.csv')
)
const objects: Object[] = await neatCsv(
fs.createReadStream('./sample/objects.csv')
)
for (let object of objects) {
for (let acquisition of acquisitions) {
if (object.id === acquisition.acquisition_id) {
const data = {
parent_company: acquisition.parent_company,
acquired_startup: object.name,
price: acquisition.price,
}
resultsCSV.push(data)
}
}
}
writeResultsCSV(resultsCSV)
}
const main = () => {
// firstRun()
secondRun()
}
main()
Now this works but I have to comment secondRun() on my first run & comment firstRun() on my second run.
How can I do this in a single operation? This is probably very simple but I am tired of banging my head against it for the last 2 days :(
Here's the Github repo if anyone wants to take a stab → https://github.com/deadcoder0904/combine-csv-startup-acquisitions-data
Someone sent a Pull Request on my repo & I found a simpler solution:
const fs = require('fs')
const fastcsv = require('fast-csv')
const neatCsv = require('neat-csv')
import { Acquisition, Object } from './types'
const resultsCSV: Array<{
parent_company?: string
acquired_startup?: string
price?: string
}> = []
const writeResultsCSV = (csv: typeof resultsCSV) => {
fastcsv
.write(csv, { headers: true })
.pipe(fs.createWriteStream('./output/acquisitions.csv')) // create `output` folder manually
}
const main = async () => {
const acquisitions: Acquisition[] = await neatCsv(
fs.createReadStream('./sample/acquisitions.csv')
)
const objects: Object[] = await neatCsv(
fs.createReadStream('./sample/objects.csv')
)
acquisitions.forEach((acquisition) => {
const parent_company = objects.find((o) => o.id == acquisition.acquiring_id)
const acquired_startup = objects.find(
(o) => o.id == acquisition.acquisition_id
)
const data = {
parent_company: parent_company.name,
acquired_startup: acquired_startup.name,
price: acquisition.price,
}
resultsCSV.push(data)
writeResultsCSV(resultsCSV)
})
}
main()
I have several JavaScript files that I create enums. for example:
source.enum.js
const enumUtils = require('../enum.utils');
const EmailAddressesSourceType = enumUtils.createEnum([
['DIRECTORY', 'directory'],
['FILE', 'file'],
['ARRAY', 'array']
]);
module.exports = { EmailAddressesSourceType };
The enum.utils.js is just a file that do the simple function of creating an enum from array:
class EnumUtils {
constructor() { }
// This method takes a map of elements and converts them to freeze objects (an enum-like object).
createEnum(mapItems) {
if (!mapItems || mapItems.length <= 0) {
throw new Error(`No array received: ${mapItems} (1000000)`);
}
const mapList = new Map([...mapItems]);
const symbolMap = {};
mapList.forEach((value, key) => { symbolMap[key] = value; });
return Object.freeze(symbolMap);
}
}
const enumUtils = new EnumUtils();
module.exports = enumUtils;
Now since I have 5-6 js files with enums, I want to avoid 'const enumUtils = require('../enum.utils');' in each of them, and do it all together in index.js file, something like this:
const { EmailAddressStatus, EmailAddressType, SendEmailStepName } = require('./files/emailAddress.enum');
const { Placeholder } = require('./files/placeholder.enum');
const { EmailAddressesSourceType } = require('./files/sources.enum');
const { Mode, Status, Method } = require('./files/system.enum');
const { StatusIcon, Color, ColorCode } = require('./files/text.enum');
const createEnum = (mapItems) => {
if (!mapItems || mapItems.length <= 0) {
throw new Error(`No array received: ${mapItems} (1000000)`);
}
const mapList = new Map([...mapItems]);
const symbolMap = {};
mapList.forEach((value, key) => { symbolMap[key] = value; });
return Object.freeze(symbolMap);
};
module.exports = {
createEnum(Color), createEnum(ColorCode), createEnum(EmailAddressStatus), createEnum(EmailAddressType), createEnum(EmailAddressesSourceType),
createEnum(Method), createEnum(Mode), createEnum(Placeholder), createEnum(SendEmailStepName), createEnum(Status), createEnum(StatusIcon)
};
But, there are compilation error in:
module.exports = {
createEnum(Color), createEnum(ColorCode), createEnum(EmailAddressStatus), createEnum(EmailAddressType), createEnum(EmailAddressesSourceType),
createEnum(Method), createEnum(Mode), createEnum(Placeholder), createEnum(SendEmailStepName), createEnum(Status), createEnum(StatusIcon)
};
My question is, there is a workaround so enable me to reduce the 'const enumUtils = require('../enum.utils');' in each file of the enums js file?
Thanks!
UPDATE 1
The error I'm getting is this:
The current status of the file (before I was trying to refactor) - It works OK:
index.js
const { EmailAddressStatus, EmailAddressType, SendEmailStepName } = require('./files/emailAddress.enum');
const { Placeholder } = require('./files/placeholder.enum');
const { EmailAddressesSourceType } = require('./files/sources.enum');
const { Mode, Status, Method } = require('./files/system.enum');
const { StatusIcon, Color, ColorCode } = require('./files/text.enum');
module.exports = {
Color, ColorCode, EmailAddressStatus, EmailAddressType, EmailAddressesSourceType,
Method, Mode, Placeholder, SendEmailStepName, Status, StatusIcon
};
This guy, guy-incognito, solved for me the issue. Now it works like a charm. Thanks man!
const { EmailAddressStatus, EmailAddressType, SendEmailStepName } = require('./files/emailAddress.enum');
const { Placeholder } = require('./files/placeholder.enum');
const { EmailAddressesSourceType } = require('./files/sources.enum');
const { Mode, Status, Method } = require('./files/system.enum');
const { StatusIcon, Color, ColorCode } = require('./files/text.enum');
const createEnum = (mapItems) => {
if (!mapItems || mapItems.length <= 0) {
throw new Error(`No array received: ${mapItems} (1000000)`);
}
const mapList = new Map([...mapItems]);
const symbolMap = {};
mapList.forEach((value, key) => { symbolMap[key] = value; });
return Object.freeze(symbolMap);
};
module.exports = {
Color: createEnum(Color),
ColorCode: createEnum(ColorCode),
EmailAddressStatus: createEnum(EmailAddressStatus),
EmailAddressType: createEnum(EmailAddressType),
EmailAddressesSourceType: createEnum(EmailAddressesSourceType),
Method: createEnum(Method),
Mode: createEnum(Mode),
Placeholder: createEnum(Placeholder),
SendEmailStepName: createEnum(SendEmailStepName),
Status: createEnum(Status),
StatusIcon: createEnum(StatusIcon)
};