I run into a problem, which I cant solve.
Im making an app, where on the first page I need to choose one of two machines, there are 2 buttons on page and when one of them is clicked, i make POST to /mechineChoose where I pass id of selected machine. Then I need to change config.js file, where I have all params needed for rest of app.
const config = {
machineName: "Machine",
...
So in my code I need to change machineName, right now I use fs module to read and then write to file, but problem is that I cant change this name more than once. When I restart app, Im able to change the name, but when trying to choose second machine, nothing happens.
router.post("/machineChoose", async (req, res) => {
console.log(req.body.machineChoose);
if (req.body.machineChoose == 1) {
machineX = "Machine1";
} else {
machineX = "Machine2";
}
console.log(machineX);
fs.readFile('./config.js', 'utf-8', function (err,data){
if (err){
console.log(err);
}
var result = data.replace(config.machineName,machineX);
fs.writeFileSync('./config.js', result, 'utf-8', function(err){
if (err) return console.log(err);
});
});
return res.send("")
})
Any idea how to solve it ?
After writing to the file, you need to reload the config-object as it will still hold the previous state in-memory and thus further calls to data.replace(...) will not replace anything, since it will still be called with "Machine".
I would do something like this (although you should consider using a real database):
router.post("/machineChoose", async (req, res) => {
const chosenMachine = req.body.machineChoose == 1 ? "Machine1" : "Machine2";
const config = await readConfig();
config.machineName = chosenMachine;
await writeConfig(config);
res.status(204).end();
});
async function writeConfig(currentConfig) {
try {
await fs.promises.writeFile("./config.json", JSON.stringify(currentConfig));
} catch (e) {
console.log("Could not write config file", e)
throw e;
}
}
async function readConfig() {
try {
const rawConfig = await fs.promises.readFile("./config.json", {encoding: 'utf-8'});
return JSON.parse(rawConfig);
} catch (e) {
console.log("Could not read config file", e)
throw e;
}
}
Related
I need to load and interpret Parquet files from an S3 bucket using node.js. I've already tried parquetjs-lite and other npm libraries I could find, but none of them seems to interpret date-time fields correctly. So I'm trying to AWS's own SDK instead, in the believe that is should be able to deserialize its own Parquet format correctly -- the objects were originally written from SageMaker.
The way to go about it, apparently, is to use the JS version of
https://docs.aws.amazon.com/AmazonS3/latest/API/API_SelectObjectContent.html
but the documentation for that is horrifically out of date (it's referring to the 2006 API, https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#selectObjectContent-property). Likewise, the example they show in their blog post doesn't work either (data.Payload is neither a ReadableStream not iterable).
I've already tried the response in
Javascript - Read parquet data (with snappy compression) from AWS s3 bucket. Neither of them work: the first uses
node-parquet, which doesn't currently compile, and the second uses parquetjs-lite (which doesn't work, see above).
So my question is, how is SelectObjectContent supposed to work nowadays, i.e., using aws-sdk v3?
import { S3Client, ListBucketsCommand, GetObjectCommand,
SelectObjectContentCommand } from "#aws-sdk/client-s3";
const REGION = "us-west-2";
const s3Client = new S3Client({ region: REGION });
const params = {
Bucket: "my-bucket-name",
Key: "mykey",
ExpressionType: 'SQL',
Expression: 'SELECT created_at FROM S3Object',
InputSerialization: {
Parquet: {}
},
OutputSerialization: {
CSV: {}
}
};
const run = async () => {
try {
const data = await s3Client.send(new SelectObjectContentCommand(params));
console.log("Success", data);
const events = data.Payload;
const eventStream = data.Payload;
// Read events as they are available
eventStream.on('data', (event) => { // <--- This fails
if (event.Records) {
// event.Records.Payload is a buffer containing
// a single record, partial records, or multiple records
process.stdout.write(event.Records.Payload.toString());
} else if (event.Stats) {
console.log(`Processed ${event.Stats.Details.BytesProcessed} bytes`);
} else if (event.End) {
console.log('SelectObjectContent completed');
}
});
// Handle errors encountered during the API call
eventStream.on('error', (err) => {
switch (err.name) {
// Check against specific error codes that need custom handling
}
});
eventStream.on('end', () => {
// Finished receiving events from S3
});
} catch (err) {
console.log("Error", err);
}
};
run();
The console.log shows data.Payload as:
Payload: {
[Symbol(Symbol.asyncIterator)]: [AsyncGeneratorFunction: [Symbol.asyncIterator]]
}
what should I do with that?
I was stuck on this exact same issue for quite some time. It looks like the best option now is to append a promise() to it.
So far, I've made progress using the following (sorry, this is incomplete but should at least enable you to read data):
try {
const s3Data = await s3.selectObjectContent(params3).promise();
// using 'any' here temporarily, but will need to address type issues
const events: any = s3Data.Payload;
for await (const event of events) {
try {
if(event?.Records) {
if (event?.Records?.Payload) {
const record = decodeURIComponent(event.Records.Payload.toString().replace(/\+|\t/g, ' '));
records.push(record);
} else {
console.log('skipped event, payload: ', event?.Records?.Payload);
}
}
else if (event.Stats) {
console.log(`Processed ${event.Stats.Details.BytesProcessed} bytes`);
} else if (event.End) {
console.log('SelectObjectContent completed');
}
}
catch (err) {
if (err instanceof TypeError) {
console.log('error in events: ', err);
throw err;
}
}
}
}
catch (err) {
console.log('error fetching data: ', err);
throw err;
}
console.log("final records: ", records);
return records;
}
I've been making a project recently and I basically need to check for new text in a text file.
My code was this:
const fs = require('fs');
fs.watch('./file.txt', (event, filename) => {
fs.readFile('./file.txt', (err, data) => {
if (err) throw err;
data = JSON.parse(data);
console.log(data);
}
}
It worked great. However, sometimes, I must delete this file for whatever reasons, and thus my code crashes too!
Any idea on how to handle this? Thank you for your answers
Node's built-in module fs doesn't support file deletion detection very well. There is a workaround using a package called nsfw which is a wrapper around a native library that provides much better support for deletion detection.
The API is a bit odd but it is a solid package nonetheless.
Here is an example of what you're attempting to do using nsfw.
const nsfw = require("nsfw");
const path = require("path");
const fs = require("fs");
const file = path.join(__dirname, "file.txt");
let watcher;
nsfw(
file,
([event, ...restEvents]) => {
switch (event.action) {
case nsfw.actions.DELETED: {
watcher.stop();
return; // or handle this however you need to..
}
default: {
fs.readFile(file, (err, data) => {
if (err) throw err;
try {
data = JSON.parse(data);
console.log(data);
} catch (error) {
console.error(error)
}
});
}
}
}
)
.then((w) => {
watcher = w;
watcher.start()
});
I'm trying to setup a NodeJS application with GraphiQL API and MySQL database connection. All of it seem to work until I'm trying to get the data that was fetched from the database be available for GraphQL to be able to do something with it.
Here we have the app.js file, which is the starting point of the backend. Assume that all the imports and declarations are valid.
app.use('/api', graphql({
schema: buildSchema(schema),
rootValue: resolvers,
graphiql: true
}));
The rootValue is as follows.
const resolvers = {
regions: () => {
var a = manager.getRegions();
console.log("a: " + a);
return a;
}
};
The manager object. I have the manager object incase I want to change database type in the future.
const manager = {
getRegions : function() {
console.log("getRegions entered...");
return processQuery("regions");
}
};
Finally, under the MySQL script we have.
const processQuery = (query) => {
var res = null;
switch (query) {
case 'regions':
default:
db.query(SELECT_REGIONS_QUERY, (err, rows) => {
if (err) {
throw err;
} else {
res = JSON.stringify(rows);
console.log("Stringify: " + res);
}
});
}
return res;
}
I've read numerous pages and even stackoverflow posts about Promise, callback functions and async/await but neither (atleast to code attempts made by me) seem to make the printout under the rootValue be printed last...
I saw an implementation done by Academind that uses MongoDB instead and he doesn't seem to have to care about this issue. Any ideas on how to solve this? Thank you!
What you can to is make the processQuery an asynchronous function and just wait for the db.query to be solved.
const processQuery = async (query) => {
var res = null;
switch (query) {
case 'regions':
default:
await db.query(SELECT_REGIONS_QUERY, (err, rows) => {
if (err) {
throw err;
} else {
res = JSON.stringify(rows);
console.log("Stringify: " + res);
}
});
}
return res;
}
I've been working on a Node project that involves fetching some data from BigQuery. Everything has been fine so far; I have my credential.json file (from BigQuery) and the project works as expected.
However, I want to implement a new feature in the project and this would involve fetching another set of data from BigQuery. I have an entirely different credential.json file for this new dataset. My project seems to recognize only the initial credential.json file I had (I named them differently though).
Here's a snippet of how I linked my first credential.json file:
function createCredentials(){
try{
const encodedCredentials = process.env.GOOGLE_AUTH_KEY;
if (typeof encodedCredentials === 'string' && encodedCredentials.length > 0) {
const google_auth = atob(encodedCredentials);
if (!fs.existsSync('credentials.json')) {
fs.writeFile("credentials.json", google_auth, function (err, google_auth) {
if (err) console.log(err);
console.log("Successfully Written to File.");
});
}
}
}
catch (error){
logger.warn(`Ensure that the environment variable for GOOGLE_AUTH_KEY is set correctly: full errors is given here: ${error.message}`)
process.kill(process.pid, 'SIGTERM')
}
}
Is there a way to fuse my two credential.json files together? If not, how can I separately declare which credential.json file to use?
If not, how can I separately declare which credential.json file to use?
What I would do I would create a function which is the exit point to BigQuery and pass an identifier to your function which credential to generate, This credential will then be used when calling BigQuery.
The below code assume you changed this
function createCredentials(){
try{
const encodedCredentials = process.env.GOOGLE_AUTH_KEY;
To this:
function createCredentials(auth){
try{
const encodedCredentials = auth;
And you can use it like this
import BigQuery from '#google-cloud/bigquery';
import {GoogApi} from "../apiManager" //Private code to get Token from client DB
if (!global._babelPolyfill) {
var a = require("babel-polyfill")
}
describe('Check routing', async () => {
it('Test stack ', async (done, auth) => {
//Fetch client Auth from local Database
//Replace the 2 value below with real values
const tableName = "myTest";
const dataset = "myDataset";
try {
const bigquery = new BigQuery({
projectId: `myProject`,
keyFilename: this.createCredentials(auth)
});
await bigquery.createDataset(dataset)
.then(
args => {
console.log(`Create dataset, result is: ${args}`)
})
.catch(err => {
console.log(`Error in the process: ${err.message}`)
})
} catch (err) {
console.log("err", err)
}
})
})
I am trying to read the contents of a specific path. for that purpose, i used the following code:
code1:
const contentsOfPersonalFolder = fs.readdirSync(rootPathToPersonal);
but i know in advance that i do not have access permission to read some of the contents that will be returned from the previous line of code.
To check whether or not I have access permission to read some files, i would use the following code
code2:
try {
fs.accessSync(path, fs.constants.R_OK);
logger.info('The directory: ', path, 'can be read');
} catch (err) {
logger.error('The directory: ', path, 'can not be read due inaccessibility');
}
The problem now is, the code in code1 will return an array of all available files in the specified path. and if one of the these files is not
accessible due read right protection, then it will throw and the program will throw.
what i want to achieve is to iterate through all the available files in the specified path in code1, and then check each item using the code in code2 and
if the file is accessible for reading i would like to do some logic, and if it is not accessible for reading i would do something else.
please let me know how to achieve that.
you could use fs.access to check the users permissions
https://nodejs.org/api/fs.html#fs_fs_access_path_mode_callback
const testFolder = './tests/';
const fs = require('fs');
fs.readdir(testFolder, (err, files) => {
files.forEach(file => {
console.log(file);
fs.access(file, fs.constants.R_OK, (err) => {
if (err) {
console.error("file is not readable");
return;
}
// do your reading operations
});
});
})
const fs = require('fs');
const isAvailableToRead = file => {
try {
fs.accessSync(file, fs.constants.R_OK);
return true;
} catch (err) {
return false;
}
}
const readDirectory = path => {
const files = fs.readdirSync(path);
files.forEach(file => {
if(isAvailableToRead(file)) {
console.log(`Do some logic ${file}`);
}
});
}
readDirectory(__dirname);