Why doesn't Node detect my recently created file - javascript

I have a Node.js script that subscribes to a notification service and runs a bunch of things when push notification is received. However the service sometimes sends multiple notifications for the same event, so to avoid duplicate work I made a basic semaphore to block other tasks.
The problem is that Node still continues with execution despite the fact I see the file created on disk. I've tried a few different solutions but I think the problem comes from my lack of experience with the JS execution model, there's something I don't know about how it works that prevents my solution from working. How do I fix this?
const fse = require('fs-extra');
// notification handler callback
function handleRequest(data)
{
try{
var semaphore = fse.readJsonSync(__dirname + '/' + objectId);
console.log('task already running, stopping');
return;
}catch(err){
// semaphore doesn't exist, ok to proceed
console.log('starting new task');
fse.writeJson(__dirname + '/' + objectId, {objectId: objectId})
.then(stepOne).catch(rejectPromise)
.then(resp => stepTwo(resp, data)).catch(rejectPromise)
.then(resp => stepThree(resp, extra)).catch(rejectPromise)
.then(resp => stepFour(resp, argument)).catch(rejectPromise)
.then(sleep(20000))
.then(resp => releaseLock(objectId))
.catch(resp => rejectionHandler(resp);
}
}
function releaseLock(objectId)
{
return fse.remove(__dirname + '/' + objectId);
}
Other things I've tried
Create file in a separate function that returns promise, same outcome
Use Sync method to write file, but then I'm unable to chain promises
Wait synchronously after file creation, no effect

There is no need to create an external file to maintain locks, you can do something like this, this will also give you the performance boost ( less I/O opts).
const fse = require('fs-extra');
// notification handler callback
class NamedLocks {
constructor() {
this._pid = {};
}
acquire(pid) {
if (this._pid[pid]) {
// process is locked
// handle it
return Promise.reject();
}
this._pid[pid] = true;
return Promise.resolve();
}
release(pid) {
delete this._pid[pid];
}
}
const userLocks = new NamedLocks();
function handleRequest(data) {
userLocks.acquire(objectId)
.then(() => {
// semaphore doesn't exist, ok to proceed
console.log('starting new task');
fse.writeJson(__dirname + '/' + objectId, { objectId: objectId })
.then(stepOne).catch(rejectPromise)
.then(resp => stepTwo(resp, data)).catch(rejectPromise)
.then(resp => stepThree(resp, extra)).catch(rejectPromise)
.then(resp => stepFour(resp, argument)).catch(rejectPromise)
.then(sleep(20000))
.then(resp => userLocks.release(objectId))
.catch(resp => rejectionHandler(resp))
}).catch(() => {
// handle lock exist condition here
});
};
In this, you basically ask for a lock and if the lock exists, handle that in catch handler else do your thing and release the lock

Related

Can I build a WebWorker that executes arbitrary Javascript code?

I'd like to build a layer of abstraction over the WebWorker API that would allow (1) executing an arbitrary function over a webworker, and (2) wrapping the interaction in a Promise. At a high level, this would look something like this:
function bake() {
... // expensive calculation
return 'mmmm, pizza'
}
async function handlePizzaButtonClick() {
const pizza = await workIt(bake)
eat(pizza)
}
(Obviously, methods with arguments could be added without much difficulty.)
My first cut at workIt looks like this:
async function workIt<T>(f: () => T): Promise<T> {
const worker: Worker = new Worker('./unicorn.js') // no such worker, yet
worker.postMessage(f)
return new Promise<T>((resolve, reject) => {
worker.onmessage = ({data}: MessageEvent) => resolve(data)
worker.onerror = ({error}: ErrorEvent) => reject(error)
})
}
This fails because functions are not structured-cloneable and thus can't be passed in worker messages. (The Promise wrapper part works fine.)
There are various options for serializing Javascript functions, some scarier than others. But before I go that route, am I missing something here? Is there another way to leverage a WebWorker (or anything that executes in a separate thread) to run arbitrary Javascript?
I thought an example would be useful in addition to my comment, so here's a basic (no error handling, etc.), self-contained example which loads the worker from an object URL:
Meta: I'm not posting it in a runnable code snippet view because the rendered iframe runs at a different origin (https://stacksnippets.net at the time I write this answer — see snippet output), which prevents success: in Chrome, I receive the error message Refused to cross-origin redirects of the top-level worker script..
Anyway, you can just copy the text contents, paste it into your dev tools JS console right on this page, and execute it to see that it works. And, of course, it will work in a normal module in a same-origin context.
console.log(new URL(window.location.href).origin);
// Example candidate function:
// - pure
// - uses only syntax which is legal in worker module scope
async function get100LesserRandoms () {
// If `getRandomAsync` were defined outside the function,
// then this function would no longer be pure (it would be a closure)
// and `getRandomAsync` would need to be a function accessible from
// the scope of the `message` event handler within the worker
// else a `ReferenceError` would be thrown upon invocation
const getRandomAsync = () => Promise.resolve(Math.random());
const result = [];
while (result.length < 100) {
const n = await getRandomAsync();
if (n < 0.5) result.push(n);
}
return result;
}
const workerModuleText =
`self.addEventListener('message', async ({data: {id, fn}}) => self.postMessage({id, value: await eval(\`(\${fn})\`)()}));`;
const workerModuleSpecifier = URL.createObjectURL(
new Blob([workerModuleText], {type: 'text/javascript'}),
);
const worker = new Worker(workerModuleSpecifier, {type: 'module'});
worker.addEventListener('message', ({data: {id, value}}) => {
worker.dispatchEvent(new CustomEvent(id, {detail: value}));
});
function notOnMyThread (fn) {
return new Promise(resolve => {
const id = window.crypto.randomUUID();
worker.addEventListener(id, ({detail}) => resolve(detail), {once: true});
worker.postMessage({id, fn: fn.toString()});
});
}
async function main () {
const lesserRandoms = await notOnMyThread(get100LesserRandoms);
console.log(lesserRandoms);
}
main();

How execute promises in order?

I can't make my code work in order. I need the connection test to come first, and finally the functions are also resolved in order to form a text string that will be sent in a tweet with an NPM package. (This is not my true code, it is a summary example)
I've tried many things and my brain is on fire
// Test DB conection
db.authenticate()
.then(() => {
const server = http.createServer(app)
server.listen(config.port, () => {
console.log(`http://localhost:${config.port}`)
})
reload(app)
})
.catch(err => {
console.log(`Error: ${err}`)
})
// Functions
resumen.man = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is good.'
})
}
resumen.man1 = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is bad.'
})
}
resumen.man2 = (numRoom) => {
const registries = Registries.findOne({})
.then((registries) => {
return registries.name+' is big.'
})
}
// Execute resumen.man(1) first and save text in $varStringMultiLine ?
// Execute resumen.man1(1) later and save text in the same $varStringMultiLine ?
// Execute resumen.man2(1) last and save text in the same $varStringMultiLine ?
sendTweet($varStringMultiLine)
Thanx.
As commented by #Barmar and #some, you could chain the promises with .then or use async / await. I would recommend the latter, since .then-chaining will get unwieldy fast.
This is a really good explanation for async / await: https://javascript.info/async-await
Basically, you can use
await db.authenticate();
to halt the code and not execute the next line before the promise is resolved. However, to not freeze the whole execution, this itself needs to be done asynchronously in a promise.

How to remove event emitter from particular function node js

I have an common event emitter.
var events = require('events');
var eventEmitter = new events.EventEmitter();
which emites events like pause, resume , cancel.
i listen to this events in my function. but this functions are called inside for loop.
let func = () =>{
//some Action runs async;
eventEmitter.on("pause",()=>{
//some action;
});
eventEmitter.on("resume",()=>{
//some action;
})
eventEmitter.on("cancel",()=>{
//some action;
})
return 0;
}
for(let i=0;i<anyNumber;i++){
func();
}
EDIT : my real indention is to read files in a directory recursively and upload to s3 Bucket, since there is no official method to upload a whole directory, i achieved through this.
The for loop mentioned above is actually a fs.readdir, for sake of simplicity i mentioned it as for loop.
in func() i have the s3 upload function (multipart upload) while pause button clicked i need to pause the upload (which means leave current part uploading already, and stop another part to be uploaded.)
while resume means part upload continues, while cancel means i cancel the multipart upload.
this is my exact case.
const readdirp = require('readdirp');
readdirp('.', {fileFilter: '*.js', alwaysStat: true})
.on('data', (entry) => {
const {path, stats: {size}} = entry;
s3Fileupload(path)
})
.on('warn', error => console.error('non-fatal error', error))
.on('error', error => console.error('fatal error', error))
.on('end', () => console.log('done'));
can you now help me out ?
EDIT:1
let func = () =>{
let stream = es.map((data, next) => {
queue.defer(function(details, done) {
_this.s3MultiUpload(JSON.parse(details), options, done, details, next);
}, data);
}); }
let stream = readdirp(path)
stream.pipe(this.func());
may be the d3Queue i am using here may causing memory leak, i am pushing the function all the way when reading directory ?
To remove a listener, call eventEmitter.removeListener(event, listener).
You need to keep a copy of all listeners attached around.
Alternatively you can simply call eventEmitter.removeAllListeners() if the emitter is not used elsewhere.
If I do this, I can't listen to that event after that right?
You are right. You need to wait till you no longer need the events, then remove them.
Ideally, you do not want to attach too many listeners.
Instead of increasing the limit, do all the work in a single event callback.
Here is how I would do it:
// Create an event emitter
const events = require('events');
const eventEmitter = new events.EventEmitter();
// Build a list of tasks to run
let tasks = [];
let tasksDone = 0;
for (let file of files) {
// Each task can be paused, resumed, canceled
let task = tasks.push({
pause: () => {/* TODO */},
resume: () => {/* TODO */},
cancel: () => {/* TODO */},
start: async () => {
// Do work
// Send a signal when task is done
eventEmitter.emit('done');
}
});
tasks.push(task);
}
// Store the listeners
let listeners = [
['pause', () => {
tasks.forEach(task => task.pause());
}],
['resume', () => {
tasks.forEach(task => task.resume());
}],
['cancel', () => {
tasks.forEach(task => task.cancel());
}],
['done', () => {
tasksDone++;
if (tasksDone === task.length) {
// All work done
// Remove listeners
listeners.forEach(([event, callback]) => {
eventEmitter.removeListener(event, callback);
});
}
}],
];
// Attach listeners
listeners.forEach(([event, callback]) => {
eventEmitter.on(event, callback);
});
// Start tasks
tasks.forEach(task => task.start());
With all that said, your application may be crashing for other reasons.
If you open too many files at the same time, or use too much memory, your application can crash before the tasks are done. Goes without saying you should also make sure that the files are closed etc.
I would recommend start by doing queuing the tasks and doing them one at a time.
If you need more throughput, write a scheduler to make sure you don't consume too much resource at a time.
Lastly, for a node program, you can attach Chrome debugger to find out why memory is not freed. You can find out exactly what is holding onto memory if the issue persists.

Asyncronus React-Native

Problem: Asynchronous code causes whole source code to follow asynchrony
Example:
// global scope
let _variableDefinedInParentScope
WriteConfiguration(__params) {
// overSpreading.
let { data, config } = __params
// local variable.
let _fileName, _configuration, _write, _read
// variable assignment.
_fileName = config
_configuration = data
// if dataset and fileName is not empty.
if(!_.isEmpty(_configuration) && !_.isEmpty(_fileName)) {
// create a path you want to write to
// :warning: on iOS, you cannot write into `RNFS.MainBundlePath`,
// but `RNFS.DocumentDirectoryPath` exists on both platforms and is writable
_fileName = Fs.DocumentDirectoryPath + ' /' + _fileName;
// get file data and return.
return Fs.readDir(_fileName).then((__data) => {
console.error(__data)
// if data is not empty.
if(!_.isEmpty(__data)) {
// return data if found.
return __data
} else {
// write the file
return Fs.writeFile(_fileName, data, 'utf8')
.then((success) => {
// on successful file write.
return success
})
.catch((err) => {
// report failure.
console.error(err.message);
})
}
})
.catch((err) => {
// report failure.
console.error(err.message)
})
}
} // write configuration to json file.
following are ways to promise handling
.then((__onAccept)=>{}, (__onReject) => {})
aync function (__promise) { await WriteConfiguration() }
.then((_onAccept) => { _variableDefinedInParentScope = __onAccept }
As far i know third one is useless point as i never encounterd any return because promise is resolving takes time and calling that variable before any resolve will return undefined
React-native
In react-native almost every part of code is syncronus where file writing module's are asynchrony and this is causing trouble for me.
What i want
i want to return value from asyncrous to syncrouns code. without any asynchrony chain.
Your answer is quit simple by using the
await
and
async
EXAMPLE:
mainFunction(){
//will wait for asyncroFunction to finish!!
await asyncroFunction()
}
async asyncroFunction(){
}

How to tell when a stream with pipes finished draining

I have a stream that is composed from a chain of pipes.
I am using event-stream package to create the building blocks of the pipes.
The code gets a file from S3, unzips it, parses it and send the data to some async function
I am trying to get the promise resolved when it finished handling that file.
How can I be sure that the all the chain has finished draining?
My current solution looks like this.
it looks bad and I still think that there is a possibility that resolve()
will be called while there are data chunks in the gzReader for example.
thanks
const inputStream = this.s3client.getObject(params).createReadStream()
inputStream.on('end',() => {
console.log("Finished handling file " + fileKey)
let stopInterval = setInterval(() => {
if (counter == 0) {
resolve(this.eventsSent)
clearInterval(stopInterval)
}
}, 300)
})
const gzReader = zlib.createGunzip();
inputStream
.pipe(gzReader)
.pipe(es.split())
.pipe(es.parse())
.pipe(es.mapSync(data => {
counter++
this.eventsSent.add(data.data)
asyncFunc(this.destinationStream, data.data)
.then(() => {
counter--
})
.catch((e) => {
counter--
console.error('Failed sending event ' + data.data + e)
})
}))
Because you never initialize counter, it is zero and after the first 300ms, your function resolves (which can be before your pipes are working and increase the counter).
So don't use setInterval ;) You don't need it.
Also there is no need to use mapSync, if you already call an async function in it. Just use map and pass the data and callback (https://github.com/dominictarr/event-stream#map-asyncfunction). Don't forget to call the callback in your async function!
Add a last step in your pipe: wait(callback) (https://github.com/dominictarr/event-stream#wait-callback)
There you can resolve.

Categories