I am implementing a cacheing layer in NodeJS and MongoDB using Redis. I am fairly new to Redis. So I am having trouble where I am trying to automatically clear cache after a given timing. The error I am getting
ReplyError: ERR wrong number of arguments for 'hset' command
This is my code block
mongoose.Query.prototype.exec = async function() {
const key = JSON.stringify(
Object.assign({}, this.getQuery(), {collection:
this.mongooseCollection.name})
);
const cachedValue = await client.hget(this.hashKey, key);
if(cachedValue) {
const parsedDoc = JSON.parse(cachedValue);
return Array.isArray(parsedDoc) ? parsedDoc.map(doc => new
this.model(doc)) : new this.model(parsedDoc);
}
const result = await exec.apply(this, arguments);
client.hset(this.hashKey, key, JSON.stringify(result), 'EX', 10);
return result;
}
Redis HSET only accepts 3 arguments. If you want to store multiple keys in one call, you should use HMSET.
Reference:
https://redis.io/commands/hset
https://redis.io/commands/hmset
client.hmset(this.hashKey, key, JSON.stringify(result), 'EX', 10);
should work.
Related
I have a collection like this into firebase realtime database:
I need to delete the first element (the one that finishes with Wt6J) from server side using firebase-admin.
I have this simple code:
const deleteNotification = () => {
const key1 = 'FhN6Ntw8gyPLwJYVzcHy0E8Wq5z2';
const key2 = '-MzGhZ2psGLivIfTWt6J';
const notsRef = db.ref(`notifications/${key1}/${key2}`);
notsRef.remove();
};
This doesn't work. What method should I use to delete a specific field? How do you think I can do it?
I would think to use await in a try catch block.
Await starts another thread which returns once it has completed.
As people said above - your cloud function is likely being killed before the remove actually happens.
const deleteNotification = async () => {
try{
const key1 = 'FhN6Ntw8gyPLwJYVzcHy0E8Wq5z2';
const key2 = '-MzGhZ2psGLivIfTWt6J';
const notsRef = db.ref(`notifications/${key1}/${key2}`);
await notsRef.remove();
} catch( err ) {
console.log( 'failed to remove record.' );
console.log( err );
}
console.log( 'removed record successfully.' );
};
So I'm working on a project where I'm making a call to a database to retrieve the data stored there. This data comes as an array. here is the code:
const allLogins = await Login.find().sort("name");
const token = req.header("x-auth-token");
const user = jwt.verify(token, config.get("jwtPrivateKey"));
const logins = allLogins
.filter((login) => login.userId === user._id)
.map((login) => {
login.password = decrypt(login.password);
});
If I call a console.log after the decrypt has been run I see that it has been completed correctly. The issue I have is if I console.log(logins) it says it is an array of two items that are both undefined. If instead I run it like this...
const allLogins = await Login.find().sort("name");
const token = req.header("x-auth-token");
const user = jwt.verify(token, config.get("jwtPrivateKey"));
let logins = allLogins.filter((login) => login.userId === user._id);
logins.map((login) => {
login.password = decrypt(login.password);
});
Then it works as it should. I'm not sure why the first set of code doesn't work and why the second set does work.
Any help would be appreciated!
Basic :
array. filter - accept a callback and call back return boolean (that match our criteria)
array.map - accept a callback and call back return transformed object
In the second working example:
logins.map((login) => {
// note: logins is iterated but not assigned to logins back
// so accessing login is working
login.password = decrypt(login.password); // map should return data
+ return login; // if we update all code will work
});
Now coming to first example:
const logins = allLogins
.filter((login) => login.userId === user._id)
.map((login) => {
login.password = decrypt(login.password);
+ return login; // this will fix the issue
});
const { promisify } = require('util');
const index = require('../../../index');
//get caching data in redis
const redisGet = async(key) => {
const getAsync = promisify(index.clientRedis.get).bind(index.clientRedis);
const value = await getAsync(key);
return value;
}
module.exports = redisGet;
My "redisGet" function return right value at the first time, and later times, it only return "null" although the caching data is still exist.
const cachingData = await redisGet('key');//first time: cachingData = <right value>//later times: cachingData = null
How can I fix it ?
Hope for solutions. Thanks all !
You might be storing the key with a TTL value.
After the TTL is up the key is expired and null is returned if you try to fetch the key.
ref: https://redis.io/commands/expire/
So I have been trying to make a system with "memory". I used a JSON file for this, but it never refreshes the file. I looked it up and I got a function showing this
function requireUncached(module) {
delete require.cache[require.resolve(module)];
return require(module);
}
but they didn't show any syntax. Do I only put it at the top instead of const whatever = require(file)? Do I do it in every function it needs to be refreshed in? I have no idea. The reason I need this is so that it is all automatic and I don't have to do node . every time.
Warning, using Proxy and fs.writeFileSync() is slow
Here's a convenient function that uses a Proxy to automatically write changes back to disk whenever the object in memory is updated:
const fs = require('fs');
const storage = (path, encoding = 'utf8', space = null) => {
const object = JSON.parse(fs.readFileSync(path, encoding));
let immediate = null;
const scheduleWriteFile = () => {
clearImmediate(immediate);
immediate = setImmediate(() => {
fs.writeFileSync(path, JSON.stringify(object, null, space), encoding);
});
};
const handler = {
get(target, property) {
const value = Reflect.get(target, property);
if (Object(value) === value) {
const descriptor = Reflect.getOwnPropertyDescriptor(target, property);
if (descriptor === undefined || descriptor.configurable || descriptor.writable) {
return new Proxy(value, handler);
}
}
return value;
},
...Object.fromEntries(['defineProperty', 'deleteProperty', 'set'].map(
(key) => [key, (...args) => {
const result = Reflect[key](...args);
if (result) {
scheduleWriteFile();
}
return result;
}]
))
};
return new Proxy(object, handler);
};
Example usage:
const settings = storage('config.json', 'utf8', 2);
...
// automatically schedules call to
// fs.writeFileSync('config.json', JSON.stringify(settings, null, 2), 'utf8')
// after updating object
settings.users[user.id].banned = true;
The advantage of using setImmediate() is that if multiple properties are updated synchronously on settings, it only schedules one call to fs.writeFileSync(), and it will occur after the event loop processes currently pending I/O events.
Because the proxy object recurses, you can treat settings exactly as a normal object, keep variable references to object or array properties, and read primitive values from it as usual.
The only restriction is that the JSON file must begin with { or [ in order for the object to be allowed as the target of a Proxy.
I am facing the problem of clone of the mongoose query object .Javascript the copy the one object into another object by call-by-ref but in my project there is scenario i need to copy one object into another object by call-by-value.
var query=domain.User.find({
deleted: false,
role: role
})
var query1=query;
I have the scenario change in the query object is not reflected in query1. I google and try so many way to clone the object but it does't work.The query object is used in another function for pagination and query1 object is used for count query.
1.I used to Object.clone(query1) error Object.clone is not function
2.I used Object.assign(query1) but it does't works fine.
3.I used other so many ways can anybody help me to sort this problem
Alternative solution using merge method:
const query = domain.User.find({
deleted: false,
role: role
}).skip(10).limit(10)
const countQuery = query.model.find().merge(query).skip(0).limit(0)
const [users, count] = await Promise.all([query, countQuery.count()])
you are trying to clone a cursor, but it is not the right approach, you probably just need to create another
like this:
var buildQuery = function() {
return domain.User.find({
deleted: false,
role: role
});
};
var query = buildQuery();
var query1 = buildQuery();
This is work for me:
const qc = sourceQuery.toConstructor();
const clonedQuery = new qc();
This code work in pagination function where sourceQuery passed as parameter and i dont known what models used. Also it work with aggregations and complex queries.
public async paging(
query: mongoose.DocumentQuery<mongoose.Document[], mongoose.Document>,
params,
transformer: any = null
) {
let page = Number(params.page);
if (!page) page = 1;
let page_size = Number(params.count);
if (!page_size) page_size = 100;
const qc = query.toConstructor();
const cq = new qc();
return cq.countDocuments().exec()
.then(async (total) => {
const s = params.sort;
if (s) {
query.sort(s);
}
query.limit(page_size);
query.skip(page_size * (page - 1));
let results = await query.exec();
if (transformer) {
results = await Promise.all(results.map((i) => transformer(i)));
}
const r = new DtoCollection();
r.pages = Math.ceil(total / page_size);
r.total = total;
(r.results as any) = results;
return r;
});
}
Sergii Stotskyi's answer works just fine and is very elegant, except that count is deprecated.
countDocuments or estimatedDocumentCount should be used instead.
However, this causes the error the limit must be positive. We can walk around this by set limit to a large integer.
const query = domain.User.find({
deleted: false,
role: role
}).skip(10).limit(10)
const countQuery = query.model.find().merge(query).skip(0).limit(Number.MAX_SAFE_INTEGER)
const [users, count] = await Promise.all([query, countQuery.countDocuments()])
Since mongoose v6 you can use Query.prototype.clone
E.g. for your code snippet:
const query = domain.User.find({
deleted: false,
role: role
})
const query1 = query.clone();