Indexeddb cursor wait for response? - javascript

I have an application which contains a indexeddb database. It saves info for a while. If this info was not passed to the server, for example due to a network error, the "synced" is set to false. Now I want to create a function that is able to upload the items with status: "synced:false" to the server.
The idea is to iterate through the db and everytime there is an entry with false, upload it to the server and on succes update the value in the local (indexeddb) db.
The problem I'm having is that I can't see how I can let the cursor operation wait for the response of the server (REST API) and then do actions accordingly (update local value "synced:false" -> "synced:true". The update to the server goes well.
Here is a working Stackblitz that shows the full code on the Homepage:
https://stackblitz.com/edit/ionic-thu3eg
Output is in the console.
This is an example with the actual problem.
update() {
var callstring: string;
let nowTime = new Date().toISOString();
let nowTimes = dateFns.format(nowTime, 'YYYY-MM-DD HH:mm:ss');
var headers = new Headers();
headers.append("Accept", "application/json");
headers.append("Content-Type", "application/json");
let options = new RequestOptions({ headers: headers });
const dbPromise = openDb('testdb', 1);
dbPromise.then(async db => {
const tx = db.transaction('scans', 'readwrite');
tx.objectStore('scans').iterateCursor(async cursor => {
if (!cursor) return;
if (cursor.value.synced == 'false') {
console.log(cursor.value);
await new Promise((resolve, reject) => {
this.http.get('https://jsonplaceholder.typicode.com/posts/1')
.map(res => res.json())
.subscribe(res => {
console.log('response : ', res.id);
resolve(res)
return callstring = res.id;
}, error => {
reject(error)
});
});
if (callstring == '1') {
console.log('do something : ', callstring);
cursor.value.synced = 'true';
cursor.value.time = nowTimes;
cursor.update(cursor.value);
} else {
console.log('sorry too late');
}
}
await cursor.continue();
});
await tx.complete.then(() => console.log('done cursoring'));
}).then(() => {
this.toastCtrl.create({
message: "DB updated",
duration: 2000,
position: 'top',
cssClass: "toast-mess"
}).present();
});
}
So if callstring = 1 then it should update the "false" value to "true" in the indexeddb.
How can I solve this ?
Thanks

Related

How to solve the data loss issue in Event Emitters in NodeJS? MicroServices

Below code is just for testing purpose , if it works then i want to implement the same in real project with microservices. I have 4 microservices in one server. And the communication between microservices is through rabbitMQ.
My flow :
Microservice 1 (Send data from here through rabbitMQ) =====> Microservice 2 (receive data here from rabbitMQ) resend the data back to the same to ====> Microservice 1.
Microservice 1 has two files:
Emitter.controller.js
recievedAuthController.js
Microservice 2 has one file:
receivedSettingsController.js
Now,Here is the Flow ->
Emitter.controller.js (M1) sends data to receivedSettingsController.js (M2) and again receivedSettingsController.js (M2) sends the same data back to recievedAuthController.js (M1) which has event emitter (emitter_test) this emitter emits data which is listened in Emitter.controller.js (M1), After this we are sending the http response.
At first we are storing all the responses in map with “key” as mongoDBObjectId for each request just for uniqueness and the “value” is responses of each request.
let emit_response_list_map = new Map();
At last we are accessing the response through Map and sending back the response.
My Issue:
If I hit 100 requests through Jmeter then , all the flow works but at event emitter in recievedAuthController.js (M1) emits duplicate values sometimes., i.e the ObjectID we are sending throughout the flow appears 2-3 times in the listener in Emitter.controller.js (M1) and data loss happens.
Note: In listener all 100 emits data is received but in that 100 some of the ObjectID’s are repeating, all are not unique. How to solve this data loss issue in Event Emitters?
My Code Below
MicroService 1 :
File: Emitter.controller.js
var common = require('../config/event_emitter');
var commonEmitter = common.commonEmitter;
const { sendMessage } = require('../config/rabbitmq/emit');
const connection = require('../config/rabbitmq/connection');
var mongoose = require('mongoose');
const { parse, stringify } = require('flatted');
let emit_response_list_map = new Map();
exports.emitOnceTesting = async (req, res, next) => {
let count = mongoose.Types.ObjectId();
emit_response_list_map.set(count.toString(), res);
const rabbit_conn = await connection.connect();
const emit_channel = await rabbit_conn.createChannel();
let method = "emit_r";
let dataObj = { method, count };
let dataPacket = { exchange: 'device_model', key: "emit.key", data: dataObj };
emit_channel.assertExchange('device_model', 'topic', {
durable: false
});
sendMessage(emit_channel, dataPacket);
console.log(`============================== [[SEND] => [FROM] auth_ms : emit_controller: ${method} [TO] settings_ms : receivedDeviceController: emit_controller: ${method}] =======================================, ${dataPacket.data.count}`);
commonEmitter.on('emitter_test', async (data) => {
try {
console.log("______________emitter_test_______________", data.index);
commonEmitter.removeAllListeners('emitter_test');
await emit_channel.close();
const res = emit_response_list_map.get(data.index);
res.status(200).json({ status: true });
} catch (err) {
console.log(err);
}
});
}
MicroService 2:-> File: receivedSettingsController.js :
let emitTesting_q1 = await device_channel_sett_ms.assertQueue('emit1test_r', { exclusive: true });
device_channel_sett_ms.bindQueue(emitTesting_q1.queue, 'device_model', 'emit.key');
device_channel_sett_ms.consume(emitTesting_q1.queue, async (msg) => {
try {
// #ts-ignore
let data = parse(msg.content);
console.log('============================== [[RECEIVED] => [HERE] settings_ms : receivedDeviceController: emitTesting [FROM] coin_zone_ms : receivedDeviceController: emitTesting] =======================================', data); //res.json({ status: true, success: data });
let dataPacketAck = { exchange: 'device_model', key: 'emit.key.ack', data };
await sendMessage(device_channel_sett_ms, dataPacketAck);
console.log('============================== [[SEND-ACK] => [FROM] settings_ms : receivedDeviceController: emitTesting [FROM] coin_zone_ms : receivedDeviceController:emitTesting ] =======================================', data);
} catch (err) {
console.log('exception in getGeoFenceSetting : getGeoFenceDetails -----------', err);
}
},
{
noAck: true
}
);
Microservice 1: File: recievedAuthController.js
let emit_key_ack = await device_channel_auth_ms.assertQueue('', { exclusive: true });
device_channel_auth_ms.bindQueue(emit_key_ack.queue, 'device_model', 'emit.key.ack');
device_channel_auth_ms.consume(emit_key_ack.queue, async (msg) => {
// #ts-ignore
let data = parse(msg.content);
console.log("___________________________________ack________________", data.countack);
if (data.method = "emit_r") {
commonEmitter.emit('emitter_test', data);
} else if (data.method = "") {
} else if (data.method = "") {
}
});

Firebase Firestore writes only working on the first attempt of a fresh build

I've been building an app with Firebase & React Native primarily using Firestore. I started to use Firestore and its been great, but for some reason when writing to Firestore, it is only working on the first attempt (when i remove the app, rebuild, and perform my write).
I tried to do the exact same thing except write to Firestore and everything works as expected.
I am also receiving no error!
Here is what I am doing:
export const addBrandProduct = (postObj) => {
return () => {
firebase
.firestore()
.collection('brandProducts')
.add(postObj)
.then((docRef) => {
console.log("Document written with ID: ", docRef.id);
Actions.categories();
})
.catch(error => {
console.error("Error adding document: ", error);
});
};
};
For more of a reference, here is my component code that calls addBrandProduct()
onUploadImages = () => {
let photo =
Platform.OS === 'ios'
? this.state.images.map(img => img.uri.replace('file://', ''))
: this.state.images.map(img => img.uri);
photo.forEach((image, i) => {
const sessionId = new Date().getTime();
const Blob = RNFetchBlob.polyfill.Blob;
const fs = RNFetchBlob.fs;
window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
window.Blob = Blob;
let uploadBlob = null;
let mime = 'image/jpg';
const imageRef = firebase
.storage()
.ref('brandProducts/')
.child(`${this.props.userData.uid}`)
.child(`${sessionId}-${i}`);
fs.readFile(image, 'base64')
.then(data => {
return Blob.build(data, {type: `${mime};BASE64`});
})
.then(blob => {
uploadBlob = blob;
return imageRef.put(blob, {contentType: mime});
})
.then(() => {
uploadBlob.close();
return imageRef.getDownloadURL();
})
.then(url => {
//if this is the last uploaded image, post data to db
if (i === this.state.images.length - 1) {
const urls = {
...this.state.urls,
[i]: url,
};
const postObj = {
...this.state.postObj,
urls: urls,
};
this.props.addBrandProduct(postObj);
} else {
this.setState({
urls: {
...this.state.urls,
[i]: url,
},
});
}
})
.catch(error => {
console.log(error);
});
});
};
Basically, I am uploading a maximum of 3 images along with some data for it. In order to ensure I am uploading them all prior to adding the post data (writing to firestore) I am using a forEach and on the last upload, when it completes, I am calling the action to write the post data.
Edition
Hum addBrandProduct is a function that create another function.
So when you call this.props.addBrandProduct(postObj) nothing is sent to firestore, you just create a new function that should be called.
Maybe you can go out this stuff and call firebase directly, ensuring that everything works and then go back to the redux way if you still want to use it. I also make it parallelized instead of sequentials. Hope it help, hard to find the real problem when it can come from anywhere.
onUploadImages = () => {
let photo = Platform.OS === 'ios'
? this.state.images.map(img => img.uri.replace('file://', ''))
: this.state.images.map(img => img.uri);
Promise.all( photo.map( image => {
const sessionId = new Date().getTime();
const Blob = RNFetchBlob.polyfill.Blob;
//This is kind useless
//const fs = RNFetchBlob.fs;
//This is not used
//window.XMLHttpRequest = RNFetchBlob.polyfill.XMLHttpRequest;
//This is not adviced
//window.Blob = Blob;
let uploadBlob = null;
let mime = 'image/jpg';
const imageRef = firebase
.storage()
.ref('brandProducts/')
.child(`${this.props.userData.uid}`)
.child(`${sessionId}-${i}`);
return fs.readFile(image, 'base64')
.then(data => {
return RNFetchBlob.polyfill.Blob.build(data, {type: `${mime};BASE64`});
})
.then(blob => {
uploadBlob = blob;
return imageRef.put(blob, {contentType: mime});
})
.then(() => {
uploadBlob.close();
return imageRef.getDownloadURL();
});
))
.then( results => {
//results is, here, [ urlFromFirst, urlFronSecond, ...]
const urls = { ...this.state.urls};
results.forEach( (r, i) => urls[i] = r );
const postObj = {
...this.state.postObj,
urls
};
return firebase
.firestore()
.collection('brandProducts')
.add(postObj)
})
.then( docRef => {
console.log("Document written with ID: ", docRef.id);
})
.catch(error => {
console.error(error);
});
};

Cannot read property "responseText" of undefined

I'm having an issue with my JSON.parse.
After I change the API call from request to request.Promise.get I receive an error -
TypeError: Cannot read property 'responseText' of undefined
Packages:
Node-Schedule
Request-Promise
CoinMarketCap API
Is that a problem with my data in r1 or I don't understand at all what is wrong now with the code. I was trying to change responseText to responseXML doesn't work as well for me.
Probably I'm missing the logic in the request but I'm just curious why this error appears even that this part of code was working before.
Problem area of code
const j = schedule.scheduleJob('* * * * * *', () => {
requestPromise.get({
uri: 'http://pro-api.coinmarketcap.com/v1/cryptocurrency/listings/latest?CMC_PRO_API_KEY=API-KEY-HERE',
json: true
resolveWithFullResponse: true
}).then(r1 => {
const x1 = JSON.parse(r1.target.responseText);
const BTCdata = x1.data.find(d => d.symbol === 'BTC').quote.USD
.volume_24h; // creating a variable to store a BTC request from API
console.log(BTCdata);
// Saving to database
saveToDatabase(BTCdata);
}).catch(err => {
console.log(err);
});
});
Full code
var requestPromise = require('request-promise');
const { MongoClient } = require('mongodb');
const schedule = require('node-schedule');
var XMLHttpRequest = require("xmlhttprequest").XMLHttpRequest;
const saveToDatabase = function(BTCdata) {
const url = 'mongodb+srv://username:password#cluster0-1kunr.mongodb.net/<dbname>?retryWrites=true&w=majority';
MongoClient.connect(url, { useNewUrlParser: true, useUnifiedTopology: true }, (err, db) => {
if (err) throw err;
const dbo = db.db('Crypto');
const myobj = { Name: 'BTC', Volume: 'BTCdata' };
dbo.collection('Crypto-Values').insertOne(myobj, (error, res) => {
if (error) throw error;
console.log('1 document inserted');
db.close();
});
});
};
function requestPromise(method, url) {
return new Promise(((resolve, reject) => {
const xhr = new XMLHttpRequest();
xhr.open(method, url);
xhr.onload = resolve;
xhr.onerror = reject;
xhr.send();
}));
}
const j = schedule.scheduleJob('* * * * * *', () => {
requestPromise.get({
uri: 'http://pro-api.coinmarketcap.com/v1/cryptocurrency/listings/latest?CMC_PRO_API_KEY=API-KEY-HERE',
json: true
resolveWithFullResponse: true
}).then(r1 => {
const x1 = JSON.parse(r1.target.responseText);
const BTCdata = x1.data.find(d => d.symbol === 'BTC').quote.USD
.volume_24h; // creating a variable to store a BTC request from API
console.log(BTCdata);
// Saving to database
saveToDatabase(BTCdata);
}).catch(err => {
console.log(err);
});
});
You have 2 mistakes here:
In your requestPromise(), your response will be already JSON since you used json: true, so no need to additionally JSON.parse() it.
In the very response itself (r1 in your case), you won't have target key. Instead, you should probably use r1.body to get the body of the response first and then do with the data from the body whatever you need.
const j = schedule.scheduleJob('* * * * * *', () => {
requestPromise.get({
uri: 'http://pro-api.coinmarketcap.com/v1/cryptocurrency/listings/latest?CMC_PRO_API_KEY=API-KEY-HERE',
json: true,
resolveWithFullResponse: true,
}).then(r1 => {
// r1.body will contain response from API
const responseData = r1.body;
// do something with responseData
}).catch(err => {
console.log(err);
});
});
To get better idea what you will get in your r1.body object, refer to official documentation or simply put that API URL with proper key in your browser and you should see JSON output already.

Close Event Triggers Before Data Events on File Stream

I've got a script that adds JSON data from a file to a DynamoDB table. The script uses the "fs" module to open a read stream to the json file and retrieve the data line by line. As the data is returned, it's inserted into a DynamoDB table. When the operation ends, an execution summary is given with number of records processed, successfully inserted, and unsuccessfully inserted. The problem is the summary executes before the file has completely processed. As result the numbers are wrong.
The script...
ddb_table_has_records(table_name, (err, dat) => {
if (dat.Count === 0 || force) {
const transformStream = JSONStream.parse("*");
const inputStream = fs.createReadStream(import_file);
let record_position = 0;
let count_imported_successful = 0;
let count_imported_fail = 0;
inputStream.pipe(transformStream).on("data", (Item) => {
const params = {
TableName: table_name,
Item
}
ddb_client.put(params, (err, data) => {
++record_position;
if (err) {
console.error("Unable to add mapping for record " + record_position + ", error = " + err);
++count_imported_fail;
} else {
console.log("PutItem succeeded " + record_position);
++count_imported_successful;
}
});
}).on("close", () => {
console.log("=".repeat(70));
console.log(`'Completed: ${import_file}' has been loaded into '${table_name}'.`);
console.log(` Record Count: ${record_position}`);
console.log(` Imported Record Count: ${count_imported_successful}`);
console.log(` Rejected Record Count: ${count_imported_fail}`);
});
} else {
console.log("=".repeat(70));
console.log(`Completed: Skipping import of '${import_file}' into '${table_name}'.`);
};
});
When this runs, it looks like the following
PS C:\> node --max-old-space-size=8192 .\try.js 'foo' 'us-west-2' 'development' '.\data.json' true
Target Profile: development
Target Region: us-west-2
Target Table: foo
Source File: .\data.json
Force Import: true
Confirming Table's State...
======================================================================
'Completed: .\data.json' has been loaded into 'foo'.
Record Count: 0
Imported Record Count: 0
Rejected Record Count: 0
PutItem succeeded 1
PutItem succeeded 2
PutItem succeeded 3
PutItem succeeded 4
...
The portion of the code that gets the record counts runs before the inserts completes so the records imported and rejected numbers are always wrong. It looks like the file stream closes while inserts are occurring. I've tried changing from the "close" to "end" event, same result.
Test this script with the following call...
node --max-old-space-size=8192 .\data.load.js 'foo' 'us-west-1' 'dev' '.\foo.default.json' true
Here is the content for the script I ultimately used...
'use strict'
if (process.argv.length < 6) {
throw new Error ('Please pass the table-name, aws-Region, aws-Profile, and file-path to the script.');
}
let [, , TableName, Region, Profile, ImportFile, Force] = process.argv;
process.env.AWS_SDK_LOAD_CONFIG = true;
process.env.AWS_PROFILE = Profile;
Force = typeof(Force) !== 'undefined' ? Force : false;
const AWS = require('aws-sdk');
const fs = require('fs');
const JSONStream = require('JSONStream');
AWS.config.update({ region: Region });
const ddbc = new AWS.DynamoDB.DocumentClient();
console.log('Target Profile: ', Profile);
console.log('Target Region: ', Region);
console.log('Target Table: ', TableName);
console.log('Source File: ', ImportFile);
console.log('Force Import: ', Force);
// Returns the number of records in a specified table
const ddb_table_has_items = (TableName) => {
return new Promise((resolve, reject) => {
const ddb_query_parameters = { TableName, Select: 'COUNT' }
ddbc.scan(ddb_query_parameters, (error, data) => {
(error) ? reject(error) : resolve(data);
});
});
}
const ddb_table_upsert_items = (TableName, Item) => {
return new Promise((reject, resolve) => {
const ddb_insert_payload = { TableName, Item };
ddbc.put(ddb_insert_payload, (error, data) => {
(error) ? reject(error) : resolve(data);
});
});
}
const ddb_bulk_load = (TableName, ImportFile) => {
return new Promise ( (resolve, reject) => {
let count_succeeded = 0;
let count_failed = 0;
let count_attempted = 0;
let inserts = [];
const json_stream = JSONStream.parse( "*" );
const source_data_stream = fs.createReadStream(ImportFile);
const ddb_source_item = source_data_stream.pipe(json_stream);
ddb_source_item.on("data", (source_data_item) => {
count_attempted++;
let ddb_insert = ddb_table_upsert_items(TableName, source_data_item)
.then( (data) => count_succeeded++ )
.catch( (error) => count_failed++ );
inserts.push(ddb_insert);
});
ddb_source_item.on("end", () => {
Promise.all(inserts)
.then(() => {
resolve({count_succeeded, count_failed, count_attempted});
})
.catch((error) => {
console.log(error);
reject(error);
});
});
ddb_source_item.on("error", (error) => {
reject(error);
});
});
}
(async () => {
try {
let proceed_with_import = false;
if ( Force.toString().toLowerCase() === 'true' ) {
proceed_with_import = true;
} else {
const table_scan = await ddb_table_has_items(TableName);
proceed_with_import = ( table_scan.Count === 0 );
}
if (proceed_with_import) {
let ddb_inserts = await ddb_bulk_load(TableName, ImportFile);
console.log("=".repeat(75));
console.log("Completed: '%s' has been loaded into '%s'.", ImportFile, TableName);
console.log(" Insert Attempted: %s", ddb_inserts.count_attempted);
console.log(" Insert Succeeded: %s", ddb_inserts.count_succeeded);
console.log(" Insert Failed : %s", ddb_inserts.count_failed);
}
} catch (error) {
console.log(error);
}
})();
Wrapping each insert in a promise, pushing the insert-promises into an array, and using promise all on that array did the trick. I execute the promise all once we're finished reading from the file; once the "end" event is emitted on the ddb_source_item stream.

Can't figure out why my app.get is being run twice?

I have a app.get which inside of it is quite a bit of logic. Which everything works great aside from some of the logic being called twice for some reason. I have noticed when I was saving something to by db that it would save two rows.
So I put a console.log in that area and sure enough it was logging it twice.
Any reason why this is happening?
app.get('/shopify/callback', (req, res) => {
const { shop, hmac, code, state } = req.query;
const stateCookie = cookie.parse(req.headers.cookie).state;
if (state !== stateCookie) {
return res.status(403).send('Request origin cannot be verified');
}
if (shop && hmac && code) {
// DONE: Validate request is from Shopify
const map = Object.assign({}, req.query);
delete map['signature'];
delete map['hmac'];
const message = querystring.stringify(map);
const providedHmac = Buffer.from(hmac, 'utf-8');
const generatedHash = Buffer.from(
crypto
.createHmac('sha256', config.oauth.client_secret)
.update(message)
.digest('hex'),
'utf-8'
);
let hashEquals = false;
try {
hashEquals = crypto.timingSafeEqual(generatedHash, providedHmac)
} catch (e) {
hashEquals = false;
};
if (!hashEquals) {
return res.status(400).send('HMAC validation failed');
}
// DONE: Exchange temporary code for a permanent access token
const accessTokenRequestUrl = 'https://' + shop + '/admin/oauth/access_token';
const accessTokenPayload = {
client_id: config.oauth.api_key,
client_secret: config.oauth.client_secret,
code,
};
request.post(accessTokenRequestUrl, { json: accessTokenPayload })
.then((accessTokenResponse) => {
const accessToken = accessTokenResponse.access_token;
// DONE: Use access token to make API call to 'shop' endpoint
const shopRequestUrl = 'https://' + shop + '/admin/shop.json';
const shopRequestHeaders = {
'X-Shopify-Access-Token': accessToken,
}
request.get(shopRequestUrl, { headers: shopRequestHeaders })
.then((shopResponse) => {
const response = JSON.parse(shopResponse);
const shopData = response.shop;
console.log('BEING CALLED TWICE...')
res.render('pages/brand_signup',{
shop: shopData.name
})
})
.catch((error) => {
res.status(error.statusCode).send(error.error.error_description);
});
})
.catch((error) => {
res.status(error.statusCode).send(error.error.error_description);
});
} else {
res.status(400).send('Required parameters missing');
}
});

Categories