In my site:
Users have many Activities
Each Activity has encoded_polyline data
I display these encoded_polylines on a map
I want to use IndexedDB (via Dexie) as an in-browser cache so that they don't need to re-download their full Activity set every time they view their map. I've never used IndexedDB before, so I don't know if I'm doing anything silly or overlooking any edge cases.
Here's a high-level description of what I think the overall process is:
Figure out what exists on the server
Remove anything that is present in IndexedDB but is not present on the server
Figure out what exists in IndexedDB
Request only the data missing in IndexedDB
Store the new data in IndexedDB
Query all of the data out of IndexedDB
Throughout all of this, we need to be focusing on this user. A person might view many people's pages, and therefore have a copy of many people's data in IndexedDB. So the queries to the server and IndexedDB need to be aware of which User ID is being referenced.
Here's the English Language version of what I decided to do:
Collect all of this User's Activty IDs from the server
Remove anything in IndexedDB that shouldn't be there (stuff deleted from the site that might still exist in IndexedDB)
Collect all of this User's Activty IDs from IndexedDB
Filter out anything that's present in IndexedDB and the server
If there are no new encoded_polylines to retrieve then putItemsFromIndexeddbOnMap (described below)
If there are new encoded_polylines to retrieve: retrieve those from the server, then store them in IndexedDB, then putItemsFromIndexeddbOnMap
For putItemsFromIndexeddbOnMap:
Get all of this user's encoded_polylines from IndexedDB
Push that data into an array
Display that array of data on the map
Here's the JavaScript code that does what I've explained above (with some ERB because this JavaScript is embedded in a Rails view):
var db = new Dexie("db_name");
db.version(1).stores({ activities: "id,user_id" });
db.open();
// get this user's activity IDs from the server
fetch('/users/' + <%= #user.id %> + '/activity_ids.json', { credentials: 'same-origin' }
).then(response => { return response.json() }
).then(activityIdsJson => {
// remove items from IndexedDB for this user that are not in activityIdsJson
// this keeps data that was deleted in the site from sticking around in IndexedDB
db.activities
.where('id')
.noneOf(activityIdsJson)
.and(function(item) { return item.user_id === <%= #user.id %> })
.keys()
.then(removeActivityIds => {
db.activities.bulkDelete(removeActivityIds);
});
// get this user's existing activity IDs out of IndexedDB
db.activities.where({user_id: <%= #user.id %>}).primaryKeys(function(primaryKeys) {
// filter out the activity IDs that are already in IndexedDB
var neededIds = activityIdsJson.filter((id) => !primaryKeys.includes(id));
if(Array.isArray(neededIds) && neededIds.length === 0) {
// we do not need to request any new data so query IndexedDB directly
putItemsFromIndexeddbOnMap();
} else if(Array.isArray(neededIds)) {
if(neededIds.equals(activityIdsJson)) {
// we need all data so do not pass the `only` param
neededIds = [];
}
// get new data (encoded_polylines for display on the map)
fetch('/users/' + <%= #user.id %> + '/encoded_polylines.json?only=' + neededIds, { credentials: 'same-origin' }
).then(response => { return response.json() }
).then(newEncodedPolylinesJson => {
// store the new encoded_polylines in IndexedDB
db.activities.bulkPut(newEncodedPolylinesJson).then(_unused => {
// pull all encoded_polylines out of IndexedDB
putItemsFromIndexeddbOnMap();
});
});
}
});
});
function putItemsFromIndexeddbOnMap() {
var featureCollection = [];
db.activities.where({user_id: <%= #user.id %>}).each(activity => {
featureCollection.push({
type: 'Feature',
geometry: polyline.toGeoJSON(activity['encoded_polyline'])
});
}).then(function() {
// if there are any polylines, add them to the map
if(featureCollection.length > 0) {
if(map.isStyleLoaded()) {
// map has fully loaded so add polylines to the map
addPolylineLayer(featureCollection);
} else {
// map is still loading, so wait for that to complete
map.on('style.load', addPolylineLayer(featureCollection));
}
}
}).catch(error => {
console.error(error.stack || error);
});
}
function addPolylineLayer(data) {
map.addSource('polylineCollection', {
type: 'geojson',
data: {
type: 'FeatureCollection',
features: data
}
});
map.addLayer({
id: 'polylineCollection',
type: 'line',
source: 'polylineCollection'
});
}
...Am I doing it right?
Related
I am new to IndexedDB and serviceworkers and am having a very difficult time understanding how to turn these into a funcitonal application. I've done extensive reading on both, but even the "complete" examples don't incorporate the two.
I am tasked with creating an application that will allow users to work offline. The first time they connect to the site, I want to pull specific information from the database and store it in IndexedDB. When they go offline, I need to use that data to display information on the page. Certain interactions will cause the data to update, then to be synced later once an internet connection is reestablished. From a high-level, I udnerstand how this works.
It is my understanding that we cannot call functions from the serviceworker.js file due to the asynchronous nature of serviceworkers. Additionally, serviceworkers.js cannot directly update the DOM. However, the examples I have seen are creating and managing the IndexedDB data within the serviceworkers.js file.
So let's say I have a file:
<!-- index.html -->
<html>
<body>
Hello <span id="name"></span>
</body>
</html>
And a serviceworker.js:
var CACHE_NAME = 'my-cache-v1';
var urlsToCache = [
'/'
// More to be added later
];
self.addEventListener('install', function(event) {
// Perform install steps
event.waitUntil(
caches.open(CACHE_NAME)
.then(function(cache) {
console.log('Opened cache');
return cache.addAll(urlsToCache);
})
);
});
self.addEventListener('activate', function(event) {
event.waitUntil(
createDB() //Use this function to create or open the database
);
});
self.addEventListener('fetch', function(event) {
event.respondWith(
caches.match(event.request)
.then(function(response) {
// Cache hit - return response
if (response) {
return response;
}
return fetch(event.request).then(
function(response) {
// Check if we received a valid response
if(!response || response.status !== 200 || response.type !== 'basic') {
return response;
}
var responseToCache = response.clone();
caches.open(CACHE_NAME)
.then(function(cache) {
cache.put(event.request, responseToCache);
});
return response;
}
);
})
);
});
function createDB() {
idb.open('mydata', 1, function(upgradeDB) {
var store = upgradeDB.createObjectStore('user', {
keyPath: 'id'
});
store.put({id: 1, name: 'John Doe'}); //This can be updated with an AJAX call to the database later
});
}
How do I now update the element "name" with the value for key = 1 from the "user" objectstore in the "mydata" database?
Depending on your use case, you've got several options :
You dont need the service worker. Just pull your data from iDB directly from the page. The DOM has access to iDB.
Set a template for your index.html. At the activate step in service worker, pre-render the page with the value from iDB and cache it.
I am working on my first Firebase project using AngularFire2. Below is the overall design of my learning project.
Users uploads photos and it's stored in the Firebase storage as images.
The uploaded photos are listed in the homepage sorted based on timestamp.
Below is the structure that I have now when I started. But I feel difficulty when doing joins. I should be able to get user details for each uploads and able to sort uploads by timestamp.
User:
- Name
- Email
- Avatar
Uploads:
- ImageURL
- User ID
- Time
I read few blogs de-normalising the data structure. For my given scenario, how best can i re-model my database structure?
Any example for creating some sample data in the new proposed solution will be great for my understanding.
Once the image upload is done, I am calling the below code to create an entry in the database.
addUpload(image: any): firebase.Promise<any> {
return firebase.database().ref('/userUploads').push({
user: firebase.auth().currentUser.uid,
image: image,
time: new Date().getTime()
});
}
I am trying to join 2 entities as below. i am not sure how can I do it efficiently and correctly.
getUploads(): any {
const rootDef = this.db.database.ref();
const uploads = rootDef.child('userUploads').orderByChild('time');
uploads.on('child_added',snap => {
let userRef =rootDef.child('userProfile/' + snap.child('user').val());
userRef.once('value').then(userSnap => {
???? HOW TO HANDLE HERE
});
});
return ?????;
}
I would like to get a final list having all upload details and its corresponding user data for each upload.
This type of join will always be tricky if you write it from scratch. But I'll try to walk you through it. I'm using this JSON for my answer:
{
uploads: {
Upload1: {
uid: "uid1",
url: "https://firebase.com"
},
Upload2: {
uid: "uid2",
url: "https://google.com"
}
},
users: {
uid1: {
name: "Purus"
},
uid2: {
name: "Frank"
}
}
}
We're taking a three-stepped approach here:
Load the data from uploads
Load the users for that data from users
Join the user data to the upload data
1. Load the data uploads
Your code is trying to return a value. Since the data is loaded from Firebase asynchronously, it won't be available yet when your return statement executes. That gives you two options:
Pass in a callback to getUploads() that you then call when the data has loaded.
Return a promise from getUploads() that resolves when the data has loaded.
I'm going to use promises here, since the code is already difficult enough.
function getUploads() {
return ref.child("uploads").once("value").then((snap) => {
return snap.val();
});
}
This should be fairly readable: we load all uploads and, once they are loaded, we return the value.
getUploads().then((uploads) => console.log(uploads));
Will print:
{
Upload1 {
uid: "uid1",
url: "https://firebase.com"
},
Upload2 {
uid: "uid2",
url: "https://google.com"
}
}
2. Load the users for that data from users
Now in the next step, we're going to be loading the user for each upload. For this step we're not returning the uploads anymore, just the user node for each upload:
function getUploads() {
return ref.child("uploads").once("value").then((snap) => {
var promises = [];
snap.forEach((uploadSnap) => {
promises.push(
ref.child("users").child(uploadSnap.val().uid).once("value")
);
});
return Promise.all(promises).then((userSnaps) => {
return userSnaps.map((userSnap) => userSnap.val());
});
});
}
You can see that we loop over the uploads and create a promise for loading the user for that upload. Then we return Promise.all(), which ensures its then() only gets called once all users are loaded.
Now calling
getUploads().then((uploads) => console.log(uploads));
Prints:
[{
name: "Purus"
}, {
name: "Frank"
}]
So we get an array of users, one for each upload. Note that if the same user had posted multiple uploads, you'd get that user multiple times in this array. In a real production app you'd want to de-duplicate the users. But this answer is already getting long enough, so I'm leaving that as an exercise for the reader...
3. Join the user data to the upload data
The final step is to take the data from the two previous steps and joining it together.
function getUploads() {
return ref.child("uploads").once("value").then((snap) => {
var promises = [];
snap.forEach((uploadSnap) => {
promises.push(
ref.child("users").child(uploadSnap.val().uid).once("value")
);
});
return Promise.all(promises).then((userSnaps) => {
var uploads = [];
var i=0;
snap.forEach((uploadSnap) => {
var upload = uploadSnap.val();
upload.username = userSnaps[i++].val().name;
uploads.push(upload);
});
return uploads;
});
});
}
You'll see we added a then() to the Promise.all() call, which gets invoked after all users have loaded. At that point we have both the users and their uploads, so we can join them together. And since we loaded the users in the same order as the uploads, we can just join them by their index (i). Once you de-duplicate the users this will be a bit trickier.
Now if you call the code with:
getUploads().then((uploads) => console.log(uploads));
It prints:
[{
uid: "uid1",
url: "https://firebase.com",
username: "Purus"
}, {
uid: "uid2",
url: "https://google.com",
username: "Frank"
}]
The array of uploads with the name of the user who created that upload.
The working code for each step is in https://jsbin.com/noyemof/edit?js,console
I did the following based on Franks answer and it works. I am not sure if this is the best way for dealing with large number of data.
getUploads() {
return new Promise((resolve, reject) => {
const rootDef = this.db.database.ref();
const uploadsRef = rootDef.child('userUploads').orderByChild('time');
const userRef = rootDef.child("userProfile");
var uploads = [];
uploadsRef.once("value").then((uploadSnaps) => {
uploadSnaps.forEach((uploadSnap) => {
var upload = uploadSnap.val();
userRef.child(uploadSnap.val().user).once("value").then((userSnap) => {
upload.displayName = userSnap.val().displayName;
upload.avatar = userSnap.val().avatar;
uploads.push(upload);
});
});
});
resolve(uploads);
});
}
Right now I am replicating my entire device database over to my remote database.
Once that is complete, I grab all my data that is not older than 1 month from my remote database, using a filter, and bring it to my device.
FILTER
{
_id: '_design/filters',
"filters": {
"device": function(doc, req) {
if(doc.type == "document" || doc.type == "signature") {
if(doc.created >= req.query.date) return true;
else return false;
}
else return true;
}
}
}
REPLICATION
device_db.replicate.to(remote_db)
.on('complete', function () {
device_db.replicate.from(remote_db, {
filter: "filters/device",
query_params: { "date": (Math.floor(Date.now() / 1000)-2419200) }
})
.on('complete', function () {
console.log("localtoRemoteSync replicate.to success");
callback(true);
});
});
My question:
I want to be able to periodically delete data from my device that is older than 3 months (old enough data where I already know it's been sync'd)
But just because I delete it from my device, when I replicate the data back to my remote_db, I don't want it to be deleted on there too.
How can I delete specific data on my device but not have that deletion translated when I replicate?
FILTERS
Here, we have 2 filters:
noDeleted : This filter doesn't push _deleted documents.
device : Filter to get the latest data only.
{
_id: '_design/filters',
"filters": {
"device": function(doc, req) {
if (doc.type == "document" || doc.type == "signature") {
if (doc.created >= req.query.date) return true;
else return false;
}
return true;
},
"noDeleted": function(doc, req) {
//Document _deleted won't pass through this filter.
//If we delete the document locally, the delete won't be replicated to the remote DB
return !doc._deleted;
}
}
}
REPLICATION
device_db.replicate.to(remote_db, {
filter: "filters/noDeleted"
})
.on('complete', function() {
device_db.replicate.from(remote_db, {
filter: "filters/device",
query_params: { "date": (Math.floor(Date.now() / 1000) - 2419200) }
})
.on('complete', function() {
console.log("localtoRemoteSync replicate.to success");
callback(true);
});
});
Workflow
You push all your documents without pushing the deleted document.
You get all the updates for the latest data
You delete your old documents
You could either query the remote DB to get the ids of the documents that are too old and delete them locally. Note that the documents will still be there as _deleted. To completely remove them, a compaction will be required.
You could also totally destroy your local database after step1 and start from scratch.
callback(true);
Add a one-way filtered replication. However anything you need back on the server you will need to use a put request with the server's _rev.
For example
Replicate from server to client, then add a filter mechanism, like transfer:true to the docs you want to replicate. replication
db.replicate.from(remoteDB, {
live: true,
retry: true,
selector: {transfer:true}// or any other type of selector
});
To delete a doc on the client, set transfer to false, then delete it on the client. it won't meet your filter criteria so it won't replicate.
Anything you want to put back to the server use a put request instead of replicate.
If you want the document back on the client just set transfer to true in the doc.
With my node.js app, I'm getting my JSON data from a spreadsheet API.
It basically returns JSON of the following.
{
"status":200,
"success":true,
"result":[
{
"Dribbble":"a",
"Behance":"",
"Blog":"http://blog.invisionapp.com/reimagine-web-design-process/",
"Youtube":"",
"Vimeo":""
},
{
"Dribbble":"",
"Behance":"",
"Blog":"http://creative.mailchimp.com/paint-drips/?_ga=1.32574201.612462484.1431430487",
"Youtube":"",
"Vimeo":""
}
]
}
It's just a dummy data for now but one thing for certain is that, I need to process values (blog URLs) under Blog differently. With the blog url, I need to get Open Graph data so I'm using a module called open-graph-scraper
With data.js I'm getting the whole JSON and it's available in route index.js as data Then I'm processing this data by checking Blog column. If it's a match, I loop the values (blog URLs) through open-graph-scraper module.
This will give me open graph data of each blog url like the following example JSON.
{
data:
{ success: 'true',
ogImage: 'http://davidwalsh.name/wp-content/themes/punky/images/logo.png',
ogTitle: 'David Walsh - JavaScript, HTML5 Consultant',
ogUrl: 'http://davidwalsh.name/',
ogSiteName: 'David Walsh Blog',
ogDescription: 'David Walsh Blog features tutorials about MooTools, jQuery, Dojo, JavaScript, PHP, CSS, HTML5, MySQL, and more!' },
success: true
}
So my goal is to pass this blog JSON as a separate data from the main JSON and put it in the render as a separate object so it's available in view as two separate JSON. But I'm not sure if my approach with getBlogData is correct.
I'm not even sure if processing data like this is a good thing to do in a router file. I would appreciate some directions.
index.js
var ogs = require('open-graph-scraper');
var data = require('../lib/data.js');
data( function(data) {
var getBlogData = function (callback) {
var blogURL = [];
if (data.length > 0) {
var columnsIn = data[0];
for(var key in columnsIn) {
if (key === 'Blog') {
for(var i = 0; i < data.length; i++) {
blogURL += data[i][key];
}
}
}
};
ogs({
url: blogURL
}, function(er, res) {
console.log(er, res);
callback(res);
});
}
getBlogData( function (blogData) {
//I want to make this blogData available in render below
but don't know how
});
router.get('/', function(req, res, next) {
res.render('index', {
title: 'Express',
data: data
});
});
});
data.js (my module that gets JSON data)
module.exports = function(callback) {
var request = require("request")
var url = "http://sheetsu.com/apis/94dc0db4"
request({
url: url,
json: true
}, function (error, response, body) {
if (!error && response.statusCode === 200) {
var results = body["result"];
callback(results)
}
})
}
The problem you'll have is that if you do getBlogData asynchronously (and you should, you don't want the client waiting around for all that data to return), by the time you get the data res.render will have already been called. As you can't call res.render again, you have 2 options that come to mind:
You could query for individual blog data from the client. This will result in more back-and-forth between client and server but is a good strategy if you have a lot of entries in your initial data but only want to display a small number.
You could use websockets to send the data to the client as you retrieve it. Look up something like express.io for an easy way to do this.
I'm working on a simple restaurant menu. And I need to filter list of dishes accordingly to the category in which we are now in current moment. The problem is in a strange behaviour of a this.store.filter(...) method. It doesn't return anything...
I want to use it like this:
App.DishRoute = Ember.Route.extend({
model: function (param) {
return this.store.filter('dish', function(dish) {
return dish.get('category_id') == param.category_id;
});
}
});
but for the test purpose I'm using this.store.filter('dish', function() {return true;}); in my example here http://jsbin.com/AcoHeNA/43/.
Please review the code and tell me what am I doing wrong or show me the way I should filter the data.
store.filter doesn't query the server, it just filter the already loaded data in the store. In your case because you don't load data in any moment, the filter will return a empty result. You can fix it calling this.store.find('dish'); to load data from the server, so any filter('dish', ...) will be updated.
App.DishRoute = Ember.Route.extend({
model: function (param) {
console.log(param.dish_id);
// pre load the data
this.store.find('dish');
// filter the prefetch data, when update the filter when new data are loaded
return this.store.filter('dish', function(){return true;});
}
});
This is the updated jsbin http://jsbin.com/akeJOTah/1/edit
This is an overview of the most used store methods:
store.find('dish') => send a request with /dishes
store.find('dish', 1) => send a request with /dishes/1
store.find('dish', { name: 'some' }) => send a request with /dishes?name=some
store.filter('dish', function() { ... }) => client side filtering, don't send a request, just filter the data present in the store
store.filter('dish', function() { ... }, { foo: 'bar' }) => run the find with query (item 3) and perform a client side filtering
store.all('dish') => don't send request, just get all the data loaded in the store