How can I automatically update a view using Node.js? - javascript

I have a cache and a cron-job for pulling/receiving and saving data. I want that the view updates itself. For testing it should update every 5 seconds. In this way I should see, when the data is pulled, saved to cache and finally will appear on the view.
I found out that socket.io can help me. I haven't found a suited example to my purpose. Can somebody help me?
Here is a snippet of my code from app.js:
var dashboardData = "";
//----fired when cache has changed
myCache.on( "set", function( key, value ){
stats = JSON.stringify(myCache.getStats());
stats = JSON.parse(stats);
console.log(stats.keys);
var content = JSON.stringify(value);
content = JSON.parse(content);
dashboardData = content;
});
//----load dashboard view
app.post('/dashboard', function(req, res) {
projectName = req.body.selectProjectName;
if(dashboardData == ""){
var content = "";
dashboard.dashboard(req,res, projectName, content);
} else {
dashboard.dashboard(req,res, projectName, dashboardData.variable);
}
});

My current solution: I've added a jquery javascript with an asynchronous ajax call that updates the view every 5 seconds.

Related

When showing a new growl in angularjs clear the old ones from view

I am working on growl.info() on angularjs and I have a question. How can I check if a growl exists in view (screen), when trying to add a new one? If a new one tries to be shown the previous one must be erased from screen. The code in controller is this:
$scope.showInfo= function () {
var info = "test";
growl.info(message.replace("{0}", info), {
ttl: 50000
});
};
but note that ttl is important too. If no new growl tries to be shown, the first must live for a long period. Thank you in advance!
Firslty we add a public variable:
$scope.growlMessge = null;
and then we check if it has already a value (just to destroy it), before giving the new one
$scope.showInfo= function () {
if ($scope.growlMessage != null) {
$scope.growlMessage.destroy();
}
var info = "test";
$scope.growlMessage = growl.info(message.replace("{0}", info), {
ttl: 50000
});
};

Write an uploads metadata with information stored data in Firebase DB

I've been hitting my head against the wall on this for about 2 hours and I think I've just lost sight of the problem a bit.
I have an incremental field saved as "index" that upon a file upload starting has it's value increase by 1.
I am able to query the database and pull the value for index to the console and receive the updated value.
I can't for the life of me work out how to insert the value I've created and subsequently logged to the console (definitely doesn't need to be logged just did this to prove to myself I wasn't going insane) into the uploads metadata at the next stage of the script. I have tried everything I can think of - I've watched about an hour of youtube videos, and I can safely say beyond a shadow of a doubt I could turn my app into a running counter of peoples file uploads but I can't add it to their upload metadata!
Help me stack overflow you're my only hope!
Code below hopefully outlines the issue - the query is going into the variable indexRef but the actual info I need is in the nested variable "key" which is just the data snapshot value. This seems like it should be so easy.
var indexRef = firebase.database().ref('index');
indexRef.once('value')
.then(function(snapshot){
var key = snapshot.val()
console.log(key)
})
var imagekey = firebase.database().ref('images/').push().key;
var downloadURL = uploadTask.snapshot.downloadURL;
var updates = {};
var postData = {
url: downloadURL,
score: 1500,
index: indexRef,
user: user.uid
};
updates ['/images/'+imagekey] = postData;
firebase.database().ref().update(updates);
Thanks in advance and I apologise if the answer to this is trivial and I've wasted someones time!
Remember the then method returns promises https://firebase.googleblog.com/2016/01/keeping-our-promises-and-callbacks_76.html
var indexRef = firebase.database().ref('index');
// Declare variables outside of block to access within
var imagekey = firebase.database().ref('images/').push().key;
var downloadURL = uploadTask.snapshot.downloadURL;
var updates = {};
indexRef.once('value')
.then(function(snapshot){
var key = snapshot.val()
// Return key variable for use
return key;
})
.then(function(key){
// You can now access the key variable in here
var postData = {
url: downloadURL,
score: 1500,
index: key,
user: user.uid
};
updates ['/images/'+imagekey] = postData;
firebase.database().ref().update(updates);
})
Hope this helps you

Export data from Google AppMaker Datasource automatically

Does anyone know how we can generate report from data in datasource in Google AppMaker automatically (e.g generate report at 12a.m.) instead of manually click export data in deployments every time user need the report.
I have seen something similar on Exporting data out of Google AppMaker but also no one tried to answer that.
Really appreciate if there is anyone who know how to solve this :)
This can be achieved by using Installable Triggers.
Say for example, you have a model with students data that has three fields; name(string), age(number) and grade(number). On the server script you can write something like this:
//define function to do the data export
function dataExport() {
//create sheet to populate data
var fileName = "Students List " + new Date(); //define file name
var newExport = SpreadsheetApp.create(fileName); // create new spreadsheet
var header = ["Name", "Age", "Grade"]; //define header
newExport.appendRow(header); // append header to spreadsheet
//get all students records
var ds = app.models.students.newQuery();
var allStudents = ds.run();
for(var i=0; i< allStudents.length; i++) {
//get each student data
var student = allStudents[i];
var studentName = student.name;
var studentAge = student.age;
var studentGrade = student.grade;
var newRow = [studentName, studentAge, studentGrade]; //save studen data in a row
newExport.appendRow(newRow); //append student data row to spreadsheet
}
console.log("Finished Exporting Student Data");
}
//invoke function to set up the auto export
function exportData(){
//check if there is an existing trigger for this process
var existingTrigger = PropertiesService.getScriptProperties().getProperty("autoExportTrigger");
//if the trigger already exists, inform user about it
if(existingTrigger) {
return "Auto export is already set";
} else { // if the trigger does not exists, continue to set the trigger to auto export data
//runs the script every day at 1am on the time zone specified
var newTrigger = ScriptApp.newTrigger('dataExport')
.timeBased()
.atHour(1)
.everyDays(1)
.inTimezone("America/Chicago")
.create();
var triggerId = newTrigger.getUniqueId();
if(triggerId) {
PropertiesService.getScriptProperties().setProperty("autoExportTrigger", triggerId);
return "Auto export has been set successfully!";
} else {
return "Failed to set auto export. Try again please";
}
}
}
Then, to delete/stop the auto export, in case you need to, you can write the following on the server script too:
function deleteTrigger() {
//get the current auto export trigger id
var triggerId = PropertiesService.getScriptProperties().getProperty("autoExportTrigger");
//get all triggers
var allTriggers = ScriptApp.getProjectTriggers();
//loop over all triggers.
for (var i = 0; i < allTriggers.length; i++) {
// If the current trigger is the correct one, delete it.
if (allTriggers[i].getUniqueId() === triggerId) {
ScriptApp.deleteTrigger(allTriggers[i]);
break;
//else delete all the triggers found
} else {
ScriptApp.deleteTrigger(allTriggers[i]);
}
}
PropertiesService.getScriptProperties().deleteProperty("autoExportTrigger");
return "Auto export has been cancelled";
}
You can check the demo app right here.
The reference to the script properties service is here.
The reference to the Time Zones list is here.
I hope this helps!
It seems that you are looking for daily database backups. App Maker Team recommends migrating apps to Cloud SQL if you haven't done this so far. Once you start using Cloud SQL as your data backend you can configure backups through Google Cloud Console:
https://cloud.google.com/sql/docs/mysql/backup-recovery/backups

Meteor JS PubSub based on action

I need your help or suggestion regarding my refresh function. I have this button called refresh that when clicked it will refresh (rearrange the data sorting based on createdAt field). I have been battling for days trying to get this correctly by resubscribing which i am not sure if it is the correct way or not.
Is there a correct way to resubscribe or re-sorting an a collection on the client when button clicked? Thanks a lot.
Yes, you can do this with following steps:
Pass the sorting type(asc or desc) into router query.
Update the subscribe sorting of server.
You need to also also update your client side find() methods sort, because when data does not change or few document get updated by your re-subscription, SO the oldest data will always come at first.
You can subscribe or re-subscribe collection on either router level or template level. If you are using Flow Rotuer then your re-subscribe will not work simply because flow router is not reactive. I prefer to use subscription at template level. Using Iron router query.
Here is the code sample :
Templete.templeteName.onRendered(function(){
this.autorun(function(){
var sort = {};
if(!Router.current().params.query || Router.current().params.query.sortType == 1 ) {
sort.createdAt = 1;
} else {
sort.createdAt = -1;
}
//You can use this handle to show/hide loader.
var handle = Meteor.Subscribe('subscriptionName', sort);
})
})
Templete.templeteName.helpers({
'data' : function(){
var sort = {};
if(!Router.current().params.query || Router.current().params.query == 1 ) {
sort.createdAt = 1;
} else {
sort.createdAt = -1;
}
return collection.find({},{sort:sort});
}
});
Templete.templeteName.events({
'click .refresh' : function(){
var sortType = value //get the value -1 or 1 from html.
Router.go('routeNaem',{},{query:{sortType:sortType}})
}
});

How to create multiple object stores in IndexedDB

I don't know if I'm right or wrong. But as I know I can't create a version change transaction manually. The only way to invoke this is by changing the version number when opening the indexed DB connection. If this is correct, in example1 and example2 new objectStore will never be created?
Example1
function createObjectStore(name){
var request2 = indexedDB.open("existingDB");
request2.onupgradeneeded = function() {
var db = request2.result;
var store = db.createObjectStore(name);
};
}
Example2
function createObjectStore(name){
var request2 = indexedDB.open("existingDB");
request2.onsuccess = function() {
var db = request2.result;
var store = db.createObjectStore(name);
};
}
Example3 - This should work:
function createObjectStore(name){
var request2 = indexedDB.open("existingDB", 2);
request2.onupgradeneeded = function() {
var db = request2.result;
var store = db.createObjectStore(name);
};
}
If I want to create multiple objectStore's in one database how can I get/fetch database version before opening the database??
So is there a way to automate this process of getting database version number??
Is there any other way to create objectStore other than that using onupgradeneeded event handler.
Please help. Thanks a lot.
Edit:
Here is same problem that I have:
https://groups.google.com/a/chromium.org/forum/#!topic/chromium-html5/0rfvwVdSlAs
You need to open the database to check it's current version and open it again with version + 1 to trigger the upgrade.
Here is the sample code:
function CreateObjectStore(dbName, storeName) {
var request = indexedDB.open(dbName);
request.onsuccess = function (e){
var database = e.target.result;
var version = parseInt(database.version);
database.close();
var secondRequest = indexedDB.open(dbName, version+1);
secondRequest.onupgradeneeded = function (e) {
var database = e.target.result;
var objectStore = database.createObjectStore(storeName, {
keyPath: 'id'
});
};
secondRequest.onsuccess = function (e) {
e.target.result.close();
}
}
}
The only way you can create an object store is in the onupgradeneeded event. You need a version_change transaction to be able to change the schema. And the only way of getting a version_change transaction is through a onupgradeneeded event.
The only way to trigger the onupgradeneeded event is by opening the database in a higher version than the current version of the database. The best way to do this is keeping a constant with the current version of the database you need to work with. Every time you need to change the schema of the database you increase this number. Then in the onupgradeneeded event, you can retrieve the current version of the database. With this, you can decide which upgrade path you need to follow to get to the latest database schema.
I hope this answers your question.

Categories