Client side routing. How does it work? - javascript

I need a client-side routing solution to work with a chrome app. I've researched several and crossroads.js seems like a good fit. When I include it in my html file, it doesn't seem to work; that is, if I use code like
crossroads.addRoute('/news/{id}', function(id){
alert(id);
});
crossroads.parse('/news/123');
, the page alerts '123' but if I type '/news/321' in the browser's url bar, it preforms the browser's default action, instead of alerting '321'. What am I doing wrong. (Also, I realize the title is broad, but I believe the difficulties I'm having with crossroads.js are more general than crossroads.js in particular. It is given as an example.)

Use Hasher (by the same author) also.
The documentation on the Crossroads page tells you that you need to use Hasher, (because that will be used for monitoring the widow.location bar.).
So you would also need to use Hasher, and initialise it, then you can add your "Crossroads" routes to Hasher to start monitoring for those particular routes.
//setup crossroads
crossroads.addRoute('foo');
crossroads.addRoute('lorem/ipsum');
crossroads.routed.add(console.log, console); //log all routes
//setup hasher
hasher.initialized.add(crossroads.parse, crossroads); //parse initial hash
hasher.changed.add(crossroads.parse, crossroads); //parse hash changes
hasher.init(); //start listening for history change
//update URL fragment generating new history record
hasher.setHash('lorem/ipsum');
http://millermedeiros.github.com/crossroads.js/

The command parse tells crossroads to have a look at the string and do an action based on it.
So in the case of crossroads.parse('/news/123'); it will always use /news/123.
Since you want crossroads to parse what you have in the browser address bar, you'll need to use that value in the parse method:
crossroads.parse(document.location.pathname);

Related

A new Cookie is added instead of replacing existing one

I just finished localizing my web application using spring boot configuration as a base.
#Bean
public LocaleResolver localeResolver() {
return new CookieLocaleResolver();
}
Due to a requirement one is supposed to be able to change locale/language of the website by pressing a button. Said function is implemented with a little bit of JS and a cookie.
<script>
function updateCookie(lang) {
let name = "org.springframework.web.servlet.i18n.CookieLocaleResolver.LOCALE"
document.cookie = name+"="+lang
location.reload()
}
</script>
<a onclick="updateCookie('de')" class="flag-icon flag-icon-de mr-2"></a>
The idea is to update said cookie on click of a button and use it throughout the whole application. This works fine until I am trying to call a specific endpoint in my application.
In order to debug my application I use:
window.onload = function () {
alert(document.cookie)
}
Now to my problem:
When User-Testing the application this is the alert-feedback:
org.springframework.web.servlet.i18n.CookieLocaleResolver.LOCALE=de
Switching to other pages, refreshing, changing language etc. properly resets the cookie with a different value.
When calling a specific endpoint though, I get the following alert:
org.springframework.web.servlet.i18n.CookieLocaleResolver.LOCALE=de;
org.springframework.web.servlet.i18n.CookieLocaleResolver.LOCALE=fr
Instead of resetting/changing the existing cookie, a new one is added with the value 'de;'. A seemingly random semicolon is added.
This doesn't happen with endpoints using similar logic and almost identical implementation.
There is no further logic outside the little bit of JS code I've posted and I'm not touching the cookie in the backend.
Unfortunately I'm out of ideas. Any tips/help would be appreciated.

(JavaScript API 1.3 for Office) Set Value of Custom Properties

My client has decided to migrate to Office 2016 and porting portions of a business process to that client requires us to offer a replacement to the Document Information Panel, which is no longer available. The Backstage file information area isn't considered a sufficient user experience for the users in question, so we're endeavoring to replace the DIP with a Task Pane app.
This example: https://www.youtube.com/watch?v=LVGqpns0oT8&feature=share shows that the idea is, at least in theory, possible. We considered buying this app but can't find sufficient information to do so.
So we set about attempting to replicate the functionality we need in the DIP. It appears that we can successfully set Document Properties of standard types, such as strings, which looks something like this:
Word.context.run(function(context){
var properties = context.document.properties;
context.load(properties):
return context.sync().then(function(){
properties.title = properties.title + " Additional Title Text"; // once the sync goes off, this works.
return context.sync();
});
});
However, when we try to update an Document Property that's, for example, a Managed Metadata property defined by a SharePoint content type, the value in the proxy object loads and remains changed, but it seems to break its relationship to the actual document property. The code below demonstrates:
Word.context.run(function(context){
var properties = context.document.properties;
var customProperties = properties.customProperties;
context.load(properties):
context.load(customProperties);
return context.sync().then(function(){
var managedMetadataProperty = customProperties.getItem('MngdMetadata');
properties.title = properties.title + " Additional Title Text"; // once the sync goes off, this works.
context.load(managedMetadataProperty);
return context.sync().then(function(){
console.log(managedMetadataProperty.value) // let's say this looks like "10;#Label 1|64d2cd3d-57d4-4c23-9603-866d54ee74f1"
managedMetadataProperty.value = "11;#Label 2|cc3d57d4-4c23-72d4-3031-238b9100f52g"
return context.sync(); // now the value in the javascript object for managedMetadataProperty is updated, but the value in the document does not change.
});
});
});
The document property Managed Metadata Property never changes in the Word UI, nor does a change push back to the SharePoint. Say we save and close the document after making the update, then re-open it. The Property value has not visibly changed, however when we load the proxy object with 'context.load()', the value that's available reflects the changes we made on last run.
I'm unclear about why this would be. It seems like to circumvent this, I would need to make a call back to SharePoint to update the relevant field, but I don't know how I would instruct Word to refresh with the new information from SharePoint.
That's a great question.
The custom properties API gives you access to some built-in properties as well as custom properties. SP-related properties do NOT follow in this category from the API perspective. (and the same is true in VBA/VSTO/COM) To access those you need to use the CustomXmlParts functionalities. Here is a good example on how to use it in the Javascript API.
Also, FYI, the team is working right now in a feature to enable the DIP again, i don't have concrete dates or commitment, but you might get this functionality again out of the box soon.
Have you tried customPropertyCollectionObject.add(key, value) ?
It will replace existing kvp's in the customPropertiesCollectionObject.
Here is the documentation customPropertiesCollection

NodeJS, SocketIO and Express logic context build

I read a lot about Express / SocketIO and that's crazy how rarely you get some other example than a "Hello" transmitted directly from the app.js. The problem is it doesn't work like that in the real world ... I'm actually desperate on a logic problem which seems far away from what the web give me, that's why I wanted to point this out, I'm sure asking will be the solution ! :)
I'm refactoring my app (because there were many mistakes like using the global scope to put libs, etc.) ; Let's say I've got a huge system based on SocketIO and NodeJS. There's a loader in the app.js which starts the socket system.
When someone join the app it require() another module : it initializes many socket.on() which are loaded dynamically and go to some /*_socket.js files in a folder. Each function in those modules represent a socket listener, then it's way easier to call it from the front-end, might look like this :
// Will call `user_socket.js` and method `try_to_signin(some params)`
Queries.emit_socket('user.try_to_signin', {some params});
The system itself works really well. But there's a catch : the module that will load all those files which understand what the front-end has sent also transmit libraries linked with req/res (sessions, cookies, others...) and must do it, because the called methods are the core of the app and very often need those libraries.
In the previous example we obviously need to check if the user isn't already logged-in.
// The *_socket.js file looks like this :
var $h = require(__ROOT__ + '/api/helpers');
module.exports = function($s, $w) {
var user_process = require(__ROOT__ + '/api/processes/user_process')($s, $w);
return {
my_method_called: function(reference, params, callback) {
// Stuff using $s, $w, etc.
}
}
// And it's called this way :
// $s = services (a big object)
// $w = workers (a big object depending on $s)
// They are linked with the req/res from the page when they are instantiated
controller_instance = require('../sockets/'+ controller_name +'_socket')($s, $w);
// After some processes ...
socket_io.on(socket_listener, function (datas, callback) {
// Will call the correct function, etc.
$w.queries.handle_socket($w, controller_name, method_name, datas);
});
The good news : basically, it works.
The bad news : every time I refresh the page, the listeners double themselves because they are in a loop called on page load.
Below, this should have been one line :
So I should put all the socket.on('connection'...) stuff outside the page loading, which means when the server starts ... Yes, but I also need the req/res datas to be able to load the libraries, which I get only when the page is loaded !
It's a programing logic problem, I know I did something wrong but I don't know where to go now, I got this big system which "basically" works but there's like a paradox on the way I did it and I can't figure out how to resolve this ... It's been a couple of hours I'm stuck.
How can I refacto to let the possibility to get the current libraries depending on req/res within a socket.on() call ? Is there a trick ? Should I think about changing completely the way I did it ?
Also, is there another way to do what I want to do ?
Thank you everyone !
NOTE : If I didn't explain well or if you want more code, just tell me :)
EDIT - SOLUTION : As seen above we can use sockets.once(); instead of sockets.on(), or there's also the sockets.removeAllListeners() solution which is less clean.
Try As Below.
io.sockets.once('connection', function(socket) {
io.sockets.emit('new-data', {
channel: 'stdout',
value: data
});
});
Use once instead of on.
This problem is similar as given in the following link.
https://stackoverflow.com/questions/25601064/multiple-socket-io-connections-on-page-refresh/25601075#25601075

Node.js changing exports on the fly

changing exports.X in a function seems to not work...
I want to be able to load settings from a file & access them in Node.js. I have this currently, however, the clients connecting to my node application can edit what's in the settings file. Unfortunately as it stands the Node application has to be restarted for the changes to take effect. Is there a way I can reload the module.exports on the fly?
EDIT:
Settings file is literally a JSON string.
My settings module is 'required' in almost every single file, and there's a lot of files... So reloading it per-file basis is out of the question. I do, however, know precisely when someone makes a change to the settings.
If you are using require to load the settings and only referencing the settings from one module, then doing something along the lines of:
delete require.cache[require.resolve(filename)];
will work for you.
If, on the other hand, multiple modules will be referencing these settings, that approach can become a bit unwieldy and open you up to unforeseen bugs. For example, if any of the modules are holding on to a reference to the required settings file, they would each need to somehow learn that the settings had changed and update their references.
To alleviate (not completely solve) the caching issue, you build your settings interface so that users of it must access either the settings object via a function and/or require that properties are accessed via functions. Even with this model, someone may still decide to cache a setting causing an obscure failure later down the road.
Using the simplest approach of a single getter for the settings object would look something like this:
var settings = require('./settings.json');
// ... watch for changes and reload by invalidating node's cache
module.exports = function() { return settings; }
Usage:
var settings = require('./path/to/settings');
settings().foo;
There are several libraries that do settings. Depending on your needs, I'm partial to nconf.
I'd set up a file watcher here that checks for changes of a JSON file dynamically. It is not recommended practice to change a JS script once the app is running.
Something like:
var _ = require("lodash");
var fs = require("fs");
var result = {};
fs.watch('my-settings.json',function(event,filename){
fs.readFile(filename,function(err,data){
if(err){
// your error catching
}
_.extend(result,JSON.parse(data));
});
});
module.exports = result;
Now, this comes with lots of caveats, first that fs.watch is not always supported by all platforms.
http://nodejs.org/api/fs.html#fs_fs_watch_filename_options_listener
Second, that it's really awkward to change a property like this. The expectation is generally that exports of module not mutate. I'd instead recommend exposing a method whose result can change based on the state of the file, a getter for the resulting data.
Third, a file watcher can be expensive, memory-wise.
This is better code, IMHO:
var _ = require("lodash");
var fs = require("fs");
var filename = 'my-settings.json';
var lastModified;
var mySetting;
module.exports = {
getSettingAsync : function (callback) {
fs.stat(filename,function(err,stat){
if(stat.mtime == lastModified) {
callback(mySetting);
} else {
fs.readFile(filename,function(err,data){
if(err){
// your error catching
}
// this assumes that your data is always correct
mySetting = JSON.parse(data).mySetting;
callback(mySetting);
});
}
});
}
};
In this case, we both check for a JSON file, and expose this as an async method. You could just as easily change the code to use the sync versions if need be and return the value instead of invoking the callback. This version checks when the file was changed, which is cheaper than reading the whole file every time, reads the file if newer and saves you the need to use a potentially buggy file watcher.
By the way, I've not tested this code and it may contain errors as is, but the concept is sound.
But, perhaps the more salient question, why not just store that value in the database?

Trying to save open MS Access documents from JScript

I was hoping to save all open MS Access documents via a JScript run from the Windows Script Host.
So far I was able to obtain the MS Access object by calling:
var objAccess = GetObject('', "Access.Application");
But now I'm stumped. If it was MS Word, I'd enumerate all open documents in the .Documents property and call Documents.Item(n).SaveAs() method on each of them.
But how do you save-as all open documents in MS Access?
After you have your object variable set to an Access application instance with GetObject, use its Quit method with the acQuitSaveAll option (value = 1). Not sure about JScript; in VBScript, I can do it like this.
Dim objAccess
Set objAccess = GetObject(,"Access.Application")
WScript.Echo objAccess.CurrentDb.Name
objAccess.Quit(1) ' acQuitSaveAll
Set objAccess = Nothing
Note, when I used GetObject as in your example, objAccess was a new Access application instance rather than a reference to the instance which was running previously. So, with the GetObject line like this ...
Set objAccess = GetObject('', "Access.Application")
... the WScript.Echo line threw an error with CurrentDb.Name (because there was not a database open in that Access application instance.
This approach will save any changes to database objects (tables, forms, reports, etc.) which were in design mode but not saved. However if a user has any unsaved changes to data in a form, those changes will be discarded despite the acQuitSaveAll option. It seems that option only applies to objects, not data.
Edit: If that approach is not satisfactory, you can do something more sophisticated with VBA in your Access applications, as #Remou mentioned in his comment. An example is KickEmOff from Arvin Meyer. He also offers a sample database which demonstrates that code in action.
Edit2: Remou's comment got me thinking acQuitSaveNone (value = 2) should be safer than acQuitSaveAll ... the unsaved object changes would be discarded, but at least you would be less likely to save an object in a non-functional state.

Categories