Signalr hosting client scripts for javascript client in server - javascript

I have SingnalR (OWIN self-hosted) server hub working with .net client. Now i'm preparing to write web client. I see that hub scripts are served http://localhost:10102/signalr/hubs but cannot see Scripts/jquery-.min.js and Scripts/jquery.signalR-.min.js.
I assume those scripts are not served from server hub (but by default included by nuget to solution) - am i right or missing something?
Is there a way to reference those scripts directly form server (not to copy and host them on javascript client side)?

General:
https://learn.microsoft.com/en-us/aspnet/signalr/overview/guide-to-the-api/hubs-api-guide-javascript-client:
A JavaScript client requires references to jQuery and the SignalR core
JavaScript file. The jQuery version must be 1.6.4 or major later
versions, such as 1.7.2, 1.8.2, or 1.9.1. If you decide to use the
generated proxy, you also need a reference to the SignalR generated
proxy JavaScript file. The following example shows what the references
might look like in an HTML page that uses the generated proxy.
You have only to add the following scripts to your index.html (take care about the versions):
<script src="Scripts/jquery-1.10.2.min.js"></script>
<script src="Scripts/jquery.signalR-2.1.0.min.js"></script>
<script src="signalr/hubs"></script>
Serve these files from server:
Create directory in server project where you place these JS Files
Configure your server that he serves theses files. For that add app.UseFileServer(); to you Configure(...) method in Startup class. (See details about service files: https://learn.microsoft.com/en-us/aspnet/core/fundamentals/static-files)
Add the required scripts in client. There is an example (change adresses and script file to your files and you server adress:
<script type="text/javascript" src="http://localhost:10310/scripts/signalr-clientES5-1.0.0-alpha2-final.js></script>

Actually, you don't need to have scripts to implement SygnalR service (backend part). To make a connection between client index.html and your service, you need to have some kind of client lib that works with SygnalR to establish a connection.

Here's the EXACT solution I came into, based on answers inside this thread:
Static files (like additional javascript files) can be served within same host with configuration of below. Scripts will be available at http://{yourhost}/scripts/{scriptName} when placed inside \Scripts folder iniside solution ('copy if newer' has to be set for 'Copy To Output Directory' for each of the files).
public class Startup
{
public void Configuration(IAppBuilder app)
{
// Branch the pipeline here for requests that start with "/signalr"
app.Map("/signalr", map =>
{
// Setup the CORS middleware to run before SignalR.
// By default this will allow all origins. You can
// configure the set of origins and/or http verbs by
// providing a cors options with a different policy.
map.UseCors(CorsOptions.AllowAll);
var hubConfiguration = new HubConfiguration
{
// You can enable JSONP by uncommenting line below.
// JSONP requests are insecure but some older browsers (and some
// versions of IE) require JSONP to work cross domain
// EnableJSONP = true
};
// Run the SignalR pipeline. We're not using MapSignalR
// since this branch already runs under the "/signalr"
// path.
map.RunSignalR(hubConfiguration);
});
// Serving javascript libraries required for client as static content from Scripts folder
app.UseStaticFiles(new StaticFileOptions() {
RequestPath = new PathString("/scripts"),
FileSystem = new PhysicalFileSystem(#".\Scripts"),
});
}
}
IT Man

Related

Bokeh: change the endpoints for /static

Is there a way to change the endpoint that static JS files get served up from in Bokeh?
I have a number of bokeh dashboards that are accessed from behind a load balancer
Any request to https://myloadbalancer/{dashboard_name} is sent to the load balancer
The loadbalancer then routes the requests to the correct dashboard server based on the value of /{dashboard_name}
For any given dashboard, bokeh then attempts to access static javascript via https://myloadbalancer/static
For this to work, I need to create a new bokeh server just to serve up static files and then configure the loadbalancer to route requests to https://myloadbalancer/static to the new server
This approach is fine, until you start getting different javascript dependencies in the different dashboards
Does anyone know of a way to change the /static path of a bokeh dashboard. So that for example it reads static files from https://myloadbalancer/{dashboard_name}/static?
You can run your bokeh serve command with the option --prefix <base_path>. This will put <base_path> behind every resource requested to the Bokeh (Torando) web server, this applies also for /static resources.
Here you can find the official Bokeh documentation page.
Kind regards

WebPack: Accessing Node Environment Variables in client code

I'm working on a project using Webpack to package my client side JavaScript and CSS. It launches a server so I can do hot reloading and other neat tricks. So when I'm debugging my application, the webpack server is running at localhost:3000. I am also using nodemon to launch another web server to host my API calls. It obviously can't run on the same port, so I have to launch it on port 3002.
I have set a node environment variable that tells my api what port it should host on. I need to somehow gain access to that same environment variable in my client script so my ajax calls know what port they need to be calling.
Before I started using webpack, I was hosting my api and my client code from the same port and I could just make api calls like this 'controller/action'. Now that I have them hosted in essentially two different domains, I need to tell my api to call a fully qualified url including the port. ie: 'host:port/controller/action'. I understand that I'll also need to configure CORS on my API server as well.
When I push this to production, I will be hosting both client files and API calls from the same domain once again, so I will be able to continue making relative api calls 'controller/action'. So I need to gain access to the environment variables from my client code so I can determine how to form the api calls in Dev verses in Production environments.
Maybe a webpack devServer proxy would be worth pursuing.
devServer: {
...
proxy: {
'*/controller/*': {
target: 'http://localhost:3002'
}
}
The client would remain blissfully unaware of the differences between development/production.

Ruby on Rails & service worker

I'm trying to use a service worker in my Ruby on Rails application.
I need to use some erb features in my app/javascripts/service-worker.js.erb file. The service worker registration script looks like this:
var registerServiceWorker = function() {
navigator.serviceWorker.register(
'<%= asset_path('service-worker.js') %>',
{ scope: '/assets/' }
)
.then(function() {
console.info('Service worker successfully registered');
})
.catch(function(error) {
console.warn('Cannot register sercie worker. Error = ', error);
});
}
This does not work; I never get promise here:
navigator.serviceWorker.ready.then
I also tried ./ and / scopes but I got this error:
DOMException: Failed to register a ServiceWorker: The path of the
provided scope ('/') is not under the max scope allowed ('/assets/').
Adjust the scope, move the Service Worker script, or use the
Service-Worker-Allowed HTTP header to allow the scope.
If I move my service-worker.js to the public folder, remove the .erb extension and change scope to ./, everything works great, but I have no template engine there.
Any suggestions?
There's now a gem, serviceworker-rails that will allow you to proxy requests for serviceworker scripts in the asset pipeline. Because of the way browsers register serviceworker scripts, it's best to avoid caching them.
Sprockets will fingerprint assets and Rails (or your webserver) typically serves the compiled files with aggressive caching headers from the /assets directory.
What we want is the ability to customize service worker paths and response headers and still get Sprockets preprocessing, i.e. CoffeeScript/ES2015 to JavaScript transpilation.
The serviceworker-rails gem inserts middleware to your Rails stack that will proxy a request for /serviceworker.js (for example) to the corresponding fingerprinted, compiled asset in production. In development, you get the auto-reloading behavior you would expect. You can set up multiple serviceworkers at different scopes as well.
https://github.com/rossta/serviceworker-rails
Because of the security purpose, you can't register ServiceWorker in higher scope than from where it was executed.
If you really need template engine, you may try to dynamically load JS file from file in your /public folder (How do I include a JavaScript file in another JavaScript file?). Currently Service-Worker-Allowed HTTP header is not implemented yet in Firefox (https://bugzilla.mozilla.org/show_bug.cgi?id=1130101)
I'm running into this problem now as well. Based on my research, I think the proper way for a Rails application to handle service workers is to serve the service worker javascript file normally using the Asset pipeline, scope the service worker using the scope option (like you have done), and then use the Service-Worker-Allowed HTTP header to explicitly allow the new scope.
Unfortunately, at the moment there are several barriers to implementing this method:
Rails 4 (apparently) does not allow adding HTTP headers to Assets it serves, other than the cache-control header. To work around this you could add a Rack plugin as detailed in this S.O. answer, or in production if you are using an Asset server such as Nginx, you can get around this by having Nginx add the HTTP header. Though this doesn't help during development.
Rails 5 does allow adding HTTP headers to Assets however, as detailed in this blog post.
Browser support for service workers and the Service-Worker-Allowed HTTP header is currently poor.
Just do it in the root directory. For example, I put a route
get 'service-worker(.format)', to: 'core#service_worker'
Then just put in your controller (core_controller.rb in my example)
def service_worker
end
Then create app/views/core/service_worker.js.erb and put your JS there (including any <%= stuff %>).
I recently wrote an article about how to do that entirely with Webpacker. Please find my guide to a Progressive Web App with Rails.
In short:
add the webpacker-pwa npm package with yarn add webpacker-pwa and edit environment.js with:
const { resolve } = require('path');
const { config, environment, Environment } = require('#rails/webpacker');
const WebpackerPwa = require('webpacker-pwa');
new WebpackerPwa(config, environment);
module.exports = environment;
This will allow you to use service workers. Please check the README of the project as well: https://github.com/coorasse/webpacker-pwa/tree/master/npm_package
Leave the service worker file in whatever directory imposed by your project structure and set the scope option to / and add the Service-Worker-Allowed HTTP header to the response of your service worker file.
In ASP.Net, I added the following to web.config file:
<location path="assets/serviceWorker.js">
<system.webServer>
<httpProtocol>
<customHeaders>
<add name="Service-Worker-Allowed" value="/" />
</customHeaders>
</httpProtocol>
</system.webServer>
</location>
And register the worker by:
navigator.serviceWorker.register('/assets/serviceWorker.js', { scope: '/' })
.then(function (registration)
{
console.log('Service worker registered successfully');
}).catch(function (e)
{
console.error('Error during service worker registration:', e);
});
Tested on Firefox and Chrome.
Refer to the examples in Service worker specification

Making signalr generated proxy hubs url dynamic

I have a solution which need to connect using CORS to a signalr exposed service.
The address where the signalr service will be hosted could change in time, so it's not
suitable to have the classic script tag
<script src="http://server:8888/signalr/hubs" type="text/javascript"></script>
but it would be fantastic if there's a way to reference the above url dynamically by javascript without the static script tag.
Suggestions would be great!
Do the following in your JS file:
$.getScript('http://server:8888/signalr/hubs', function () {
//Set the hubs URL for the connection
$.connection.hub.url = 'http://server:8888/signalr';
var hub = $.connection.yourHub; //yourHub is name of hub on the server side
//wire up SignalR
//start hub
$.connection.hub.start().done(function () {
//once we're done with SignalR init we can wire up our other stuff
});
});
You could put your settings in a config file:
config.json:
{
"signalr_url": "http://server:8888/signalr"
}
Then load the config:
$.getJSON('config.json', function(config) {
$.getScript(config.signalr_url, function() {
// when the script has loaded
});
});
Edit:
After looking at the SignalR documentation, I think the following would be more suitable.
First include the needed scripts:
<script src="Scripts/jquery-1.6.4.min.js" type="text/javascript"></script>
<script src="Scripts/jquery.signalR.min.js" type="text/javascript"></script>
Now you can just call $.connection(...); with any url. So applying the above it could look like this:
$.getJSON('config.json', function(config) {
$.connection(config.signalr_url);
});
I have this same scenario and after some researchs we've decided to put the generated proxy as a physical script like bellow:
How to create a physical file for the SignalR generated proxy
As an alternative to the dynamically generated proxy, you can create a
physical file that has the proxy code and reference that file. You
might want to do that for control over caching or bundling behavior,
or to get IntelliSense when you are coding calls to server methods.
To create a proxy file, perform the following steps:
Install the Microsoft.AspNet.SignalR.Utils NuGet package.
Open a command prompt and browse to the tools folder that contains the
SignalR.exe file. The tools folder is at the following location:
[your solution
folder]\packages\Microsoft.AspNet.SignalR.Utils.2.1.0\tools
Enter the following command:
signalr ghp /path:[path to the .dll that contains your Hub class]
The path to your .dll is typically the bin folder in your project
folder.
This command creates a file named server.js in the same folder as
signalr.exe.
Put the server.js file in an appropriate folder in your project,
rename it as appropriate for your application, and add a reference to
it in place of the "signalr/hubs" reference.
link: How to create a physical file for the SignalR generated proxy
As long we have proxy then you can just refer to the Hub url like this:
$.connection.hub.url = "http://your-url/signalr;
//open connection
$.connection.hub.start()
.done(function () {
//do your stuff
})
.fail(function () { alert('unable to connect'); });

socket.io.js not loading

I know there are a bunch of questions on this already, but none have answered it for me, and plus mine is slightly different.
I'm starting the socket.io server in node using this:
var io = require('socket.io').listen(8000);
My terminal says everything is ok:
info - socket.io started
Now I am trying to load the .js file in my clientside browser using this url:
http://<hostname>:8000/socket.io/socket.io.js
I dont get a 404, it just hangs forever. I've also used a network utility to ping port 8000, and it seems to be open fine.
I installed node and socket.io just yesterday, so they should be the latest versions. Can anyone shed any light on this? Thanks!
Turns out the reason I could never request the .js file was because my company network blocks all ports except the usual ones (80, 21, etc), so `I would never be able to communicate with port 8000.
Use express.js. Place the socket.io file in public/javascripts folder and add this line to your html
<script src="/javascripts/socket.io.js"></script>
I think this is the best way. When you're writing http://<hostname>:8000/socket.io/socket.io.js
node tries to find a folder named socket.io in your project's public folder. And the file socket.io.js in it.
If you don't want to use express.js you should catch the request and try to load a file if no routes were found for your request (what actually express does) because node doesn't know what to do for requests which don't match any routes in your server.
And I recommend to use the socket.io.min.js file (it's smaller and it's in folder node_modules\socket.io\node_modules\socket.io-client\dist)
You have to start an http/https server to access it via http/https. Simply starting an socket.io server won't do. Do the following:
var http = require('http');
var app = http.createServer(),
io = require('socket.io').listen(app);
app.listen(7000, "0.0.0.0");
Then I can access the file http://localhost:7000/socket.io/socket.io.js
sockets.io uses websocket protocol (ws://). See the wikipedia page.
You need to get at least 3 pieces working together.
Serve some HTML (/index.html will do just fine) so there's a web page. This file should contain the socket.io client <script> tag. For this you need the http server portion of the starter examples. You are missing this and that's why browsing to your server just hangs.
Serve the socket.io client. Socket.io will do this for you automatically when you pass in your http server function to it. You don't need full express as this can be done with just node's http module as per the first example on the socket.io docs.
Some javascript to actually do something with the socket. This can be the content of a <script> tag in index.html or a separate file. If it's a separate file, you need to set up your http server to actually serve it.

Categories