JavaScript Web Worker Messages [duplicate] - javascript

This question already has answers here:
How to allow Web Workers to receive new data while it still performing computation?
(5 answers)
Closed 2 years ago.
I need to have a JavaScript web worker that maintains its own internal state (not just replies to messages). It also needs to do be able to do a computation until it is told to stop (by a message being sent). Something like the following:
// worker.js
initialState = () => {...}
updateState = (state) => {...}
updateStateWithMessage = (state, message) => {...}
state = initalState()
state = updateStateWithMessage(state, self.getmessage())
while (true) {
while (!self.hasmessage()) {
state = updateState(state)
}
state = updateStateWithMessage(state, self.getmessage())
self.postmessage(state)
}
//main.js
worker = new Worker("worker.js")
worker.onmessage = (event) => console.log(event.data)
onClick() {
worker.postMessage("Here is some data.")
}
I couldn't think of a way to implement this with a single self.onmessage callback in the worker (which is how I have seen most examples of Web Workers operating) since it replies on maintaining its own internal state. So I invented the fictitious self.getmessage and self.hasmessage functions. Can anyone show me how to actually implement this or something similar.
Thanks in advance!

I'm not really sure if I understand your question.
If you want your web worker to be stateful and perform different tasks according to the message the main thread sent to the worker, you will need to implement your own messaging system. You will need to establish some kind of convention for your messages, and use some way to distinguish between the incoming messages, both in the main thread and in your web worker. Basically you need to implement a very lightweight version of Redux.
For example, I wrote a Three.js app where I render a bitmap in an OffscreenCanvas in a web worker.
First, I established a convention for the messages exchanged between main thread and worker. All my messages have a string representing the action the message is all about, a payload containing some optional data, and a source stating who sent the message (i.e. the main thread or the web worker).
In my app, the main thread can send 2 types of messages to the worker, while the worker can sen 3 types of messages.
// Actions sent by main thread to the web worker
export const MainThreadAction = Object.freeze({
INIT_WORKER_STATE: "initialize-worker-state",
REQUEST_BITMAPS: "request-bitmaps"
});
// Actions sent by the web worker to the main thread
export const WorkerAction = Object.freeze({
BITMAPS: "bitmaps",
NOTIFY: "notify",
TERMINATE_ME: "terminate-me",
});
My web worker is stateful, so in my worker.js file I have this code:
// internal state of this web worker
const state = {};
The worker can discern the messages sent by the main thread with a simple switch statement, handle the message, and respond to the main thread with postMessage:
// worker.js
const onMessage = event => {
const label = `[${event.data.source} --> ${NAME}] - ${event.data.action}`;
console.log(`%c${label}`, style);
switch (event.data.action) {
case MainThreadAction.INIT_WORKER_STATE:
// handle action and postMessage
break;
case MainThreadAction.REQUEST_BITMAPS: {
// handle action and postMessage
break;
}
default: {
console.warn(`${NAME} received a message that does not handle`, event);
}
}
};
onmessage = onMessage;
The code running in the main thread also uses a switch statement to distinguish between the messages sent by the worker:
// main.js
switch (event.data.action) {
case WorkerAction.BITMAPS: {
// handle action
break;
}
case WorkerAction.TERMINATE_ME: {
worker.terminate();
console.warn(`${NAME} terminated ${event.data.source}`);
break;
}
case WorkerAction.NOTIFY: {
// handle action
break;
}
default: {
console.warn(`${NAME} received a message that does not handle`, event);
}
}

Related

Can't get custom push notification event working in PWA (Firebase)

I've been searching for a few hours on how to get my custom push notification working. Here is how I've set up my project: no front-end framework, a Node.js/Express.js back-end, Firebase Cloud Messaging (FCM) as push manager and a custom service worker. I am currently hosting the app on localhost and I have HTTPS set up and a manifest.json that contains the minimum amount of fields to get started. The manifest.json contains a start_url field that points to /index.html, the landing page for the app. The app is bundled with webpack v. 4.
Back-end
On the back-end, I have set up the Firebase Admin SDK in a specific router and I send a custom notification object a bit like the following to the FCM server:
let fcMessage = {
data : {
title : 'Foo',
tag : 'url to view that opens bar.html'
}
};
When an interesting event occurs on the back-end, it retrieves a list of users that contains the FCM registration tokens and sends the notification to the FCM servers. This part works great.
Front-end
I have two service workers on the front-end. Inside my front-end index.js, I register a custom service worker named sw.js in the domain root and tell firebase to use that service worker like so:
if ('serviceWorker' in navigator) {
navigator.serviceWorker.register('./sw.js')
.then(registration => {
messaging.useServiceWorker(registration);
})
.catch(err => console.error(`OOps! ${err}`));
}
FCM and its credentials are set up and the user can subscribe to push notifications. I won't show that code here since it works and I don't believe it is the issue.
Now on to the service workers themselves. I have a firebase-messaging-sw.js file at the root of my domain. It contains the following code:
importScripts('https://www.gstatic.com/firebasejs/4.8.1/firebase-app.js');
importScripts('https://www.gstatic.com/firebasejs/4.8.1/firebase-messaging.js');
firebase.initializeApp(configuration);
const messaging = firebase.messaging();
Configuration is just a placeholder for all of the creds. Again that stuff works.
What I want to do is to NOT use the FCM push notification and instead create my own push notification that contains a url to a view that the user can click on and go to that view. The following code almost works and is a hack I found on another site:
class CustomPushEvent extends Event {
constructor(data) {
super('push');
Object.assign(this, data);
this.custom = true;
}
}
self.addEventListener('push', (e) => {
console.log('[Service Worker] heard a push ', e);
// Skip if event is our own custom event
if (e.custom) return;
// Keep old event data to override
let oldData = e.data;
// Create a new event to dispatch
let newEvent = new CustomPushEvent({
data: {
json() {
let newData = oldData.json();
newData._notification = newData.notification;
delete newData.notification;
return newData;
},
},
waitUntil: e.waitUntil.bind(e),
})
// Stop event propagation
e.stopImmediatePropagation();
// Dispatch the new wrapped event
dispatchEvent(newEvent);
});
messaging.setBackgroundMessageHandler(function(payload) {
if (payload.hasOwnProperty('_notification')) {
return self.registration.showNotification(payload.data.title,
{
body : payload.data.text,
actions : [
{
action : `${payload.data.tag}`,
title : 'Go to link'
}
]
});
} else {
return;
}
});
self.addEventListener('notificationclick', function(e) {
console.log('CLICK');
e.notification.close();
e.waitUntil(clients.matchAll({ type : 'window' })
.then(function(clientList) {
console.log('client List ', clientList);
const cLng = clientList.length;
if (clientList.length > 0) {
for (let i = 0; i < cLng; i++) {
const client = clientList[i];
if (client.url === '/' && 'focus' in client) {
return client.focus();
}
}
} else {
console.log('no clients ', e.action);
clients.openWindow(e.action)
.then(client => {
console.log('client ', client);
return client.navigate(e.action);
})
.catch(err => console.error(`[Service Worker] cannot open client : ${err} `));
}
}))
});
The hack is meant to capture a push event and the FCM default notification payload and instead serve that payload through a custom one made via the Notification API.
The code above works great but ONLY if I put it in the firebase-messaging-sw.js file. That's not what I really want to do: I want to put it in the sw.js file instead but when I do, the sw.js cannot hear any push events and instead I get the default FCM push notification. I've also tried importing the entire firebase-messaging-sw scripts into the custom service worker and it still won't hear the message events.
Why do I want to use it in my service worker instead of the Firebase one? It's to be able to open the app on the view passed into the 'tag' field on the notification's body. If I use the Firebase service worker, it tells me that it's not the active registered service worker and though the app does open in a new window, it only opens on /index.html.
Some minor observations I've made: the clients array is always empty when the last bit of code is added to the firebase-messaging-sw.js file. The custom service worker is installed properly because it handles the app shell cache and listens to all of the other events normally. The firebase-messaging-sw service worker is also installed properly.
After much pain and aggravation, I figured out what the problem was. It was a combination of the architecture of the app (that is, a traditional Multi-Page App) and a badly-formed url in the custom service worker, sw.js as so:
sw.js
self.addEventListener('fetch', e => {
// in this app, if a fetch event url calls the back-end, it contains api and we
// treat it differently from the app shell
if (!e.request.url.includes('api')) {
switch(e.request.url) {
case `${endpoint}/bar`: // <-- this is the problem right here
matcher(e, '/views/bar.html');
break;
case `${endpoint}/bar.js`:
matcher(e, '/scripts/bar.js');
break;
case `${endpoint}/index.js`:
matcher(e, '/index.js');
break;
case `${endpoint}/manifest.json`:
matcher(e, '/manifest.json');
break;
case `${endpoint}/baz/`:
matcher(e, '/views/bar.html');
break;
case `${endpoint}/baz.js`:
matcher(e, '/scripts/bar.js');
break;
default:
console.log('default');
matcher(e, '/index.html');
}
}
});
Matcher is the function that matches the request url with the file on the server. If the file already exists in the cache, it returns what is in the cache but if it doesn't exist in the cache, it fetches the file from the server.
Every time the user clicks on the notification, it's supposed to take him/her to the 'bar' html view. In the switch it must be:
case `${endpoint}/bar/`:
and not
case `${endpoint}/bar`:
Even though the message-related code is still in the firebase-messaging-sw.js file, what happens is it creates a new WindowClient when the browser is in the background. That WindowClient is under the influence of sw.js, not firebase-messaging-sw.js. As a result, when the window is opened, sw.js intercepts the call and takes over from firebase-messaging-sw.js.

Writing middleware for event handlers

Use case:
I have to handle several events which require an "available client". So in each event handler I first have to try to get an available client. If there is no client available I'll respond with a "Service unavailable" message. Right now I've implemented that requirement like this:
public constructor(consumer: RpcConsumer) {
consumer.on('requestA', this.onRequestA);
}
private onRequestA = async (msg: RpcConsumerMessage) {
const client: RequestClient = this.getClient(msg);
if (client == null) {
return;
}
msg.reply(await client.getResponseA());
}
private getClient(msg: RpcConsumerMessage): RequestClient {
const client: RequestClient= this.clientManager.getClient();
if (client == null) {
const err: Error = new Error('Currently there is no client available to process this request');
msg.reply(undefined, MessageStatus.ServiceUnavailable, err);
return;
}
return client;
}
The problem:
I don't want to check for an available client in all event handlers again and again. Instead I thought a middleware would perfectly fit into this use case. It would check for an available client and passes on the client instance if there is one. If there is not available client it will respond with the error message.
The question:
How would I write such a middleware for this case?
Build a curried method for this:
private withClient(cb: (client: RequestClient) => string | Promise<string>) {
return function(msg: RpcConsumerMessage) {
const client: RequestClient= this.clientManager.getClient();
if (client == null) {
const err: Error = new Error('Currently there is no client available to process this request');
msg.reply(undefined, MessageStatus.ServiceUnavailable, err);
return;
}
msq.reply(await cb(client));
};
}
So you can use it as:
private onRequestA = withClient(client => client.getResponseA());
If I understand correctly I don't think you actually NEED middleware, although you might choose to go that route.
You can just have a module that is in charge of finding a client and serving one up if it is available. This would look something like this:
const _client;
module.exports = {
getClient
}
getClient(){
return _client;
}
function setClient(){
//Your logic to find an available client
//run this whenever a client disconnects (if there is an event for that) or until you are connected to a client
_clent = client; //When you find that client set it to `_client`. You will return this everytime someone calls getClient.
}
The advantage here is that once you find a client, the module will serve up that same client until you are disconnected from it. The trick then is just making sure that you are always trying to connect to client when you are disconnected - even when there are no requests. I hope this makes sense.

How to read Client.postMessage before the page loaded?

I have a service worker that emits Client.postMessage during fetch when a cached resource has changed. I'm using this to notify the user that they might want to refresh.
My problem is that when the active page resource is changed and the service worker emits that message, the page hasn't loaded yet so no javascript can receive the message.
Is there a better way to handle cases like this rather than using waitUntil to pause a few seconds before emitting the message?
Another option would be to write to IndexedDB from the service worker, and then read it when the page loads for the first time, before you establish your message listener.
Using the ibd-keyval library for simplicity's sake, this could look like:
// In your service worker:
importScripts('https://unpkg.com/idb-keyval#2.3.0/idb-keyval.js');
async function notifyOfUpdates(urls) {
const clients = await self.clients.matchAll();
for (const client of clients) {
client.postMessage({
// Structure your message however you'd like:
type: 'update',
urls,
});
}
// Read whatever's currently saved in IDB...
const updatedURLsInIDB = await idb.get('updated-urls') || [];
// ...append to the end of the list...
updatedURLsInIDB.push(...urls);
// ...and write the updated list to IDB.
await idb.set('updated-urls', updatedURLsInIDB);
}
// In your web page:
<script src="https://unpkg.com/idb-keyval#2.3.0/idb-keyval.js"></script>
<script>
async listenForUrlUpdates() {
const updatedURLsInIDB = await idb.get('updated-urls');
// Do something with updatedURLsInIDB...
// Clear out the list now that we've read it:
await idb.delete('updated-urls');
// Listen for ongoing updates:
navigator.serviceWorker.addEventListener('message', event => {
if (event.data.type === 'update') {
const updatedUrls = event.data.urls;
// Do something with updatedUrls
}
});
}
</script>

Websocket timing out too soon

I've set up a standard Phoenix websocket/channel environment but I am not using the socket.js provided - I have my own (very simple) code that connects to the channels and topics. However, I can't get the socket to persist beyond a minute or so. Is there any way to define the timeout for sockets? I don't have any special configurations in the Phoenix-side (all standard as per the documentation)
My javascript code is as follows:
const ws = new WebSocket(sock_url);
ws.onmessage = (msg) => {
const { payload, event } = JSON.parse(msg.data);
if (!event.startsWith("phx_")) {
onMessage(payload.body);
}
};
ws.onclose = (code, reason) => {
onClose(code, reason);
};
ws.onopen = () => {
ws.send(JSON.stringify({
topic: `users_socket:${user_id}`,
event: "phx_join",
payload: {},
ref: '1'
}));
};
Update: I ended up using the socket.js file that comes with Phoenix as everyone suggested - it just does everything I need. Thanks to everyone who answered :)
I am developing a project with Websockets (using Go not Phoenix or Elixir) and I've had the same disconnection problems that I've manage to solve (at least it has not been timing out since) by "pinging" the websocket i.e. sending a message in specific intervals.
Perhaps you could have something like this in your Javascript.
ws.onopen = () => {
ws.send(/** YOUR CODE */);
// Send a ping event every 10 seconds
setInterval(() => ws.send(JSON.stringify({ event: "ping" })), 10000);
}
And handle this new event type accordingly server-side. Also you can try to monitor the onclose event and depending on the reason re-open the connection. You can find a list of such event codes in the Mozilla docs.
phoenix backend expects a ping every 30 seconds. You can re-configure it like so:
defmodule UserSocket do
use Phoenix.Socket
## Transports
transport :websocket, Phoenix.Transports.WebSocket,
timeout: 300_000, # 5 minutes
transport_log: :debug
...
end
If you do not care for the timeout you can set it to very high. the code above sets it to 5 minutes.
In general phoenix.js will implement all of this for you. It is a very small lib. You will find at the end you implemented all that is in the lib with a bunch of things you got wrong :-)

NodeJS Event Emitter Blocking Issue

I have a node application handling some ZeroMQ events coming from another application utilizing the Node-ZMQ bindings found here: https://github.com/JustinTulloss/zeromq.node
The issue I am running into is one of the operations from an event takes a long time to process and this appears to be blocking any other event from being processed during this time. Although the application is not currently clustered, doing so would only afford a few more threads and doesn't really solve the issue. I am wondering if there is a way of allowing for these async calls to not block other incoming requests while they process, and how I might go about implementing them.
Here is a highly condensed/contrived code example of what I am doing currently:
var zmq = require('zmq');
var zmqResponder = zmq.socket('rep');
var Client = require('node-rest-client').Client;
var client = new Client();
zmqResponder.on('message', function (msg, data) {
var parsed = JSON.parse(msg);
logging.info('ZMQ Request received: ' + parsed.event);
switch (parsed.event) {
case 'create':
//Typically short running process, not an issue
case 'update':
//Long running process this is the issue
serverRequest().then(function(response){
zmqResponder.send(JSON.stringify(response));
});
}
});
function serverRequest(){
var deferred = Q.defer();
client.get(function (data, response) {
if (response.statusCode !== 200) {
deferred.reject(data.data);
} else {
deferred.resolve(data.data);
}
});
return deferred.promise;
}
EDIT** Here's a gist of the code: https://gist.github.com/battlecow/cd0c2233e9f197ec0049
I think, through the comment thread, I've identified your issue. REQ/REP has a strict synchronous message order guarantee... You must receive-send-receive-send-etc. REQ must start with send and REP must start with receive. So, you're only processing one message at a time because the socket types you've chosen enforce that.
If you were using a different, non-event-driven language, you'd likely get an error telling you what you'd done wrong when you tried to send or receive twice in a row, but node lets you do it and just queues the subsequent messages until it's their turn in the message order.
You want to change REQ/REP to DEALER/ROUTER and it'll work the way you expect. You'll have to change your logic slightly for the ROUTER socket to get it to send appropriately, but everything else should work the same.
Rough example code, using the relevant portions of the posted gist:
var zmqResponder = zmq.socket('router');
zmqResponder.on('message', function (msg, data) {
var peer_id = msg[0];
var parsed = JSON.parse(msg[1]);
switch (parsed.event) {
case 'create':
// build parsedResponse, then...
zmqResponder.send([peer_id, JSON.stringify(parsedResponse)]);
break;
}
});
zmqResponder.bind('tcp://*:5668', function (err) {
if (err) {
logging.error(err);
} else {
logging.info("ZMQ awaiting orders on port 5668");
}
});
... you need to grab the peer_id (or whatever you want to call it, in ZMQ nomenclature it's the socket ID of the socket you're sending from, think of it as an "address" of sorts) from the first frame of the message you receive, and then use send it as the first frame of the message you send back.
By the way, I just noticed in your gist you are both connect()-ing and bind()-ing on the same socket (zmq.js lines 52 & 143, respectively). Don't do that. Inferring from other clues, you just want to bind() on this side of the process.

Categories