Run web extension's service worker on browser open (Chrome) - javascript

I am building a web extension (Chrome) that checks if the external's API changed. This should be happening periodically (e.g. every 10 mins) and in the background. The plan is to have a service worker that would fire these requests, and replace extension's icon when a change in the response was detected. I have a working code that does exactly that, but I am unable to persist the service worker, and make it run on browser load (i.e. moment when the window opens). I managed to use message API, but that requires the user to click the extension button to open it and only then the extension would continuously run in the background.
This is my service worker code:
const browser = chrome || browser;
self.addEventListener("message", async (event) => {
if (event.data && event.data.type === 'CHECK_FOR_NEW_RESOURCES') {
communicationPort = event.ports[0];
const compareStructures = setInterval(async () => {
const currentStructure = await getStructure(event.data.baseURL + 'webservice/rest/server.php', event.data.token);
const {curr, newResources } = findDifferences(event.data.structure.base, currentStructure);
if(newResources > 0) {
communicationPort.postMessage({different: true, structure: curr, newResources,
time: new Date().toISOString()});
browser.action.setIcon({ path: { '48': event.data.newResourcesIcon } });
clearInterval(compareStructures);
} else {
communicationPort.postMessage({ different: false, time: new Date().toISOString() });
browser.action.setIcon({ path: { '48': event.data.noNewResourcesIcon } });
}
}, 900000);
}
});
const getStructure = async (url, token) => {
// Uses fetch() to get the resources
...
};
const findDifferences = (newStructure, oldStructure) => {
...
};
If it is not possible, what are the viable options to achieve my desired design? Could reverting to manifest ver. 2 help?
Hopefully my description makes sense, and I think it is possible, as I have seen extensions send notifications when the browser is opened.

Related

Delayed read performance when using navigator.serial for serial communication

I've been trying out the web serial API in chrome (https://web.dev/serial/) to do some basic communication with an Arduino board. I've noticed quite a substantial delay when reading data from the serial port however. This same issue is present in some demos, but not all.
For instance, using the WebSerial demo linked towards the bottom has a near instantaneous read:
While using the Serial Terminal example results in a read delay. (note the write is triggered at the moment of a character being entered on the keyboard):
WebSerial being open source allows for me to check for differences between my own implementation, however I am seeing performance much like the second example.
As for the relevant code:
this.port = await navigator.serial.requestPort({ filters });
await this.port.open({ baudRate: 115200, bufferSize: 255, dataBits: 8, flowControl: 'none', parity: 'none', stopBits: 1 });
this.open = true;
this.monitor();
private monitor = async () => {
const dataEndFlag = new Uint8Array([4, 3]);
while (this.open && this.port?.readable) {
this.open = true;
const reader = this.port.readable.getReader();
try {
let data: Uint8Array = new Uint8Array([]);
while (this.open) {
const { value, done } = await reader.read();
if (done) {
this.open = false;
break;
}
if (value) {
data = Uint8Array.of(...data, ...value);
}
if (data.slice(-2).every((val, idx) => val === dataEndFlag[idx])) {
const decoded = this.decoder.decode(data);
this.messages.push(decoded);
data = new Uint8Array([]);
}
}
} catch {
}
}
}
public write = async (data: string) => {
if (this.port?.writable) {
const writer = this.port.writable.getWriter();
await writer.write(this.encoder.encode(data));
writer.releaseLock();
}
}
The equivalent WebSerial code can be found here, this is pretty much an exact replica. From what I can observe, it seems to hang at await reader.read(); for a brief period of time.
This is occurring both on a Windows 10 device and a macOS Monterey device. The specific hardware device is an Arduino Pro Micro connected to a USB port.
Has anyone experienced this same scenario?
Update: I did some additional testing with more verbose logging. It seems that the time between the write and read is exactly 1 second every time.
the delay may result from SerialEvent() in your arduino script: set Serial.setTimeout(1);
This means 1 millisecond instead of default 1000 milliseconds.

Service worker no longer works as expected in Chrome

So again like other thousand times I updated my service workers file version, to check any errors, I opened the Chrome(Browser) developer tool, and what I see... fetching works in a new way...ERROR?
Fetch finished loading: GET "https://www.example.com/favicon.ico". etc...some more CSS and image, I don't know what this is: Fetch failed loading: GET "https://www.example.com/". (the last line of console log)
Why it needs to request the domain top root every time...
Now I check the headers (DEV tools - Network - Headers) because the network status = (failed)
Request URL: https://www.example.com/
Referrer Policy: unsafe-url
pretty much no headers info at all or any content???
If I use the preload it shows extra ERROR in red (The service worker navigation preload request failed with network error:net::ERR_INTERNET_DISCONNECTED), so I have disabled preload for now, the service worker would still work with this error.
I had just updated to PHP 8.0 so maybe that was doing something, but after getting back to the old version nothing changed. Maybe my server started blocking some sort of requests, but that is unlikely, more like bad request from chrome service workers.
If the Chrome tries to check with the last request some sort of offline capacity, I use to display an offline page if fetch error, If that has anything to do with this.
Anyways despite the problems/errors described above the service worker works like it should.
Anyways, here is the SW code example:
const OFFLINE_VERSION = 1;
var filevers='xxxx';
const CACHE_NAME = 'offline'+filevers;
// Customize this with a different URL if needed.
const OFFLINE_URL = 'offlineurl.php';
const OFFLINE_URL_ALL = [
'stylesheet'+filevers+'.css',
'offlineurl.php',
'favicon.ico',
'img/logo.png'
].map(url => new Request(url, {credentials: 'include'}));
self.addEventListener('install', (event) => {
event.waitUntil((async () => {
const cache = await caches.open(CACHE_NAME);
// Setting {cache: 'reload'} in the new request will ensure that the response
// isn't fulfilled from the HTTP cache; i.e., it will be from the network.
await cache.addAll(OFFLINE_URL_ALL);
})());
});
self.addEventListener('activate', (event) => {
event.waitUntil((async () => {
// Enable navigation preload if it's supported.
// See https://developers.google.com/web/updates/2017/02/navigation-preload
//removed for now
})());
// Tell the active service worker to take control of the page immediately.
self.clients.claim();
});
self.addEventListener('fetch', (event) => {
// We only want to call event.respondWith() if this is a navigation request
// for an HTML page.
const destination = event.request.destination;
if (destination == "style" || destination == "script" || destination == "document" || destination == "image" || destination == "font") {
event.respondWith((async () => {
try {
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(event.request);
if (cachedResponse) {
return cachedResponse;
} else {
// First, try to use the navigation preload response if it's supported.
//removed for now
const networkResponse = await fetch(event.request);
return networkResponse;
}
} catch (error) {
if (event.request.mode === 'navigate') {
// catch is only triggered if an exception is thrown, which is likely
// due to a network error.
// If fetch() returns a valid HTTP response with a response code in
// the 4xx or 5xx range, the catch() will NOT be called.
const cache = await caches.open(CACHE_NAME);
const cachedResponse = await cache.match(OFFLINE_URL);
return cachedResponse;
}
}
})());
}
});
Any suggestions, what may cause the error?

Problems using oauth with Electron

Im attempting to obtain an oauth token using "Implicit grant flow" in my electron app. The issue Im having is when the oauth service (in this case Twitch) redirects my electron app to the redirect uri with the token in it. When redirected the BrowserWindow seems to crash (error can be seen below). I've tried listening to multiple events provided by the BrowserWindow but all of them seem to not trigger before the crash. I've following multiple guides on how to make oauth work within Electron but none of them seem to actually work. If anybody has any success in doing this, Id very much appreciate a solution. Thanks.
Error message after being redirected
UnhandledPromiseRejectionWarning: Error: ERR_CONNECTION_REFUSED (-102) loading (redirect uri with token in it)
Code
const redirect = 'https://localhost/';
const authApp = new AuthApp(cid, redirect);
function handleAuthApp(url: string) {
const urlD = new URL(url);
console.log(urlD);
authApp.reset();
}
//Event that will trigger the AuthWindow to appear
ipcMain.on('get-auth', async (event, arg: any) => {
const window = await authApp.getWindow();
window.show();
window.on('close', () => {
authApp.reset();
console.log('closed');
});
// These events seem to never trigger
window.webContents.on('will-navigate', function(event, newUrl) {
console.log(`Navigate: ${newUrl}`);
handleAuthApp(newUrl);
});
window.webContents.on('will-redirect', function(event, newUrl) {
console.log(`Redirect: ${newUrl}`);
handleAuthApp(newUrl);
});
const filter = {
urls: [redirect+'*']
};
const { session } = window.webContents;
session.webRequest.onBeforeRedirect(filter, details => {
const url = details.url;
console.log(url);
event.returnValue = url;
window.close();
});
});
I was awaiting the URL to load before the 'will-navigate' events could be set. So the BrowserWindow would crash before the events could be fired.
Im dumb.

Change extension from .ff to .ff.js improve speed?

Ey! I recently made a small framework to build my small webpages and projects on top.
Today i'm checking a lot of articles about tricks to speed up stuff. I'm interested in improve xhr speed.
I been reading and found some file extensions get usually cached by default and others don't.
I use a filename.ff special extension on my frameworks to known what files i want to fech when accessing a resource.
As a live example
https://bugs.stringmanolo.ga/#projects/fastframework is being downloaded from https://github.com/StringManolo/bugWriteups/blob/master/projects/fastframework/fastframework.ff using XHR when you click the fastframework link in this page https://bugs.stringmanolo.ga/#projects
My question is:
If i change the extension from fastframework.ff to fastframework.ff.js is the file getting cached by the browser and then it will be downloaded faster? Also will be working offline? Or it's already cached? Changing the framework code to use .ff.js isn't going to make a diference at all?
I finally solved it in a better way using service workers and cache api.
I let you the code i used here, so maybe is helpfull to someone in the future.
ff.js (ff is a ff = {} object)
/*** Cache Service Workers Code */
ff.cache = {}
ff.cache.resources = [];
ff.cache.start = function(swName, ttl) {
let tl = 0;
tl = localStorage.cacheTTL;
if (+tl) {
const now = new Date();
if (now.getTime() > +localStorage.cacheTTL) {
localStorage.cacheTTL = 0;
caches.delete("cachev1").then(function() {
});
}
} else {
navigator.serviceWorker.register(swName, {
scope: './'
})
.then(function(reg) {
caches.open("cachev1")
.then(function(cache) {
cache.addAll(ff.cache.resources)
.then(function() {
localStorage.cacheTTL = +(new Date().getTime()) + +ttl;
});
});
})
.catch(function(err) {
});
}
};
ff.cache.clean = function() {
caches.delete("cachev1").then(function() {
});
};
/* End Cache Service Workers Code ***/
cache.js (this is the service worker intercepting the requests)
self.addEventListener('fetch', (e) => {
e.respondWith(caches.match(e.request).then((response) => {
if(response)
return response
else
return fetch(e.request)
}) )
})
main.js (this is the main file included into the index.html file)
ff.cache.resources = [
"./logs/dev/historylogs.ff",
"./blogEntries/xss/xss1.ff",
"./blogEntries/xss/w3schoolsxss1.ff",
"./blogEntries/csrf/w3schoolscsrf1.ff",
"./projects/fastframework/fastframework.ff",
"./projects/jex/jex.ff",
"./ff.js",
"./main.js",
"./main.css",
"./index.html",
"./resources/w3schoolspayload.png",
"./resources/w3schoolsxsslanscape.png",
"./resources/w3schoolsxss.png"];
ff.cache.start("./cache.js", 104800000);
/* 604800000 milliseconds equals 1 week */
You can test it live in https://bugs.stringmanolo.ga/index.html is hosted from github repo in case you need to see more code.

Websocket progress pauses when switching tabs on browser (client)

I face quite an extraordinary side effect I never faced before. Let me describe it thoroughly:
Client (browser) has an input tag, where files are being chosen. On clicking a specific button, an action emits, where those files are being transmitted to the server by websockets (by library Socket.io).
// Starts an upload after clicking on an upload button.
async upload() {
if (this.state.fileList.length === 0) { return; }
// Dispatch a start uploading action.
this.props.startUp();
// We map original files from the state's file list.
let originFiles = _.map(this.state.fileList, (file: any) => {
return file.originFileObj;
});
// Send files over the socket streams to the server.
await ApiService.upload(originFiles, () => {
// Update the state whenever a file has been uploaded.
this.setState({ ...this.state, finishedFiles: this.state.finishedFiles + 1 })
});
this.clearItemList();
// Dispatch an end uploading action.
this.props.endUp();
}
This function is called whenever the button is clicked. As you can see, there is an api service, that gets called on that filelist, and streams those files to the server. The files are streamed through the sockets.
import * as io from 'socket.io-client';
import * as ios from 'socket.io-stream';
export function upload(data: any[], onUpload?: () => void): Promise<any> {
return new Promise((res, rej) => {
const up = io.connect("ws://localhost:8000/");
// Right after we connect.
up.on('connect', () => {
// Connect to the upload socket point.
up.emit('upload', {});
// Whenever we receive an 'ok' status, we send files over the wire.
up.on('ok', async () => {
// If we sent all the files, notify the server to end.
if (data.length === 0) {
up.emit('end', {});
up.disconnect();
// Resolve this promise.
res();
// Otherwise, emit a 'data' action, that sends the files.
} else {
let blob = data.pop();
let stream = ios.createStream();
ios(up).emit('data', stream, { size: blob.size });
ios.createBlobReadStream(blob, { highWaterMark: 500000 }).pipe(stream);
// Callback for the onUpload event.
onUpload();
}
});
up.on('error', () => {
rej();
});
});
});
}
Everything works well, until I switch tabs on the client (browser) and the progress gets paused. After I switch to the client tab, the progress automatically resumes.
My colleague presumed, this might be a problem with a browser itself, which stops the files to be piped whenever I lose focus of the tab.
Any ideas on how to solve this problem and/or tweak the code a bit will be much appreciated.
Thank you.
It doesn't pause the stream, it slows it down only. check this option in Chrome: "--disable-background-timer-throttling"

Categories