How to download styles via Chrome debugger API - javascript

I need to get styles with exact, not computed values for my extension. I can use document.styleSheets in some cases, but in case when css styles hosted on other domain, I am getting CORS error. I found a way to get those styles with help of chrome.debugger API, but having difficulties with implementation:
chrome.debugger.attach(debuggeeId, "1.3", () => {
chrome.debugger.sendCommand(debuggeeId, "Page.enable", null, (r) => {
chrome.debugger.sendCommand(debuggeeId, "Page.getResourceTree", null, (res) => {
// get style URLs from resourceTree object
const cssResources = getCSSResources(res.frameTree.resources);
for(let url of cssResources) {
chrome.debugger.sendCommand(debuggeeId, "Page.getResourceContent", {frameId: toString(tabId), url: url}, (resp) => {
console.log(resp) /// return undefined
})
}
})
})
})
For some reasons I am getting undefined from Page.getResourceContent. Just to clarify, getting undefined because of CORS(also works here??) or because of incorrect response to chrome.debugger API?
Code below cause the same - no data from request was returned.

The only problem I see is that frameId is an internal id of the frame which you can get from res.frameTree.frame.id, it's not related to tabId.
Might process all frames recursively, at least of the same origin, and use the modern syntax:
chrome.debugger.attach(debuggeeId, '1.3', async () => {
const send = (cmd, params = null) =>
new Promise(resolve =>
chrome.debugger.sendCommand(debuggeeId, cmd, params, resolve));
await send('Page.enable');
const {frameTree} = await send('Page.getResourceTree');
const frameQueue = [frameTree];
const results = [];
for (const {frame, childFrames, resources} of frameQueue) {
frameQueue.push(...childFrames);
const frameId = frame.id;
for (const {url, type} of resources) {
if (type === 'Stylesheet') {
results.push({
url,
frameUrl: frame.url,
...await send('Page.getResourceContent', {frameId, url}),
});
}
}
}
console.log(results);
});

Related

React async fetch with 503 Service Unavailable

I am trying to fetch 20 random users data from 'https://randomuser.me/api' and in some cases, I get the 503 Service Unavailable error.
I tried to resolve the problem with this kind of code.
async function fetchDriver() {
const driver = await fetch(driverUrl)
.then((response) => {
if (response.status === 200) {
return response.json();
}
if (response.status === 503) {
return fetchDriver();
}
})
.then((data) => console.log(data))
.catch((error) => {
console.log(error);
fetchDriver();
});
return driver;
}
do {
const geoPosition = fetchGeoPosition();
const phoneNumber = fetchPhoneNumber();
const licenseNumber = fetchLicense(3);
const speed = fetchSpeed(60, 200);
const driver = fetchDriver();
const first = driver.name.first;
const last = driver.name.last;
fetchedCars.push({
licenseNumber,
first,
last,
phoneNumber,
geoPosition,
speed,
favorite: false,
more: false,
});
} while (fetchedCars.length < 20);
But console.log(data) in some cases still shows the value of undefined and I just don't know what to do with this.
random user API has a rate limiter. use this to fetch multiple users in one request (for example 20 users):
https://randomuser.me/api/?results=20
check the documentation here
This GitHub thread mentions the rate-limiting and the solution I've included

Electron BrowserWindow cannot get response when debugger is attached

I'm writing an Electron app which creates a BrowserWindow. I want to capture a few requests sent to a server, I also want responses for there requests. Using Electron WebRequest api I can't get responses, so searched the web and found that I can attach a debugger programatically.
I attach debugger using the below code and I get almost all responses correctly. But for one bigger request I can't get the response. I get an error
Error: No resource with given identifier found
If I launch DevTools and navigate to that request I also can't get the response: Failed to load response data. If I comment out below code the response in DevTools show correctly.
Note that this happens only for one specific request which returns about 1MB response. For all other requests I can get the response using getResponseData().
const dbg = win.webContents.debugger
var getResponseData = async (reqId) => {
const res = await dbg.sendCommand("Network.getResponseBody", {requestId: reqId});
return res.body
}
try {
dbg.attach('1.3')
dbg.sendCommand('Network.enable')
} catch (err) {
console.log('Debugger attach failed : ', err)
}
dbg.on('detach', async (event, reason) => {
console.log('Debugger detached due to : ', reason)
})
dbg.on('message', (e, m, p) => {
if (m === 'Network.requestWillBeSent') {
if (p.request.url === someURL) {
const j = JSON.parse(p.request.postData)
console.log("req " + p.requestId)
global.webReqs[p.requestId] = { reqData: j}
}
} else if (m === 'Network.loadingFinished') {
if (p.requestId in global.webReqs) {
console.log("res " + p.requestId)
getResponseData(p.requestId).then(res => {
console.log(res.slice(0,60))
}).catch(err => {
console.error(err)
})
}
}
});
Short update
The event stack for this particular request is as follows, where 13548.212 is simply requestId
Network.requestWillBeSentExtraInfo 13548.212
Network.requestWillBeSent 13548.212
Network.responseReceivedExtraInfo 13548.212
Network.responseReceived 13548.212
Network.dataReceived 13548.212 [repeated 135 times]
...
Network.loadingFinished 13548.212
Looks that I found a solution. It's rather a workaround, but it works. I didn't use Network.getResponseBody. I used Fetch(https://chromedevtools.github.io/devtools-protocol/tot/Fetch).
To use that one need to subscribe for Responses matching a pattern. Then you can react on Fetch.requestPaused events. During that you have direct access to a request and indirect to a response. To get the response call Fetch.getResponseBody with proper requestId. Also remember to send Fetch.continueRequest as
The request is paused until the client responds with one of continueRequest, failRequest or fulfillRequest
https://chromedevtools.github.io/devtools-protocol/tot/Fetch/#event-requestPaused
dbg.sendCommand('Fetch.enable', {
patterns: [
{ urlPattern: interestingURLpattern, requestStage: "Response" }
]})
var getResponseJson = async (requestId) => {
const res = await dbg.sendCommand("Fetch.getResponseBody", {requestId: requestId})
return JSON.parse(res.base64Encoded ? Buffer.from(res.body, 'base64').toString() : res.body)
}
dbg.on('message', (e, m, p) => {
if(m === 'Fetch.requestPaused') {
var reqJson = JSON.parse(p.request.postData)
var resJson = await getResponseJson(p.requestId)
...
await dbg.sendCommand("Fetch.continueRequest", {requestId: p.requestId})
}
});

Making a distinction between file not present and access denied while accessing s3 object via Javascript

I have inherited the following code. This is part of CICD pipeline. It tries to get an object called "changes" from a bucket and does something with it. If it is able to grab the object, it sends a success message back to pipeline. If it fails to grab the file for whatever reason, it sends a failure message back to codepipeline.
This "changes" file is made in previous step of the codepipeline. However, sometimes it is valid for this file NOT to exist (i.e. when there IS no change).
Currently, the following code makes no distinction if file simply does not exist OR some reason code failed to get it (access denied etc.)
Desired:
I would like to send a success message back to codepipeline if file is simply not there.
If there is access issue , then the current outcome of "failure' would still be valid.
Any help is greatly appreciated. Unfortunately I am not good enough with Javascript to have any ideas to try.
RELEVANT PARTS OF THE CODE
const AWS = require("aws-sdk");
const s3 = new AWS.S3();
const lambda = new AWS.Lambda();
const codePipeline = new AWS.CodePipeline();
// GET THESE FROM ENV Variables
const {
API_SOURCE_S3_BUCKET: s3Bucket,
ENV: env
} = process.env;
const jobSuccess = (CodePipeline, params) => {
return new Promise((resolve, reject) => {
CodePipeline.putJobSuccessResult(params, (err, data) => {
if (err) { reject(err); }
else { resolve(data); }
});
});
};
const jobFailure = (CodePipeline, params) => {
return new Promise((resolve, reject) => {
CodePipeline.putJobFailureResult(params, (err, data) => {
if (err) { reject(err); }
else { resolve(data); }
});
});
};
// MAIN CALLER FUNCTION. STARTING POINT
exports.handler = async (event, context, callback) => {
try {
// WHAT IS IN changes file in S3
let changesFile = await getObject(s3, s3Bucket, `lambda/${version}/changes`);
let changes = changesFile.trim().split("\n");
console.log("List of Changes");
console.log(changes);
let params = { jobId };
let jobSuccessResponse = await jobSuccess(codePipeline, params);
context.succeed("Job Success");
}
catch (exception) {
let message = "Job Failure (General)";
let failureParams = {
jobId,
failureDetails: {
message: JSON.stringify(message),
type: "JobFailed",
externalExecutionId: context.invokeid
}
};
let jobFailureResponse = await jobFailure(codePipeline, failureParams);
console.log(message, exception);
context.fail(`${message}: ${exception}`);
}
};
S3 should return an error code in the exception:
The ones you care about are below:
AccessDenied - Access Denied
NoSuchKey - The specified key does not exist.
So in your catch block you should be able to validate exception.code to check if it matches one of these 2.

Firebase cloud functions onCreate painfully slow to update database

I am having a slightly odd issue, and due to the lack of errors, I am not exactly sure what I am doing wrong. What I am trying to do is on an onCreate event, make an API call, and then update a field on the database if the field is not set to null. Based on my console logs for cloud functions, I can see the API call getting a ok, and everything is working properly, but after about 2-5 minutes, it will update. A few times, it didnt update after 15 mins. What is causing such a slow update?
I have eliminated the gaxios call as the bottleneck simply from the functions logs, and local testing.
Some context: I am on the firebase blaze plan to allow for egress and my dataset isnt really big. I am using gaxios because it is already part of firebase-funcstions npm install.
The code is:
const functions = require('firebase-functions');
const { request } = require('gaxios');
const { parse } = require('url');
exports.getGithubReadme = functions.firestore.document('readmes/{name}').onCreate((snapshot, context) => {
const toolName = context.params.name;
console.log(toolName);
const { name, description, site } = snapshot.data();
console.log(name, description, site);
const parsedUrl = parse(site);
console.log(parsedUrl);
if (description) return;
if (parsedUrl.hostname === 'github.com') {
let githubUrl = `https://api.github.com/repos${parsedUrl.path}/readme`;
request({
method : 'GET',
url : githubUrl
})
.then((res) => {
let { content } = res.data;
return snapshot.ref.update({ description: content });
})
.catch((error) => {
console.log(error);
return null;
});
}
return null;
});
When you execute an asynchronous operation (i.e. request() in your case) in a background triggered Cloud Function, you must return a promise, in such a way the Cloud Function waits that this promise resolves in order to terminate.
This is very well explained in the official Firebase video series here (Learning Cloud Functions for Firebase (video series)). In particular watch the three videos titled "Learn JavaScript Promises" (Parts 2 & 3 especially focus on background triggered Cloud Functions, but it really worth watching Part 1 before).
So you should adapt your code as follows, returning the promise returned by request():
const functions = require('firebase-functions');
const { request } = require('gaxios');
const { parse } = require('url');
exports.getGithubReadme = functions.firestore.document('readmes/{name}').onCreate((snapshot, context) => {
const toolName = context.params.name;
console.log(toolName);
const { name, description, site } = snapshot.data();
console.log(name, description, site);
const parsedUrl = parse(site);
console.log(parsedUrl);
if (description) return null;
if (parsedUrl.hostname === 'github.com') {
let githubUrl = `https://api.github.com/repos${parsedUrl.path}/readme`;
return request({
method: 'GET',
url: githubUrl
})
.then((res) => {
let { content } = res.data;
return snapshot.ref.update({ description: content });
})
.catch((error) => {
console.log(error);
return null;
});
} else {
return null;
}
});

Streaming JSON data to React results in unexpected end of JSON inpit

I'm trying to stream a lot of data from a NodeJS server that fetches the data from Mongo and sends it to React. Since it's quite a lot of data, I've decided to stream it from the server and display it in React as soon as it comes in. Here's a slightly simplified version of what I've got on the server:
const getQuery = async (req, res) => {
const { body } = req;
const query = mongoQueries.buildFindQuery(body);
res.set({ 'Content-Type': 'application/octet-stream' });
Log.find(query).cursor()
.on('data', (doc) => {
console.log(doc);
const data = JSON.stringify(result);
res.write(`${data}\r\n`);
}
})
.on('end', () => {
console.log('Data retrieved.');
res.end();
});
};
Here's the React part:
fetch(url, { // this fetch fires the getQuery function on the backend
method: "POST",
body: JSON.stringify(object),
headers: {
"Content-Type": "application/json",
}
})
.then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
const pump = () =>
reader.read().then(({ done, value }) => {
if (done) return this.postEndHandler();
console.log(value.length); // !!!
const decoded = decoder.decode(value);
this.display(decoded);
return pump();
});
return pump();
})
.catch(err => {
console.error(err);
toast.error(err.message);
});
}
display(chunk) {
const { data } = this.state;
try {
const parsedChunk = chunk.split('\r\n').slice(0, -1);
parsedChunk.forEach(e => data.push(JSON.parse(e)));
return this.setState({data});
} catch (err) {
throw err;
}
}
It's a 50/50 whether it completes with no issues or fails at React's side of things. When it fails, it's always because of an incomplete JSON object in parsedChunk.forEach. I did some digging and it turns out that every time it fails, the console.log that I marked with 3 exclamation marks shows 65536. I'm 100% certain it's got something to do with my streams implementation and I'm not queuing the chunks correctly but I'm not sure whether I should be fixing it client or server side. Any help would be greatly appreciated.
Instead of implementing your own NDJSON-like streaming JSON protocol which you are basically doing here (with all of the pitfalls of dividing the stream into chunks and packets which is not always under your control), you can take a look at some of the existing tools that are created to do what you need, e.g.:
http://oboejs.com/
http://ndjson.org/
https://www.npmjs.com/package/stream-json
https://www.npmjs.com/package/JSONStream
https://www.npmjs.com/package/clarinet

Categories