This might be a special case:
I want to read from a queue (AWS SQS), which is done by making a call which waits a few secs for messages, and then resolves - and call again and again in a loop as long as you want to process that queue (it checks a flag every time).
This means that I have a consume function which is running as long as the app is active, or the queue is unflagged.
And I also have a subscribe function used for subscribing to a queue - which is supposed to resolve as soon as it knows that the consumer is able to connect to the queue. Even though this functions calls the consumer which keeps running and does not return until the queue is unflagged.
It gives me some challenges - do you have any tips on how to solve this with modern JS and async/await promises? I keep in mind this code is running in a React web app, not in node.js.
I basically just want the await subscribe(QUEUE) call (which comes from the GUI) to resolve as soon as it's sure that it can read from that queue. But if it cannot, I want it to throw an error which is propagated to the origin of the subscribe call - which means that I have to await consume(QUEUE), right?
Update:
Some untested draft code has been added (I don't want to spend more time making it work if I'm not doing the right approach) - I thought about sending success and failure callback to the consuming function, so it can report a success as soon as it gets the first valid (but possibly empty) response from the queue, which makes it store the queue url as a subscription - and unsubscribe if as the queue poll fails.
Since I'm setting up several queue consumers they should not be blocking anything but just work in the background
let subscribedQueueURLs = []
async function consumeQueue(
url: QueueURL,
success: () => mixed,
failure: (error: Error) => mixed
) {
const sqs = new AWS.SQS()
const params = {
QueueUrl: url,
WaitTimeSeconds: 20,
}
try {
do {
// eslint-disable-next-line no-await-in-loop
const receivedData = await sqs.receiveMessage(params).promise()
if (!subscribedQueueURLs.includes(url)) {
success()
}
// eslint-disable-next-line no-restricted-syntax
for (const message of receivedData.Messages) {
console.log({ message })
// eslint-disable-next-line no-await-in-loop
eventHandler && (await eventHandler.message(message, url))
const deleteParams = {
QueueUrl: url,
ReceiptHandle: message.ReceiptHandle,
}
// eslint-disable-next-line no-await-in-loop
const deleteResult = await sqs.deleteMessage(deleteParams).promise()
console.log({ deleteResult })
}
} while (subscribedQueueURLs.includes(url))
} catch (error) {
failure(error)
}
}
export const subscribe = async (entityType: EntityType, entityId: EntityId) => {
const url = generateQueueURL(entityType, entityId)
consumeQueue(
url,
() => {
subscribedQueueURLs.push(url)
eventHandler && eventHandler.subscribe(url)
},
error => {
console.error(error)
unsubscribe(entityType, entityId)
}
)
}
I ended up solving it like this - maybe not the most elegant solution though...
let eventHandler: ?EventHandler
let awsOptions: ?AWSOptions
let subscribedQueueUrls = []
let sqs = null
let sns = null
export function setup(handler: EventHandler) {
eventHandler = handler
}
export async function login(
{ awsKey, awsSecret, awsRegion }: AWSCredentials,
autoReconnect: boolean
) {
const credentials = new AWS.Credentials(awsKey, awsSecret)
AWS.config.update({ region: awsRegion, credentials })
sqs = new AWS.SQS({ apiVersion: '2012-11-05' })
sns = new AWS.SNS({ apiVersion: '2010-03-31' })
const sts = new AWS.STS({ apiVersion: '2011-06-15' })
const { Account } = await sts.getCallerIdentity().promise()
awsOptions = { accountId: Account, region: awsRegion }
eventHandler && eventHandler.login({ awsRegion, awsKey, awsSecret }, autoReconnect)
}
async function handleQueueMessages(messages, queueUrl) {
if (!sqs) {
throw new Error(
'Attempt to subscribe before SQS client is ready (i.e. authenticated).'
)
}
// eslint-disable-next-line no-restricted-syntax
for (const message of messages) {
if (!eventHandler) {
return
}
// eslint-disable-next-line no-await-in-loop
await eventHandler.message({
content: message,
queueUrl,
timestamp: new Date().toISOString(),
})
const deleteParams = {
QueueUrl: queueUrl,
ReceiptHandle: message.ReceiptHandle,
}
// eslint-disable-next-line no-await-in-loop
await sqs.deleteMessage(deleteParams).promise()
}
}
export async function subscribe(queueUrl: QueueUrl) {
if (!sqs) {
throw new Error(
'Attempt to subscribe before SQS client is ready (i.e. authenticated).'
)
}
const initialParams = {
QueueUrl: queueUrl,
WaitTimeSeconds: 0,
MessageAttributeNames: ['All'],
AttributeNames: ['All'],
}
const longPollParams = {
...initialParams,
WaitTimeSeconds: 20,
}
// Attempt to consume the queue, and handle any pending messages.
const firstResponse = await sqs.receiveMessage(initialParams).promise()
if (!subscribedQueueUrls.includes(queueUrl)) {
subscribedQueueUrls.push(queueUrl)
eventHandler && eventHandler.subscribe(queueUrl)
}
handleQueueMessages(firstResponse.Messages, queueUrl)
// Keep on polling the queue afterwards.
setImmediate(async () => {
if (!sqs) {
throw new Error(
'Attempt to subscribe before SQS client is ready (i.e. authenticated).'
)
}
try {
do {
// eslint-disable-next-line no-await-in-loop
const received = await sqs.receiveMessage(longPollParams).promise()
handleQueueMessages(received.Messages, queueUrl)
} while (sqs && subscribedQueueUrls.includes(queueUrl))
} catch (error) {
eventHandler && eventHandler.disconnect()
throw error
}
})
}
Related
I am an AQA and testing an app. According to the test, after the button is clicked, I need to get the responseBody returned from the server, like we have in devtools - network tab. I have tried multiple Java and Python code examples found here, tried to transform them to JavaScript, but nothing worked for me. I've been trying smth like this:
try {
const url = 'http://someUrl';
const driver = await new Builder().forBrowser('chrome').build();
const cdpConnection = await driver.createCDPConnection('page');
await cdpConnection.execute('Network.responseReceived()', response => {
// Network.getResponseBody(), etc.
const res = response.getResponse();
console.log(res);
};
await driver.get(url);
await driver.quit();
} catch (error) {
console.log(error);
}
Network.responseReceived is an Event. So you have to listen to the message from the underlying CDP connection.
wsConnection.on('message', message => {
let data = JSON.parse(message);
if (data.method === 'Network.loadingFinished') {
// ... load response body here
}
});
I use Network.loadingFinished event instead of Network.responseReceived, as the response is then completely loaded.
The problem is, that the CDPConnection class is not properly implemented yet: CDPConnection.js#L18 It doesn't return any promise. This is message-based communication, though it adds the Message ID, to retrieve later the Response Message from the WebSocket, but it doesn't handle that response message here webdriver.js#L1239
Until it is implemented you can use custom CDPConnection class. Here is the TypeScript implementation.
let ID = 0;
type TAwaiter = {
id: number
resolve: (value: any) => void
reject: (reason?: any) => void
};
export class BiDiCDPConnection {
private requests: Map<number, TAwaiter> = new Map();
constructor(private wsConnection, private sessionId: string) {
wsConnection.on('message', this.onMessage.bind(this));
wsConnection.on('close', this.onClose.bind(this));
wsConnection.on('error', this.rejectAll.bind(this));
}
execute <T = any> (method, params, onMessageSent: (err) => any = null): Promise<T> {
let message = {
sessionId: this.sessionId,
method,
params,
id: ++ID,
};
let listener = {
id: message.id,
resolve: null,
reject: null,
};
let promise = new Promise<T>((resolve, reject) => {
listener.resolve = resolve;
listener.reject = reject;
});
this.requests.set(listener.id, listener);
this.wsConnection.send(JSON.stringify(message), onMessageSent)
return promise;
}
private onMessage (message: Buffer) {
let params = JSON.parse(message.toString());
let { id, result } = params;
if (id != null && this.requests.has(id)) {
this.requests.get(id)?.resolve?.(result);
this.requests.delete(id);
}
}
private onClose () {
this.rejectAll(new Error(`CDPConnection: The underlying connection was closed`));
}
private rejectAll(error: Error) {
let awaiters = this.requests.values();
this.requests = new Map();
for (let awaiter of awaiters) {
awaiter.reject(error);
}
}
}
Then you initialize the class and use it for your calls, after you create the inner cdp connection, as createCDPConnection establishes the WebSocket connection.
const cdpConnection = await driver.createCDPConnection('page');
const wsConnection = driver._wsConnection;
const bidiCdpConnection = new BiDiCDPConnection(wsConnection, driver.sessionId);
wsConnection.on('message', message => {
let data = JSON.parse(message);
if (data.method === 'Network.loadingFinished') {
let response = await bidiCdpConnection.execute('Network.getResponseBody', {
requestId: data.params.requestId,
});
console.log(response)
}
});
I use this to monitor (selenium-query/BrowserNetworkMonitor.ts) and intercept (selenium-query/BrowserNetworkInterceptor.ts) requests. You can take and modify those classes for your initial needs.
I am close to getting this to work... but not quite there. See my code here: https://github.com/SeleniumHQ/seleniumhq.github.io/issues/1155
If anyone can figure out the last step I'm missing, that'd be so amazing.
ie.
let test = await cdpConnection.execute('Fetch.getResponseBody', {
requestId: obj.params.requestId,
});
console.log(test); // ------> THIS RETURNS UNDEFINED !!!!
Summary:
I've built a chrome extension that reaches out to external API to fetch some data. Sometimes that data returns quickly, sometimes it takes 4 seconds or so. I'm often doing about 5-10 in rapid succession (this is a scraping tool).
Previously, a lot of requests were dropped because the service worker in V3 of Manifest randomly shuts down. I thought I had resolved that. Then I realized there was a race condition because local storage doesn't have a proper queue.
Current Error - Even with all these fixes, requests are still being dropped. The external API returns the correct data successfully, but it seems like the extension never gets it. Hoping someone can point me in the right direction.
Relevant code attached, I imagine it will help someone dealing with these queue and service worker issues.
Local Storage queue
let writing: Map<string, Promise<any>> = new Map();
let updateUnsynchronized = async (ks: string[], f: Function) => {
let m = await new Promise((resolve, reject) => {
chrome.storage.local.get(ks, res => {
let m = {};
for (let k of ks) {
m[k] = res[k];
}
maybeResolveLocalStorage(resolve, reject, m);
});
});
// Guaranteed to have not changed in the meantime
let updated = await new Promise((resolve, reject) => {
let updateMap = f(m);
chrome.storage.local.set(updateMap, () => {
maybeResolveLocalStorage(resolve, reject, updateMap);
});
});
console.log(ks, 'Updated', updated);
return updated;
};
export async function update(ks: string[], f: Function) {
let ret = null;
// Global lock for now
await navigator.locks.request('global-storage-lock', async lock => {
ret = await updateUnsynchronized(ks, f);
});
return ret;
}
Here's the main function
export async function appendStoredScrapes(
scrape: any,
fromHTTPResponse: boolean
) {
let updated = await update(['urlType', 'scrapes'], storage => {
const urlType = storage.urlType;
const scrapes = storage.scrapes;
const {url} = scrape;
if (fromHTTPResponse) {
// We want to make sure that the url type at time of scrape, not time of return, is used
scrapes[url] = {...scrapes[url], ...scrape};
} else {
scrapes[url] = {...scrapes[url], ...scrape, urlType};
}
return {scrapes};
});
chrome.action.setBadgeText({text: `${Object.keys(updated['scrapes']).length}`});
}
Keeping the service worker alive
let defaultKeepAliveInterval = 20000;
// To avoid GC
let channel;
// To be run in content scripts
export function contentKeepAlive(name : string) {
channel = chrome.runtime.connect({ name });
channel.onDisconnect.addListener(() => contentKeepAlive(name));
channel.onMessage.addListener(msg => { });
}
let deleteTimer = (chan : any) => {
if (chan._timer) {
clearTimeout(chan._timer);
delete chan._timer;
}
}
let backgroundForceReconnect = (chan : chrome.runtime.Port) => {
deleteTimer(chan);
chan.disconnect();
}
// To be run in background scripts
export function backgroundKeepAlive(name : string) {
chrome.runtime.onConnect.addListener(chan => {
if (chan.name === name) {
channel = chan;
channel.onMessage.addListener((msg, chan) => { });
channel.onDisconnect.addListener(deleteTimer);
channel._timer = setTimeout(backgroundForceReconnect, defaultKeepAliveInterval, channel);
}
});
}
// "Always call sendResponse() in your chrome.runtime.onMessage listener even if you don't need
// the response. This is a bug in MV3." — https://stackoverflow.com/questions/66618136/persistent-service-worker-in-chrome-extension
export function defaultSendResponse (sendResponse : Function) {
sendResponse({ farewell: 'goodbye' });
}
Relevant parts of background.ts
backgroundKeepAlive('extension-background');
let listen = async (request, sender, sendResponse) => {
try {
if (request.message === 'SEND_URL_DETAIL') {
const {url, website, urlType} = request;
await appendStoredScrapes({url}, false);
let data = await fetchPageData(url, website, urlType);
console.log(data, url, 'fetch data returned background');
await appendStoredScrapes(data, true);
defaultSendResponse(sendResponse);
} else if (request.message === 'KEEPALIVE') {
sendResponse({isAlive: true});
} else {
defaultSendResponse(sendResponse);
}
} catch (e) {
console.error('background listener error', e);
}
};
chrome.runtime.onMessage.addListener(function (request, sender, sendResponse) {
listen(request, sender, sendResponse);
});
My download code relies on listening on listening for events to determine when to fire callbacks, and whether the promise it's in should be resolved or rejected:
async function downloadMtgJsonZip() {
const path = Path.resolve(__dirname, 'resources', fileName);
const writer = Fs.createWriteStream(path);
console.info('...connecting...');
const { data, headers } = await axios({
url,
method: 'GET',
responseType: 'stream',
});
return new Promise((resolve, reject) => {
const timeout = 20000;
const timer = setTimeout(() => {
console.log('timed out'); // debug log
writer.close();
reject(new Error(`Promise timed out after ${timeout} ms`));
}, timeout);
let error = null;
const totalLength = headers['content-length'];
const progressBar = getProgressBar(totalLength);
console.info('...starting download...');
// set up data and writer listeners
data.on('data', (chunk) => progressBar.tick(chunk.length));
data.on('error', (err) => { // added this to see if it would be triggered - it is not
console.log(`did a data error: ${error}`);
error = err;
clearTimeout(timer);
writer.close();
reject(err);
});
writer.on('error', (err) => {
console.log(`did a writer error: ${error}`);
error = err;
clearTimeout(timer);
writer.close();
reject(err);
});
writer.on('close', () => {
const now = new Date();
console.log(`close called: ${now}`);
console.log(`error is: ${error}`);
console.info(
`Completed in ${(now.getTime() - progressBar.start) / 1000} seconds`,
);
clearTimeout(timer);
console.log(`time cleared: ${timer}`);
if (!error) resolve(true);
// no need to call the reject here, as it will have been called in the
// 'error' stream;
});
// finally call data.pipe with our writer
data.pipe(writer);
});
}
I had some issues writing my tests, but I managed to get something that worked, despite feeling slightly messy, based on this advice:
Here is my test, with the relevant bits of my set up:
describe('fetchData', () => {
let dataChunkFn;
let dataErrorFn;
let dataOnFn;
let writerCloseFn;
let writerErrorFn;
let writerOnFn;
let pipeHandler;
beforeEach(() => {
// I've left all the mocking in place,
// to give an idea of what I've set up
const mockWriterEventHandlers = {};
const mockDataEventHandlers = {};
dataChunkFn = jest.fn((chunk) => mockDataEventHandlers.data(chunk));
dataErrorFn = jest.fn((chunk) => mockDataEventHandlers.data(chunk));
dataOnFn = jest.fn((e, cb) => {
mockDataEventHandlers[e] = cb;
});
writerCloseFn = jest.fn(() => mockWriterEventHandlers.close());
writerErrorFn = jest.fn(() => mockWriterEventHandlers.error());
writerOnFn = jest.fn((e, cb) => {
mockWriterEventHandlers[e] = cb;
});
const getMockData = (pipe) => ({
status: 200,
data: {
pipe,
on: dataOnFn,
},
headers: { 'content-length': 100 },
});
axios.mockImplementationOnce(() => getMockData(pipeHandler));
fs.createWriteStream.mockImplementationOnce(() => ({
on: writerOnFn,
close: writerCloseFn,
}));
jest.spyOn(console, 'info').mockImplementation(() => {});
jest.spyOn(console, 'log').mockImplementation(() => {});
});
it.only('handles errors from the writer', async (done) => {
console.log('writer error');
expect.assertions(1);
pipeHandler = (writer) => writer.emit('error', new Error('bang'));
try {
await downloadMtgJsonZip();
done.fail('ran without error');
} catch (exception) {
// expect(dataErrorFn).toHaveBeenCalled(); // neither of these are called
expect(writerErrorFn).toHaveBeenCalled();
}
});
I would have expected, that when data(pipe) ran, and the writer emitted a new error, it would have triggered at least one of the error listeners.
The code runs as expected, and it even handles the timeout (which I initially set too low), but this last test doesn't run.
As I commented above, neither of the functions above are called, so the expect.assertions(1); code fails the test.
It's possible I need to fundamentally change how I've written the tests, but I'm not sure how I would do that.
Why doesn't that last test pass?
When the code invokes data.pipe(writer), it's running your pipeHandler function defined in the test. This function takes a given writer object and calls writer.emit(...). I believe the issue is that the writer object being passed in is the one mocked out for fs.createWriteStream(), which doesn't have an emit method defined, so nothing is happening in response to that call. It is likely throwing an error, which you may be able to see in your catch block.
I believe what you want is to invoke the handlers saved by the writerOnFn. One way to do so would be to add a property to the object returned by your mock of fs.createWriteStream named emit and define it as a function that invokes the appropriate handler from inside mockWriterEventHandlers. I haven't tested this code but it would look something like the following
const writerEmitFn = (event, arg) => {
mockWriterEventHandlers[event](arg);
}
fs.createWriteStream.mockImplementationOnce(() => ({
on: writerOnFn,
close: writerCloseFn,
emit: writerEmitFn,
}));
My guess is that jest is gobbling up the error.
In order to continue running in the case of exceptions, jest could be guarding against ever having to run try and throw.
You could try expecting an error to have been thrown using jest's API.
There is an SSE endpoint that shares a subscription if the consumer with the same key is already subscribed. If there is an active subscription the data is being polled from another client.
The problem is that the outer subscription never seems to catch the error and delegate it to the router in order to close the connection with the client: polling stops, but connection stays active.
I think the issue is how I start the subscription that is to be shared... but I can't think of a way to resolve this in another way currently.
Router (SSE) / outer subscription:
...
const clientId = Date.now();
const newClient = {
id: clientId,
res,
};
clients.push(newClient);
const sub = subscriptionService.listenToChanges(req.context, categoryIds).subscribe({
next: (data) => {
if (JSON.stringify(data) !== '{}') {
newClient.res.write(`data: ${JSON.stringify(data)}\n\n`);
} else {
newClient.res.write(': poke...\n\n');
}
},
error: () => {
// we never get here...
next(new InternalError());
clients = clients.filter((c) => c.id !== clientId);
res.end();
},
complete: () => {
res.end();
clients = clients.filter((c) => c.id !== clientId);
},
});
req.on('close', () => {
subscriptionService.stopListening(req.context);
sub.unsubscribe();
clients = clients.filter((c) => c.id !== clientId);
});
...
SubscriptionService
...
#trace()
public listenToChanges(ctx: Context, ids: string[]): Observable<{ [key: string]: Data }> {
const key = ctx.user?.email || ClientTypeKey.Anon;
try {
if (this.pool$[key]) {
return this.pool$[key];
}
this.poolSource[key] = new BehaviorSubject<{ [p: string]: Data }>({});
this.pool$[key] = this.poolSource[key].asObservable();
this.fetchData(ctx, ids);
return this.pool$[key].pipe(
catchError((e) => throwError(e)), // we never get here...
);
} catch (e) {
throw new Error(`Subscription Service Listen returned an error: "${e}"`);
}
}
...
private fetchData(ctx: Context, ids: string[]): void {
const key = ctx.user?.email || ClientTypeKey.Anon;
const sub = this.service.getData(ctx, ids)
.pipe(
catchError((e) => throwError(e)),
).subscribe(
(r) => this.poolSource[key].next(r),
(e) => throwError(e), // last time the error is caught
);
this.subscriptions[key] = sub;
}
...
Polling Service
...
#trace()
public getData(ctx: Context, ids: string[]): Observable<{[key: string]: Data}> {
try {
const key = ctx.user?.email || ClientTypeKey.Anon;
const pollingInterval = config.get('services.pollingInterval') || 10000;
return interval(pollingInterval).pipe(
startWith(0),
switchMap(() => this.getConfig(ctx, !!this.cachedData[key])),
map((r) => this.getUpdatedData(ctx, r.data, ids)),
catchError((e) => throwError(e)),
);
} catch (e) {
throw new Error(`Get Data returned an error: "${e}"`);
}
}
...
throwError doesn't actually throw an error, but rather creates an observable that emits an error.
From the docs:
[throwError] Creates an observable that will create an error instance and push it to the consumer as an error immediately upon subscription.
This is why using it inside subscribe does not work as intended. You should simply throw:
.subscribe(
(r) => this.poolSource[key].next(r),
(e) => throw new Error(e)
);
It seems like you have some unnecessary complexity in the way you are calling fetchData() in order to subscribe and push the result into a BehaviorSubject. I don't know all your requirements, but it seems like maybe you don't need the BehaviorSubject at all.
Instead of subscribing in fetchData(), you could simply return the observable and add that into your pool$ array, or maybe even get rid of fetchData() altogether:
public listenToChanges(ctx: Context, ids: string[]): Observable<{ [key: string]: Data }> {
const key = ctx.user?.email || ClientTypeKey.Anon;
try {
if (!this.pool$[key]) {
this.pool$[key] = this.service.getData(ctx, ids).pipe(
catchError((e) => throwError(e))
);
}
return this.pool$[key];
} catch (e) {
throw new Error(`Subscription Service Listen returned an error: "${e}"`);
}
}
Notes:
with the above simplification, maybe you no longer need the outer try/catch
this isn't a complete solution and may require some tweaks in other places of your code. I just wanted to point out, what seems like unnecessary complexity.
Not being able to get the following test to pass...
Sorry for putting up a lot of code....
I was able get some other click events working but I am stuck with this one at the moment
Getting the following message:
"expect(jest.fn()).toHaveBeenCalled()
expected mock function to have been called."
here is the click event under Render method
className={!this.state.name || !this.state.label || this.state.valueStore === null ? `add-custom-field-button disabled` : `add-custom-field-button`}
id="test-addclick"
onClick={() => {this.onAddClick()}}
>
Create Field
</button>
here is onAddClick method:
onAddClick = () => {
let obj = this.props.selectedFormJSON;
this.addValueAttribute().then(()=>{
obj.FORM_COLUMN.push(
{
Control: this.state.control,
CreateBy: this.props.user.userId,
Datatype: this.state.datatype,
Form_UID: this.props.selectedFormJSON.Form_UID,
Help: this.state.help,
ValueStore: this.state.valueStore
}
)
this.props.updateSelectedFormJSON(obj);
if(!this.props.isNewForm && this.state.valueStore) {
this.props.patchForm().then((res)=>{
if(res.Forms[0] && (res.Forms[0].Code === '200' || res.Forms[0].Code===200)) {
toast(<div>Attribute Added Successfully!</div>, {type: toast.TYPE.SUCCESS,position: toast.POSITION.TOP_LEFT})
} else {
toast(<div>Failed to Add Attribute!</div>, {type: toast.TYPE.ERROR,position: toast.POSITION.TOP_LEFT})
}
});
} else if(this.state.valueStore) {
this.props.postForm().then((res)=>{
if(res.Forms[0] && (res.Forms[0].Code === '201' || res.Forms[0].Code===201)) {
toast(<div>Attribute Added Successfully!</div>, {type: toast.TYPE.SUCCESS,position: toast.POSITION.TOP_LEFT})
} else {
toast(<div>Failed to Add Attribute!</div>, {type: toast.TYPE.ERROR,position: toast.POSITION.TOP_LEFT})
}
})
}
this.props.closeModal();
})
}
addValueAttribute = () => {
return new Promise((resolve, reject) => {
if(this.state.valueStore) {
let {valueTables, valueDatatypes, service} = this.state;
let body = {
Name: this.state.name,
TableName: this.props.selectedFormJSON.Entity,
Datatype: this.state.datatype,
ChangeType: 'I'
}
fetch(service.URL+'/VALUE_ATTRIBUTE', { headers: {
'Content-Type': 'application/json',
'Ocp-Apim-Subscription-Key': service.subscription_key,
},
method: 'POST',
credentials: 'include',
body: JSON.stringify(body),
})
.then((res) => {
res.status === 201 && resolve();
})
.catch(() => {
reject();
})
} else {
//Not a value attr
resolve()
}
})
}
Here is how I am trying to test it: using jest/enzyme. I have been using the same set up for some other click events and it has been working. Unable to figure out for the following:
it("should call onAddClick", async () => { // use an async test method
baseProps. closeModal.mockClear();
baseProps. updateSelectedFormJSON.mockClear();
const instance = wrapper.instance();
const spy = jest.spyOn(instance, 'addValueAttribute'); // spy on addValueAttribute...
spy.mockImplementation(() => Promise.resolve()) // give any callbacks queued in PromiseJobs a chance to run
wrapper.find('#test-addclick').at(0).simulate('click'); // simulate click
expect(baseProps.updateSelectedFormJSON).toHaveBeenCalled(); // SUCCESS
expect(baseProps.closeModal).toHaveBeenCalled(); // SUCCESS
});
addValueAttribute is expensive so you will want to mock it to resolve immediately.
addValueAttribute is a class field so you will need to mock it using the component instance.
When onAddClick is called it will call this.addValueAttribute which will be mocked to immediately return. This will cause the Promise callback in then to get added to the PromiseJobs queue. Jobs in this queue run after the current message completes and before the next message begins.
This means that the callback that calls this.props.updateSelectedFormJSON and this.props.closeModal is queued in the PromiseJobs queue when the click handler returns and the test continues.
At this point you need to pause your test to give the callback queued in PromiseJobs a chance to run. The easiest way to do that is to make your test function async and call await Promise.resolve(); which will essentially queue the rest of the test at the end of the PromiseJobs queue and allow any jobs already in the queue to run first.
Putting it all together, here is a simplified version of your code with a working test:
import * as React from 'react';
import { shallow } from 'enzyme';
class Comp extends React.Component {
onAddClick = () => {
this.addValueAttribute().then(() => {
this.props.updateSelectedFormJSON();
this.props.closeModal();
})
}
addValueAttribute = () => {
return new Promise((resolve) => {
setTimeout(resolve, 100000); // does something computationally expensive
});
}
render() {
return (<button onClick={this.onAddClick}>Create Field</button>);
}
}
it("should call onAddClick", async () => { // use an async test method
const props = {
updateSelectedFormJSON: jest.fn(),
closeModal: jest.fn()
}
const wrapper = shallow(<Comp {...props} />);
const instance = wrapper.instance();
const spy = jest.spyOn(instance, 'addValueAttribute'); // spy on addValueAttribute...
spy.mockResolvedValue(); // ...and mock it to immediately resolve
wrapper
.find('button')
.at(0)
.simulate('click'); // simulate click
await Promise.resolve(); // give any callbacks queued in PromiseJobs a chance to run
expect(props.updateSelectedFormJSON).toHaveBeenCalled(); // SUCCESS
expect(props.closeModal).toHaveBeenCalled(); // SUCCESS
});