Test not looping in protractor - javascript

I have a test that I'm trying to fix. It needs to click on each result and verify the price and name match from the initial result with the final individual result. Currently my code will click on only the first image but will not navigate back to the results page to try the next result. I tried to remove the first() as my understanding is that method only takes the very first element and ignores the rest. Sadly that didn't work. What am I missing?
tester.it('links to the correct product details page when a result is clicked', () => {
const $offer = element.all(by.css('#main-results .catalog-offer')).first();
const offerResultsText = [];
let offerResultsPrice;
return Promise.all([
$offer.all(by.css('.offer-name .name-part')).map(($namePart) =>
$namePart.getText().then((partText) => offerResultsText.push(partText))
),
$offer
.element(by.css('.price'))
.getText()
.then((offerPrice) => (offerResultsPrice = offerPrice)),
])
.then($offer.element(by.tagName('a')).click)
.then(() =>
browser.wait(
protractor.ExpectedConditions.presenceOf(
element(by.css('#recently-purchased-details'))
),
5000
)
)
.then(() =>
expect(element(by.css('.details .subtotal > span')).getText()).to.eventually.equal(
offerResultsPrice
)
)
.then(() => {
return offerResultsText.map((sourceString) => {
console.log(sourceString);
return expect(
element(by.css('.details .setting .info')).getText()
).to.eventually.contains(sourceString);
});
});
});

Figured out what I was doing wrong. I needed to remove the return and then use our internal method afterward to loop through the results urls. Looked like this in the end...
expect(testerUtils.getPageId()).to.eventually.equal('Recently Purchased Engagement Ring Details');
testerUtils.go(testContext.url);

Related

Compare two Session ID's in Cypress

I am new to automation and coding in general and I would like to compare two session ID values with the following steps:
Get first value right after logging in
Refresh page
Get second value and make an assertion.
I made a custom command in order to simplify things:
Cypress.Commands.add('getSessionId', () => {
let sessionId
cy.getCookie('development')
.its('value').then(($value) => {
sessionId = String($value)
})
})
I want the test script to look something like this:
let firstSessionId = cy.getSessionId()
cy.reload()
let secondSessionId = cy.getSessionId()
expect(firstSessionId).to.eq(secondSessionId)
There are two problems with this:
I cannot access the values as strings in this scenario
The expect runs before getting the ID's (i guess because of the asyncronous nature of cypress?)
I would appreciate any hint what I do wrong. Thanks
This is the simplest way to perform the test, no need for a custom command in this case.
cy.getCookie('development').its('value')
.then(sessionId1 => {
cy.reload()
cy.getCookie('development').its('value')
.then(sessionId2 => {
expect(sessionId1).to.eq(sessionId2)
})
})
If you want a custom command for other reasons,
Cypress.Commands.add('getSessionId', () => {
cy.getCookie('development').its('value') // last command is returned
})
cy.getSessionId().then(sessionId1 => {
cy.reload()
cy.getSessionId().then(sessionId2 => {
expect(sessionId1).to.eq(sessionId2)
})
})
You can return the value from the custom command like this:
Cypress.Commands.add('getSessionId', () => {
cy.getCookie('development')
.its('value')
.then((val) => {
return cy.wrap(val)
})
})
Then in your test, you can do this:
//Get First session Id
cy.getSessionId.then((sessionId1) => {
cy.wrap(sessionId1).as('sessionId1')
})
//Refresh Page
cy.reload()
//Get Second session Id
cy.getSessionId.then((sessionId2) => {
cy.wrap(sessionId2).as('sessionId2')
})
//Assert both
cy.get('#sessionId1').then((sessionId1) => {
cy.get('#sessionId2').then((sessionId2) => {
expect(sessionId1).to.eq(sessionId2)
})
})

How to stay within 2 GET requests/second per seconds with Axios (Shopify API)

I have about 650 products and each product has a lot of additional information relating to it being stored in metafields. I need all the metafield info to be stored in an array so I can filter through certain bits of info and display it to the user.
In order to get all the metafiled data, you need to make one API call per product using the product id like so: /admin/products/#productid#/metafields.json
So what I have done is got all the product ids then ran a 'for in loop' and made one call at a time. The problem is I run into a '429 error' because I end up making more than 2 requests per second. Is there any way to get around this like with some sort of queuing system?
let products = []
let requestOne = `/admin/products.json?page=1&limit=250`
let requestTwo = `/admin/products.json?page=2&limit=250`
let requestThree = `/admin/products.json?page=3&limit=250`
// let allProducts will return an array with all products
let allProducts
let allMetaFields = []
let merge
$(document).ready(function () {
axios
.all([
axios.get(`${requestOne}`),
axios.get(`${requestTwo}`),
axios.get(`${requestThree}`),
])
.then(
axios.spread((firstResponse, secondResponse, thirdResponse) => {
products.push(
firstResponse.data.products,
secondResponse.data.products,
thirdResponse.data.products
)
})
)
.then(() => {
// all 3 responses into one array
allProducts = [].concat.apply([], products)
})
.then(function () {
for (const element in allProducts) {
axios
.get(
`/admin/products/${allProducts[element].id}/metafields.json`
)
.then(function (response) {
let metafieldsResponse = response.data.metafields
allMetaFields.push(metafieldsResponse)
})
}
})
.then(function () {
console.log("allProducts: " + allProducts)
console.log("allProducts: " + allMetaFields)
})
.catch((error) => console.log(error))
})
When you hit 429 error, check for Retry-After header and wait for the number of seconds specified there.
You can also use X-Shopify-Shop-Api-Call-Limit header in each response to understand how many requests left until you exceed the bucket size limit.
See more details here: REST Admin API rate limits
By the way, you're using page-based pagination which is deprecated and will become unavailable soon.
Use cursor-based pagination instead.

How do I convert each line elements into an Array, So I can select only one element like data[2]

here is my code, I have received data from firebase storage.
listRef.listAll().then((res) => {
res.prefixes.forEach((folderRef) => {
// All the prefixes under listRef.
// You may call listAll() recursively on them.
});
res.items.forEach((itemRef, index) => {
itemRef.getDownloadURL().then((url) => {
console.log(`${index} ${url}`)
})
});
})
Here is my output result like
0 https://myfirebaseapp.com/videos/nice.mp4
1 https://myfirebaseapp.com/videos/bad.mp4
2 https://myfirebaseapp.com/videos/funny.mp4 [ I want only this element instead of whole list ]
3 https://myfirebaseapp.com/videos/good.mp4
4 https://myfirebaseapp.com/videos/sad.mp4
You can use the find() method on your array. Here's an example:
var items = [
'https://myfirebaseapp.com/videos/nice.mp4',
'https://myfirebaseapp.com/videos/bad.mp4',
'https://myfirebaseapp.com/videos/funny.mp4',
'https://myfirebaseapp.com/videos/good.mp4',
'https://myfirebaseapp.com/videos/sad.mp4'
]
var funny = items.find(x => x.endsWith('funny.mp4'));
console.log(funny);
For your code, it might look something like this:
listRef.listAll().then((res) => {
...
// Find the string that ends with 'funny.mp4'.
var funnyItem = res.items.find(x => x.fullPath.endsWith('funny.mp4'));
if(funnyItem) {
// The item we want was found. Do something with it...
funnyItem.getDownloadURL().then((url) => {
console.log(`Fetching ${url}...`);
});
}
})
The above example will work if we don't know the location of funny.mp4. If you know for sure that the location of the item you want is always going to be 2, then you could get away with doing this:
listRef.listAll().then((res) => {
...
res.items[2].getDownloadURL().then((url) => {
console.log(`Fetching ${url}...`);
});
})
If you really just need a collection of the download URLs (hard to tell from your question), then you can project your items array with the map function like this:
listRef.listAll().then(async (res) => {
...
var urls = await res.items.map(async x => Promise.all(await x.getDownloadURL()));
console.log(`Fetching ${urls[2]}...`);
})
Keep in mind that this will invoke the awaitable getDownloadURL() method on every item, which is probably undesired.
You should put an if statement inside the forEach, checking on the index (if you generically want the 3rd element of the list) or on the name of the video (if you want specifically that video)

Why my effect is running serveral times after action is called?

I have this effect that request serveral values to retrive a product from service. Afer dispatch REQUEST_PRODUCTS is called one time as expected, but when I tried go to other location in the routing the this.apiMarketServices is called serveral times, this trigger the router navigate and this will redirect to previous page. The action REQUEST_PRODUCTS is dispatched one time. Why this effect is called serveral times?
Do I need add some kind of stop to the effect in order to avoid the called after the return GetSuccess and GetFailed?
#Effect()
requestProductsFromMarket = this.actions$
.ofType(REQUEST_PRODUCTS)
.withLatestFrom(this.store)
.switchMap(([action, store]) => {
const id = store.product.id;
return this.savedProducts.getProduct(id, 'store');
})
.switchMap(_ => this.stateService.getMarketId())
.switchMap(({ marketId }) =>
this.apiMarketServices.get(MARKETS_PROFILES + marketId)
)
.withLatestFrom(this.store)
.map(([r, store]) => {
const ser = r.data.map(s => s.legId);
const storSer =
store.product.serIds;
if (storSer.every(s =>ser.includes(s))) {
this.router.navigate([
`/products/edit/${store.products.id}`
]);
return GetSuccess;
} else {
return GetFailed;
}
})
.catch(() => of(GetQueryFailed));
The solution for defect is related to an Observable. In the debugging the "this.apiMarketServices.get(MARKETS_PROFILES + marketId)" was called several times, I realted this services like cause of the defect:
.switchMap(({ marketId }) =>
this.apiMarketServices.get(MARKETS_PROFILES + marketId)
)
But the real cause was the stateSevice, this behavior subject was updated with next, in anothers parts of the app.
.switchMap(_ => this.stateService.getMarketId())
In order to avoid those calls, I created a function in order to retrive the current value from the BehaviorSubject.
getCurrentMarketId(): ClientData {
return this.currentMarket.value; // BehaviorSubject
}
I added this function to the effect the call is once per dispatched effect.
...
.switchMap(([action, store]) => {
const id = store.product.id;
return this.savedProducts.getProduct(id, 'store');
})
.map(_ => this.stateService.getCurrentMarketId())
.switchMap(({ marketId }) =>
this.apiMarketServices.get(MARKETS_PROFILES + marketId)
)

How many requests can Node-Express fire off at once?

I have a script that is pulling 25,000 records from AWS Athena which is basically a PrestoDB Relational SQL Database. Lets say that I'm generating a request for each one of these records, which means I have to make 25,000 requests to Athena, then when the data comes back I have to make 25,000 requests to my Redis Cluster.
What would be the ideal amount of requests to make at one time from node to Athena?
The reason I ask is because I tried to do this by creating an array of 25,000 promises and then calling Promise.all(promiseArray) on it, but the app just hanged forever.
So I decided instead to fire off 1 at a time and use recursion to splice the first index out and then pass the remaining records to the calling function after the promise has been resolved.
The problem with this is that it takes forever. I took about an hour break and came back and there were 23,000 records remaining.
I tried to google how many requests Node and Athena can handle at once, but I came up with nothing. I'm hoping someone might know something about this and be able to share it with me.
Thank you.
Here is my code just for reference:
As a sidenote, what I would like to do differently is instead of sending one request at a time I could send 4, 5, 6, 7 or 8 at a time depending on how fast it would execute.
Also, how would a Node cluster effect the performance of something like this?
exports.storeDomainTrends = () => {
return new Promise((resolve, reject)=>{
athenaClient.execute(`SELECT DISTINCT the_column from "the_db"."the_table"`,
(err, data) => {
var getAndStoreDomainData = (records) => {
if(records.length){
return new promise((resolve, reject) => {
var subrecords = records.splice(0, )[0]
athenaClient.execute(`
SELECT
field,
field,
field,
SUM(field) as field
FROM "the_db"."the_table"
WHERE the_field IN ('Month') AND the_field = '`+ record.domain_name +`'
GROUP BY the_field, the_field, the_field
`, (err, domainTrend) => {
if(err) {
console.log(err)
reject(err)
}
redisClient.set(('Some String' + domainTrend[0].domain_name), JSON.stringify(domainTrend))
resolve(domainTrend);
})
})
.then(res => {
getAndStoreDomainData(records);
})
}
}
getAndStoreDomainData(data);
})
})
}
Using the lib your code could look something like this:
const Fail = function(reason){this.reason=reason;};
const isFail = x=>(x&&x.constructor)===Fail;
const distinctDomains = () =>
new Promise(
(resolve,reject)=>
athenaClient.execute(
`SELECT DISTINCT domain_name from "endpoint_dm"."bd_mb3_global_endpoints"`,
(err,data)=>
(err)
? reject(err)
: resolve(data)
)
);
const domainDetails = domain_name =>
new Promise(
(resolve,reject)=>
athenaClient.execute(
`SELECT
timeframe_end_date,
agg_type,
domain_name,
SUM(endpoint_count) as endpoint_count
FROM "endpoint_dm"."bd_mb3_global_endpoints"
WHERE agg_type IN ('Month') AND domain_name = '${domain_name}'
GROUP BY timeframe_end_date, agg_type, domain_name`,
(err, domainTrend) =>
(err)
? reject(err)
: resolve(domainTrend)
)
);
const redisSet = keyValue =>
new Promise(
(resolve,reject)=>
redisClient.set(
keyValue,
(err,res)=>
(err)
? reject(err)
: resolve(res)
)
);
const process = batchSize => limitFn => resolveValue => domains =>
Promise.all(
domains.slice(0,batchSize)
.map(//map domains to promises
domain=>
//maximum 5 active connections
limitFn(domainName=>domainDetails(domainName))(domain.domain_name)
.then(
domainTrend=>
//the redis client documentation makes no sense whatsoever
//https://redis.io/commands/set
//no mention of a callback
//https://github.com/NodeRedis/node_redis
//mentions a callback, since we need the return value
//and best to do it async we will use callback to promise
redisSet([
`Endpoint Profiles - Checkin Trend by Domain - Monthly - ${domainTrend[0].domain_name}`,
JSON.stringify(domainTrend)
])
)
.then(
redisReply=>{
//here is where things get unpredictable, set is documented as
// a synchronous function returning "OK" or a function that
// takes a callback but no mention of what that callback recieves
// as response, you should try with one or two records to
// finish this on reverse engineering because documentation
// fails 100% here and can not be relied uppon.
console.log("bad documentation of redis client... reply is:",redisReply);
(redisReply==="OK")
? domain
: Promise.reject(`Redis reply not OK:${redisReply}`)
}
)
.catch(//catch failed, save error and domain of failed item
e=>
new Fail([e,domain])
)
)
).then(
results=>{
console.log(`got ${batchSize} results`);
const left = domains.slice(batchSize);
if(left.length===0){//nothing left
return resolveValue.conat(results);
}
//recursively call process untill done
return process(batchSize)(limitFn)(resolveValue.concat(results))(left)
}
);
const max5 = lib.throttle(5);//max 5 active connections to athena
distinctDomains()//you may want to limit the results to 50 for testing
//you may want to limit batch size to 10 for testing
.then(process(1000)(max5)([]))//we have 25000 domains here
.then(
results=>{//have 25000 results
const successes = results.filter(x=>!isFail(x));
//array of failed items, a failed item has a .reason property
// that is an array of 2 items: [the error, domain]
const failed = results.filter(isFail);
}
)
You should figure out what redis client does, I tried to figure it out using the documentation but may as well ask my goldfish. Once you've reverse engineered the client behavior it is best to try with small batch size to see if there are any errors. You have to import lib to use it, you can find it here.
I was able to take what Kevin B said to find a much quicker way to query the data. What I did was change the query so that I could get the trend for all domains from Athena. I ordered it by domain_name and then sent it as a Node stream so that I could separate out each domain name into it's own JSON as the data was coming in.
Anyways this is what I ended up with.
exports.storeDomainTrends = () => {
return new Promise((resolve, reject)=>{
var streamObj = athenaClient.execute(`
SELECT field,
field,
field,
SUM(field) AS field
FROM "db"."table"
WHERE field IN ('Month')
GROUP BY field, field, field
ORDER BY field desc`).toStream();
var data = [];
streamObj.on('data', (record)=>{
if (!data.length || record.field === data[0].field){
data.push(record)
} else if (data[0].field !== record.field){
redisClient.set(('Key'), JSON.stringify(data))
data = [record]
}
})
streamObj.on('end', resolve);
streamObj.on('error', reject);
})
.then()
}

Categories