Javascript how to build loop with unconstant time of iterations - javascript

need to be consulted by JS jedi.
Situation: I got array with USER_IDs and need to call social network function to post on their walls. So I need to launch wallpost function, listen to its answer and then continue iterating my loop.
Code I have written is able to leave post to the last person from array, to the first, to none and many others useful functions :s . Need your help jedis
function showPost() {
for(i in selFriends) { //selFriends - massive with user_id
alert(i+' '+selFriends[i]); //checkplace
VK.api('wall.post',{
owner_id:selFriends[i], //element
message:htmlres, // content
attachment:photoID // id of the mediacontent
},function(receive) {
showPost();
});
return;
}
}
This code need to be fixed, because now it is just iterationg wall.post for 1st element from the massive.
By the way method VK.api is almost similar to Facebook UI method 'feed'.
Thank you very much :)

I'm no jedi, but are you trying to run the first index in your selFriends array through VK.api, and call the next index in the callback? And then repeat for each element in the selFriends array?
function postId(index) {
if (index === selFriends.length)
return;
VK.api('wall.post',{
owner_id:selFriends[i], //element
message:htmlres, // content
attachment:photoID // id of the mediacontent
},function(receive) {
postId(index + 1);
});
}
And then
function showPost() {
postId(0);
}

Related

How to get the last item from every firebase object and put it into an array

Folks I'm using React Native Firebase and I want to achieve as you can see in the attached ss
there's message key and it has threads underneath and each thread have multiple messages.I want to do like I want get last item from every thread and put it into an array.How to achieve.I'm trying to use foreach and not getting my desired result
here is my code
messagesRef.on('value', snapshot => {
let messsagesFB = [];
snapshot.forEach(element => {
messsagesFB = [...messsagesFB, element.val()];
});
});
I know my code will push every element but i want to push only last item from each thread
For your database reference messagesRef, try adding .orderByKey().limitToLast(1) for the query, in which case a loop would not be needed.
messagesRef.orderByKey().limitToLast(1).on('value', snapshot => {
// TODO: Add to array
}
There's also always the catch for the last iteration of a jQuery foreach loop. In your case, you could simply do:
snapshot.forEach(function(element, idx, messagesFB) {
if (idx === messagesFB.length - 1){
messsagesFB.push(element.val());
}
});

Better performance when saving large JSON file to MySQL

I have an issue.
So, my story is:
I have a 30 GB big file (JSON) of all reddit posts in a specific timeframe.
I will not insert all values of each post into the table.
I have followed this series, and he coded what I'm trying to do in Python.
I tried to follow along (in NodeJS), but when I'm testing it, it's way too slow. It inserts one row every 5 seconds. And there 500000+ reddit posts and that would literally take years.
So here's an example of what I'm doing in.
var readStream = fs.createReadStream(location)
oboe(readStream)
.done(async function(post) {
let { parent_id, body, created_utc, score, subreddit } = data;
let comment_id = data.name;
// Checks if there is a comment with the comment id of this post's parent id in the table
getParent(parent_id, function(parent_data) {
// Checks if there is a comment with the same parent id, and then checks which one has higher score
getExistingCommentScore(parent_id, function(existingScore) {
// other code above but it isn't relevant for my question
// this function adds the query I made to a table
addToTransaction()
})
})
})
Basically what that does, is to start a read stream and then pass it on to a module called oboe.
I then get JSON in return.
Then, it checks if there is a parent saved already in the database, and then checks if there is an existing comment with the same parent id.
I need to use both functions in order to get the data that I need (only getting the "best" comment)
This is somewhat how addToTransaction looks like:
function addToTransaction(query) {
// adds the query to a table, then checks if the length of that table is 1000 or more
if (length >= 1000) {
connection.beginTransaction(function(err) {
if (err) throw new Error(err);
for (var n=0; n<transactions.length;n++) {
let thisQuery = transactions[n];
connection.query(thisQuery, function(err) {
if (err) throw new Error(err);
})
}
connection.commit();
})
}
}
What addToTransaction does, is to get the queries I made and them push them to a table, then check the length of that table and then create a new transaction, execute all those queries in a for loop, then comitting (to save).
Problem is, it's so slow that the callback function I made doesn't even get called.
My question (finally) is, is there any way I could improve the performance?
(If you're wondering why I am doing this, it is because I'm trying to create a chatbot)
I know I've posted a lot, but I tried to give you as much information as I could so you could have a better chance to help me. I appreciate any answers, and I will answer the questions you have.

Execute function only after display is fully rendered from ng-repeat not when ng-repeat reaches last index

I have a list being generated from ng-repeat and it's rendering a component tag on each iteration. And I'm giving each iteration a unique id utilizing the $index value.
That looks like this
<div ng-if="$ctrl.myArr.length > 0" ng-repeat="obj in $ctrl.myArr">
<myCustomComponentTag id="L1-{{$index}}" obj="obj"></myCustomComponentTag>
</div>
I need to run some jquery on these unique ids, which are populating just fine. The view will successfully show the tags to have ids of #L1-0, #L1-1, #L1-2
The snag is that my function running my jquery is executing before the view fully loads and the id values actually populate with the value of $index
My jQuery is searching for $(`#L1-${i}`) in a loop. If I store #L1-${i} in a string variable and output its value it will return '#L1-0'.
But '#L1-0' does not exist yet, as the view has not been fully populated. When I break on my function the ids on the elements only read L1-{{$index}} and have yet to populate.
I've read in several places to try this directive. This is still not working. It seems that just because my ng-repeat has reached its last element, it does not mean the view has populated and my ids are fully loaded and in place.
How can I execute my jQuery function only after the view is fully populated with my data?
Edit:
This is the jQuery function
jqFn= () => {
if (this.myArr.length > 0) {
for (var i: number = 0; i < this.myArr.length; i++) {
$(`#L1-${i}`).css({
/*
Add various styles here
*/
});
}
}
};
Perhaps you can try something like this
Disclaimer: I have found it one of the comments on the link that you referenced in the post.
The solution I used for this was a recursive function that set a timeout and searched for the id of the last object in the repeat statement until it exuisted. The id could not exist until the dynamic data from the $watch fully loaded in. This guaranteed all my data was accessible.
Code looked similar to this
setListener = (el, cb) => {
if ($(el).length) {
cb();
} else {
setTimeout(() => {
this.setListener(el, cb);
}, 500);
}
};

Keeping Count of Child Objects

I am trying to come up with some sort of a solution to keep a count of a node with many child nodes... I thought of just keeping a field and increment it as stuff is added to the parent node
My one concern is multiple users adding to the node at the same time, is there a way I could safely incriment without worrying about overrighting if other users icriment the count at the same time
Thanks to #FrankVanPuffelen for pointing me in the right direction.. How exactly would you go about callling it for a simple counter? Heres what I wrote up but dosen't seem to be working the way I expected
var ref = firebase().database().ref('Counter');
export function toggleStar(postRef) {
postRef.transaction(function(post) {
if (post) {
post++;
}else{
post = 0;
}
return post;
});
}
//then to Call it:
toggleStar(ref);
Tried to keep it minimal so it could help someone else trying to implement a counter system. The field Counter in this case would just be my spot where I would like to store it. I tried to add a case where if it was false or NULL to set it to 0.
EDIT 2:
Also did this:
export function toggleStar(postRef) {
postRef.transaction(function(post) {
if (post) {
post.go++;
}else{
post = {};
post.go = 0;
}
return post;
});
}
ANd called ti with the same method above. This does appear to be working... However I am worried that this isn't accomplishing the process in the right way so I just want to be sure... I don't want to overwrite other users data and having inaccurate numbers

Rxjs - Consume API output and re-query when cache is empty

I'm trying to implement a version of this intro to RxJS (fiddle here) that instead of picking a random object from a returned API array, it consumes a backthrottled stream of objects from the returned API array.
Here's a portion of the code that produces a controlled Observable from the API response (full fiddle here):
var responseStream = requestStream.flatMap(function (requestUrl) {
return Rx.Observable.fromPromise(fetch(requestUrl));
}).flatMap(function(response) {
return Rx.Observable.fromPromise(response.json());
}).flatMap(function(json) {
return Rx.Observable.from(json);
}).controlled();
I just dump each emitted user in console.log, and use a click event stream to trigger the request() call in the controlled Observable:
responseStream.subscribe(function(user) {
console.log(user);
});
refreshClickStream.subscribe(function (res) {
responseStream.request(1);
});
There's about 50 user objects returned from the GitHub API, and I'd like to backthrottle-consume them one per click (as seen above). However, after I'm fresh out of user objects I'd like to send in another call to requestStream to fetch another API call, replenish the responseStream and continue providing user objects to console.log upon each click. What would be the RxJS-friendly way to do so?
I'd do it similarly to the article example with combineLatest() although I wonder if there's an easier way than mine.
I'm making request for only 3 items. Working with 3 items is hardcoded so you'll want to modify this. I was thinking about making it universal but that would require using Subject and made it much more complicated so I stayed with this simple example.
Also, I'm using concatMap() to trigger fetching more data. However, just clicking the link triggers just the combineLatest() which emits another item from the array.
See live demo: https://jsfiddle.net/h3bwwjaz/12/
var refreshButton = document.querySelector('#main');
var refreshClickStream = Rx.Observable.fromEvent(refreshButton, 'click')
.startWith(0)
.scan(function(acc, val, index) {
return index;
});
var usersStream = refreshClickStream
.filter(function(index) {
return index % 3 === 0;
})
.concatMap(function() {
var randomOffset = Math.floor(Math.random() * 500);
var url = 'https://api.github.com/users?since=' + randomOffset + '&per_page=3';
return Rx.Observable.fromPromise(fetch(url))
.flatMap(function(response) {
return Rx.Observable.fromPromise(response.json());
});
})
.combineLatest(refreshClickStream, function(responseArray, index) {
return responseArray[index % 3];
})
.distinct();
usersStream.subscribe(function(user) {
console.log(user);
});
I use refreshClickStream twice:
to emit next item in the array in combineLatest()
to check whether this is the end of the array and we need to make another request (that's the filter() operator).
At the end distinct() is required because when you click index % 3 === 0 time triggers in fact two emission. First is the one from downloading the data and the second one is directly in combineLatest() that we want to ignore because we don't want to iterate the same data again. Thanks to distinct() it's ignored and only the new values is passed.
I was trying to figure out a method without using distinct() but I couldn't find any.

Categories