User profiles on GitHub display the number of contributions a user has made in the past year:
I'd like to display this information on my website, preferably without having to introduce any kind of backend. I've looked for it in several places. Nothing is listed in /users/(me)/. Short of scraping the page source, is there a way to retrieve this information? I don't see anything documented...
Have you seen the GithubAPI? Using it, you could run the request from your js file, without implemeting any backend.
For your purpose, I think you could use the following request https://api.github.com/users/yourUserName/events. That gives you as result, a list of events related to youUserName.
e.g
[
{
"id": "xxxxxxx",
"type": "PushEvent",
"actor": {.......},
"repo": {......},
"payload": {.......},
"public": true,
"created_at": "2016-11-17T21:33:15Z"
},
.....
]
You should go through the list and filter the type = PushEvent and created_at = lastYear.
I hope this can help you!
Update: that service is just giving the results for the last 3 months, so other possibility is to check (users/username/repos) and after that check the commits over them (repos/username/repoName/commits)
I've settled for parsing the HTML of my profile page using PHP. I begin by downloading the HTML for my profile page:
$githubProfile = file_get_contents('https://github.com/controversial');
next I find the position of the phrase contributions in the last year on the page. The whitespace is strange between contributions and in the last year so I'm using a regex to match any amount of space.
$contributionsIndex = preg_match(
'/contributions\s+in the last year/',
$githubProfile,
$matches,
PREG_OFFSET_CAPTURE
);
$index = $matches[0][1];
Finally, I iterate backwards from this point until I come across a character that is neither a number or a comma in the number. Once this condition becomes false, I know I've found the beginning of the number:
$endIndex = $index;
while (is_numeric($githubProfile[$index-2]) || $githubProfile[$index-2] == ',')
$index--;
$contributionsCount = substr($githubProfile, $index-1, $endIndex-$index);
Finally, I echo the number out (without commas)
echo(str_replace(',', '', $contributionsCount));
Finally, I use an AJAX call from my JavaScript page to get the value from the PHP script.
function get(url, callback) {
/* eslint-disable no-console */
const xhr = new XMLHttpRequest();
xhr.open('GET', url);
xhr.onload = e => (callback || console.log)(e.target.responseText);
xhr.onerror = () => console.log('error');
xhr.send();
/* eslint-enable no-console */
}
// Fill in GitHub contributions count
get('contributions-count.php', (resp) => {
document.getElementById('gh-contributions-count').innerText = resp;
});
I'm sure this will be accomplishable more cleanly in the future with GitHub's GraphQL API, but that's still in preview stages.
It's also possible to parse the svg calendar data with a xml parser and then sum all data-count entries. To avoid CORS issue, you can use a proxy like urlreq living on http://urlreq.appspot.com/req :
const user = 'controversial';
fetch('https://urlreq.appspot.com/req?method=GET&url=https://github.com/users/' + user + '/contributions')
.then(function(response) {
return response.text();
})
.then(function(text) {
xmlDoc = new DOMParser().parseFromString(text, 'text/xml');
var nodes = xmlDoc.getElementsByTagName('rect');
var count = 0;
for (var i = 0; i < nodes.length; i++) {
count += parseInt(nodes[i].getAttribute('data-count'));
}
console.log('contributions count : ' + count);
})
.catch(function(error) {
console.log('Request failed', error)
});
Another example using the command line with curl & xmlstarlet can be found here
Related
The project aims to study a new social media:
https://booyah.live/
My needs are:
1 - Collect data from profiles that follow a specific profile.
2 - My account use this data to follow the collected profiles.
3 - Among other possible options, also unfollow the profiles I follow.
The problem found in the current script:
The profile data in theory is being collected, the script runs perfectly until the end, but for some reason I can't specify, instead of following all the collected profiles, it only follows the base profile.
For example:
I want to follow all 250 profiles that follow the ID 123456
I activate the booyahGetAccounts(123456); script
In theory the end result would be my account following 250 profiles
But the end result I end up following only the 123456 profile, so the count of people I'm following is 1
Complete Project Script:
const csrf = 'MY_CSRF_TOKEN';
async function booyahGetAccounts(uid, type = 'followers', follow = 1) {
if (typeof uid !== 'undefined' && !isNaN(uid)) {
const loggedInUserID = window.localStorage?.loggedUID;
if (uid === 0) uid = loggedInUserID;
const unfollow = follow === -1;
if (unfollow) follow = 1;
if (loggedInUserID) {
if (csrf) {
async function getUserData(uid) {
const response = await fetch(`https://booyah.live/api/v3/users/${uid}`),
data = await response.json();
return data.user;
}
const loggedInUserData = await getUserData(loggedInUserID),
targetUserData = await getUserData(uid),
followUser = uid => fetch(`https://booyah.live/api/v3/users/${loggedInUserID}/followings`, { method: (unfollow ? 'DELETE' : 'POST'), headers: { 'X-CSRF-Token': csrf }, body: JSON.stringify({ followee_uid: uid, source: 43 }) }),
logSep = (data = '', usePad = 0) => typeof data === 'string' && usePad ? console.log((data ? data + ' ' : '').padEnd(50, '━')) : console.log('━'.repeat(50),data,'━'.repeat(50));
async function getList(uid, type, follow) {
const isLoggedInUser = uid === loggedInUserID;
if (isLoggedInUser && follow && !unfollow && type === 'followings') {
follow = 0;
console.warn('You alredy follow your followings. `follow` mode switched to `false`. Followings will be retrieved instead of followed.');
}
const userData = await getUserData(uid),
totalCount = userData[type.slice(0,-1)+'_count'] || 0,
totalCountStrLength = totalCount.toString().length;
if (totalCount) {
let userIDsLength = 0;
const userIDs = [],
nickname = userData.nickname,
nicknameStr = `${nickname ? ` of ${nickname}'s ${type}` : ''}`,
alreadyFollowedStr = uid => `User ID ${uid} already followed by ${loggedInUserData.nickname} (Account #${loggedInUserID})`;
async function followerFetch(cursor = 0) {
const fetched = [];
await fetch(`https://booyah.live/api/v3/users/${uid}/${type}?cursor=${cursor}&count=100`).then(res => res.json()).then(data => {
const list = data[type.slice(0,-1)+'_list'];
if (list?.length) fetched.push(...list.map(e => e.uid));
if (fetched.length) {
userIDs.push(...fetched);
userIDsLength += fetched.length;
if (follow) followUser(uid);
console.log(`${userIDsLength.toString().padStart(totalCountStrLength)} (${(userIDsLength / totalCount * 100).toFixed(4)}%)${nicknameStr} ${follow ? 'followed' : 'retrieved'}`);
if (fetched.length === 100) {
followerFetch(data.cursor);
} else {
console.log(`END REACHED. ${userIDsLength} accounts ${follow ? 'followed' : 'retrieved'}.`);
if (!follow) logSep(targetList);
}
}
});
}
await followerFetch();
return userIDs;
} else {
console.log(`This account has no ${type}.`);
}
}
logSep(`${follow ? 'Following' : 'Retrieving'} ${targetUserData.nickname}'s ${type}`, 1);
const targetList = await getList(uid, type, follow);
} else {
console.error('Missing CSRF token. Retrieve your CSRF token from the Network tab in your inspector by clicking into the Network tab item named "bug-report-claims" and then scrolling down in the associated details window to where you see "x-csrf-token". Copy its value and store it into a variable named "csrf" which this function will reference when you execute it.');
}
} else {
console.error('You do not appear to be logged in. Please log in and try again.');
}
} else {
console.error('UID not passed. Pass the UID of the profile you are targeting to this function.');
}
}
This current question is a continuation of that answer from the link:
Collect the full list of buttons to follow without having to scroll the page (DevTools Google Chrome)
Since I can't offer more bounty on that question, I created this one to offer the new bounty to anyone who can fix the bug and make the script work.
Access account on Booyah website to use for tests:
Access by google:
User: teststackoverflowbooyah#gmail.com
Password: quartodemilha
I have to admit that it is really hard to read your code, I spent a lesser amount of time rewriting everything from scratch.
Stated that we need a code piece to be cut/pasted in the JavaScript console of web browsers able to store some data (i.e. expiration of followings and permanent followings) we need some considerations.
We can consider expiration of followings as volatile data: something that if lost can be reset to 1 day later from when we loose this data. window.localStorage is a perfect candidate to store these kind of data. If we change web browser the only drawback is that we loose the expiration of followings and we can tolerate to reset them to 1 day later from when we change browser.
While to store the list of permanent followings we need a permanent store even if we change web browser. The best idea that came to my mind is to create an alternative account with which to follow the users we never want to stop following. In my code I used uid 3186068 (a random user), once you have created your own alternative account, just replace the first line of the code block with its uid.
Another thing we need to take care is error handling: API could always have errors. The approach I chosen is to write myFetch which, in case of errors, retries twice the same call; if the error persists, probably we are facing a temporary booyah.live outage. Probably we just need to retry a bit later.
To try to provide a comfortable interface, the code blocks gathers the uid from window.location: to follow the followers of users, just cut/paste the code block on tabs opened on their profiles. For example I run the code from a tab open on https://booyah.live/studio/123456?source=44.
Last, to unfollow users the clean function is called 5 minutes later we paste the code (to not conflict with calls to follow followers) and than is executed one hour later it finishes its job. It is written to access the localStorage in an atomic way, so you can have many of them running simultaneously on different tabs of the same browser, you can not care about it. The only thing you need to take care it that when the window.location changes, all the JavaScript events in the tab are reset; so I suggest to keep a tab open on the home page, paste the code block on it, and forget about this tab; it will be the tab responsible of unfollowing users. Then open other tabs to do what you need, when you hit a user you want to follow the followers, paste the block on it, wait the job is finished and continue to use the tab normally.
// The account we use to store followings
const followingsUID = 3186068;
// Gather the loggedUID from window.localStorage
const { loggedUID } = window.localStorage;
// Gather the CSRF-Token from the cookies
const csrf = document.cookie.split("; ").reduce((ret, _) => (_.startsWith("session_key=") ? _.substr(12) : ret), null);
// APIs could have errors, let's do some retries
async function myFetch(url, options, attempt = 0) {
try {
const res = await fetch("https://booyah.live/api/v3/" + url, options);
const ret = await res.json();
return ret;
} catch(e) {
// After too many consecutive errors, let's abort: we need to retry later
if(attempt === 3) throw e;
return myFetch(url, option, attempt + 1);
}
}
function expire(uid, add = true) {
const { followingsExpire } = window.localStorage;
let expires = {};
try {
// Get and parse followingsExpire from localStorage
expires = JSON.parse(followingsExpire);
} catch(e) {
// In case of error (ex. new browsers) simply init to empty
window.localStorage.followingsExpire = "{}";
}
if(! uid) return expires;
// Set expire after 1 day
if(add) expires[uid] = new Date().getTime() + 3600 * 24 * 1000;
else delete expires[uid];
window.localStorage.followingsExpire = JSON.stringify(expires);
}
async function clean() {
try {
const expires = expire();
const now = new Date().getTime();
for(const uid in expires) {
if(expires[uid] < now) {
await followUser(parseInt(uid), false);
expire(uid, false);
}
}
} catch(e) {}
// Repeat clean in an hour
window.setTimeout(clean, 3600 * 1000);
}
async function fetchFollow(uid, type = "followers", from = 0) {
const { cursor, follower_list, following_list } = await myFetch(`users/${uid}/${type}?cursor=${from}&count=50`);
const got = (type === "followers" ? follower_list : following_list).map(_ => _.uid);
const others = cursor ? await fetchFollow(uid, type, cursor) : [];
return [...got, ...others];
}
async function followUser(uid, follow = true) {
console.log(`${follow ? "F" : "Unf"}ollowing ${uid}...`);
return myFetch(`users/${loggedUID}/followings`, {
method: follow ? "POST" : "DELETE",
headers: { "X-CSRF-Token": csrf },
body: JSON.stringify({ followee_uid: uid, source: 43 })
});
}
async function doAll() {
if(! loggedUID) throw new Error("Can't get 'loggedUID' from localStorage: try to login again");
if(! csrf) throw new Error("Can't get session token from cookies: try to login again");
console.log("Fetching current followings...");
const currentFollowings = await fetchFollow(loggedUID, "followings");
console.log("Fetching permanent followings...");
const permanentFollowings = await fetchFollow(followingsUID, "followings");
console.log("Syncing permanent followings...");
for(const uid of permanentFollowings) {
expire(uid, false);
if(currentFollowings.indexOf(uid) === -1) {
await followUser(uid);
currentFollowings.push(uid);
}
}
// Sync followingsExpire in localStorage
for(const uid of currentFollowings) if(permanentFollowings.indexOf(uid) === -1) expire(uid);
// Call first clean task in 5 minutes
window.setTimeout(clean, 300 * 1000);
// Gather uid from window.location
const match = /\/studio\/(\d+)/.exec(window.location.pathname);
if(match) {
console.log("Fetching this user followers...");
const followings = await fetchFollow(parseInt(match[1]));
for(const uid of followings) {
if(currentFollowings.indexOf(uid) === -1) {
await followUser(uid);
expire(uid);
}
}
}
return "Done";
}
await doAll();
The problem: I strongly suspect a booyah.live API bug
To test my code I run it from https://booyah.live/studio/123456?source=44.
If I run it multiple times I continue to get following output:
Fetching current followings...
Fetching permanent followings...
Syncing permanent followings...
Following 1801775...
Following 143823...
Following 137017...
Fetching this user followers...
Following 16884042...
Following 16166724...
There is bug somewhere! The expected output for subsequent executions in the same tab would be:
Fetching current followings...
Fetching permanent followings...
Syncing permanent followings...
Fetching this user followers...
After seeking the bug in my code without success, I checked booyah.live APIs: if I navigate following URLs (the uids are the ones the code continue to follow in subsequent executions)
https://booyah.live/studio/1801775
https://booyah.live/studio/143823
https://booyah.live/studio/137017
https://booyah.live/studio/16884042
https://booyah.live/studio/16166724
I can clearly see I follow them, but if I navigate https://booyah.live/following (the list of users I follow) I can't find them, neither if I scroll the page till the end.
Since I do exactly the same calls the website does, I strongly suspect the bug is in booyah.live APIs, exactly in the way they handle the cursor parameter.
I suggest you to open a support ticket to booyah.live support team. You could use the test account you provided us: I already provided you the details to do that. ;)
Im trying to sign a BlockCypher transaction on the bitcoin testnet using bitcoinjs as described here but I keep getting the error:
{"error": "Couldn't deserialize request: invalid character 'x' in literal true (expecting 'r')"}
I have searched around and can find no documentation on what the problem is. Below is the code im using to try and sign the transaction.
var bitcoin = require("bitcoinjs-lib");
var buffer = require('buffer');
var keys = new bitcoin.ECPair.fromWIF('cMvPQZiG5mLARSjxbBwMxKwzhTHaxgpTsXB6ymx7SGAeYUqF8HAT', bitcoin.networks.testnet);
const publicKey = keys.publicKey;
console.log(keys.publicKey.toString("hex"));
var newtx = {
inputs: [{addresses: ['ms9ySK54aEC2ykDviet9jo4GZE6GxEZMzf']}],
outputs: [{addresses: ['msWccFYm5PPCn6TNPbNEnprA4hydPGadBN'], value: 1000}]
};
// calling the new endpoint, same as above
$.post('https://api.blockcypher.com/v1/btc/test3/txs/new', JSON.stringify(newtx))
.then(function(tmptx) {
// signing each of the hex-encoded string required to finalize the transaction
tmptx.pubkeys = [];
tmptx.signatures = tmptx.tosign.map(function(tosign, n) {
tmptx.pubkeys.push(keys.publicKey.toString("hex"));
return keys.sign(new buffer.Buffer(tosign, "hex")).toString("hex");
});
// sending back the transaction with all the signatures to broadcast
$.post('https://api.blockcypher.com/v1/btc/test3/txs/send', tmptx).then(function(finaltx) {
console.log(finaltx);
}).catch(function (response) {
console.log(response.responseText);
});
}).catch(function (response) {
console.log(response.responseText);
});
It seems this line return keys.sign(new buffer.Buffer(tosign, "hex")).toString("hex"); is the problem but im not sure on what is wrong.
This question was discussed and answered here. This post and this one are to be looked into in particular.
As far as I understand, according to the issue respective one was opened at BlockCypher repo. Although its status is still opened till this date, current BlockCypher JS docs respective API description contains altered version of the line
return keys.sign(new buffer.Buffer(tosign, "hex")).toString("hex");
with toDER() conversion prior to toString(), consequently it looks like this now
return keys.sign(new buffer.Buffer(tosign, "hex")).toDER().toString("hex");
I'm using Chrome v67 and FileSaver.js.
This code works in FF & Edge but not in Chrome.
var ajaxSettings = {
url: "my/api/method",
data: JSON.stringify(myItems),
success: function (data) {
if (data) {
var dateToAppend = moment().format("YYYYMMDD-HHmmss");
createPdf("pdfDoc_" + dateToAppend + ".pdf", data[0]);
saveFile("txtDoc_" + dateToAppend + ".txt", data[1]);
//sleep(2000).then(() => {
// saveFile("txtDoc_" + dateToAppend + ".txt", data[1]);
//});
}
}
}
//function sleep(time) {
//return new Promise((resolve) => setTimeout(resolve, time));
//}
function createPdf(filename, pdfBytes) {
if (navigator.appVersion.indexOf("MSIE 9") > -1) {
window.open("data:application/pdf;base64," + encodeURIComponent(pdfBytes));
}
else {
saveFile(fileName, data);
}
}
function saveFile(fileName, data) {
var decodedByte64 = atob(data);
var byteVals = new Array(decodedByte64.length);
for (var i = 0; i < decodedByte64.length; i++) {
byteVals[i] = decodedByte64.charCodeAt(i);
}
var byte8bitArray = new Uint8Array(byteVals);
var blob = new Blob([byte8bitArray]);
saveAs(blob, fileName); //FileSaver.js
}
The result of calling the API is an array with 2 byte arrays in it.
The byte arrays are the documents.
If I run this code, as a user would, then what happens is that the first document gets "downloaded" but the second does not. Attempting to do this a second time without refreshing the page results in no documents being "downloaded". The "download" word is in quotes because it already has been downloaded, what I'm really trying to do is generate the documents from the byte arrays.
This is the strange bit ... if I open the console and place a breakpoint on the "saveFile" call and immediately hit continue when the debugger lands on the breakpoint then all is well with the world and the 2 documents get downloaded.
I initially thought it was a timing issue so I put a 2 second delay on this to see if that was it but it wasn't. The only thing I've managed to get working is the breakpoint which I'm obviously not going to be able to convince the users to start doing no matter how much I want them to.
Any help or pointers are much appreciated
You probably faced a message at the top left corner of your browser stating
https://yoursite.com wants to
Download multiple files
[Block] [Allow]
If you don't have it anymore, it's probably because you clicked on "Block".
You can manage this restriction in chrome://settings/content/automaticDownloads.
I'm working on a CSV parsing web application, which collects data and then uses it to draw a plot graph. So far it works nicely, but unfortunately it takes some time to parse the CSV files with papaparse, even though they are only about 3MB.
So it would be nice to have some kind of progress shown, when "papa" is working. I could go for the cheap hidden div, showing "I'm working", but would prefer the use of <progress>.
Unfortunately the bar just gets updated AFTER papa has finished its work. So I tried to get into webworkers and use a worker file to calculate progress and also setting worker: true in Papa Parses configuration. Still no avail.
The used configuration (with step function) is as followed:
var papaConfig =
{
header: true,
dynamicTyping: true,
worker: true,
step: function (row) {
if (gotHeaders == false) {
for (k in row.data[0]) {
if (k != "Time" && k != "Date" && k != " Time" && k != " ") {
header.push(k);
var obj = {};
obj.label = k;
obj.data = [];
flotData.push(obj);
gotHeaders = true;
}
}
}
tempDate = row.data[0]["Date"];
tempTime = row.data[0][" Time"];
var tD = tempDate.split(".");
var tT = tempTime.split(":");
tT[0] = tT[0].replace(" ", "");
dateTime = new Date(tD[2], tD[1] - 1, tD[0], tT[0], tT[1], tT[2]);
var encoded = $.toJSON(row.data[0]);
for (j = 0; j < header.length; j++) {
var value = $.evalJSON(encoded)[header[j]]
flotData[j].data.push([dateTime, value]);
}
w.postMessage({ state: row.meta.cursor, size: size });
},
complete: Done,
}
Worker configuration on the main site:
var w = new Worker("js/workers.js");
w.onmessage = function (event) {
$("#progBar").val(event.data);
};
and the called worker is:
onmessage = function(e) {
var progress = e.data.state;
var size = e.data.size;
var newPercent = Math.round(progress / size * 100);
postMessage(newPercent);
}
The progress bar is updated, but only after the CSV file is parsed and the site is set up with data, so the worker is called, but the answer is handled after parsing. Papa Parse seems to be called in a worker, too. Or so it seems if checking the calls in the browsers debugging tools, but still the site is unresponsive, until all data shows up.
Can anyone point me to what I have done wrong, or where to adjust the code, to get a working progress bar? I guess this would also deepen my understanding of web workers.
You could use the FileReader API to read the file as text, split the string by "\n" and then count the length of the returned array. This is then your size variable for the calculation of percentage.
You can then pass the file string to Papa (you do not need to reread directly from the file) and pass the number of rows (the size variable) to your worker. (I am unfamiliar with workers and so am unsure how you do this.)
Obviously this only accurately works if there are no embedded line breaks inside the csv file (e.g. where a string is spread over several lines with line breaks) as these will count as extra rows, so you will not make it to 100%. Not a fatal error, but may look strange to the user if it always seems to finish before 100%.
Here is some sample code to give you ideas.
var size = 0;
function loadFile(){
var files = document.getElementById("file").files; //load file from file input
var file = files[0];
var reader = new FileReader();
reader.readAsText(file);
reader.onload = function(event){
var csv = event.target.result; //the string version of your csv.
var csvArray = csv.split("\n");
size = csvArray.length;
console.log(size); //returns the number of rows in your file.
Papa.parse(csv, papaConfig); //Send the csv string to Papa for parsing.
};
}
I haven't used Papa Parse with workers before, but a few things pop up after playing with it for a bit:
It does not seem to expect you to interact directly with the worker
It expects you to either want the entire final result, or the individual items
Using a web worker makes providing a JS Fiddle infeasible, but here's some HTML that demonstrates the second point:
<html>
<head>
<script src="papaparse.js"></script>
</head>
<body>
<div id="step">
</div>
<div id="result">
</div>
<script type="application/javascript">
var papaConfig = {
header: true,
worker: true,
step: function (row) {
var stepDiv = document.getElementById('step');
stepDiv.appendChild(document.createTextNode('Step received: ' + JSON.stringify(row)));
stepDiv.appendChild(document.createElement('hr'));
},
complete: function (result) {
var resultDiv = document.getElementById('result');
resultDiv.appendChild(document.createElement('hr'));
resultDiv.appendChild(document.createTextNode('Complete received: ' + JSON.stringify(result)))
resultDiv.appendChild(document.createElement('hr'));
}
};
var data = 'Column 1,Column 2,Column 3,Column 4 \n\
1-1,1-2,1-3,1-4 \n\
2-1,2-2,2-3,2-4 \n\
3-1,3-2,3-3,3-4 \n\
4,5,6,7';
Papa.parse(data, papaConfig);
</script>
</body>
</html>
If you run this locally, you'll see you get a line for each of the four rows of the CSV data, but the call to the complete callback gets undefined. Something like:
Step received: {"data":[{"Column 1":"1-1",...
Step received: {"data":[{"Column 1":"2-1",...
Step received: {"data":[{"Column 1":"3-1",...
Step received: {"data":[{"Column 1":"4","...
Complete received: undefined
However if you remove or comment out the step function, you will get a single line for all four results:
Complete received: {"data":[{"Column 1":"1-1",...
Note also that Papa Parse uses a streaming concept to support the step callback regardless of using a worker or not. This means you won't know how many items you are parsing directly, so calculating the percent complete is not possible unless you can find the length of items separately.
I'm working on a search box for my site and want to build if from scratch (I know I probably shouldn't but it's a learning exercise as much as anything).
I am receiving a string from a text box and using the Jquery autocomplete function to display any results from Parse.
So far this works OK but when I type in a string that includes all the words in my database, but not in the correct order, I get no results.
e.g. if I search for "New article" it shows the article, whereas if I search for "article New" it doesn't.
I am retrieving the data with:
...
var searchString = document.getElementById("tags").value; // Get the search string
var str = searchString.split(" "); // Split it up into an array
var Articles = Parse.Object.extend("Articles");
var query = new Parse.Query(Articles);
query.containedIn("title", str); // This part doesn't work.
// query.contains("title", searchString); // I was using this and it works OK.
query.find({
...
EDIT:
Using the method suggested by bond below my cloud code is:
Parse.Cloud.define('searchArticles', function(request, response) {
console.log("About to request search terms"); // This never appears in cloud code logs
var input = request.params.searchTerms;
input = _.uniq(input);
input = _.filter(list, function(w) { return w.match(/^\w+$/); });
var searchQuery = new Parse.Query("Articles");
searchQuery.containedIn("titleArray", input);
searchQuery.find().then(function(articles) {
console.log("Success");
}, function(err) {
console.log("Failure");
});
});
and the function i'm using to call this is:
$("#searchBox").keyup(function() {
var availableArticles = [];
var searchString = document.getElementById("tags").value;
var str = searchString.split(" ");
Parse.Cloud.run('searchArticles', {"searchTerms":str}, {
success: function(articles) {
alert("Successful");
},
error: function(error) {
alert("Unsuccessful");
}
});
Note that none of the console.log's are ever written, and the alerts in the Parse.Cloud.run function never appear. The cloud code is running as it is displayed in Parse cloud code logs.
For search functions you may use a separate array column say keyWords. Where you save the keywords in title column in beforeSave function of Articles class. Then you may call separate search function to search while you type to search. Something like this:
Parse.Cloud.beforeSave("Articles", function(request, response) {
// set/update Articles keywords
}
Parse.Cloud.define("searchArticles", function(request, response) {
var input = request.params.query; //
// optimizing input
input = _.uniq(input);
input = _.filter(list, function(w) { return w.match(/^\w+$/); });
var searchQuery = new Parse.Query("AddressInfo");
searchQuery.containsAll("keyWords", input);
searchQuery.find().then(function(articles) {
// return success response
}, function(err) {
// handle error
});
}
You may call searchArticles from client. Running queries like containedIn in Cloud than on client would be faster to give your search results.