How do I limit the result attributes coming back from algolia? - javascript

When I use algolia's instantsearch, the url that I hit returns all the attributes for the object that is hit. I have multiple different types of users and I don't want to just display something like a user's email to the entire world. Is there a way to query algolia differently so that I can limit the returned resultset before it comes to the page?
My current idea is to funnel everything through the our backend but I don't like the idea of limiting the spead of my search results by my own server response speed.
Here's an example of the algolia hit that returns all of my keys:
https://identifier-dsn.algolia.net/1/indexes/localhost_users/query?x-algolia-agent=Algolia%20for%20vanilla%20JavaScript%203.18.0&x-algolia-application-id=identifier&x-algolia-api-key=secret_letters

For better control over what kind of data is returned, you can configure the attributesToRetrieve and attributesToHighlight of your index. Take a look at the documentation for the attributesToRetrieve here.
Edit: Also, use unretrievableAttributes if you want don't want someone with a Search API key to get access to some attributes

Related

Will Product listing having filter, sort & search using Ajax breach REST principles?

I am building a library for listing products in a web application. It has to have filter, search and sort features. I have a web service that when called with filter, search and sort parameters can return the result set with all the those parameters applied. If page number is passed along with that, with number of products per page, it can return that specific page as well. It looks very much suitable to have the data populated through AJAX at client side using this web service. However the page will lose all the parameters (filter, search & sort) when clicking back button and coming back and the page will display the default list of products, as the URL will remain the same as below, irrespective of the filter or page or search or sort parameter
<domain>/productlist
. To retain them, I have to save these in sessionStorage or any other such mechanism. Will this be a violation of REST principles? Do I have to avoid AJAX and have the parameters always passed in the URL for the actions to be repeatable and abide by REST principles like
<domain>/productlist?filter=f1f2f3&search=apple&sort=price&order=1&page=3&items=10?
I may be wrong in understanding REST as well, as I am a bit new to this. So would like to understand better to have a proper & compliant design.
To retain them, there should have some better approaches instead of putting those params into session storage; one of the approach would be for every AJAX requests, pushing the search params into the window.history via window.history.pushState function, once user go back to previous page, all you have to do is check whether urlParams is filled with something or empty, and fetch the data according to the urlParams.
REST is a concept that how you should handle the requests throughout the frontend and backend.
AJAX is a method of fetching data from backend.
They could coexist therefore you could use AJAX and at the same time abide the statelessness of REST.

How can I make live search in my site like wikipedia?

I want to add such a search bar in my site that users search my site and they can get instant result though live search like wikipedia. How can be done this job?
What you want is an ajax request which queries a php file and search for the keyword in your db and return the topic to be displayed as search predictions. Here is a good example that you can try it out.
https://www.w3schools.com/php/php_ajax_livesearch.asp
In a nutshell, you need an endpoint that you can call each time you change the input with the current results. That endpoint would then spit back out search results which you could render to show the user.

How to paginate a search result using same query with different skip and limit values

Some of you might argue that this is a question for programmers.stackexchange.com but, having read through Help Center of Stack Overflow I believe this is a specific programming problem and I am more likely to get a response here.
I have a webapp that uses ExpressJS with Neo4j database as backend. I have a search screen, where I would like to use the power of Neo4j's relationships. The search screen accepts one or more values (i.e. manufacture year, fuel type, gearbox, etc etc) and then make a post request to ExpressJS where I construct a cypher query using the parameters of POST request as shown below as an example:
MATCH
(v:VEHICLE),(v)-[:VGEARBOX_IS]->(:VGBOX{type:'MANUAL'}),
(v)-[:VCONDITION_IS]->(:VCONDITION{condition:'USED'})
WITH DISTINCT v
WHERE v.manufacture_year = 1990
MATCH (v)-[r]->(info)
RETURN v AS vehicle, COLLECT({type:type(r), data:info}) AS details
Let's say that running above query, returns the following three vehicles and its properties
If more than 20 vehicles in result then I want to paginate the result and I know how that works, we make use of SKIP and LIMIT as shown below:
MATCH
(v:VEHICLE)
OPTIONAL MATCH (v)-[r:VFUEL_TYPE|:VGEARBOX_IS|:VHAVING_COLOR|...]->(info)
RETURN
v.vehicle_id AS vehicle_id,
v.engine_size AS engine_size,
v.published_date AS published_date,
COUNT(v) AS count,
COLLECT({type:type(r), data:info}) as data
ORDER BY v.published_date DESC
SKIP 20
LIMIT 16
Here is the workflow,
User navigates to search screen which is a form with POST method and various input fields.
User selects some options based on which he/she wish to search.
User then submits the form, which makes a post request to the server.
This request is handled by a ROUTE which uses the parameters of the request to construct a cypher query shown above. It runs the cypher against the Neo4j database and receive the result.
Let's assume, there are 200 vehicles that match the search result. I then want to display only 20 of those results and provide a next/previous button.
When user is done seeing the first 20, he/she wants to see the next 20, that is when I have to re-run the same query that user submitted initially but, with SKIP value of 20 (SKIP value keeps incrementing 20 as user navigates to next page, and decrement 20 as your moves to previous page).
My question is, what is the best approach to save search request (or the cypher generated by original request) so that when user clicks next/previous page, I re-run the original search cypher query with different SKIP value? I don't want to make a fresh POST request every time the user goes to next/previous page. This problem can be resolved in the following manner but, not sure which is more performance-friendly?
Every time the user clicks next or previous page, I make a new POST request with preserved values of original request and rebuild the cypher query (POSTs can be costly - I want to avoid this, please suggest why this is better option)
I store the original cypher query in Redis and whenever the user clicks next or previous, I retrieve the query specific to that user (need to handle this either via cookie, session or some sort of hidden uuid) from Redis, supply the new value for SKIP and re-run it (somehow I have to handle when I should delete this entry from Redis - deletion should happen when user change his search or abandon the page/site).
I store the query in session (user does not have to be logged in) or some other temporary storage (than Redis) that provide fast access (not sure if that is safe and efficient)
I am sure somebody came across this issue and it in an efficient manner which is why I post the question here. Please advise how I can best resolve this problem.
As far as performance goes, the first thing that you should absolutely do is to use Cypher parameters. This is a way to separate your query string from your dynamic data. This has the advantage that you are guarded against injection attacks, but it also is more performance because if your query string doesn't change, Neo4j can cache a query plan for your query and use it over and over again. With parameters your first query would look like this:
MATCH
(v:VEHICLE),(v)-[:VGEARBOX_IS]->(:VGBOX{type: {vgearbox_type}}),
(v)-[:VCONDITION_IS]->(:VCONDITION{condition: {vcondition}})
WITH DISTINCT v
WHERE v.manufacture_year = {manufacture_year}
MATCH (v)-[r]->(info)
RETURN v AS vehicle, COLLECT({type:type(r), data:info}) AS details
SKIP ({page} - 1) * {per_page}
LIMIT {per_page}
Your javascript library for Neo4j should allow you to pass down a separate object. Here is what the object would look like represented in json:
{
"vgearbox_type": "MANUAL",
"vcondition": "USED",
"manufacture_year": 1990,
"page": 1,
"per_page": 20
}
I don't really see much of a problem with making a fresh query to the database from Node each time. You should benchmark how long it actually takes to see if it really is a problem.
If it is something that you want to address with caching, it depends on your server setup. If the Node app and the DB are on the same machine or very close to each other, probably it's not important. Otherwise you could use redis to cache based on a key which is a composite of the values that you are querying for. If you are thinking of caching on a per-user basis, you could even use the browser's local storage, but are users often re-visiting the same pages over and over? How static is your data and does the data vary from user to user?

Take jobs from jobvite with javascript

Is there any way to take the list of available positions for a company from JobVite using javascript (I would prefer if it returned JSON)?
I would like to take 5 random open position and display them in a recruiting region on the website I am working on.
I can confirm that Jobvite DOES have an API, and it returns results in JSON! You need to submit a request to obtain an API key. (Look in the Category dropdown menu)
http://recruiting.jobvite.com/support/customer
Yes. You'll need an API key and secret as blastronaut points out. Then hit this URL:
https://api.jobvite.com/v1/jobFeed?api=KEY&sc=SECRET&companyId=COMPANYID
The API documentation is here: Jobvite Services API PDF
Well, if they have no API, I guess you're going to need to use cURL or something similar and then your PHP could return JSON encoded results?
Failing that, you might check out:
https://github.com/dylang/jobvite

Shall I use javascript for page submission?

I am working on a big site, and in the site there is a search module. Searching is done by using a a lot of user submitted values, so in pagination I must pass all these data to the next page, appending the values to url make the url very big.
Sso how can I solve this issue? I am planning to use a javascript based page submission (POST) with all the values in hidden fields to the next page the read all the values from the next page.
Will it cause any problems? Or should I use database to keep the search criterias?
I would create a server side object, possibly with a database backend which is updated by the different pages.
It is at my opinion the most clear and easy solution. Giving parameters from page to page, either by post or javascript or cookie will work too but it's more of a quirk in my experience.
Also if a search query is so complex that it needs multiple pages to create it, it might be helpfull for the user to have all the data stored on the server so he can change it more easily by switching back and forth between the different pages.
I would store all the search criterias in some kind of session-store on the server when the initial search is being triggered.
For pagination I would retrieve the criterias from the session-store and then just show the appropriate results. Also I would append some kind of key to the pagination links (so this would be the only hidden post-field) under which the search criterieas can be found.
Even though the session is per user, you might have several search windows open within the same session, and you don't want to mess them up with the pagination.
In order to make a reliable search with pagination, we need to do a bit more than normal.
We need to handle the following cases.
Once search is done, user may choose to do browser back and forward. Here, if you are doing form submission on every page, it would be an overload. Also, if user presses browser refresh button, it will unnecessarily warn him that data is being submitted.
Searching on a large database with lots of criteria is costly. Hence, optimization is important.
So you should NOT do the following:
Submit data on every page change
Not store data in cookie. (This is not secure and not even reliable.)
For large database with complex query, cache the result in session.
In case, you need very up-to-date and real-time result, ignore point (3) and try doing partial search for every page.
Thus, for your case, you can do the following:
When user searches first time, make the form POST data to a search page.
This search page will store the search query in session and generate a unique id for it.
Now render the result page. The result page will be passed the search id (generated in point 2) and the page number. Example result.aspx?searchId=5372947645&page=2
The result page will puck up the query from session using the searchId and then provide result based on the page number sent.
Using hidden fields and POST method should be fine too unless you are able to get them on the next page right.
To supplement Sarfraz's answer...
It's not necessary to use Javascript to make a POST.
<form action="destination_url" method="POST">
...
</form>

Categories