REST Calls from Javascript VS ASP.NET Controller REST Calls - javascript

I am new to the REST world. I am writing an ASP.NET MVC application. My requirement is to make a few REST calls from the client. I can either choose to make these REST calls from Javascript or I can do that in the C# code in the Controller. Which method is recommended? According to my understanding, the Controller runs on the Web Server and the Javascript runs on the browser. So is there any perf degradation if the REST calls are made from the Web Server.
Can someone suggest me the general practice around this? Are there any security gotchas for the same?
Thanks

Let us consider the pros and cons of doing this Server side
PROs:
You can do other processing on the data using the power of the server
You are not subject to cross domain limitations like you would be in ajax
Generally you do not have to worry about your server being able to access the resource, whereas on client you are at the mercy of users net restrictions, firewalls etc
More control over your http response\request lifecycle
CONS:
You will have to consume more bandwidth sending the resulting data down to the client.
You may have to do more work to leverage good caching practices
Dependant on having certain server side libraries\framework elements
Now although we have a much bigger list of Pros than cons... in the majority of cases you will still want to do this on the client... because the issue of double handling the data is actually a very big one, and will cost you both time and money.
The only reason you should actually do it server side is if you need to do extensive processing on the data, or you cannot circumvent CORS (cross domain) restrictions.
If you are just doing something simple like displaying the information on a webpage, then client side is preferable.

It strongly depends on your situation. If you simple display this data in your page without any actions, you can get it from javascript. If you want to work with this data, transform it, join it with other data or else, i recommend do this operations on server so get this data on server too.

Related

Client Side vs Server Side When GET data from API

I'm retrieving data from a movie API.
Now i can either do this in on client side (jQuery) or can make the http get call on the server side (Node.js).
Is there a best practice in doing this? Is one option faster than the other?
(I'm towards server side as I can hide my API key) but interested to know for certain.
Is there a best practice in doing this?
Not a general one
Is one option faster than the other?
Doing it server side allows results to be cached and shared between multiple clients. This might be faster.
Doing it server side allows the client to make one fewer HTTP requests. This might be faster.
Doing it client side allows it to be redone without reloading the whole page. This might be faster.
Doing it client side means it comes from a different computer which might be nearer or further from the server the request is being made to. This might be faster.
Concerning hiding your API key, if you are using NodeJS, it doesn't matter whether you make calls from client side or server side, you have the control of not exposing your API key.
Concerning performance, I would propose to check out different opinions on the internet for the topic "Client Side VS Server Side rendering". There are a lot of articles related to performance. This is one them. Hope it helps.

How to Stream data from WebAPI method to a JavaScript client

I have a service method written in ASP.Net WebAPI :http://diningphilospher.azurewebsites.net/api/dining?i=12
and JavaScript client gets the response and visualizes it here
But the nature of Dining Philosophers problem is I never know when the Dead-lock or starvation will happen. So Instead of having a request/response I would like to stream the data through service method and client side JavaScript read the data I assume JSON asynchronously. Currently several post directs me towards changing the default buffer limit in WebAPI so you get a streaming like behavior.
what other(easy or efficient) ways exist to achieve this above behavior.
You can return PushStreamContent from ASP.NET Web API and use Server Sent Events (SSE) JavaScript API on the client side. Check out Push Content section in Henrik's blog. Also, see Strathweb. One thing I'm not sure about the latter implementation is the use of ConcurrentQueue. Henrik's implementation uses ConcurrentDictionary and that allows you to remove the StreamWriter object from the dictionary corresponding to the clients who drop out, which will be difficult to implement using ConcurrentQueue, in my opinion.
Also, Strathweb implementation uses KO. If you don't like to use KO, you don't have to. SSE JavaScript APIs have nothing to do with KO.
BTW, SSE is not supported in IE 9 or lesser.
Another thing to consider is the scale out option. Load balancing will be problematic, in the sense there is a chance that the load will not be uniformly distributed, since clients are tied to the server (or web role) they hit first.

How does WebSockets scale compared to standard HTTP?

How does a site programmed using TCP (that is, someone on the site is connected to the server and exchanging information via TCP) scales compared to just serving information via AJAX? Say the information exchanged is the same.
Trying to clarify: I'm asking specifially about scale: I've read that keeping thousands of TCP connections is resources (which?) demanding, as compared to just serving information statically. I want to know if this correct.
WebSockets is a technology that allows the server to push notifications to the client. AJAX on the other hand is a pull technology meaning that the client is sending requests to the server.
So for example if you had an application which needed to receive notifications from the server at regular intervals and update its UI, WebSocket is more adapted and much better. With AJAX you will have to hammer your server with requests at regular intervals to see whether some state changed on the server. With WebSockets, it's the server that will notify the client for some event happening on the server. And this will happen in a single request.
So I guess it would really depend on the type of application you are developing but WebSockets and AJAX are two completely different technologies solving different kind of problems. Which one to choose would depend on your scenario.
Websockets are not a one-for-one with AJAX; they offer substantially different features. Websockets offers the ability to 'push' data to the client. AJAX works by 'pushing' data and returning a response.
The purpose of WebSockets is to provide a low-latency, bi-directional, full-duplex and long-running connection between a browser and server. WebSockets opens up possibilities with browser applications that were previously unavailable using HTTP or AJAX.
However, there is certainly an overlap in purpose between WebSockets and AJAX. For example, when the browser wants to be notified of server events (i.e. push) either AJAX or WebSockets are both viable options. If your application needs low-latency push events then this would be a factor in favor of WebSockets which would definitely scale better in this scenario. On the other hand, if you need to work with existing frameworks and deployed technologies (OAuth, RESTful API's, proxies, etc.) then AJAX is preferable.
If you don't need the specific benefits that WebSockets provides, then it's probably a better idea to stick with existing techniques like AJAX because this allows you to re-use and integrate with an existing ecosystem of tools, technologies, security mechanisms, knowledge bases that have been developed over the last 7 years.
But overall, Websockets will outperform AJAX by a significant factor.
I don't think there's any difference when it comes to scalability between WebSockets and standards TCP connnections. WebSocket is an upgrade from a static one way pipe into a duplex one. The physical resources are the exact same.
The main advantage of WebSockets is that they run over port 80, so it avoids most firewall problems, but you have to first connect over standard HTTP.
Here's a good page that clearly shows the benefits of the WebSocket API compared to Ajax long polling (especially on a large scale): http://www.websocket.org/quantum.html
It basically comes down to the fact that once the initial HTTP handshake is established, data can go back and forth much more quickly because the header overhead is greatly reduced (this is what most people refer to as bidirectional communication).
As an off note, if you only need to be able to push data from the server on a regular basis, but you don't need to make many client-initiated requests, then using HTML5 server-sent events with occasional Ajax requests from the client might be just what you need and much easier to implement then the WebSocket API.

Securing AJAX API

I have an API (1) on which I have build an web application with its own AJAX API (2). The reason for this is not to expose the source API.
However, the web application uses AJAX (through JQuery) go get new data from its AJAX API, the data retrieved is currently in XML.
Lately I have secured the main API (1) with an authorization algorithm. However, I would like to secure the web application as well so it cannot be parsed. Currently it is being parsed to get the hash used to call the AJAX API, which returns XML.
My question: How can I improve the security and decrease the possibility of others able to parse my web application.
The only ideas I have are: stop sending XML, but send HTML instead. Use flash (yet, this is not an option).
I understand that since the site is public, and no login can be implemented, it can be hard to refuse access to bots (non legitimate users). Also, Flash is not an option... it never is ;)
edit
The Web Application I am referring to: https://bikemap.appified.net/
This is somewhat of an odd request; you wish to lock down a system that your own web application depends on to work. This is almost always a recipe for disaster.
Web applications should always expect to be sidelined, so the real security must come from the server; tarpitting, session tokens, throttling, etc.
If that's already in place, I don't see any reason why should jump through hoops in your own web application to give robots a tougher time ... unless you really want to separate humans from robots ;-)
One way to reduce the refactoring pain on your side is to wrap the $.ajax function in a piece of code that could sign the outgoing requests (or somehow add fields to it) ... then minify / obscurify that code and hope it won't get decoded so fast.

How to restrict access to my web service?

I have http://example.com/index.html, which from within the HTML uses JavaScript (XmlHttpRequest) to call a web services at http://example.com/json/?a=...&b=...
The web service returns to index.html a JSON array of information to then be displayed on index.html.
Since anyone can view the source code for index.html and see how I'm calling the JSON web service (http://example.com/json/), how do I prevent people from calling my JSON web service directly?
Since the web service is essentially an open read into my database, I don't want people to abuse the web service and start fetching data directly from the web service, start DoS my server, fetching more information than they should, etc..
UPDATE:
Is there no way to limit requests for http://example.com/json/ to only come from the same server (IP) and URL request of http://example.com/index.html?
Meaning, can't http://example.com/json/ detect that the Requester is ($_SERVER['REQUEST_URI'] == http://example.com/index.html) and only allow that?
There are no easy way to prevent that. If your service isn't extremely popular and thus being likely target for denial of service attacks, I wouldn't bother.
One thing which came into my mind is using disposable tokens (valid for 10 or 100 requests).
Other (naive) approach is checking that X-Requested-With header exists in request, but of course that can be easily faked. So, my advice is: do nothing unless the problem is real.
One more idea: hash calc. The idea is to require client performing rather expensive calculation per every request, while validating the calculation in server side is cheap. For a single request the overhead is very small, but say for 1000 requests it may take significant amount of CPU time. I have no idea if hashcalc has been used to prevent DoS'ing web services. Some years ago it was proposed as antispam measure, but never became popular.
Answer is really simple, use CSRF protection. http://en.wikipedia.org/wiki/Cross-site_request_forgery
Simply, when user comes to your site (index.php), put in session:
CSRF=(RANDOM_HASH)
Ask for JSON request, example.com/json.php?hash=$_SESSION['CSRF']
And in json.php check if $_GET['hash'] matches $_SESSION['CSRF']
Simple as that...
It's Server-Side solution!
I would keep track of the IP addresses making the requests. If you ever saw a large number of requests coming from the same IP, you could block it or offer a CAPTCHA.
There are a wide array of things you can do to add security to your service. Check this out
You can't properly secure a web-service that is callable from client-side javascript.
You can try security through obscurity techniques like javascript obfuscation, but it won't stop someone motivated.

Categories