Suggestion: Single Page application architecture issue - javascript

I have written a web app (Single Page application) which has only frontend technologies involved (Vuejs) and when I compile it, it will ultimately generate web pages (only HTML and JS). I can run this app anywhere by opening the index page.I am consuming REST API powered by oAuth on this SPA (making direct Ajax call to REST API endpoints).
But the problem is, My lead developer is saying the SPA must be powered by back-end service (Server) for example nodejs, apache. And the backend should make call to the REST APIs not directly Ajax calls from the browser (Frontend JS ajax). My SPA app runs anywhere and works perfectly on browsers even without any server.
My question is, do I really need to render and run my SPA using webserver, whats the reasons behind making my SPA (Plain html, js) app server powered??
Also please suggest me, if people simply write app using JS and HTML (pure front end) and upload on the server and point a domain name to that html-js web app which will be consuming remote REST APIs.
Thank you for making my doubts clear in advance.
I have remote REST API provider, suggest me best way to write an SPA to consume that remote APIs.

There may be some reasons to setup a back-end service, for example:
Hide REST API endpoints
Setup your own caching / throttling / failovers etc. to REST API endpoints
Override / control REST API responses / requests
Still, you can use only pure html+js SPA, but adding back-end service gives you additional options, not possible to achieve on front-end.

Related

Will a separate backend server with Next.js frontend affect the benefits of Next.js (SSR, SEO friendly, etc)?

Currently I have a backend express server that is used by our mobile apps (iOS and Android). I want to use Next.js for our web app version of the same application for the benefits that its been praised for (SSR, SEO, etc). I don't want to rewrite our current backend api onto the Next.js server. I just want to use Next.js as the frontend while it sends request to our current server (ideally I want to keep the backend code in a single server). I've done a bit of digging and it's entirely possible, however will this render the benefits of Next.js useless so I might as well just use CRA?
I want to have an application that is the most SEO friendly and efficient for users. Wondering which option is the best in this case.

Single Page Application Web crawlers and SEO

I have created my blog as a single page application using mithril framework on the front end. To make queries I've used a rest API and Django at the backend. Since everything is rendered using javascript code and when the crawlers hit my blog all they see is an empty page. And to add to that whenever I share a post on social media for instance all Facebook sees is just an empty page and not the post content and title.
I was thinking of looking at the user agents and whenever the USER-AGENT is from a crawler I would feed it the rendered version of the pages but I'm having problems implementing the above method described.
What is the best practice to create a single page app that uses rest API and Django in the backend SEO friendly for web crawlers?
I'm doing this on a project right now, and I would really recommend doing it with Node instead of Python, like this:
https://isomorphic-mithril.mvlabs.it/en/
You might want to look into a server-side rendering of the page that crawlers visit.
Here is a good article on Client Side vs Server Side
I haven't heard of Mithril before, but you might find some plugins that does this for you.
https://github.com/MithrilJS/mithril-node-render
This might help you : https://github.com/sharjeel619/SPA-SEO
The above example is made with Node/Express but you can use the same logic with your Django server.
Logic
A browser requests your single page application from the server,
which is going to be loaded from a single index.html file.
You program some intermediary server code which intercepts the client
request and differentiates whether the request came from a browser or
some social crawler bot.
If the request came from some crawler bot, make an API call to
your back-end server, gather the data you need, fill in that data to
html meta tags and return those tags in string format back to the
client.
If the request didn't come from some crawler bot, then simply
return the index.html file from the build or dist folder of your single page
application.

NodeJs back end code structure

I'm quite new to web applications and have decided to create a single page web app hosted on Heroku.
My understanding of this web app is as follows:
Client side (AngularJs) has input text box, once button press it requests server side endpoint
Server (NodeJs) uses data from client to call external API (e.g imgur API) and returns json
Server processes json and responds to client with information
Client uses server response to render user interface
Main Concerns
Best practices for external API calling: Should I have an API wrapper class that allows me to call custom methods that return specific external api calls?
How should I handle http error responses?: I understand that NodeJs is async by nature and all http calls are done async as well. If there are multiple responses, error or success, how do I go about handling them all without doing a custom set of ".error()" and ."success()" methods for each call?
Furthermore
I cannot seem to find a good reference material for a simple NodeJs back end like the one I described. Please direct me if there are any.
I recommend looking at this Scotch.io article for creating a Single Page MEAN Application:
Setting Up a Single Page MEAN Application Starter Kit

Pure Javascript Front-end connecting to a BaaS (Can it be done?)

I am quite determined to do a pure Javascript front-end (Using JS and GWT) connecting to a back-end using Ajax on a separate server. My concern is with security.
What could be a solution for a Pure Front-end application?
For example, a user-generated content site:
When we look at it at a perspective of an app that to gain access to it it needs to ask user to login, so here Oauth can take over. The app is authenticated properly and access to any content is based on the authorization given.
The problem is here: For an application that can allow anonymous users to view user generated content without logging in thus there is no chance for Oauth to take place.
Connecting to a BaaS:
There will be no Java middleware to store application key for Baas access (e.g. Kinvey etc.)
Even if obfuscated the application key can easily be snooped from the HTTP requests.
What could be a solution for a Pure Javascript front-end to connect to a BaaS or independent backend? In terms of securing application keys? Where Baas or independent backend can know if it is to serve data to the requesting client (even its a web app) since its not from the same domain.

Secure way of persisting an auth token in a single page js application

Our scenario:
Our solution consists of an MVC app which serves a single page javascript application, and an Asp.Net WebAPI app which is intended for use both as a standalone api and as a source of data for the SPA.
We have set everything up so that the two apps will share auth tokens and membership, so if we are logged in to the SPA then the same formsauthentication cookie will also allow us access to the API.
This works fine if you make API requests in the browser address bar, but not through AJAX. We have followed examples of setting up basic authentication using Thinktecture and if we hardcode username\password as an authentication header for our ajax calls then this works fine also.
My question is however, what is the correct way of persisting these details on the client side? Our only real solution so far would be to send down the base 64 hash of the username\password as part of the initial load of the SPA and then pull this out when needed. This seems insecure however.
So basically, just wondering what the 'correct' approach is in this situation... are we close or is there another approach that we have overlooked?
Thanks!
We're using the session token support from Thinktecture.IdentityModel and then making the token available to the client via a dynamically generated script.
Full details at http://ben.onfabrik.com/posts/dog-fooding-our-api-authentication
I also published a sample application demonstrating these concepts at https://github.com/benfoster/ApiDogFood.

Categories