Let's say I want to create a ToDo list using angular. I have a REST API that stores the items in db and provides basic operations. Now when I want to connect my angular app to the REST api I found two ways to do so following some tutorials online:
1.Data gets handled in backend only: A service gets created that has a getAllTodos function. This function gets directly attached to scope (e.g. to use it in ng-repeat):
var getAllTodos = function() {
//Todo: Cache http request
return $http...;
}
var addTodo = function(todo) {
//Todo: Clear cache of getAllTodos
$http...
}
2.Data gets handled in frontend too. There is a init function that initializes the todo variable in the service.
var todos = [];
var init = function() {
$http...
todos = //result of $http;
};
var getAllTodos = function() {
return todos;
};
var addTodo = function(todo) {
$http...
todos.push(todo);
}
I've seen both ways in several tutorials but I'm wondering what would be the best way? The first one is used in many tutorials where the author from the start has in mind to attach it to a REST API. The second one is often used when the author at first wants to create the functionality in the frontend and later wants to store data permanently using a backend.
Both ways have its advantages and disadvantages. The first one reduces code duplication in frontend and backend, the second one allows faster operations because it can be handled frontend first and the backend can be informed about changed afterwards.
//EDIT: Frontend is Angular.JS Client for me, backend the REST API on the server.
Separation of Frontend and Backend is often done for security reasons. You can locate Backend on a separate machine and then restrict access to that machine to only calls originating from the Frontend. The theory is that if the Frontend is compromised, the Backend has a lower risk factor. In reality if someone has compromised any machine on your network then the entire network is at risk on one level or another.
Another reason for a Backend/Frontend separation would be to provide database access through the Backend to multiple frontend clients. You have a single Backend with access to the DB and either multiple copies of the Frontend or different Frontends that access the Backend.
Your final design needs to take into account the possible security risks and also deployment and versioning. With the multiple-tier approach you can deploy individual Frontends without having to drop the Backend, and you can also "relocate" parts of the application without downtime. The more flexible the design of your application, the deployment may be more complicated. The needs of your application will depend on if you are writing a simple Blog or a large Enterprise application.
You need frontend and backend functionality. In frontend you preprape data which are being send and in the backend you make request to server.
Related
I am sorry for how I have framed the question in the title but I have started programming very recently so once again, I am really sorry.
I am developing a project with React js as my front-end and node js as my backend, I have been successful in doing some basic test api calls to confirm the connection between the two but now, how am I supposed to actually process different actions. For example, while a user is logging in, I need to first check if they are an existing user or not, sign them in if they are not, deleting a user account, changing username, etc.
I tried very hard to look for relevant articles but all I can find are basic "one-time" api calls, what am I supposed to do for an entire batch of operations? From what I have understood, the process of sending a request from React to getting it processed in Node js is like this:
react js ======>
(request for operation) node js ======>
(process the operation) ======>
send a response back to react
Please correct me if there are any mistakes in my question...
Thanks a lot!
This question is really broad, so I'm going to focus in on this part at a high level:
I tried very hard to look for relevant articles but all I can find are basic "one-time" api calls, what am I supposed to do for an entire batch of operations?
I'm guessing when you talk about requests and APIs you're talking about HTTP requests and REST APIs, which are fundamentally "stateless" kinds of things; HTTP requests are self-contained, and the backend doesn't have to keep track of any application state between requests in order to speak HTTP.
But backend applications usually do maintain state, often in the form of databases. For example, many REST APIs expose CRUD (create, read, update, and delete) operations for important business entities — in fact, Stack Overflow probably does exactly this for answers. When I post this answer, my browser will probably send some kind of a "create" request, and when I inevitably notice and fix a typo I will send some kind of "update" request. When you load it in your browser, you will send a "read" request which will get the text of the answer from Stack Overflow's database. Stack Overflow's backend keeps track of my answer between the requests, which makes it possible to send it to many users as part of many different requests.
Your frontend will have to do something similar if your project does things which involve multiple interdependent requests — for example, you would need to keep track of whether the user is authenticated or not in order to show them the login screen, and you would need to hold onto their authentication token to send along with subsequent requests.
Edit: Let's walk through a specific example. As a user, I want to be able to create and read notes. Each note should have a unique ID and a string for its text. On the backend, we'll have a set of CRUD endpoints:
var express = require(“express”);
var router = express.Router();
class Note {
constructor(id, text) {
this.id = id;
this.text = text;
}
}
// This is a quick and dirty database!
notes = {};
router.put("/note/:id", function(req, res) {
notes[req.params.id] = "hello"; // TODO: accept text as well
res.send("");
});
router.get(“/note/:id”, function(req, res) {
res.send(notes[req.params.id].text);
});
module.exports = router;
Then on the client side, one could create a note and then read it like this:
// this request will create the note
fetch("http://localhost:9000/note/42", { method: 'PUT', body: 'hello note!' });
// this request will read the note, but only after it is created!
fetch("http://localhost:9000/note/42")
.then(res => res.text())
.then(res => console.log(res));
I'm wondering if there's a JS web framework with transparent/autogenerated data layer? To save my time, so I don't have to write all the code by myself.
Let's say I'm writing React Blog app, and communicate with the server using REST API.
I need to write two pieces of data layer, the server side, and its client on the client side.
Something like code below:
1 The business logic itself, resides on the server side:
class BlogAPI {
async getPosts() {
const records = await db.query('select * from posts')
return convertRecordsIntoPostObjects(records)
}
}
2 Exposing business logic as REST API and writing its client on the client side:
// REST API, this code also resides on the server side.
httpServer.get('/posts', () => blogApi.getPosts())
// And writing a client for it, that code resides on the client side.
class BlogAPIClient {
async getPosts() {
http.get('http://server-api-endpoint.com/posts')
}
}
I'm wondering if there are web frameworks where part 2 is somehow magically autogenerated, so I can save my time and write only
the business logic.
P.S. Don't mention GraphQL, it's not what I'm asking.
Meteor looks like what you are looking for: https://www.meteor.com/tutorials/react/collections . Also you may wish to take a look at Parse Server https://docs.parseplatform.org/parse-server/guide/#getting-started (and maybe also Firebase, but it's neither customizable nor self-hosted/opensource)
I have a Java Spring Application with a Tomcat server that listen on kafka topic. I want to display all messages in a real-time mode on the web page. Therefore, when a kafka messages is arrived in the backend I want to see it on my web page. I don't know a good approach to push kafka message directly to the front-end and display it on web page. Is someone could help my with a solution and some examples that could help? Thanks!
I have implemented a system like this in Java for my last employer, albeit not with Spring/Tomcat. It was consuming messages from Kafka and serving them on a web socket to be displayed in the browser. The approach I followed was to use akka-stream-kafka and akka-http for web-socket support. The benefit of that is both are based on akka-streams which makes it an easy fit for streaming data.
While you can embed akka-http in your spring app running inside tomcat, it may not feel the most natural choice any more as spring framework already has its own support for both kafka and websockets. However, if you're not familiar with either, then jumping on the akka approach may be easiest and the core logic goes along these lines (I can't share the code from work so have just put this together from the examples in the docs, not tested):
public Route createRoute(ActorSystem system) {
return path("ws", () -> {
ConsumerSettings<byte[], String> consumerSettings = ConsumerSettings.create(system, new ByteArrayDeserializer(), new StringDeserializer())
.withBootstrapServers("localhost:9092")
.withGroupId(UUID.randomUUID().toString()) //this is so that each client gets all messages. To be able to resume from where a client left off in case of disconnects, you can generate in on the client side and pass in the request
.withProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest")
return handleWebSocketMessages(
Flow.fromSinkAndSourceCoupled(
Sink.ignore(),
Consumer.committableSource(consumerSettings, Subscriptions.topics("topic1"))
.map(msg -> TextMessage.create(msg.record().value()))
)
);
}
}
To expose this route you can follow the minimalistic example, the only difference being the route you define needs the ActorSystem:
final Http http = Http.get(system);
final ActorMaterializer materializer = ActorMaterializer.create(system);
final Flow<HttpRequest, HttpResponse, NotUsed> routeFlow = createRoute(system).flow(system, materializer);
final CompletionStage<ServerBinding> binding = http.bindAndHandle(routeFlow,
ConnectHttp.toHost("localhost", 8080), materializer);
Once you have your messages published to the websocket, the front end will code will of course depend on your UI framework of choice, the simplest code to consume ws messages from javascript is:
this.connection = new WebSocket('ws://url-to-your-ws-endpoint');
this.connection.onmessage = evt => {
// display the message
To easily display the message in the UI, you want the format to be something convenient, like JSON. If your Kafka messages are not JSON already, that's where the Deserializers in the first snippet come in, you can convert it to a convenient JSON string in the Deserializer or do it later on in the .map() called on the Source object.
Alternatively, if polling is an option you can also consider using the off-the-shelf Kafka Rest Proxy, then you only need to build the front-end.
Throughout my prior experience with web development, I've always been afloat the LAMP stack. As of recent I've become infatuated with the MEAN stack, and have built a couple of neat little applications. Moving onward, I heard Meteor mentioned over an IRC chat and decided to jump on the bandwaggon.
For the past week, I've been trying to set up an application within Meteor. I start planning concise structures and documenting control flow, but when it gets to the actual application, I start to lose myself during development. My problem standing, is I don't truly understand isomorphic javascript.
Using Meteor, when exactly is server code executed? Server code is beyond the scope of the client, as it should be for security purposes, but then how is it actually isomorphic? I know of mongo-client, and I find it pretty nifty. Though how exactly should I be structuring an application within Meteor?
Let's say I'm creating a controller for "users", and this controller will be loaded into two possible solutions.
./server/app.js
$scope.controller = { };
if( !$scope.session.exists )
{
$scope.controller =
{
authenticate: function(email, password) {
// authenticate credentials and create session
},
create: function(form) {
// create new user
}
};
}
else
{
$scope.controller =
{
end: function(call) {
try {
call();
} catch(err) {
// do nothing, optional callback
}
// end user session
}
};
}
How will the above code work in coherence with my client code? If I create a simple login form
./client/login.html
<template name="login">
<form>
<input type="text" name="email" placeholder="Email Address">
<input type="password" name="password" placeholder="Password">
<button type="submit">Login</button>
</form>
</template>
./client/login.js
Template.login.events({
// handle the form submission
'submit form': function(event) {
// stop the form from submitting
event.preventDefault();
// ???
// event.target.email.value
// event.target.password.value
}
});
How will this template event make use of $scope.controller.authenticate? Well, it can't, because authenticate is exclusively on the server. Hence, I would need to move the authentication process to the client, which wouldn't make any sense, therefore making my brain numb at the ideology of isomorphic javascript.
I know these questions are somewhat vague, but to put it short and simple, what is the extent of the relationship between server sided and client sided code within Meteor?
Foreword: It seems that you are using Angular on the backend, but Meteor is supposed to be self sufficient out of the box (you can of course integrate other frameworks like Angular, but it is sometimes hacky and not required).
In Meteor the "communication" between client and server works in two ways:
DB reactivity: Depending on your allow/deny rules for each Collection, the client can directly interact with the client side DB (miniMongo). Such interactions (delete, update, etc) will be replicated on the server DB (and validated against your allow/deny rules for security). Be careful, any app comes with the insecure package installed which allows for total control on the DB from the client (for prototyping purpose, but this can be confusing).
Methods (best way IMO, and complementary): This is the part where Meteor may be truly isomorphic. Methods can be declared on server only, to be accessed from the client, or on both client and server (in a file located outside /client and /server for example). In the first case the client will call the method and wait for the answer from the server in the Meteor.call callback, in the second case, the client will call the method but the method will be run on both client and server, server actually doing the job and client simulating the method in order to compensate the latency between the client and the server (and give the impression of instantaneity to the user). If the result on the Server is different (i.e. different data after modification) the client's method result will be invalidated and reverted.
Methods work like below (example without latency compensation), and I think it is what you are currently looking for:
//SERVER
Meteor.methods({
doSomething: function (params) {
//method logic
},
doSomethingElse: function (params) {
//method logic
}
});
//CLIENT
Meteor.call('doSomething', params, function (error, result) {
//get the error or the result of the call
});
You have to remember that Meteor is really about data synchronization through publish/subscribe. lots of what you will do inside methods will be translated in a DB modification and then synchronized to the client through reactivity (provided that these data are published on the client).
So depending on your publications/subscriptions you may not need to handle servers calls and results like in a usual application, i.e. you may call the method and do nothing else as the data will be automatically synchronized through a publication.
I just read this post, and I do understand what the difference is. But still in my head I have the question. Can/Should I use it in the same App/Website? Say I want the AngularJs to fetch content and update my page, connecting to a REST api and all of that top stuff. But on top of that I also want a realtime chat, or to trigger events on other clients when there is an update or a message received.
Does Angular support that? Or I need to use something like Socket.io to trigger those events? Does it make sense to use both?
If someone could help me or point me to some good reading about that if there is a purpose for using both of them together.
Hope I'm clear enough. thank you for any help.
Javascript supports WebSocket, so you don't need an additional client side framework to use it. Please take a look at this $connection service declared in this WebSocket based AngularJS application.
Basically you can listen for messages:
$connection.listen(function (msg) { return msg.type == "CreatedTerminalEvent"; },
function (msg) {
addTerminal(msg);
$scope.$$phase || $scope.$apply();
});
Listen once (great for request/response):
$connection.listenOnce(function (data) {
return data.correlationId && data.correlationId == crrId;
}).then(function (data) {
$rootScope.addAlert({ msg: "Console " + data.terminalType + " created", type: "success" });
});
And send messages:
$connection.send({
type: "TerminalInputRequest",
input: cmd,
terminalId: $scope.terminalId,
correlationId: $connection.nextCorrelationId()
});
Usually, since a WebSocket connection is bidirectional and has a good support, you can also use it for getting data from the server in request/response model. You can have the two models:
Publisher/Subscriber: Where the client declares its interest in some topics and set handlers for messages with that topic, and then the server publish (or push) messages whenever it sees fit.
Request/response: Where the client sends a message with a requestID (or correlationId), and listen for a single response for that requestId.
Still, you can have both if you want, and use REST for getting data, and WebSocket for getting updates.
In server side, you may need to use Socket.io or whatever server side framework in order to have a backend with WebSocket support.
As noted in the answer in your linked post, Angular does not currently have built-in support for Websockets. So, you would need to directly use the Websockets API, or use an additional library like Socket.io.
However, to answer your question of if you should use both a REST api and Websockets in a single Angular application, there is no reason you can't have both standard XmlHttpRequest requests for interacting with a REST api, using $http or another data layer library such as BreezeJS, for certain functionality included in various parts of the application and also use Wesockets for another part (e.g. real time chat).
Angular is designed to assist with handling this type of scenario. A typical solution to would be to create one or more controllers to handle the application functionality and update your page and then creating separate Services or Factories that encapsulate the data management of each of your data end points (i.e. the REST api and the realtime chat server), which are then injected into the Controllers.
There is a great deal of information available on using angular services/factories for managing data connections. If you're looking for a resource to help guide you on how to build an Angular application and where data services would fit in, I would recommend checking out John Papa's AngularJS Styleguide, which includes a section on Data Services.
For more information about factories and services, you can check out AngularJS : When to use service instead of factory