I'm wondering if there's a JS web framework with transparent/autogenerated data layer? To save my time, so I don't have to write all the code by myself.
Let's say I'm writing React Blog app, and communicate with the server using REST API.
I need to write two pieces of data layer, the server side, and its client on the client side.
Something like code below:
1 The business logic itself, resides on the server side:
class BlogAPI {
async getPosts() {
const records = await db.query('select * from posts')
return convertRecordsIntoPostObjects(records)
}
}
2 Exposing business logic as REST API and writing its client on the client side:
// REST API, this code also resides on the server side.
httpServer.get('/posts', () => blogApi.getPosts())
// And writing a client for it, that code resides on the client side.
class BlogAPIClient {
async getPosts() {
http.get('http://server-api-endpoint.com/posts')
}
}
I'm wondering if there are web frameworks where part 2 is somehow magically autogenerated, so I can save my time and write only
the business logic.
P.S. Don't mention GraphQL, it's not what I'm asking.
Meteor looks like what you are looking for: https://www.meteor.com/tutorials/react/collections . Also you may wish to take a look at Parse Server https://docs.parseplatform.org/parse-server/guide/#getting-started (and maybe also Firebase, but it's neither customizable nor self-hosted/opensource)
Related
I have a Java Spring Application with a Tomcat server that listen on kafka topic. I want to display all messages in a real-time mode on the web page. Therefore, when a kafka messages is arrived in the backend I want to see it on my web page. I don't know a good approach to push kafka message directly to the front-end and display it on web page. Is someone could help my with a solution and some examples that could help? Thanks!
I have implemented a system like this in Java for my last employer, albeit not with Spring/Tomcat. It was consuming messages from Kafka and serving them on a web socket to be displayed in the browser. The approach I followed was to use akka-stream-kafka and akka-http for web-socket support. The benefit of that is both are based on akka-streams which makes it an easy fit for streaming data.
While you can embed akka-http in your spring app running inside tomcat, it may not feel the most natural choice any more as spring framework already has its own support for both kafka and websockets. However, if you're not familiar with either, then jumping on the akka approach may be easiest and the core logic goes along these lines (I can't share the code from work so have just put this together from the examples in the docs, not tested):
public Route createRoute(ActorSystem system) {
return path("ws", () -> {
ConsumerSettings<byte[], String> consumerSettings = ConsumerSettings.create(system, new ByteArrayDeserializer(), new StringDeserializer())
.withBootstrapServers("localhost:9092")
.withGroupId(UUID.randomUUID().toString()) //this is so that each client gets all messages. To be able to resume from where a client left off in case of disconnects, you can generate in on the client side and pass in the request
.withProperty(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest")
return handleWebSocketMessages(
Flow.fromSinkAndSourceCoupled(
Sink.ignore(),
Consumer.committableSource(consumerSettings, Subscriptions.topics("topic1"))
.map(msg -> TextMessage.create(msg.record().value()))
)
);
}
}
To expose this route you can follow the minimalistic example, the only difference being the route you define needs the ActorSystem:
final Http http = Http.get(system);
final ActorMaterializer materializer = ActorMaterializer.create(system);
final Flow<HttpRequest, HttpResponse, NotUsed> routeFlow = createRoute(system).flow(system, materializer);
final CompletionStage<ServerBinding> binding = http.bindAndHandle(routeFlow,
ConnectHttp.toHost("localhost", 8080), materializer);
Once you have your messages published to the websocket, the front end will code will of course depend on your UI framework of choice, the simplest code to consume ws messages from javascript is:
this.connection = new WebSocket('ws://url-to-your-ws-endpoint');
this.connection.onmessage = evt => {
// display the message
To easily display the message in the UI, you want the format to be something convenient, like JSON. If your Kafka messages are not JSON already, that's where the Deserializers in the first snippet come in, you can convert it to a convenient JSON string in the Deserializer or do it later on in the .map() called on the Source object.
Alternatively, if polling is an option you can also consider using the off-the-shelf Kafka Rest Proxy, then you only need to build the front-end.
I'm using Next.js, and I have a custom server using Express. I have a page that requires some data from the database.
getInitialProps(), when running on the server, could just grab the data from the database and return it, without any problems.
However, getInitialProps() can also run on the client side (when the user initially requests a different page, then navigates to this one). In that case, since I'm on the client side, I obviously can't just fetch the data from the database - I have to use AJAX to talk to the server and ask it to retrieve it for me.
Of course, this also means that I have define a new Express route on the server to handle this request, which will contain exactly the same code as the server-side part of getInitialProps(), which is very undesirable.
What's the best way to handle this?
getInitialProps() always receives the request and response as parameters which are only set on the server:
static async getInitialProps({req}){
if(req){
// called on server
} else {
// called on client
}
}
https://github.com/zeit/next.js#fetching-data-and-component-lifecycle
Since no good solution seemed to have existed, I have created and published a library to provide a simple and elegant solution to this problem: next-express.
In your getInitialProps you should be making a http request to a new express route that has your logic for fetching from the database. That logic should never live in the UI layer.
This route should then be called regardless of whether you are on the client or on the server - you don't need to do any code branching.
Make an API distinct from your next.js app. Think of the next app as a frontend client that happens to render pages on the server
With time new solutions come around.
Nextjs has introduced a new method getServerSideProps primarily for such use cases
getServerSideProps only runs on server-side and never runs on the browser.
For me, the quickest way I found is to get the data from __NEXT_DATA__
MyApp.getInitialProps = async (): Promise<AppCustomProps> => {
const isInBroswer = typeof window !== 'undefined';
if (isInBroswer) {
const appCustomPropsString =
document.getElementById('__NEXT_DATA__')?.innerHTML;
if (!appCustomPropsString) {
throw new Error(`__NEXT_DATA__ script was not found`);
}
const appCustomProps = JSON.parse(appCustomPropsString).props;
return appCustomProps;
}
// server side, where I actually fetch the data from db/cms and return it
}
I'm an html5 developer with mainly JavaScript experience. I'm starting to learn the backend using Node.js. I don't have a particular example of this question/requirements. I'd like to call a back end function with JavaScript, but I'm not sure how. I already researched events and such for Node.js, but I'm still not sure how to use them.
Communicating with node.js is like communicating with any other server side technology.. you would need to set up some form of api. What kind you need would depend on your use case. This would be a different topic but a hint would be if you need persistent connections go with web sockets and if you just need occasional connections go with rest. Here is an example of calling a node function using a rest api and express.
var express = require('express');
var app = express();
app.post('/api/foo', foo);
function foo(req, res){
res.send('hello world');
};
app.listen(3000);
From the frontend you can post to this REST endpoint like so.
$.post("/api/foo", function(data) {
console.log( "Foo function result:", data );
});
If you're just starting with node-js, don't worry about Websockets just yet.
You're going to want to create a REST API (most likely) depending on what you're trying to accomplish. You can put that REST API behind some kind of authentication if desired.
A REST API is going to have endpoints for creating/deleting/updating and getting (finding) a document, like a given user.
My recommendation is to work backwards from something that's already working. Clone this app locally and check out the controllers to see examples of how this application interacts with creating users.
https://github.com/sahat/hackathon-starter
Once you create a controller that returns data when a client hits an endpoint (like http://localhost:3000/user/create ) , you'll want to create some HTML that will interact with endpoint through a form HTML element. Or you can interact with that endpoint with Javascript using a library like jQuery.
Let me know if that makes sense to you. Definitely a good starting point is to clone that app and work backwards from there.
Can I suggest trying api-mount. It basically allows calling API as simple functions without having to think about AJAX requests, fetch, express, etc. Basically in server you do:
const ApiMount = apiMountFactory()
ApiMount.exposeApi(api)
"api" is basically an object of methods/functions that you are willing to call from your web application.
On the web application you then do this:
const api = mountApi({baseUrl: 'http://your-server.com:3000'})
Having done that you can call your API simply like this:
const result = await api.yourApiMethod()
Try it out. Hope it helps.
Let's say I want to create a ToDo list using angular. I have a REST API that stores the items in db and provides basic operations. Now when I want to connect my angular app to the REST api I found two ways to do so following some tutorials online:
1.Data gets handled in backend only: A service gets created that has a getAllTodos function. This function gets directly attached to scope (e.g. to use it in ng-repeat):
var getAllTodos = function() {
//Todo: Cache http request
return $http...;
}
var addTodo = function(todo) {
//Todo: Clear cache of getAllTodos
$http...
}
2.Data gets handled in frontend too. There is a init function that initializes the todo variable in the service.
var todos = [];
var init = function() {
$http...
todos = //result of $http;
};
var getAllTodos = function() {
return todos;
};
var addTodo = function(todo) {
$http...
todos.push(todo);
}
I've seen both ways in several tutorials but I'm wondering what would be the best way? The first one is used in many tutorials where the author from the start has in mind to attach it to a REST API. The second one is often used when the author at first wants to create the functionality in the frontend and later wants to store data permanently using a backend.
Both ways have its advantages and disadvantages. The first one reduces code duplication in frontend and backend, the second one allows faster operations because it can be handled frontend first and the backend can be informed about changed afterwards.
//EDIT: Frontend is Angular.JS Client for me, backend the REST API on the server.
Separation of Frontend and Backend is often done for security reasons. You can locate Backend on a separate machine and then restrict access to that machine to only calls originating from the Frontend. The theory is that if the Frontend is compromised, the Backend has a lower risk factor. In reality if someone has compromised any machine on your network then the entire network is at risk on one level or another.
Another reason for a Backend/Frontend separation would be to provide database access through the Backend to multiple frontend clients. You have a single Backend with access to the DB and either multiple copies of the Frontend or different Frontends that access the Backend.
Your final design needs to take into account the possible security risks and also deployment and versioning. With the multiple-tier approach you can deploy individual Frontends without having to drop the Backend, and you can also "relocate" parts of the application without downtime. The more flexible the design of your application, the deployment may be more complicated. The needs of your application will depend on if you are writing a simple Blog or a large Enterprise application.
You need frontend and backend functionality. In frontend you preprape data which are being send and in the backend you make request to server.
Throughout my prior experience with web development, I've always been afloat the LAMP stack. As of recent I've become infatuated with the MEAN stack, and have built a couple of neat little applications. Moving onward, I heard Meteor mentioned over an IRC chat and decided to jump on the bandwaggon.
For the past week, I've been trying to set up an application within Meteor. I start planning concise structures and documenting control flow, but when it gets to the actual application, I start to lose myself during development. My problem standing, is I don't truly understand isomorphic javascript.
Using Meteor, when exactly is server code executed? Server code is beyond the scope of the client, as it should be for security purposes, but then how is it actually isomorphic? I know of mongo-client, and I find it pretty nifty. Though how exactly should I be structuring an application within Meteor?
Let's say I'm creating a controller for "users", and this controller will be loaded into two possible solutions.
./server/app.js
$scope.controller = { };
if( !$scope.session.exists )
{
$scope.controller =
{
authenticate: function(email, password) {
// authenticate credentials and create session
},
create: function(form) {
// create new user
}
};
}
else
{
$scope.controller =
{
end: function(call) {
try {
call();
} catch(err) {
// do nothing, optional callback
}
// end user session
}
};
}
How will the above code work in coherence with my client code? If I create a simple login form
./client/login.html
<template name="login">
<form>
<input type="text" name="email" placeholder="Email Address">
<input type="password" name="password" placeholder="Password">
<button type="submit">Login</button>
</form>
</template>
./client/login.js
Template.login.events({
// handle the form submission
'submit form': function(event) {
// stop the form from submitting
event.preventDefault();
// ???
// event.target.email.value
// event.target.password.value
}
});
How will this template event make use of $scope.controller.authenticate? Well, it can't, because authenticate is exclusively on the server. Hence, I would need to move the authentication process to the client, which wouldn't make any sense, therefore making my brain numb at the ideology of isomorphic javascript.
I know these questions are somewhat vague, but to put it short and simple, what is the extent of the relationship between server sided and client sided code within Meteor?
Foreword: It seems that you are using Angular on the backend, but Meteor is supposed to be self sufficient out of the box (you can of course integrate other frameworks like Angular, but it is sometimes hacky and not required).
In Meteor the "communication" between client and server works in two ways:
DB reactivity: Depending on your allow/deny rules for each Collection, the client can directly interact with the client side DB (miniMongo). Such interactions (delete, update, etc) will be replicated on the server DB (and validated against your allow/deny rules for security). Be careful, any app comes with the insecure package installed which allows for total control on the DB from the client (for prototyping purpose, but this can be confusing).
Methods (best way IMO, and complementary): This is the part where Meteor may be truly isomorphic. Methods can be declared on server only, to be accessed from the client, or on both client and server (in a file located outside /client and /server for example). In the first case the client will call the method and wait for the answer from the server in the Meteor.call callback, in the second case, the client will call the method but the method will be run on both client and server, server actually doing the job and client simulating the method in order to compensate the latency between the client and the server (and give the impression of instantaneity to the user). If the result on the Server is different (i.e. different data after modification) the client's method result will be invalidated and reverted.
Methods work like below (example without latency compensation), and I think it is what you are currently looking for:
//SERVER
Meteor.methods({
doSomething: function (params) {
//method logic
},
doSomethingElse: function (params) {
//method logic
}
});
//CLIENT
Meteor.call('doSomething', params, function (error, result) {
//get the error or the result of the call
});
You have to remember that Meteor is really about data synchronization through publish/subscribe. lots of what you will do inside methods will be translated in a DB modification and then synchronized to the client through reactivity (provided that these data are published on the client).
So depending on your publications/subscriptions you may not need to handle servers calls and results like in a usual application, i.e. you may call the method and do nothing else as the data will be automatically synchronized through a publication.