Well, I'm kinda stuck in a ( maybe simple ) problem. I would like to use a database / cloud sever with a Nest.Js API for a main purpuse to store images. It's my firt time experimenting with such concept and I would like to know first of all the 'best' solution. For example, is it ok to use something like MongoDB or Firebase Storage? If not, what would be the ideal solution to follow? Is it even good to use Nest for such a thing?
More specificly; ( How I think / plan for the project )
I would like to use a nonSQL database ( pref. MongoDB ) to store some data about a certain object. In this object I want a photo as well as part of it. So I am pretty sure I can't mix these types of data ( storing strings/numbers/etc AND images ) so maybe I could store a location to another DB ( Collection ) / Cloud Server and retrieve it from there.
Is this practical? How is this consept suposed to work?
So far I've managed to obdain to store images with the help of Multer in a local /uploads folder. But I don't really like this solution. After a certain amount of images posted my server will ( probably ) run slower and slower ( ???? ).
Sorry for the silly question. But I'm kinda stuck here for a few days... :(
The way you're doing it with filesystem is perfectly fine. Behind the hood, tools like mongodb are implemented using a file system anyways. Using server's file system can pose scalability issues when you need to spread application load over multiple servers, but for a single server solution it will have the highest performance due to minimal IO overhead.
I recommend using an object storage system (like amazon s3 or gcp bucket), if you want to take scalability seriously. All your files will be stored on a single managed system designed to scale, and you'll be able to treat each of your severs as stateless (and therefore provision multiple and guarantee you can access all files no matter which server handles each request).
These would be the two "industry standard" solutions. Not to say you cannot do this with mongo or firebase but they probably aren't as good solutions. The most important thing however, is to avoid storing files on a relational database, so as long as you aren't doing that it will be fine.
I'd say you're overthinking this. Stick with file system until you know you have actual scalability issues. Nestjs is highly modular you can implement your app in a way such that other modules don't "know" the underlying implementation of how you store files, so you can comfortably write your code with images stored on file system now then move later if needed.
Lastly, Nestjs is just a framework for writing HTTP servers. It isn't going to be any worse or better than any other HTTP server out there.
Related
I'm trying to do an authentication system for a website that is supposed to be stored on an intranet. The guy told me he wanted the credentials of the users (max 10 users) to be stored on a single file with some kind of protection. Do you have suggestions.? I really tried to find information, maybe I am not searching for the right thing but everybody uses mondoDB in this kind of situation. Which is not stored locally.
be gentle please... I trying to do it in JavaScript.
When you mentioned a single file on the server, I thought of SQLite. As per the projects description:
The complete state of an SQLite database is usually contained in a single file on disk called the "main database file".
But passwords - even on an intranet server - should never be stored in plain text. And one-way hashing where the source code is also present does not sound like a good idea.
However, sqlcipher seems to solve both those issues. As a starter, you might want to setup a brief tester with plain SQLite.(Here is one simple tutorial).
From there you can look at a sample like this one which shows how sqlcipher could be used with a JavaScript engine like Node.js.
I've tried like that to make it easy
The Javascript Code
SOOOOO, i ended up using nedb... VOILA
My boss asked me to find a way to completely disassociate our front-end application from the back-end in the local environment, currently I'm the sole developer for both our back-end software and the front-end, so using Docker I'm able to mimic a production environment and work on both projects separately, (we don't render on the server side), his idea is to mock literally everything, so in theory you wouldn't need the back-end software to develop the front-end.
Two of the (more reasonable) solutions I've thought of are:
Mocking all of the network requests on the frontend, these functions will
run instead of network requests.
the problem with this approach is that it is not persistent, all of the data is randomly generated for every request, and in a system that is so oriented around forms, tables, and lists, I feel that getting the data you're expecting after a form submission is a must.
and in order to persist data, every request would have to go through some sort of data store (Mobx, Redux, etc...) and even then, if the page refreshes, the data is gone.
Initiating an express server and DB on top of Docker along with Webpack, and mimicking the production server requests and responses using db seeders, this way the front-end is persistent.
Obviously, this approach would generate plenty of work, and in order to make sure the express server is correctly mimicking the original back-end software, it too will need unit tests and mock requests.
While mocking the data is great for unit tests, this doesn't seem like the way of doing front-end with such a small team to me, is there a good approach to achieving this that I cant come up with or find? or is this an exercise in poor decoupling strategies?
What you are looking for is a Mock API. There's plenty of packages for it where you define example requests in a JSON format. A lot of these also handle persisting data for a short amount of time.
From a strategy perspective using these can actually make a lot of sense to automize end-to-end-tests, which shouldn't rely a production API. Whether it's the right choice of developer time in a one man team depends on the long-term perspective of course ;-)
I hava a node.js app on heroku thath checks for emails and then if the email match certian cliteria it would reply.
Because heroku restarts the dyno every so often I made a file that saves the ids of the emails (a small array) I've already cheaked (so it doesn't reply twice to the same email), but silly me, heroku restarts that file too, so no changes I made will be save.
Is there a way to save the file changes the app makes to it?
If you know of a better way of to do what I want to?
Heroku enforces this behavior (the local file deletion stuff) because it is a best practice. Writing files locally doesn't scale well, and can lead to some odd edge-case behaviors when you have multiple processes on the same VM all performing file I/O operations.
What you should use instead is either a database, a cache (like Redis), or even just write your file directly to a file storage service like Amazon S3.
I realize it sounds annoying that you have to do these extra things even for a simple use case like what you're doing here -- but Heroku's platform is geared around enforcing best practices to help people build scalable, reliable software.
If you're looking for a way to do stuff like this without the extra hassle, you might want to consider just purchasing a small VPS server from another company where you can have direct control over processes, disk, etc.
I have the following scenario: I'm doing a publication lookup tool so users can find documents through a search field and filters. Right now we are working with a small budget, so all the data is stored in a json file (~60 records). If the project is successfull, we will have a server with a database and a couple of thousand records.
I want to develop all the lookup solution using breeze, so later I will don't have to make many modifications. The problem is that I can't find information about querying a json file directly (without a server).
Do you think that this is possible?
Actually, it is possible. But I can't think of a way that is as simple as setting up a simple server. That's like following off a log with Visual Studio. Maybe you're coming from a different environment? I'd like to know. Even there, it's usually pretty easy to spin something up with some kind of http API that can return JSON.
If you only have 60 records, I'm guessing this is a prototype that you're trying to stand up in a hurry. You're so much in a hurry that you don't even want to use a server ... which is kind of odd because you need something to serve the HTML, CSS, and JavaScript files, right?
You could do it with node.js / express very easily; almost as simple as setting up an express route that reads and returns the JSON file. But that still involves a server running somewhere (the client's own machine?) and you'd have to learn some elementary node.js
You can do it entirely with HTML and JS script files and no server other than the file system.
Off the top of my head, I think I'd begin by writing a custom Breeze ajax adapter that is actually a mock: no matter what you ask of it, it returns the JSON data in its entirety.
You call this once at application start to load the entities into an EntityManager cache. Then make all subsequent queries be local queries. You can set the EntityManager default query strategy to turn all queries into local queries by default.
No matter what you do, you'll have to define metadata to describe the entity types in your JSON data. I'm guessing you only have one type so that should be simple and quick.
You'll also have to do something to tell Breeze what kind of entity you're querying. Adding .toType('Foo'); to the end of your queries may be sufficient. You can always delve into the JsonResultsAdapter if you need something fancier at a lower level of the stack.
None of this is hard. But none of it is Breeze 101 either. You're not following what we've thought of as a typical application development path. Maybe we're missing something. I'll be curious to see if people can relate to your situation.
I've made a server in nowjs, and with about 80 users online it get slow and sometimes people get disconnect. I've heared about that I have to change the workers count. But how to do it? and is it a solution? Or maybe there are another advices.
Since you mentioned writing log data to file and is larger, make sure you're using right Node asynch file i/o so is not blocking -- can use with optional callbacks. Better yet, creating a write stream is the way to go (Node is great for it's asynch file streaming capabilities).
You may have hit a scaling issue, 80 users seems low to me.
Are you sure you are not doing any kind of logic on your server side that could be blocking ?
Any math or something that require too much time ?
If you have a scaling issue, you may need to horizontally scale you app.
To do so you would have to use something like node cluster to have multiple workers handling the work, and a Redis or a Mongo used for handling shared the data, it might be possible to do using message in node cluster.
I've not push now.js that far yet. I don't know how it would handle in such a situation.