Angular - how to test Internet upload speed without backend? - javascript

I want to upload file into folder from which my Angular app is served while running on localhost. I'm not able to find any solution without using backend.
For example I just want to upload an image file and that file should copy in specified folder of the project. This should be done only with Angular without using any Backend script or hitting any API endpoint.

Depending on your webhost, you can make your assets-folder accessible via FTP.
Making a FTP-call from javascript (angular is javascript) isn't that difficult. And there are plenty of example and questions about it on the internet (like this)
Why you wouldn't do that:
The credentials for your ftp-connection will be accessible in the compiled javascript-code. With a little bit of effort, everyone can find it.
Each gate you open through the webhosts firewall, is a extra vulnerability. Thats why everybody will recommend you to add an API endpoint for uploading files so that you keep holding the strings of what may be uploaded.
Edit:
As I read your question again and all the sub-answers, I (think) figured out that you are building an native-like app with no back-end, just an angular-single page front-end application. An I can understand why (you can run this on every platform in an application that supports javascript), but the problem you are encountering is only the first of a whole series.
If this is the case, I wouldn't call it uploadingas you would store it locally.
But the good news is that you have localstoragefor your use to store temporary data on the HDD of the client. It isn't a very large space but it is something...

The assets folder is one of the statically served folders of the Angular app. It is located on the server so you can't add files to it without hitting the server (HTTP server, API, or whatever else...).
Even when running your app on localhost, there's a web server under the hood, so it behaves exactly the same than a deployed application, and you can't add files to the assets folder via the Angular app.
I don't know what exactly you want to do with your uploaded files, but:
If you want to use them on client side only, and in one user session, then you can just store the file in a javascript variable and do what you want with it
If you want to share them across users, or across user sessions, then you need to store them on the server, and you can't bypass an API or some HTTP server configuration

Based on your clarification in one of your comments:
I'm trying to develop a small speed test application in which user can upload any file from his system to check upload and download speed.
The only way to avoid having you own backend is to use 3rd party API.
There are some dedicated speed test websites, which also provide API access. E.g.:
https://myspeed.today
http://www.speedtest.net
https://speedof.me/api.html
Some more: https://duckduckgo.com/?q=free+speedtest+api
Note, that many of these APIs are paid services.
Also, I've been able to find this library https://github.com/ddsol/speedtest.net, which might indicate that speedtest.net has some kind of free API tier. But this is up to you to investigate.
This question might also be of help, as it shows using speedtest.net in React Native: Using speedtest.net api with React Native

You can use a third party library such ng-speed-test. For instance here is an Angular library which has an image hosted on a third party server (ie GitHub) to test internet speed.

Related

How to fetch data as .csv file from client with node.js application

i have written a game in javascript with the p5.js library. Now i want to host the game on a server to conduct a survey on a service like amazon turk. Ideally the clients recieve a URL to the game and play it while in-game actions are tracked and stored in node.js or on the server and exported as a .csv file once they are done playing. After they finish the game the csv. file should be sent automatically to a location that i can then access. I have zero experience in server hosting or similar topics.
So a couple questions arise:
Is a hosting service like Heroku suitable for hosting the game?
Do i need to use node.js to make this happen?
Which of those two would extract the data and store it to a csv? And where is the file stored?
How do i get or access the csv. after?
Any alternative takes to solve the problem?
Thanks alot in advance!
github repository: https://github.com/luuuucaaa/schaeffers-charade
game on github pages: https://luuuucaaa.github.io/schaeffers-charade/
If I were you, I would do it like below:
Host
Since your project is basically a html & JavaScript static contents,
AWS S3's static hosting would be sufficient (Also, the current git hub pages is another option if you just want to host it).
Hosting on node.js environment is also available using webpack serving, but it requires additional works. (but if you require other npm packages to generate .csv file, you need webpack anyway to bundle js file and attach it to html)
Data Storing
Two ways are considerable,
the first is to store it on the filesystem. Generate .csv via JS script within your app, and save it where the app is hosted (if you go with s3, you can access it afterwards, but I'm not sure if it can write objects by script)
The second is to post the data to another API endpoint. (for example building an API Gateway on AWS that triggers Lambda, which stores it on S3)
It's merely an example and I don't know exactly what you want to achieve, but take it into considerations. Good luck. Cool game BTW.

Is there a way to avoid SignalR when uploading files in Blazor Server-Side

I have a Blazor App (server-side) that is hosted in Azure as an App Service, I use Azure SignalR Service to be able to scale the amount of users being connected at the same time. I also have a page where you can upload files, the files might be large, even 1GB and more, I'm using Tewr.Blazor.FileReader (Nuget) to read the file in chunks and upload it to an API that saves the file to the server. But I was wondering if it's possible to avoid the SignalR part in some way when uploading the file, because for 2GB file, Azure SignalR breaks each message to 2KB/each, which means I hit the limit of 1M messages/day/unit, just for uploading 1 file.
Let me know of all the ways I can approach this problem, currently I've few solutions in mind:
Have separate SignalR App hosted as App Service instead of using Azure SignalR
iframe that page to ASP.NET Core App that only handles the transfer of the files. (Not sure if this will trigger SignalR traffic)
If you use standard DOM events (e.g. onchanged=someJavaSciptFunction) then the Browser will run JavaScript in response to that event.
If you set the appropriate JS script on your UI elements then you can have the JS make an HTTP request to your API. This will avoid SignalR completely.
You could upload the file to a Web API controller. You can even add the controller in the same project.
I think this approach is used by the DevExpress Blazor FileUpload.
But it should be easy to implement it yourself. You need some javascript that POSTs the file content to your Web API controller. This blog entry could help you.

Javascript frontend configuration at runtime

I have a vue.js SPA which I want to deploy on different servers without recompiling the frontend for each deploy. The SPA connects to a backend, with a url yet unknown to the spa. Is there a way I can dynamically tell the frontend at runtime where the backend is?
Many articles and forum threads suggest to just use different config files for different environments and build them in at build time, but that is not an option for me because I simply don't know where the frontend/backend will be deployed when building it.
EDIT: The project is an open source project, so I can't really make an assumption about how people will deploy it. I've always kind of "assumed" it would be deployed on a seperate sub domain, with the frontend being reachable at / and the backend with a proxy at /api because that's how I set up my server.
However, I've seen people deploying the api at a totally different sub domain (sometimes even with different ports) than the frontend or on a sub path or even a mixture between the two.
Things I've considered so far:
Putting the config in a conf.js which would then expose the backend url via window.config.backendUrl or similar and load that file in my index.html from a <script> tag
Slightly similar to 1.: Put the config in a config.json and making a fetch request to it once the application loaded, then exposing the result of it in window.config.backendUrl
Inserting a <script>window.config.backendUrl = 'http://api.example.com'</script> in my index.html.
Serving the frontend with a custom made web server (based on express or similar) which parses either env or a different config file and then creates the <script> tag from 3. dynamically
Always "assuming" where the backend will be with some kind of list to work up, like "First look at /api then at ./api then at api.current-host.com etc."
Bundling the frontend with the backend - that way I would always "know" where the backend relative to the frontend is.
Yet all of theses options seem to me a bit hacky, I think there has to be a better way.
My favourite is the third option because it is the best trade off between configurability and performance IMHO.
If I was in the same situation I would have considered the following 2 approaches:
Deploying a JSON file with the same name but with different contents - so the frontend can always fetch its configuration by making AJAX call to /config.json but this file will depend on where you deploy and will be generated during the deployment step
Using a single API endpoint with a fixed/constant URL (completely separate from your backends) - so that frontend always calls this API endpoint to get its configuration at startup and the actual URL of the corresponding backend for its further operation.
Basically (2) is just the dynamic version of the static configuration in (1).

Image upload to Heroku App

I am currently investigating a way to upload Images to a Heroku repo where I have a python-application that takes in the images, has them classified and saves the results in a .csv file.
The Images can be selected for upload via a website that uses Javascript and HTML.
My Question now is, how would I best enable the upload from the website to the Heroku App?
Bearing in mind that the Frontend is currently running on my local machine and that I want to use Heroku as a Backend to take in either Images or Strings.
Will I need an SSH-connection to a separate Web server? Will I need to use Amazon S3?
Not looking for a complete Solution to my problem per se, but if someone could point me in the right direction as to what I will need to solve my problem that would be great.
You could upload an image to Heroku however there two problems with that
Heroku router times out requests after 30 seconds which means that if your users have a spotty connection and/or huge files the upload will fail
Heroku's ephemeral filesystem means that you must process this file in your web process because workers run on different dynos and don't have access to your web dyno filesystem. So that's another strike at 30 seconds timeout.
Your best bet is to have your users upload their files directly to s3 from their browsers. We had a good experience with filestack.com js widget, but there are other ways.
Your page will then ping your backend with this newly uploaded file's s3 url. The backend will launch an asynchronous job using Heroku worker to process it.
This neatly solves all issues with timeouts and blocking your web dynos.

Pure Frontend CMS

I'm currently redeveloping/designing a department website at my university, but an internal organization who handles the servers/current CMS system are being ridiculously uncooperative. I can't get access to the templates of the current CMS and I can't develop my own templates for the system, so I'm trying to move away from it (EZ Publish). But I also can't get admin access to a server, so I'm unable to install PHP/MySQL to get Wordpress up and running.
Basically, all I have access to right now is a public/ folder. I'm considering writing a pure frontend app with backbone or something, but my boss wants the option of the dept heads to edit information. I want to avoid rolling together my own custom CMS if possible, so I was wondering if anyone knows of a pure front-end CMS manager that doesn't require a server language and server database.
that's a hard one, first because you have to have a way to maintain the data and that's not possible in front end, if the server is working RESTfully, that may be possible, but in other cases, I don't think this is a choice, because no cms will be able to communicate with an existing server cms.
Host a wordpress somewhere else and install the JSON API plugin. Enable CORS on the remote server. Build Your frontend with static files that interact with the service. Host them in the buggers public/ directory.
The admin/post editing tasks will have to be done on the backend host, but hopefuly you'll be able to get a subdomain from your university to point there.
another approach to this problem is static website generators.
you can host the content and data for those in a university internal or other code repository and push the generated content to the site.
to allow your depearment heads to edit information, find a site generator that allows you to seperate structure and layout from content, so that they can edit simple text files to add content, hosted on a shared storage they can all access.

Categories