How to pipe a file upload without dependencies like Multer? - javascript

I'm using express and I would like to pipe a POST request coming from the client directly to a writeable stream to Google Storage, in order to upload the file without storing it in memory or on the disk first. I did some research and found various examples, which seem to work. However, all the solutions I found use dependencies like multer or busyboy. For learning purposes and to avoid unecessary overheat, I'm interested in how to achieve the desired solution without these kind of dependencies. I've been searching for a while but unfortunately couldn't find any examples. Thanks for your time.

Related

Automatically create a CSV file with Javascript data on Github

I'm working on a Quizz with Html/JS on Github and which will be dedicated to my comrades.
I would like to be able to read everyone’s answers so I thought about creating a text or csv file with their answers that would be saved in a specific directory of the github project.
But I’m a beginner and I don’t know if that’s possible, i’ve seen tracks that use PHP or NodeJs with FileSaver.js, but I haven’t managed any of them because i would like it to be automatic, not to ask the user to download his answers.
If some people knwo how to do it or explain me why it’s impossible and how to do it otherwise it’ll be cool.
Thanks ! ;)
Unless you want to make every person using the quiz a contributor to your Github project (which will require that they sign up for Github accounts and tell you their account name so you can manually grant them permission) and then use the API to read the CSV file, modify it, then commit the change (and resolve any merge conflicts caused by race conditions): This is not possible (and if you are willing to do that, then it is among the most complex approach that you could take).
If you want to store and aggregate data submitted by visitors to a website then write some server-side code (using whatever language and frameworks you like, PHP and Node.js are both options) and use a web hosting service designed to support them. Github Pages is designed only for static pages and doesn't support any form of server-side programming.
Once you store the data in a file, just use git commands to commit and push it.

How to Upload Large files in Javascript in a Single Connection?

I am trying to write some HTML/JS code which will facilitate uploading large files (multi-GB) to a remote server. Previously we had been using a flash uploader which uploaded a given file in a single network request. The flash would open a network connection, read a chunk of a file into memory then write that chunk to the network connection then grab the next chunk then write to the network etc. etc. until the entire file is uploaded. It was done this way because most web browsers will attempt to read an entire file into memory before attempting to upload. When dealing with multi-GB files, this essentially crashes the client system because it uses all of the client memory. Now we are having issues with using flash, so it needs to go, we want to replace it without needing to modify the existing server-side code.
A few google searches for jquery uploaders reveals that there are plenty of libraries which support "chunking" but they "chunk" over multiple requests. We do not want to chunk a file over multiple network requests, we merely want the JS to read the file in chunks as it writes the file to a single network connection.
Anybody know a library which can do this out of the box?
We are not opposed to modifying an existing library if need be. Anyone have a snippet that resembles the bellow pseudo-code that I may be able to retrofit into a library?
connection = fopen(...);
fputs("123", connection);
... some unrelated code ...
fputs("456", connection);
fclose(connection);
(excuse my use of C functions in pseudo JS code ... I know that is not how you do it in JS, I am merely demonstrating at a low-level the flow for how I want to write to the network connection before closing it)
NOTE: We are not trying to "modernize" or improve this project extensively -- we are not trying to re-do this project. We have some old code that has sat here for years and we want to make as few changes to the server-side code as possible. I have more important projects to modernize and make more efficient -- this one we just need to work. Please don't advise me to impliment "proper" file chunking on the server side -- that was my suggestion, and if my suggestion were taken then that task would have been assigned to a different developer. Out of my control now, this is a client-side-only fix please!
Thanks, sorry for any headache!
You could try binaryjs. I haven't looked into the internals but I know it supports manually setting the chunk size. Maybe you can even set it to Infinity.
Specifically you could try:
var client = new BinaryClient('example.com', { chunkSize: Number.POSITIVE_INFINITY });
client.send('data...');
Note: binaryjs is a NodeJS server library, and a browser-compatible client library.

Saving a file from a chrome packaged app

I have a Chrome packaged app that has taken me a while to get my head around but I finally have it working. However I have now come across another problem.
Is it possible to save a variable from my app into a text file that's placed in my app/file directory?
I have looked over the chrome.fileSystem api but I don't really understand it.
I could be completely wrong and maybe you can't save files to the file directory?
Any examples or tutorials on this would be great!
Thanks!
I would suggest storing the file's contents in chrome.storage and then dynamically loading and saving the contents of your css file from there instead of using the filesystem. To me, this would be much easier to accomplish.
chrome.storage sounds like a good fit all right (though not too sure about your specific needs in accessing the css outside the app). It's main use case is for saving state and storing configuration settings/variables.
We've a code lab that walks through all kinds of chrome apps stuff, including chrome.storage here: http://developer.chrome.com/trunk/apps/app_codelab5_data.html
There's also the new syncFileSystem API, which really doesn't seem to fit your needs. This is document/file level synchronization, like getting access to user's files on hard drive, saving them in app, and syncing them.
But sure, if you are curious, here's the docs on working with Chrome File System and sync File System: http://developer.chrome.com/trunk/apps/app_storage.html
We're also working on a doc that explains the key concepts around storing data in the cloud (specifically for packaged apps). Hoping to have an early cut in trunk over the next week or so.
Keep us posted on how you get on!

Send Information inside a .png image then extract it through javascript?

After searching around in Google for a while I have not had any luck or guidance in my question.
I want to be able to load up a website using javascript, ajax, in order to reduce the amount of requests needed by the server from the client. My goal is to embed/encode data within an image such that only the client needs to request this image through ajax call, and then be decoded to find the js, css, and other files needed. Then the js, css and other files will be inserted into the DOM.
If I can get the above to work then I could have a lot of flexibility on how my webapp is loaded and be able to notify the user how close the webapp is to being ready for viewing.
Currently my problem is that I cannot find how I would encode the data within an image.
Even if this is not the way to be going about serving up a webapp my curiosity is getting the best of me and I would just really like to do this.
Any guidance or pointers would be greatly appreciated!
Also: I am learning Python so if you know of a python module that I could play with that would be cool. Currently i'm playing with the pypng module to see if this could be done.
To be frank. Don't do that.
The brightest minds on earth use other methods to keep the number of requests and response time down. The most common technique for minimizing the number of requests is called Bundling. In short, you just copy'n paste all js files after each other into one big js file and all the css files into one big css file. This way you need to download two files, one js and one css. Better than that is usually not worth the trouble.
To further keep response times down you usually minify your js and css files. This is a process where all white space, comments, etc are removed and internal variable names are made as short as possible.
Finally you can serve both js and css files as gziped files to further reduce the file size to transfer.
There are many tools out there that does both bundling and minification for you. Google and pick one that suits your other tooling support.

Image sharing on deployd

I've recently begun experimenting with Deployd. It is (kind of) similar to meteor.
This may be amateurish question, but what happens if my collection consists of images?
How will I upload it to MongoDB # deployd dashboard?
I created a module for deployd to upload files (images included).
https://github.com/NicolasRitouet/dpd-fileupload
It lets you store the files in a local folder and in a collection to query them.
The only real way to use the Collection Resource Type to do this right now would be to base64 encode the image and store it as a string property. There are some limitations and performance issues with base64 images though. Alternatively, #dallonf has created an Amazon S3 resource to make it easy to integrate deployd apps with S3. http://docs.deployd.com/docs/using-modules/official/s3.md
There have been a lot of requests for storing binary files in collections, and hopefully someone (core committer or otherwise) can work on this after the forthcoming deployd release which includes significant improvements to the module API. This Github issue is worth watching: https://github.com/deployd/deployd/issues/106

Categories