Force the browser to download videos from third party website - javascript

I have this problem. I have a website which has URLs to videos from different website (not in my servers). What I exactly want when the user clicks on any of these link the video in remote website is downloaded. But what I have now when the user clicks on the link the video is open and show on the browser instead of download. I created a source code in ASP.net and C# that could force the browser to download the video, but the problem here is that my server should process the downloading operation to the browser, in other word that data should pass through my server to the client browser rather than from the third website to the client directly.
This will have two problems
it waste my server resources and effectiveness because the data should be processed through the server
it will increase the In and out bandwidth for my server and all the files are videos so it will be very costly.
What I want now, is there a way that enables me to force the browser to download the file directly from third party and without passing data through my server by using JavaScript, jQuery, or any client side techniques?

The html5 download attribute could help here. I haven't tested it personally but this blogpost says adding download should force a download on the browser side.
Example code:
Download

Related

Is it possible to programmatically cancel/stop a started download from JavaScript?

Our application implements a custom download manager.
So actually this means that when a user starts a download of a resource he/she has the opportunity to cancel it through a UI button from the app, which actually sends some event to the server to cancel the downloading (the connection is disconnected and etc).
But since recently Chrome added auto-resume of failed downloads, so the same request is resumed to be downloaded again from the Chrome Download Manager, it's because Chrome cannot distinguish that the user/server intentionally canceled the HTTP connection so it revokes it again.
Currently, we download resources using IFrame, but once the download is started by the browser changing the IFrame's src attribute is not relevant anymore. I've tried to use <a href='...' download> but the issue is the same.
Maybe there's some API at least for Chrome? (There's actually such for Chrome extensions but this is not the case)
You cannot control the downloads that are managed by the download manager, of the browser.
If you want to control the download then you need to fetch the file JavaScript by e.g. utilizing the Streams API.
After you downloaded the Data you can pass that file to the download manager of the browser.
Each of those tasks can be solved in different ways.
Could you do something like add a unique key to the URL for each download request? When the user cancels the download, you invalidate the key so that the URL cannot be used again to download that resource - and in the process prevents Chrome from being able to resume the download.

Best practices for loading videos on a web page?

Would like to understand how AirBnb is able to load a 20MB background video file so fast on their homepage. After inspecting their homepage on WebPageTest, I noticed that the video did not show up in any of the downloaded resources, which made it score so high. When I've tried this tactic, via loading the video asynchronously via AJAX, the video still shows up on WebPageTest as a downloaded resource, but just after the DOM loads. So I'm really not sure how AirBnb is able to make this work. Does anyone have an idea?
AirBnb isn't doing anything special here. They're just starting playback of media using progressive download, which just means playback starts while the video is still downloading.
On their CDN, they have uploaded some fairly large MP4 files with two important characteristics:
The indexing information (MOOV atom) has been moved to the beginning of the MP4 file
The video is encoded in a format and codec that your browser supports
Because of these characteristics, all the site has to do is tell your browser to begin playing the source URL, and it will do the right thing: it makes a web request to the CDN and begins downloading the file. As soon as enough data has been transferred to start playback, it does so.
Finally, I can't say for sure why WebPageTest doesn't show you the video MP4s that are driving the video, but they are certainly there, and the URLs look like https://a0.muscache.com/airbnb/static/Xxxxx-X1-1.mp4. I suspect they're looking at your User Agent to decide which file to send you, and are not sending any video at all to bots like Google and WebPageTest.
You're not getting the real story through WebPageTest. Instead of relying on a third party to evaluate the page in their environment, you should watch the traffic you are actually being sent using Fiddler or the Network tab on Chrome Developer Tools.

Amazon CloudFront - Restrict MP3 to play on specific website

I've been trying figure out how to get mp3 files in an Amazon S3 bucket paired with Cloudfront to allow me to stream the files directly on my site but not allow anyone to take the source URL of the mp3s by viewing the source code of the page and then sharing or leeching the link.
Right now, I am using an html5 mp3 playlist from mediaelements.js and the mp3 file is always in the source code. That's fine, but I want to only allow the mp3 to play on my specific website and if the link gets copied from the source and accessed in a different browser it should show an access restricted error.
I tired to update the cloudfront policy to expire within 30 seconds of the page load, but that will ultimately prevent the files from playing once the 30 seconds is over and if the user didn't play one of the tracks prior to that expiration.
Is there another way to do this without putting a time expiration on the cloudfront links?
I think this is what you are looking for: http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-signed-urls-overview.html
Basically you can vend URLs dynamically from your service, and CloudFront will validate signature. You can also set pretty short expiration time to avoid wide distribution of your URL, and restrict IP addresses that might access URL (see Custom Policies section in the referenced document).
It's possible with just few line wizard coding on back-end (private method) ,i prefer to using free tier EC2 instance and configure environment for handle streamable contents for deliver everything clear this way provide a restrict page to someone going to leech or using IDM for donwload your mp3 files.
Example : Grooveshark.com
However still there is some another methods like Owain answer .
Unfortunately, you can't. MediaElements.js may be hosted on your site, but it's being run on the user's computer. So although it looks like they are playing an MP3 via your site, they are actually just downloading a URL from your site and playing it using code running on their computer.
You could write server-side code that went off to S3 and retrieved the MP3 before returning it as if it were a file hosted on your server, but that still wouldn't limit people from copying that link, unless some sort of session were used before returning the file to ensure they're logged in via your site.
But that would mean you can't make use of CloudFront. That's the compromise. Distribute your MP3 via a CDN and improve download performance by hosting the file in an edge location closer to your users, or take advantage of server-side security to ensure your IP isn't hosted by unscrupulous third parties.

Upload image by URL in single-page application with Canvas and File API

We have a single-page application (Rails back-end, Ember.js front-end) where we're currently moving away from a server-side image uploader to a client-side image uploader. We previously used the Carrierwave gem to do resizing and sending to S3 on the server. Now we want to the resizing (using HTML5 Canvas and File API) and sending to S3 directly on the client.
This works well when the user selects an image from his computer. It's definitely way faster for the user and puts less load on the server.
However, our users have gotten used to the "Upload by URL" functionality that we offered. It works like the "Search by image" functionality on Google Image Search. Instead of selecting a file from his computer, the user pastes a URL to an image.
Because of the same-origin policy, we cannot use Canvas API on an external image (it becomes tainted, see https://developer.mozilla.org/en-US/docs/Web/API/Canvas_API/Tutorial/Using_images#Using_images_from_other_domains).
I see two possible solutions:
In order to be able to upload to S3 directly from the client, we need to generate a pre-signed key on the server. We could pass the URL to the image in that same request, download it on the server while we generate the pre-signed key, and put the image as base64 payload in the response.
Use a proxy on our domain and use it to bypass the SOP. So access the image on the client as https://www.mydomain.com/?link=http://www.link.to/image/selected/by/user.jpg.
My questions are:
Do you know any other way to bypass the same-origin policy to provide a "Upload by URL" functionality?
Which solution do you think is best?
How hard is it to setup 2)? I have no experience in setting up proxies. FWIW, we host our application on Heroku.
I hope the situation I described is clear enough.
Thank you!
Yoran
Yes, you could force your clients to download the other-domain image to their local drives and then upload that copy from their local drives.
"Best" is subjective & relative to your configuration. The traditional workaround is your option#2--bounce the image off your server. Really all you're doing is having your server upload the image and re-serve it to the client. If you're expecting a huge volume of images, then forcing the client to download their own images might be better rather than gumming up your server by "cleaning" their images.
How hard to set up? It's fairly easy...after all you're just having some server code pull a remote image and save it to a specified server directory. The only modestly hard part is:
Make sure the server never interprets one of those client-inspired urls as executable (viruses!)
Clear the new directory often so the server is not over-weighted with images loaded for the client
Set limits on the size and quantity of images the client can upload to your server (denial-of-service attack!).

Download multiple files from remote server with a single user confirmation

I have a web page containing a list of pictures urls (can be more then 1000 items) and I want to enable a button for the user to click and download all of the files to the local hard drive.
The download process should ask the user for a directory to save the files in and then go ahead and download all files to that directory (if possible, creating sub directories inside). This should be done with a single user confirmation for the whole download process and avoid display the browser save dialog for each file.
Is there a way doing that? I am aware I can't use the standard HTTP protocol for the downloads and have to write some kind of control to do the job. The page is written in asp.net.
Downloading to the server, packing and sending to the user is not possible. The download has to be originated from the client machine.
You should update your question to include the requirements from your comment, because they make a huge difference. If the server cannot retrieve the files, because he doesn't have the right permissions, your only option is to run the code on the client side. There are several options how to do this, mostly depending on the clients and your coding skill:
Flash (Not sure about the security aspect of writing to the local file system, though)
Java Webstart (Disadvantage: Clients need to have the Java runtime installed)
Browser plugin/extension (Disadvantage: You can only support a subset of browsers and the page will not be immediately usable, as the plugin or extension needs to be installed first)
In all cases, you will have to write a tool, that retrieves the URL list from your server and starts downloading it.

Categories