Disclaimer: I haven't done web programming for ages and am not even sure what or where to search.
Intro
Everyone's familiar with the concept of downloading files from websites, you click a link on a webpage, the server gets the request containing the URL and responds with the file data appropriately packaged with the content type indicated and all.
Problem
Now, I'd like the same experience, except the data is generated fully on the client side without any requests going back to the server. I know I can generate all the data on client and even dynamically change the viewed page using DOM. But I'm not sure about embedding this data on the page in a downloadable way, whether it's possible at all and how to do it. Is it possible? In e.g. HTML+JavaScript? If it is, will it work in the major browsers such as IE, FF and Chrome? Will it need HTML5? Or am I doomed to serving the data from the server or using other technologies (maybe Flash)?
You can base64 encode the content into an the href attribute of an anchor a tag. See:
http://webreflection.blogspot.com/2011/08/html5-how-to-create-downloads-on-fly.html
Related
I want to show website preview on a link similar to facebook when a user post a link. my question has been repeated in the following link ,but I am going to ask specific information throughout my solutions. I have 2 solutions for showing webpage preview which are as follows:1. server side html process 2. client side html process.
1. server-side html process
I used System.Net.WebClient().DownloadString(url) to retrieve the web page data in server side , and I tried to extract the most important information in the page ,but in most cases, main part of the page loads using javascript , therefore I do not have access to that information.
Another solution in server-side html process is to work with webBrowser and WebDocument objects. because I didn't work with these libraries and I don't know how much the Web server performance affect by applying this objects , I only present this solution for discussion .Therefore are there any server-side html graber which fetch all html data including javascript loaded html source?
2. Client Side Html process
The simplest approach for client side is to use the iframe tag, but it has two following problems:
a. I can not access to innerHTML of the frame for the links on other domains.
b. I can not load https webpages such as drop-box and facebook in the iframe
because of "x-frame options" error.
My question is that, is there any other client-side solution to retrieve dynamic html source(loaded by javascript) from 3rd party webpages (usually https)? Or can I solve above problems with some tricks.
I guess server side approach would be most viable option. On client side you can use proxy services which allow to solve cross domain limitation, for example, crossorigin.
To generate a preview, similar to one Facebook provides, you need to get Open Graph information for target page. Libraries to process open graph data available for multiple platforms. OpenGraph-Net could be used on .NET plarform.
I have a web page with a file swf and an HTML button: when I click the button I want to save (download to my disk) the current image my swf file is showing (it is a sort of image gallery).
It perfectly works when the button is inside my swf but it fails when -through ExternalInterface- I call from JavaScript the method that saves the image.
I verified the JS-AS communication (it's ok) and I know that FileReference.save() only works when triggered by a user event. Probably, the click on an HTML button is not considered a user event.
Aside from changing anything (eg, moving some code on the server side, sending the image to server, then downloading it...), is there any way to simulate a user event? Any other solution or tip is appreciated.
NB: I would use a Flash button but the HTML is required.
Solution (or not as the case may be)
Flash based
Currently I would say your best bet is to stick with your button operating from within Flash. If you need to dislocate the button from your main Flash runtime, you could try what you are doing using two embeds of Flash and communicate between them using LocalConnection. I wouldn't recommend this however as LocalConnection is a pain to get working well, and there is no guarantee that you wont come up against security sandbox problems working across two instances.
Server-side based
You could implement a save system that would involve sending the image data back to a server and forming an actual URL that your front end could request. This would allow you to specify whatever you wanted for the download. The downsides to this are that it requires a server (so wont work for offline apps), it also requires quite a lot of hassle of sending the image data one way only to pull it down later...
I've gone in to more detail about this here:
Canvas Image to an Image file
HTML5 based
Currently I wouldn't recommend the Data URL download as I suggested in my comment because it is not a complete solution yet. However, on the plus side I'd keep an eye out on what the top browsers are implementing though, because that answer could change shortly.
Workings
Basically I just tried to implement an image download via a data URI (thinking this would be the best solution for your poblem), which all works fine, plus you could quite happily derive the Base64 data you need from your BitmapData object. However, the problem is that there is no way to specify a filename along with the download. So you end up with rather ugly filenames that don't even have the correct extension.
Click to Download File
After researching a bit it seems there isn no workable workaround for this, but there are specifications that are ready to be implemented that would help:
<a download="filename.png" href="data:image/octet-stream;...">Download File</a>
The download attribute is designed for precisely the problem I mention above, and would allow naming of the download. Unfortunately I can't find a browser that implements it yet...
References
about the download attribute of an a tag
more about the download attribute of an a tag
stackoverflow : suggest a file name when using data uri
stackoverflow : force download an image using javascript
I have a web app (sencha/phonegap) that includes a feature allowing users to click on buttons that link to Wikipedia articles. This obviously works fine if the device has internet access, but I get numerous requests to make the app work when the app is offline too. To accomplish this, I'd like to give the user the option to download the linked articles/webpages for offline access. When the device does not have internet access, the app would instead display the saved version (which might be stale/out-of-date, but is better than nothing). What are possible ways to accomplish this task?
My first thought was to somehow use the html manifest to cache the pages in the phone's browser, which sounds possible on the Android browser, but iOS apparently has a 5MB browser cache limit - too small.
My next thought was to save the needed html & associated files and bundle them up inside the app. But this seems a rather cumbersome approach, the app becomes much larger than it needs to be, and the webpages are stale back to the date the app was installed.
Using javascript, is it possible to download webpages, which I could then save (on the sd card, for example) for access later?
Or is there a more elegant approach?
If anyone could point me in the right direction it would be much appreciated.
In pure Javascript you can make an Ajax request to download a page. Then you can use the FileWriter to write the responseText to a file on the file system. However, that won't help you when it comes to images. You'll need to use the FileTransfer.download() command to get the binary image files.
If I were you I'd:
Use AJAX to download the html.
Parse the html looking for images.
Use FileTransfer.download to get the images.
I want to create a web crawler/spider to iteratively fetch all the links in the webpage including javascript-based links (ajax), catalog all of the Objects on the page, build and maintain a site hierarchy. My question is:
Which language/technology should be better (to fetch javascript-based links)?
Is there any open source tools there?
Thanks
Brajesh
You can automate the browser. For example, have a look at http://watir.com/
Fetching ajax links is something that even the search-giants haven't accomplished yet. It is because, the ajax links are dynamic and the command and response both vary greatly as per the user's actions. That's probably why, SEF-AJAX (Search Engine Friendly AJAX) is now being developed. It is a technique that makes a website completely indexable to search engines that when visited by a web browser, acts as a web application. For reference, you may check this link: http://nixova.com
No offence but I dont see any way of tracking ajax links. That's where my knowledge ends. :)
you can do it with php, simple_html_dom and java. let the php crawler copy the pages on your local machine or webserver, open it with an java application (jpane or something) mark all text as focused and grab it. send it to your database or where you want to store it. track all a tags or tags with an onclick or mouseover attribute. check what happens when you call it again. if the source html (the document returned from server) size or md5 hash is different you know its an effective link and can grab it. i hope you can understand my bad english :D
I am taking a text file from user and then posting that file back to the browser using ajax storing the content in db and then showing the content back to user page using Jquery post response.
Now i want to something like this..
Read the text file from the user computer using javascript. Display the content and when he submits the page I will save the values.
Can't be done in pure JS for security reasons. You would need to have the user upload the file to your server, and fetch the contents back through Ajax.
If you use Flash or Java, you should be able to gain direct access to the file. If you speak Flash/Actionsript, maybe SWFUpload's source code (especially the new client-side resizing functions) can serve as an inspiration.
Update: This blog entry should help. Read and write local files with Flash Player 10
Update: To elaborate on the "upload and fetch" thing, if you do the uploading in an IFRAME, you could even have the upload script simply output the text file's contents. Because the iframe belongs to your domain, you will be able to retrieve its contents via JavaScript when the upload has finished. As long as you send a content-type: application/text it should be fairly safe from any malicious attacks.
If you're ok with Firefox 3.6 support only check out https://developer.mozilla.org/en/Using_files_from_web_applications, otherwise you should use Flash, Java or silverlight for this.
You won't be able to read a file in user's computer due to security issues.
Reading client files in javascript is possible with the new File API available in modern browsers. Check this site and its code: http://www.readfileonline.com/
However, before reading file contents in javascript, the user must explicitly select the files it allows to read. This is a security feature of the standard.