Canvas Image to an Image file - javascript

So i have a canvas on which the user signs, now instead of converting it to a base 64 string i simply want to save it as an image itslef. whats the easiest way to do it html5??

You can easily do that this way (specifying the format as png in this case):
var img = canvas.toDataURL("image/png");
You can specify different image formats.
Take a look at this answer.

I've answered a similar question here:
Simulating user event
Assuming you are saving locally
You can go the route of creating an image from a Data URL, but then saving it is the trickier part that currently isn't very nice using HTML5. It's hopefully going to get better soon, if browsers incorporate the download attribute of the a tag.
Obviously if you have higher permissions than a standard webpage... i.e. you are designing a browser plugin - then there are other options...
If I were to implement something like this myself, at the moment, I would conceed to using a flash plugin to handle the save to the local computer.
Assuming you are saving remotely
By the sounds of it you aren't saving to a server, but if so this is quite easy by just POSTing the base64 information to a script written in a server-side scripting language (i.e. like PHP) and getting that to write the data directly as binary to a file. Obviously you have to make certain you do this securely however, you don't want just any binary data written to your server's filesystem.
Best of both worlds
If you've got the development time, the best method to get a canvas image saved locally - without Flash - is to create a server-side script that instead of saving the data to your server actually writes the Base64 information you send it directly back as a realised binary image file. That way you can create a form that posts your Base64 data to a new tab, this request is evaluated by the server-side, and the binary image is returned... at which point the browser asks the user where they wish to save their image.
You'll need to define the correct headers to force an image to download (rather than display in-browser). A simple change to force this is to set the server-side script's Content-type header to 'image/octect-stream'... there are other header options to set which would be best researched (i.e. headers that control the filename and so forth).
reflect.php
<?php
/// a simple example..
if ( array_key_exists('data', $_POST) && ($data = $_POST['data']) ) {
header('Content-type: image/octet-stream');
echo base64_decode( $data );
exit;
}
and the form...
<form action="reflect.php" method="post" target="_blank">
<input name="data" type="hidden" value=" ... Base64 put here with JS ... ">
</form>
(The whole form should be created dynamically and submitted automatically with JavaScript)
Improving the user experience
There are ways to avoid a new tab being created, but you'd have to research to make sure these other methods don't cause cross-browser problems... for example you could post your form data as part of an iframe (which would keep the process hidden), or just post the data directly on the current window (..and hope that all the browsers receive the correct request and open a download rather than replace your page content - most modern browsers should handle this).
Improving security
With regards to a PHP script that automatically returns binary data, you should keep the access to this script secured by one time use key / authentication token or something similar, and keep a limit on how much Base64 data you are willing to accept. It might not seem like it poses a secutiry risk - as you are not modifying your server in any way with what the user sends - but the dodgy people of this world could take your script and use it to send download request to other users... which if downloaded (and turned out to be unwanted trojans or viruses) would make your server implicit in providing the dodgy file.
All in all
Due to the effort required to get a simple thing like an image saved to the desktop, I wouldn't blame you for doing the following:
Embed the image in the page (after taking your snapshot from canvas) and ask the user to right click and Save as...
Hopefully future things will make this situation better...

Related

Store data from image in a cookie

I am trying to store image in a cookie, but I don't know if this is the best way keep registers from a user when he refreshes the page.
Well, I am using ngCookies module to do that. So I received a image from server in base64 string format, the contentType and the data and then I store in the cookies:
$cookies.contentType = value.image.contentType;
$cookies.data = value.image.data;
To make my url I do this:
vm.url = "data:"+vm.value.image.contentType+";base64,"+vm.value.image.data;
And the url I insert in my page, using img html:
<img src={{ctrl.url}} style="width:200px;height:200px">
My problem is: when I refreshed my page, $cookies.contentType remains, but value.data doesn't stay stored in $cookies.data anymore. I think this value is too big to store in a cookie. I am using cookies correctly? Are there other way to do that?
I appreciate if anyone can help me.
Using cookies for client side storage is generally considered to be less than ideal. Cookies are automatically sent back with every request (if it matches domain/path/security restrictions). Even if the cookie storage limit could handle this I can't imagine you would want to send this back with any future request. LocalStorage and similar technologies were developed in part to avoid this issue.
That said, the http cookie spec originally stated that a cookie needed to accommodate at least 4096 bytes which was then generally interpreted as setting a max size of approximately 4k. Each browser handles this a bit differently and there are plenty of places to read up on it.

File Reading in PHP

I want to access url of file which user select through pop up file directory navigation window. My browse button tag is:
<input type="file" id="loadFile"/>
On the back end, i can access the file url in javascript, but not sure how to do it in PHP.
You have to have the correct enctype for the form.
Otherwise, you utilize the $_FILES super global.
This is covered extensively in the PHP Manual regarding uploads.
The original filename is available in $_FILES['load file']['name']
Since it seems that you actually want a way to have the user provide a url to a file, the way to handle that is to simply implement a text input and accept the url there, and process the url on the server, using an HTTP client that fetches and stores the file on the user's behalf.
For years people have been using the curl extension, which is fast and highly functional. There are also a number of client libraries written in php like Guzzle.

Refresh image or clear cache

Is there any way (server or client side) to force the browser to pull a new version of a file (image) from the server. The image in question is otherwise cached for a long time. I know I can append a random number, for instance, to the URL of the image but this is not acceptable in this situation. I need for the image to be refresh from the exact same URL.
What I'm doing: a YouTube like portal where users upload videos. Each video has a thumbnail which is shown on various pages on the portal. User can, at any time, change the thumbnail (he can select from three generated thumbnails). So when this happens (a new image overwrites the 'original' image), I wan't to refresh the video's thumbnail so that the owner (I don't care if other users see the old thumbnail) will see the new thumbnail no matter where the thumbnail is shown.
I'm afraid this can't be done but I'm asking here just to be sure.
update: I'm using nginx and PHP on the server side
You could use ETAGs on your thumbnails. This would prevent the transmission of the actual thumbnail data if it hasn't changed (i.e. still has the same hash). However, you would still face the clients HTTP requests to check if the ETAG has changed (normly to be answered by HTTP 304.
But combined with a rather short freshness threshold (say a couple of minutes), you could achieve tradeoff between caching and freshness while still conserving resources. If you need absolute freshness, you might have to stick to ETAGs though. If you create a clever hash function, you could handle the ETAG requests on your frontend loadbalancer (or at least near it) which could thus be rather cheap.
Edit: Add alternative from my other comment.
An alternative could be to use added request parameters to force a re-fetch when the resource changed as suggested in another answer. A variation of that schema (which is used by many Rails applications) is to append the timestamp of the last change (or some kind of hash) as a parameter to the file which only changes when the file actually does change. Something like this, or one of the above methods, is actually the only way to be really sure to not have unnecessary cache validation requests while at the same time having always the freshest resource.
Add at the end of the filename a get parameter, such as:
example.jpg?refresh=yesplease
You could also refresh that image each visit by using a rand() param.
In php:
example.jpg?refresh=<?php echo rand(1,999); ?>

With JS, jQuery, how do I save an AJAX response to a (text) file?

It seems like this question is asked periodically and the common response is "You shouldn't do that with AJAX anyway. Just set the window location to the file."
But I'm trying to request a file that doesn't actually exist out on the server anywhere. It's dynamically generated (by a Django view) given the GET/POST context parameters. The file I want to retrieve via AJAX, and then save to the client machine, is a text file (csv).
I can currently get the text to the client machine (and can verify this by seeing it in logging or an alert) but cannot then figure out how to save this text to a file inside of the AJAX success callback fn.
Essentially, is this possible, is it something JS can do? That is, to open file save dialogs for "files" that are actually AJAX response text?
From the browser's point of view, it doesn't matter if the file exists or not, it's just a resource on a server that it's requesting. I think you're going to need to do some version of "Just set the window location to the file". If you set the content type in the header to something that the browser doesn't recognize, I believe it will ask the user if they want to save it.
As others mentioned, you can't do it only with JavaScript.
IMO the best option would be the Flash 10+ FileReference API.
There are some good JavaScript wrapper libraries like Downloadify that provide a JavaScript API to access those methods.
Give a look to this demo.
This isn't something JavaScript (and therefore jQuery or anything other JS framework) is allowed to do, for security reasons. You may be able to do what you want to flash or another route, but not JavaScript. Bear in mind Flash has it's own slew of security restrictions for this as well.
(Yes, IE can do this via an ActiveX object, but I'm not counting that as a "solution" here)
Basically, no. Javascript cant save anything to the local machine due to security restrictions. Your best bet may be to have a signed applet that the user can trust to write the file, or put it in a textarea that they can then easily copy and paste into a new file.
Could you not use the PHP rename() function for this, instead of just Javascript? Call to a PHP file and pass the name of the file you want to copy along with where as parameters?
I have the same problem. You can try this
<button id="Save">Save</button>
<img src="MakeThumbnail.ashx?Image=1.jpg" id="imgCrop">
$("#Save").click(function (e) {
url = $("#imgCrop").attr("src")+"&Action=Save"
e.preventDefault(); //stop the browser from following
window.location.href = url;
});

How far can I go with JavaScript?

I need to do as much as possible on the client side. In more details, I would like to use JavaScript to code an interface (which displays information to the user and which accepts and processes response from the user). I would like to use the web serve just to take a date file from there and then to send a modified data file back. In this respect I would like to know if the following is possible in JavaScript:
Can JavaScript read content of a external web page? In other words, on my local machine I run JavaScript which reads content of a given web page.
Can JavaScript process values filled in a HTML form? In other words, I use HTML and JavaScript to generate an HTML form. User is supposed to fill in the form and press a "Submit" button. Then data should be sent to the original HTML file (not to a web server). Then this data should be processed by JavaScript.
In the very end JavaScript will generate a local data-file and I want to send this file to a PHP web server. Can I do it with JavaScript?
Can I initiate an execution of a local program from JavaScript. To be more specific, the local program is written in Python.
I will appreciate any comments and answers.
It could technically, but can't in reality due to the same origin policy. This applies to both reading and writing external content. The best you can do is load an iframe with a different domain's page in it - but you can't access it programmatically. You can work around this in IE, see Andy E's answer.
Yes for the first part, mmmm not really for the second part - you can submit a form to a HTML page and read GET arguments using Javascript, but it's very limited (recommended maximum size of data around 1024 bytes). You should probably have all the intelligence on one page.
You can generate a file locally for the user to download using Downloadify. Generating a file and uploading it to a server won't be possible without user interaction. Generating data and sending it to a server as POST data should be possible, though.
This is very, very difficult. Due to security restrictions, in most browsers, it's mostly not possible without installing an extension or similar. Your best bet might be Internet Explorer's proprietary scripting languages (WScript, VBScript) in conjuction with the "security zones" model but I doubt whether the execution of local files is possible even there nowadays.
Using Internet Explorer with a local file, you can do some of what you're trying to do:
It's true that pages are limited by the same origin policy (see Pekka's link). But this can be worked around in IE using the WinHttpRequest COM interface.
As Pekka mentioned, the best you can manage is GET requests (using window.location.search). POST request variables are completely unobtainable.
You can use the COM interface for FileSystemObject to read & write local text files.
You can use the WScript.Shell interface's Exec method to execute a local program.
So just about everything you asked is attainable, if you're willing to use Internet Explorer. The COM interfaces will require explicit permission to run (a la the yellow alert bar that appears). You could also look at creating a Windows Desktop Gadget (Vista or Win 7) or a HTML Application (HTA) to achieve your goal.
Failing all that, turn your computer into a real server using XAMPP and write your pages in PHP.
see i got what you want to do
best things is do following
choose a javascript library (eg:jquery,dojo,yui etc), i use jquery.this will decrease some of your load
inspite of saving forms data in in a local file, store them in local variables process them and send them to server (for further processing like adding/updating database etc) using XMLHttp request, and when webservice returns data process that data and update dom.
i am showing you a sample
--this is dom
Name:<input type='text' id='name' />
<a href='javascript:void(0)' onClick='submit()'>Submit Form</a>
<br>
<div id='target'></div>
--this is js
function submit()
{
var _name=$('#name').val();// collect text box's data
//now validate it or do any thing you want
callWebservice(_name,_suc,_err);
//above call service fn has to be created by you where you send this data
//this function automatically do xmlHttprequest etc for you
//you have to create it ur self
}
//call this fn when data is sucessfully returned from server
function _suc(data)
{
//webservice has returned data sucessefully
//data= data from server, may be in this case= "Hello user Name"; (name = filled in input box);
//update this data in target div(manipulate dom with new data);
$('#target').html(data);
}
function _err()
{
//call this fn when error occurs on server
}
// in reality most of the work is done using json. i have shown u the basic idea of how to use js to manipulate dom and call servcies and do rest things. this way we avoid page-reloads and new data is visible to viewer
I would answer saying there's a lot you can do, but then in the comment to the OP, you say "I would like to program a group game."
And so, my answer becomes only do on the client side what you are able and willing to double check on the server side. Never Trust the Client!
And I do not want to do my job twice.
If you are going to do things on the client side, you will have to do it twice, or else be subject to rampant cheating.
We had the same question when we started our project.In the end we moved everything we could on the JS side. Here's our stack:
The backend receives and send JSON data exclusively.We use Erlang, but Python would be the same. It handles the authentication/security and the storage.
The frontend, is in HTML+CSS for visual elements and JS for the logic.A JS template engine converts the JSON into HTML. We've built PURE, but there are plenty of others available. MVC can be an overkill on the browser side, but IMO using a template engine is the least separation you can do.
The response time is amazing. Once the page and the JS/CSS are loaded(fresh or from the cache), only the data cross the network for each request.

Categories