Edit file on server from a webpage - javascript

What I want to do is to have a set of editable Excel files on my webpage:
I give links to what for the user represents an Excel file
With a click, the user's default program for editing Excel files (say, MS Excel) should open
After finishing editing, the file should be uploaded to my server transparently for the user, and next time the user visits my page, they should see their edited file and be able to edit it again
What I have considered:
JavaScript Excel-like grid. However, I did not find a JavaScript library with sufficient features, such as easily moving rows (any advice of a good JavaScript Excel component?)
Saving to DropBox / Google Docs /... using their APIs. However, it requires the user to have an account, and it will probably require me to manage user's DropBox passwords (and not all users will want to share passwords with me). Also, I will need to have interfaces to Google Drive, Miscrosoft OneDrive, and who knows how many other services.
Allow the user to download the file and rely on the user to upload it back again. However, this is too complicated for the user, and the users will forget to upload the files, which means losing their edits. Any way or uploading the file automatically upon closing?
A macro in my Excel files that would contact my server before exiting. However, this requires the user to enable macros (security alarm) and may be unreliable if the connection breaks. I did not evaluate whether this is technically possible.
Or what is the best / simplest way to achieve this?
(I know how to generate Excel files and how to open them from the webpage; my problem is how to get the user's edits back to the server transparently for the user.)

I think the easiest way to do this ("get the user's edits back to the server transparently for the user") is to use AJAX (JS) requests to PHP scripts.
AJAX is great for doing things in the background (asynchronously), but it can't edit the server. Just add an event listener in JS (an onchange or onblur, perhaps) and send an AJAX request every time the user edits the file.
PHP is a great server-side scripting language, and you can edit files with it.
EDIT: Example (on request)
Assuming that the Excel file is stored in a string in a <textarea> for simplicity (for now), you can set a listener to get the data from it (in jQuery), and send an AJAX request:
HTML:
<textarea id="excel"></textarea>
JS:
$("#excel").change(function() {
var excelFile = $(this).val();
$.ajax({
url: "updateFile.php",
method: "post",
data: { data: excelFile }
});
});
PHP (updateFile.php):
<?php
$data = $_POST["data"];
$file = fopen("FILENAME.xlsx", "w+");
fwrite($file, $data);
fclose($file);
?>

Related

Acrobat Javascript: passing a UTM parameter from URL into a PDF button

I have a PDF that has a button with field name ctaButton.
ctaButton currently has a url pointing to https://mywebsite.com.
I want to host the PDF on my server at https://mywebsite.com/hosted.pdf.
And when I send someone a link to the PDF, I want to attach a UTM_term parameter ?utm_term=customer1 and then have the PDF read this parameter and update the ctaButton url to https://mywebsite.com/?utm_term=customer1.
I've been messing around with the Javascript actions in Acrobat for a couple of hours trying to make this happen. Any help greatly appreciated.
You can get the full url to the document using...
var myURL = this.url;
"this" in Acrobat JavaScript is the document context.
I did hours of research and came to this conclusion – Javascript in Acrobat is like trying to code in 1985 AND browsers will not execute whatever code you come up with.
So I used this workaround:
When I send the PDF to someone, I send it as a link with a base64
encoded stringified JSON package that contains a bunch of tracking
data but importantly, the name of the file to access as well as utm
parameters specific to the recipient
The link hits a server handler (NodeJS) that extracts the encoded
JSON package, and uses the data in the package to serve up an HTML
redirect page pointing to the right PDF file
Importantly, the HTML page also saves the JSON package to the
browser's localStorage . . . this comes in handy in subsequent
steps
The PDF file opens in browser (it doesn't have to, could be opened on
desktop) and the call to action link has a link to a get request
handler
The get request handler serves up ANOTHER redirect page
This second redirect page accesses the browser's local storage, looks
for the utm parameters I set for that user, and then redirects to the
sale page, with nice utm parameters attached
So to sum up, you don't add the utm parameters to the call to action link in the PDF (because that would make the world too easy to live in) and instead you do all these acrobatics (no pun intended) to attach utm parameters in the link clicks (via JSON strings saved in localStorage) during the process (i.e. when user opens email to extract file via link, and then when user clicks call to action in the PDF).
Any questions or clarifications please let me know in the comments and I will do my best to address.
Caveats
Only works if user uses same browser in all steps (i.e. if Susan opens the email in Safari, saves the PDF, then clicks the call to action in the PDF, and the link opens in Chrome, utm parameters will not be passed).
Assumes browser is modern and has localStorage
UPDATE: I came across another solution. It's a bit more convoluted. Diagram below.
Porky.io is a Javascript extension for Adobe Indesign. So flow is:
send Porky.io the customer data you need (e.g. utm's for links)
Porky.io generates PDF from a template you provide with the customer data you provided
Listen for a new file save from Porky
Do something with the file (e.g. email it to customer)
I believe you need to run an instance of Windows somewhere in the cloud (e.g. on Azure) to run Indesign with the Porky.io. Unless you want to rely on your laptop.
My project's not big enough yet to warrant setting this up . . . but good alternative if I need to make my current solution more robust.

File Reading in PHP

I want to access url of file which user select through pop up file directory navigation window. My browse button tag is:
<input type="file" id="loadFile"/>
On the back end, i can access the file url in javascript, but not sure how to do it in PHP.
You have to have the correct enctype for the form.
Otherwise, you utilize the $_FILES super global.
This is covered extensively in the PHP Manual regarding uploads.
The original filename is available in $_FILES['load file']['name']
Since it seems that you actually want a way to have the user provide a url to a file, the way to handle that is to simply implement a text input and accept the url there, and process the url on the server, using an HTTP client that fetches and stores the file on the user's behalf.
For years people have been using the curl extension, which is fast and highly functional. There are also a number of client libraries written in php like Guzzle.

Creating a smart remote upload script with jQuery AJAX and PHP

I'm currently creating an image hosting script and so far so good. I've used several plugins to create the local uploading process with drag & drog + AJAX which works totally fine. Now I've moved to the part where I need to create the remote uploading process with jQuery AJAX and a PHP script to handle the whole thing.
How it's gonna work
My thought are like this: There is a big box in the middle of the page that accepts the URLs to be remote uploaded. Once valid URL(s) are passed into the text area, they will be immediately sent to the server side script via jQuery AJAX. It's bound with a keyup event.
This is how it looks like: http://i.imgur.com/NhkLKii.png.
The "HERE COME THE URLS" part is already a text area - So that part's already done.
Where I need help
The issue with this whole situation is: Once there are valid URLs pasted into the text area, those must be immediately be converted to some sort of box which also includes an uploading progress. Something that looks like this (copied from the local uploading part): http://i.imgur.com/q7RyDmb.png
It was easy implement the progress indicator for the local uploading, since it was a feature offered by the plugin I've used, but I don't know how to indicate the progress of remote uploading, which is totally being made from scratch.
So this is how I've imagined the logic to flow:
User pastes some URLs into the text area
There is a client-side check to validate the pasted URLs
Validated URLs are send to upload.php on keyup (?)
URLs are being processed
While the upload goes on, we show the users the progress in the knob (?)
PHP script finishes the process and returns back the uploaded URLs
I update the page in the AJAX success callback to display the uploaded files
So, the two process flows marked with (?) are unclear to me - I don't know how to achieve those...
What I have tried
Well, I didn't just come here and ask you to do everything for me, but I've come across a dead end and I don't know how to continue. What I've done so far is collect the URLs from the text area, and if there are multiple URLs separated by a line break (\n), I simply use split to get an array of pasted text and then use another function inside the loop to validate if they are URLs. If there is no line break detected inside the text area value, then I simply check the one line that was provided. On each case, I send the whole text area to the PHP script, because I don't know how to get rid of the invalid URLs in jQuery. I've created a function called debug() in PHP which stores anything into a debug.log file and this is what I'm getting (in one try) when I paste something into the text area:
https://www.google.com/https://www.google.com/
I paste https://www.google.com/ once in the text area, but it gets logged twice in the PHP side and I can't determine why.
This is how my jQuery looks like:
// Remote upload
var char_start = 10;
var index = 0;
var urls = $('.remote-area');
var val_ary = [];
urls.keyup(function(){
if (urls.val().length >= char_start)
{
var has_lbrs = /\r|\n/i.test(urls.val());
val_ary = urls.val().split('\n');
if (has_lbrs)
{
for (var i = 0; i < val_ary.length; i++)
{
if (!validate_url(val_ary[i]))
{
val_ary.splice(i, 1);
continue;
}
}
$.ajax({
type: 'POST',
url: 'upload.php',
data: {
upload_type: 'remote', // Used to determine the upload type in PHP
urls: val_ary, // Sending the whole array here
},
});
}
else
{
if (!validate_url(urls.val()))
{
// Display an error here
return;
}
$.ajax({
type: 'POST',
url: 'upload.php',
data: {
upload_type: 'remote', // Used to determine the upload type in PHP
urls: urls.val(), // Sending what's in the text area
},
});
}
}
});
The questions
So the final questions are:
How do I send my information correctly to the PHP script, only valid URLs and have them kind of "process-able" in my PHP script.
How do I indicate the progress of the upload?
If I was somewhere unclear during my question, please let me know, I'll try to reexplain.
Thank you.
Updates
09/12/2013
I think I have managed to solve the double-sending issue where my AJAX would send the same information twice to the PHP script. What I did was code in a delay anonymous function that sends the text area content to the PHP script after an user stops typing for 2 seconds. Once the user stops typing again, the timer resets and a new AJAX request will be made. So, I'm assuming that this issue has been solved. I'll come back to it if anything strange occurs.
Now I'm still left with the progress indicators part. I'd appreciate your thoughts on that one.
My new code: http://pastebin.com/SaFSLeE9
What you're looking for in terms of communicating progress back and forth is "pushing". That refers to the technique of server sending data to the client, rather than the other way around, which is the standard HTTP way of doing things.
You've got plenty of options available, as described in the explanatory Wikipedia article, though perhaps more relevant to this topic would be Comet. What happens is you trigger and $.ajax call just like the one you have now, but you set a very long timeout. That essentially gives the server a "channel" to send data back to the page whenever it's available.
So what you need is a .php on the server that is capable of handling long polling and will send data back to the page as the upload progress changes (probably in array form for multiple uploads). This article should get you started with some jQuery code. Just remember that this request doesn't go to upload.php. This request goes to a different script that deals solely with upload percentages and only returns data when that is available, it doesn't return immediately as all others scripts - the Ajax will happily wait for the data.
Also, don't separate your code like that with has_lbrs. One line or many are not distinct cases, one line is just an edge case of many lines. You're duplicating the code unnecessarily. What does the else case do that would break in the general case? Further, the "error handling" in the else case is misleading. The only error reporting you do is if there is only one line and it's wrong. What if you have two lines and they're both wrong? Your code will happily send an empty array to upload.php.
This is why I think you shouldn't separate your code like that, because then you'll split logic and not even notice it.
In my opinoin, the best way is to call your cURL script with ajax and use it to upload your files on remote server. You need ajax.js, curl.php, index.php (whatever name you want) on your app server. And image.php, class.image.php (whatever name you want) on your remote server.
Steps that I did for my app
1) I am going to upload an image from my index.php file. It will call curl.php file using ajax.js and the cURL file will check file's extension and all (for your app's security, make sure what you want to allow users to upload).
2) Now the curl file will upload the file to your pre defined temporary folder with the default file name.
3) Now if move_uploaded_file function (which I used in my script) run successfully, you can call your cURL function to send your data as post on your remote server, where image file will receive posts and will process further. You can keep your class in image.php or you can create two PHP files on your remote server, as you want.
4) Now in your class file, you should check file once again that it is image file (and whatever you want to allow) or not for better security. If file is good, process to rename it and add file into folder if you want to.
5) Add file's new name and folder name into your database by using remote database connection. So, cURL will show you result on the same page.
Now, why cURL? I prefer cURL because, you can add secret key or API for your communication to make it more secure, with if else conditions. Your remote server file which is going to receive all posts, will check if API == 'yourKey' then will process other wise it wont process and nobody will be able to send images on your server with bots and all.
I don't know that my answer is going to help you or not, probably my method is lengthy or not good for your app, but try to Google about cURL and you will understand what I am trying to say. Hope you like it and understood it. If any doubt, you can ask me any time.

Canvas Image to an Image file

So i have a canvas on which the user signs, now instead of converting it to a base 64 string i simply want to save it as an image itslef. whats the easiest way to do it html5??
You can easily do that this way (specifying the format as png in this case):
var img = canvas.toDataURL("image/png");
You can specify different image formats.
Take a look at this answer.
I've answered a similar question here:
Simulating user event
Assuming you are saving locally
You can go the route of creating an image from a Data URL, but then saving it is the trickier part that currently isn't very nice using HTML5. It's hopefully going to get better soon, if browsers incorporate the download attribute of the a tag.
Obviously if you have higher permissions than a standard webpage... i.e. you are designing a browser plugin - then there are other options...
If I were to implement something like this myself, at the moment, I would conceed to using a flash plugin to handle the save to the local computer.
Assuming you are saving remotely
By the sounds of it you aren't saving to a server, but if so this is quite easy by just POSTing the base64 information to a script written in a server-side scripting language (i.e. like PHP) and getting that to write the data directly as binary to a file. Obviously you have to make certain you do this securely however, you don't want just any binary data written to your server's filesystem.
Best of both worlds
If you've got the development time, the best method to get a canvas image saved locally - without Flash - is to create a server-side script that instead of saving the data to your server actually writes the Base64 information you send it directly back as a realised binary image file. That way you can create a form that posts your Base64 data to a new tab, this request is evaluated by the server-side, and the binary image is returned... at which point the browser asks the user where they wish to save their image.
You'll need to define the correct headers to force an image to download (rather than display in-browser). A simple change to force this is to set the server-side script's Content-type header to 'image/octect-stream'... there are other header options to set which would be best researched (i.e. headers that control the filename and so forth).
reflect.php
<?php
/// a simple example..
if ( array_key_exists('data', $_POST) && ($data = $_POST['data']) ) {
header('Content-type: image/octet-stream');
echo base64_decode( $data );
exit;
}
and the form...
<form action="reflect.php" method="post" target="_blank">
<input name="data" type="hidden" value=" ... Base64 put here with JS ... ">
</form>
(The whole form should be created dynamically and submitted automatically with JavaScript)
Improving the user experience
There are ways to avoid a new tab being created, but you'd have to research to make sure these other methods don't cause cross-browser problems... for example you could post your form data as part of an iframe (which would keep the process hidden), or just post the data directly on the current window (..and hope that all the browsers receive the correct request and open a download rather than replace your page content - most modern browsers should handle this).
Improving security
With regards to a PHP script that automatically returns binary data, you should keep the access to this script secured by one time use key / authentication token or something similar, and keep a limit on how much Base64 data you are willing to accept. It might not seem like it poses a secutiry risk - as you are not modifying your server in any way with what the user sends - but the dodgy people of this world could take your script and use it to send download request to other users... which if downloaded (and turned out to be unwanted trojans or viruses) would make your server implicit in providing the dodgy file.
All in all
Due to the effort required to get a simple thing like an image saved to the desktop, I wouldn't blame you for doing the following:
Embed the image in the page (after taking your snapshot from canvas) and ask the user to right click and Save as...
Hopefully future things will make this situation better...

How far can I go with JavaScript?

I need to do as much as possible on the client side. In more details, I would like to use JavaScript to code an interface (which displays information to the user and which accepts and processes response from the user). I would like to use the web serve just to take a date file from there and then to send a modified data file back. In this respect I would like to know if the following is possible in JavaScript:
Can JavaScript read content of a external web page? In other words, on my local machine I run JavaScript which reads content of a given web page.
Can JavaScript process values filled in a HTML form? In other words, I use HTML and JavaScript to generate an HTML form. User is supposed to fill in the form and press a "Submit" button. Then data should be sent to the original HTML file (not to a web server). Then this data should be processed by JavaScript.
In the very end JavaScript will generate a local data-file and I want to send this file to a PHP web server. Can I do it with JavaScript?
Can I initiate an execution of a local program from JavaScript. To be more specific, the local program is written in Python.
I will appreciate any comments and answers.
It could technically, but can't in reality due to the same origin policy. This applies to both reading and writing external content. The best you can do is load an iframe with a different domain's page in it - but you can't access it programmatically. You can work around this in IE, see Andy E's answer.
Yes for the first part, mmmm not really for the second part - you can submit a form to a HTML page and read GET arguments using Javascript, but it's very limited (recommended maximum size of data around 1024 bytes). You should probably have all the intelligence on one page.
You can generate a file locally for the user to download using Downloadify. Generating a file and uploading it to a server won't be possible without user interaction. Generating data and sending it to a server as POST data should be possible, though.
This is very, very difficult. Due to security restrictions, in most browsers, it's mostly not possible without installing an extension or similar. Your best bet might be Internet Explorer's proprietary scripting languages (WScript, VBScript) in conjuction with the "security zones" model but I doubt whether the execution of local files is possible even there nowadays.
Using Internet Explorer with a local file, you can do some of what you're trying to do:
It's true that pages are limited by the same origin policy (see Pekka's link). But this can be worked around in IE using the WinHttpRequest COM interface.
As Pekka mentioned, the best you can manage is GET requests (using window.location.search). POST request variables are completely unobtainable.
You can use the COM interface for FileSystemObject to read & write local text files.
You can use the WScript.Shell interface's Exec method to execute a local program.
So just about everything you asked is attainable, if you're willing to use Internet Explorer. The COM interfaces will require explicit permission to run (a la the yellow alert bar that appears). You could also look at creating a Windows Desktop Gadget (Vista or Win 7) or a HTML Application (HTA) to achieve your goal.
Failing all that, turn your computer into a real server using XAMPP and write your pages in PHP.
see i got what you want to do
best things is do following
choose a javascript library (eg:jquery,dojo,yui etc), i use jquery.this will decrease some of your load
inspite of saving forms data in in a local file, store them in local variables process them and send them to server (for further processing like adding/updating database etc) using XMLHttp request, and when webservice returns data process that data and update dom.
i am showing you a sample
--this is dom
Name:<input type='text' id='name' />
<a href='javascript:void(0)' onClick='submit()'>Submit Form</a>
<br>
<div id='target'></div>
--this is js
function submit()
{
var _name=$('#name').val();// collect text box's data
//now validate it or do any thing you want
callWebservice(_name,_suc,_err);
//above call service fn has to be created by you where you send this data
//this function automatically do xmlHttprequest etc for you
//you have to create it ur self
}
//call this fn when data is sucessfully returned from server
function _suc(data)
{
//webservice has returned data sucessefully
//data= data from server, may be in this case= "Hello user Name"; (name = filled in input box);
//update this data in target div(manipulate dom with new data);
$('#target').html(data);
}
function _err()
{
//call this fn when error occurs on server
}
// in reality most of the work is done using json. i have shown u the basic idea of how to use js to manipulate dom and call servcies and do rest things. this way we avoid page-reloads and new data is visible to viewer
I would answer saying there's a lot you can do, but then in the comment to the OP, you say "I would like to program a group game."
And so, my answer becomes only do on the client side what you are able and willing to double check on the server side. Never Trust the Client!
And I do not want to do my job twice.
If you are going to do things on the client side, you will have to do it twice, or else be subject to rampant cheating.
We had the same question when we started our project.In the end we moved everything we could on the JS side. Here's our stack:
The backend receives and send JSON data exclusively.We use Erlang, but Python would be the same. It handles the authentication/security and the storage.
The frontend, is in HTML+CSS for visual elements and JS for the logic.A JS template engine converts the JSON into HTML. We've built PURE, but there are plenty of others available. MVC can be an overkill on the browser side, but IMO using a template engine is the least separation you can do.
The response time is amazing. Once the page and the JS/CSS are loaded(fresh or from the cache), only the data cross the network for each request.

Categories