Got a really interesting (for me) problem.
I have a dropzone.js plugin installed and now I need to put some files there... from php.
What I am trying to do:
php script detects, that there are some files (in directory) that were loaded earlier (for example, few days ago). (I know the names of this files).
After that, I have to pass this files to my javascript script which will add them to dropzone so user could see files that he uploaded earlier.
And all of this using Ajax.
I understand, what to do with step 1 (I can find those files). But how to pass it to js and then add to dropzone?
Or am I thinking wrong? Help me please.
Dropzone has a wiki page explaining that.
Here is how I've recently done that by getting file URLs from REST API:
$.get('http://api.to.return.files', function(data) {
$(data.photos).each(function(i, photo) {
var mockFile = { name: photo.name, size: photo.size, accepted: true, id: photo.id };
myDropzone.emit("addedfile", mockFile);
myDropzone.emit("thumbnail", mockFile, photo.url);
myDropzone.emit("complete", mockFile);
myDropzone.files.push(mockFile);
});
});
If you already have your files urls in the script, use them instead of API response in my case.
Related
I don't know if this is a duplicate question but i have searched and couldn't found solution for this
I am newbie in cpanel and i recently uploaded my project in it. Now there is a part in my website where i am loading a folder of images through jquery ajax. Now this was working perfectly in the local server xampp but not in the server it keeps giving 404 error that means that the files not being discovered by the ajax script. For security reasons i am not going to share the links right now but i will explain the full procedure
These are the location of those folders. These scripts are in js folder. But obviously it is included in index page. anyway lets move
var svgFolder = "img/svg/";
var productImagesFolder = "img/ImagesForProducts/";
Following are the ajax scripts that i am using to load the images of these folders
$.ajax({
url: svgFolder,
success: function (data) {
$(data).find("a").attr("href", function (i, val) {
if (val.match(/\.(jpe?g|svg)$/)) {
$(".svg-shapesDiv").append("<img src='" + svgFolder + val + "' id='svg-shapes' loading='lazy'>");
}
});
}
});
$.ajax({
url: productImagesFolder,
success: function (data) {
$(data).find("a").attr("href", function (i, val) {
if (val.match(/\.(jpe?g|jpg)$/)) {
$("#avatarlist").append("<img style='cursor:pointer;' class='img-polaroid' src='" + productImagesFolder + val + "' loading='lazy'>");
}
});
}
});
All of this is working fine in localhost server but for some reason when i uploaded them in the cpanel it stopped working.
I tried hard coding the img tag like this
<img src='img/svg/file.svg' id='svg-shapes' loading='lazy'>
<img src='img/ImagesForProducts/file.png' id='svg-shapes' loading='lazy'>
Things i tried
And this works fine so i think that the ajax is not figuring out the address. I also tried to search the image through link in the browser like this domainname.com/img/svg/file.svg and it works fine as well. i also tried to give ajax the path like this domainname.com/img/svg/file.svg but it doesn't work. I checked the file capitalization etc but everything is correct
If this was a stupid question then i am sorry but i don't know that what i am doing wrong and i am also new to cpanel and live hosting stuff.
Based on the response to my comment it sounds as though your xampp has "indexes" enabled by default. Please see here: https://httpd.apache.org/docs/2.4/mod/mod_autoindex.html
It may be that on your shared webhosting they are disabled by default and you would need to enable them for those 2 directories. As you are using cpanel please see here: https://docs.cpanel.net/cpanel/advanced/indexes/82/ but this can also be achieve by adding a .htaccess file to the 2 folders containing Options +Indexes.
The trouble with relying on indexes this way is that different servers could potentially return slightly different html so you could find that your xampp server returns html links (your JavaScript searches for anchor tags and gets the href from there) but the shared server may not return links it may just return the file names. Also with this html being returned your JavaScript has to parse that html, search all links and extract the href. I would therefore recommend writing a php script that gathers the relevant files and returns only those in JSON format. Much easier then for the JavaScript to parse and use and you now have full control of what is returned whether it is on your xampp server or other hosting. You can call this script whatever you want and you can place it wherever you want. You could even have one script that accepts query parameters from your AJAX call and from those it know which folder to look into and what types of files it must gather from the folder. This also has the advantage of keeping all other files in those folders hidden from prying eyes.
I face the following problem:
When a user uploads a file with the HTML file input and I then want to receive the file path itself. I only get C:/fakepath/filename.txt for example.
I understand that it is a security reason for browsers to know the exact path of the file. So i was wondering if it is even possible with some hack, some way in .net or with additional jquery/js plugin to get the full path of the file.
Why?
We dont want to upload the file itself to our server filesystem, neither to the database. We just want to store the local path in the database so when the same user opens the site, he can click on that path and his local file system opens.
Any suggestions or recommendations for this approach?
If this is really not possible like
How to resolve the C:\fakepath?
How To Get Real Path Of A File Using Jquery
we would need to come up with a diffrent idea I guess. But since some of the answers are really old, I thought maybe there is a solution to it by now. Thx everyone
As my goal was to make the uploaded file name visible to the End User and then send it via php mail() function, All I did to resolve this was:
in your js file
Old function:
var fileuploadinit = function(){
$('#career_resume').change(function(){
var pathwithfilename = $('#career_resume').val();
$('.uploadedfile').html("Uploaded File Name :" + pathwithfilename).css({
'display':'block'
});
});
};
Corrected function:
var fileuploadinit = function(){
$('#career_resume').change(function(){
var pathwithfilename = $('#career_resume').val();
var filename = pathwithfilename.substring(12);
$('.uploadedfile').html("Uploaded File Name :" + filename).css({
'display':'block'
});
});
};
$(document).ready(function () {
fileuploadinit();
});
Old result:
Uploaded File Name :C:\fakepath\Coverpage.pdf
New result:
Uploaded File Name :Coverpage.pdf
Hope it helps :)
You can't do it.
And if you find a way, it's big security vulnerability that the browser manufacturer will fix when discovered.
You'll need your own code running outside browser-box to do this, since browsers are designed NOT to allow this.
I mean something ugly like ActiveX, flash, COM object, custom browser extenstion or other fancy security breach that can open it's own OpenFileDialog and insert that value in your input field.
After searching the past couple days, I've found nearly 30 different people asking this same question, and I haven't found an answer. A couple reported that they found a solution, but did not provide it, so I'm hoping someone could answer it here.
Using blueimp jQuery File Upload, how can you upload multiple files directly to Amazon S3?
Problem: S3 accepts only one file per request.
Solution: Use blueimp jQuery File Upload to send separate requests for each file.
Roadblock: I cannot figure out how to make blueimp jQuery File Upload do this.
Solution Attempt
The guide is here: https://github.com/blueimp/jQuery-File-Upload/wiki/Upload-directly-to-S3
S3 currently requires more form fields than the ones shown in the guide. Here's a trimmed down version of my code:
$(':file').each(function(i, el) {
var fileInput = $(el);
var form = fileInput.parents('form:first');
fileInput.fileupload({
forceIframeTransport: true,
autoUpload: true,
singleFileUploads: true, //default anyway
add: function(event, data) {
var files = data.files || [];
var nextInQueue = files[0]; //this is a queue, not a list
console.log('nextInQueue:', nextInQueue);
if (!nextInQueue) return;
var fileData = {name: nextInQueue.name, mime: nextInQueue.type, size: nextInQueue.size};
$.ajax({
url: '/s3stuff',
type: 'POST',
dataType: 'json',
data: {'file': fileData},
async: false,
success: function(res) {
form.find('input[name="key"]').val(res.key);
form.find('input[name="AWSAccessKeyId"]').val(res.AWSAccessKeyId);
form.find('input[name="policy"]').val(res.policy);
form.find('input[name="signature"]').val(res.signature);
form.find('input[name="acl"]').val(res.acl);
form.find('input[name="success_action_status"]').val(res.success_action_status);
form.find('input[name="Content-Type"]').val(nextInQueue.type);
form.attr('action', res.url);
data.submit(); //why is this submitting all files at once?
}
});
},
fail: function(err) {
console.log('err:', err.stack);
}
});
});
Error
When I try to upload a single file, it works great! But when I try to upload multiple files, S3 returns this 400 error:
<?xml version="1.0" encoding="UTF-8"?>
<Error>
<Code>InvalidArgument</Code>
<Message>POST requires exactly one file upload per request.</Message>
<ArgumentName>file</ArgumentName>
<ArgumentValue>2</ArgumentValue>
</Error>
Analysis
The blueimp jQuery File Upload documentation says this (under add):
If the singleFileUploads option is enabled (which is the default), the add callback will be called once for each file in the selection for XHR file uploads, with a data.files array length of one, as each file is uploaded individually.
This is where I'm stuck:
If "each file is uploaded individually" to S3, why does S3 say otherwise?
Also, regardless of the number of files, the add function runs only once. (I verify this with the console.log statement.) I see 2 likely reasons for this:
The plugin stops further submissions after one fails.
The plugin is not submitting files individually (for whatever reason).
Is my code incorrect, am I missing an option in the plugin, or does the plugin not support multiple direct uploads to S3?
Update
Here's an html file for quick testing:
http://pastebin.com/mUBgr4MP
Author of JQuery File Upload here.
The reason the files are not submitted individually is the forceIframeTransport option, which is set to true in your example.
The documentation for the singleFileUploads option states the following:
By default, each file of a selection is uploaded using an individual request for XHR type uploads.
While the documentation for the forceIframeTransport option states the following:
Set this option to true to force iframe transport uploads, even if the browser is capable of XHR file uploads.
So, the solution is to enable CORS on the S3 bucket and to not enable the forceIframeTransport option.
By the way, most of the integration examples are user contributions (including the S3 upload guides).
Here's another one which uses the S3 CORS feature and might help your out (haven't tested it though):
http://pjambet.github.io/blog/direct-upload-to-s3
I just found Fine Uploader today, after having searched for a javascript uploader that will also support posting the file to Amazon S3. I read the documents as much as I could and searched this site, but I don't think there's anything about this specifically.
As a user of wikis and Markdown (it's ubiquitous, here, on github, in our internal ERP database and so on), I'd like to be able to easy copy-paste a "syntax complete" string, after a file is uploaded, because that would really make documentation creation easier.
The workflow I envision -
make screenshots locally
drag the screenshots or other files to the Fine Uploader upload area on a page
files in the session are uploaded and presented in the result list with a button at its top, allowing the user to copy the markdown syntax for the one or more files, to the clipboard.
Then I can paste the result into whatever textarea I want. Something like:
![This is image 1](http://mys3_url.tdl/path/to/this_is_image_1.png)
![This is image 2](http://mys3_url.tdl/path/to/this_is_image_2.png)
[Link to a PDF](http://mys3_url.tdl/path/to/this_is_my_pdf_1.pdf))
For bonus points, I'd like to add an icon to represent the non-image file type, to its left. Something like:
![](http://url.tdl/path/to/icon.png)[Link to a PDF](http://mys3_url.tdl/path/to/this_is_my_pdf_1.pdf))
I imagine there's a way to do this, with a cursory look at the Events and API methods. But would you be so kind as to point me at events or API methods of interest?
Please advise. If this is the wrong place for this and if it needs to be posted at your github, I will do so. Let me know, please.
Thank you for your assistance in advance.
Kind regards
Rick
It sounds like you are simply looking for a way to easily retrieve the url in S3 of each uploaded file. This can be done by having your server return the URL of the file in the response to the upload success POST request sent by Fine Uploader. Fine Uploader will return the response (assumed to be JSON) to your onComplete event handler.
For example, say your server returns the following response to an upload success POST: {"url": "http://mys3_url.tdl/path/to/this_is_image_1.png"}. You can access this response in your onComplete event handler like this:
var uploader = new qq.s3.FineUploader({
...
callbacks: {
onComplete: function(id, name, responseJSON) {
var urlOfFile = responseJSON.url;
...
}
}
});
At this point, you can so whatever you please with these URLs.
Before you start tutting, this isn't the usual 'can't load my files from the server' post...
I want to give users the option to see files on the server already in a bootstrap modal, then allow them to select given files. On selection, I want to close the modal and send them to dropzone to load in.
I'm sure mockfile is the way to go. I'm just not sure where to start.
how do I pass image URLs to dropzone programmatically? I don't think I want to get dropzone to re-initialise as if they click 'browse files' more than once, then they will loose previous images.
I hope I have explained myself ok. I can't see an 'addFiles' option and am not sure how to pass mockfiles after the dropzone has been loaded.
Any ideas?
var mockfile = { name: fileName, size: fileSize };
dropZoneObject.options.addedfile.call(dropZoneObject, mockfile);
dropZoneObject.options.thumbnail.call(dropZoneObject, mockfile, fileImageURL);
Ofcourse you replace filename, fileSize and fileImageURL and add as many files as you have on the server.