I have an application which is javascript and HTML to be delivered with about 500 short (18MB) videos on 2 physical discs. I'm making an ajax request to check a file exists before displaying it, if it does not I prompt the user to insert the other disc.
video.innerHTML = "<p class=\"no-video\">Working...</p>";
$.ajax({
url: "movies/"+num+".mp4",
type: "HEAD",
success: function(){showVideo(num);},
error: function(){video.innerHTML = "<p class=\"no-video\">Please insert the other disk and click ok</p>";}
});
This works fine in Firefox, but takes about a minute to figure out the file is actually there in IE (if the file is missing it is fast), I'm assuming this is because IE does not respect type: "HEAD" but it still should not take that long to load an 18MB file from DVD.
I'll have to test more browsers next.
Does anyone have any suggestions?
(would prefer not to have to re-load the HTML when the disc is swapped)
Interesting thought about using HEAD with the file system. Another solution is to use some sort of a file that acts as a table of contents for what's on the disc.
Related
I don't know if this is a duplicate question but i have searched and couldn't found solution for this
I am newbie in cpanel and i recently uploaded my project in it. Now there is a part in my website where i am loading a folder of images through jquery ajax. Now this was working perfectly in the local server xampp but not in the server it keeps giving 404 error that means that the files not being discovered by the ajax script. For security reasons i am not going to share the links right now but i will explain the full procedure
These are the location of those folders. These scripts are in js folder. But obviously it is included in index page. anyway lets move
var svgFolder = "img/svg/";
var productImagesFolder = "img/ImagesForProducts/";
Following are the ajax scripts that i am using to load the images of these folders
$.ajax({
url: svgFolder,
success: function (data) {
$(data).find("a").attr("href", function (i, val) {
if (val.match(/\.(jpe?g|svg)$/)) {
$(".svg-shapesDiv").append("<img src='" + svgFolder + val + "' id='svg-shapes' loading='lazy'>");
}
});
}
});
$.ajax({
url: productImagesFolder,
success: function (data) {
$(data).find("a").attr("href", function (i, val) {
if (val.match(/\.(jpe?g|jpg)$/)) {
$("#avatarlist").append("<img style='cursor:pointer;' class='img-polaroid' src='" + productImagesFolder + val + "' loading='lazy'>");
}
});
}
});
All of this is working fine in localhost server but for some reason when i uploaded them in the cpanel it stopped working.
I tried hard coding the img tag like this
<img src='img/svg/file.svg' id='svg-shapes' loading='lazy'>
<img src='img/ImagesForProducts/file.png' id='svg-shapes' loading='lazy'>
Things i tried
And this works fine so i think that the ajax is not figuring out the address. I also tried to search the image through link in the browser like this domainname.com/img/svg/file.svg and it works fine as well. i also tried to give ajax the path like this domainname.com/img/svg/file.svg but it doesn't work. I checked the file capitalization etc but everything is correct
If this was a stupid question then i am sorry but i don't know that what i am doing wrong and i am also new to cpanel and live hosting stuff.
Based on the response to my comment it sounds as though your xampp has "indexes" enabled by default. Please see here: https://httpd.apache.org/docs/2.4/mod/mod_autoindex.html
It may be that on your shared webhosting they are disabled by default and you would need to enable them for those 2 directories. As you are using cpanel please see here: https://docs.cpanel.net/cpanel/advanced/indexes/82/ but this can also be achieve by adding a .htaccess file to the 2 folders containing Options +Indexes.
The trouble with relying on indexes this way is that different servers could potentially return slightly different html so you could find that your xampp server returns html links (your JavaScript searches for anchor tags and gets the href from there) but the shared server may not return links it may just return the file names. Also with this html being returned your JavaScript has to parse that html, search all links and extract the href. I would therefore recommend writing a php script that gathers the relevant files and returns only those in JSON format. Much easier then for the JavaScript to parse and use and you now have full control of what is returned whether it is on your xampp server or other hosting. You can call this script whatever you want and you can place it wherever you want. You could even have one script that accepts query parameters from your AJAX call and from those it know which folder to look into and what types of files it must gather from the folder. This also has the advantage of keeping all other files in those folders hidden from prying eyes.
So I'm making a browser based game, and in order to create an account for this game I have the JS file call a PHP file (POST) to write an XML file.
This works, I get the file in cPanel, in the right directory, with the right content. Meaning I can open it, but only in cPanel. When I try to access it via browser I get a 404, but only for about 30 min, then it'll just magically start working.
This same PHP file is called later on in the game to update XML files, and the same thing happens. I can confirm that the PHP works exactly as it should, because I can see that the file/directory is perfect.
Here's the interesting bit, if I create an XML file manually or update it manually, it works instantly. It's only the XMLs created by the PHP file that take forever to load.
It's like the server doesn't realize that there was a change on it, until half an hour after the fact. That is, unless I did it manually.
My PHP:
<?php
$filename=$_POST['fileTo'];
$newfile=fopen($filename,"w")or die('Can not open');
$string=$_POST['stuff'];
fwrite($newfile,$string) or die('Could not write');
fclose($newfile);
?>
My AJAX call:
$.ajax({
type: 'GET',
url: writeDirect,
dataType: 'xml',
success: function(result) {
},
cache:false,
error: function(error) {
$.post('PHP/Accounts/creatAcc.php', { fileTo: userWrite, stuff: writeStuff }, function() {
signIn(userATFS, passCe);
});
}
});
Update:
I've decided to access the games directory directly from the browser. This gets even more interesting.
First thing I did was create a new account called testFile, I get the standard error on the GET because the game can't access the newly created account.
Then I opened the directory in Chrome, this is interesting:
The index clearly shows that testFile.xml exists
Then I try clicking on it, but this is where it breaks
The images 404 despite the file clearly existing
And no, changing the permissions on testFile.xml did not change anything.
I believe I've found the answer. I think it was just that server that was weird like that. I was using x10 basic and decided to switch over to another service and now it works.
I am using angular and ASP.NET Web API to allow users to download files that are generated on the server.
HTML Markup for download link:
<img src="/content/images/table_excel.png">
<a ng-click="exportToExcel(report.Id)">Excel Model</a>
<a id="report_{{report.Id}}" target="_self"></a>
The last anchor tag is there to serve as a place holder for an automatic click event. The visible anchor calls the exportToExcel method to initiate the call to the server and begin creating the file.
$scope.exportToExcel = function(reportId) {
reportService.excelExport(reportId, function (result) {
var url = "/files/report_" + reportId + "/" + result.data.Model.fileName;
var dLink = document.getElementById("report_" + reportId);
dLink.href = url;
dLink.setAttribute('download', result.data.Model.fileName);
dLink.click();
});
}
The Web API code creates an Excel file. The file, on the server is about 279k, but when it is downloaded on the client it is only 7k. My first thought was that the automatic click might be happening before the file is completely written. So, I added a 10 second $timeout around the click event as a test. It failed with the same result.
This seems to only be happening on our remote QA server. On my local development server I always get the entire file back. I am at a loss as to why this might be happening. We have similar functionality where files are constructed from a database blob and saved to the local disk for download. The same method is employed for the client side download and that seems to work fine. I am wondering if anyone else has run into a similar issue.
Update
After the comment by SilentTremmor we think it actually may be IIS or some sort of Sever issue. Originally, we didn't think it could be, but after some digging it may be. It seems the instance of the client code is only allowing 7k of data to be downloaded. It doesn't matter what we try to download the result is always the same.
It turns out the API application was writing the file to a different instance of our application. The client code had no idea and was trying to download a file that did not exist. So, when the download link was creating the file it was empty, thus the small file size.
I am trying to add some emails taken from an inputbox into a .txt file present on my webserver. Here is the code :
email = document.getElementById("mail").value;
$.ajax({
url: 'maillist.txt',
datatype: 'text',
type: 'PUT',
data: email + '; ',
success: function(data) {
alert('Should have work yay!');
}
});
but that doesn't work on any browser. :(
I have tried using javascript classic methods but it was a no go as well...
I would need either a PUT or POST method, either jQuery or JS, to be able to do this on internet explorer 8 and up as well as firefox and chrome. Emails should appear in the text file as
email1#cooldomain.com; email2#cooldomain.com; .....
Just so it works with our in-house VBA Macro. :)
Also, could there be a method for dropping data into XML files (aka create a new XML entry with form data)? And also, is it possible to upload a file from client side to server side using jQuery? Because i would need users to fill up forms and drop their data into an XML file, and link up a file they choose with that. That way they could add stuff into the XML themselves and they would show up brand new into the webpage.
Kindoff "reddit" or "4chan" like if you know the references.
Thanks for your time, really appreciated!
You can't post from a browser to a text file on the server side. You need to have some sort of code on the server side that will receive the HTTP PUT, and persist the data to a file local to the server.
So I've been researching this for a couple days and haven't come up with anything conclusive. I'm trying to create a (very) rudimentary liveblogging setup because I don't want to pay for something like CoverItLive. My process is: Local HTML file > Cloud storage (Dropbox/Drive/etc) > iframe on content page. All that works, and with some CSS even looks pretty nice despite the less-than-awesome approach. But here's the thing: the liveblog itself is made up of an HTML table, and I have to manually copy/paste the code for a new row, fill in the timestamp, write the new message, and save the document (which then syncs with the cloud and shows up in the iframe). To simplify the process I've made another HTML file which I intend to run locally and use to add entries to the table automatically. At the moment it's just a bunch of input boxes and some javascript to automate the timestamp and write the table row from the input data.
Code, as it stands now: http://jsfiddle.net/LukeLC/999bH/
What I'm looking to do from here is find a way to somehow export the generated table data to another .html file on my hard drive. So far I've managed to get this code...
if(document.documentElement && document.documentElement.innerHTML){
var a=document.getElementById("tblive").innerHTML;
a=a.replace(/</g,'<');
var w=window.open();
w.document.open();
w.document.write('<pre><tblive>\n'+a+'\n</tblive></pre>');
w.document.close();
}
}
...to open just the generated table code in a new window, and sure, I can save the source from there, but the whole point is to eliminate steps like that from the process.
How can I tell the page to save the generated code to a separate .html file when I click on the 'submit' button? Again, all of this happens locally, not on a server.
I'm not very good with javascript--and maybe a different language will be necessary--but any help is much appreciated.
I suppose you could do something like this:
var myHTMLDoc = "<html><head><title>mydoc</title></head><body>This is a test page</body></html>";
var uri = "data:application/octet-stream;base64,"+btoa(myHTMLDoc);
document.location = uri;
BTW, btoa might not be cross-browser, I think modern browsers all have it, but older versions of IE don't. AFAIK base64 isn't even needed. you might be able to get away with
var uri = "data:application/octet-stream,"+myHTMLDoc;
Drawbacks with this is that you can't set the filename when it gets saved
You cant do this with javascript but you can have a HTML5 link to open save dialogue:
<a href="pageToDownload.html" download>Download</a>
You could add some smarts to automate it on the processed page after the POST.
fiddle : http://jsfiddle.net/ghQ9M/
Simple answer, you can't.
JavaScript is restricted to perform such operations due to security reasons.
The best way to accomplish that, would be, to call a server page that would write
the new file on the server. Then from javascript perform a POST request to the
server page passing the data you want to write to the new file.
If you want the user to save the page to it's file system, this is a different
problem and the best approach to accomplish that, would be to, notify the user/ask him
to save the page, that page could be your new window like you are doing w.open().
Let me do some demonstration for you:
//assuming you know jquery or are willing to use it :)
var html = $("#tblive").html().replace(/</g, '<');
//generating your download button
$.post('generate_page.php', { content: html })
.done(function( data ) {
var filename = data;
//inject some html to allow user to navigate to the new page (example)
$('#tblive').parent().append(
'Check your Dynamic Page!');
// you data here, is the response from the server so you can return
// your new dynamic page file name here.
// and maybe to some window.location="new page";
});
On the server side, something like this:
<?php
if($_REQUEST["content"]){
$pagename = uniqid("page_", true) . '.html';
file_put_contents($pagename, $_REQUEST["content"]);
echo $pagename;
}
?>
Some notes, I haven't tested the example, but it works in theory.
I assume that with this the effort to implement it should be minimal, assuming this solves your problem.
A server based solution:
You'll need to set up a server (or your PC) to serve your HTML page with headers that tell your browser to download the page instead of processing the HTML markup. If you want to do this on your local machine, you can use software such as WAMP (or MAMP for Mac or LAMP for Linux) that is basically a web server in a .exe. It's a lot of hassle but it'll work.