The idea is to make a widget or "chunk" of a website that can be inserted onto someone else's webpage by just adding a script tag to my javascript file located on amazon s3 and a div that I will insert content into. I also uploaded the css and HTML files to amazon but when I try to make a call to them in the javascript I get errors. I tried variations of the code below and got various errors, most recently 403, forbidden. I made the files public on amazon too. Please let me know if you have a suggestion/solution!
var css_link = $("<link>", {
rel: "stylesheet",
type: "text/css",
href: "https://s3.amazonaws.com/lawkickstas/lawkick.css"
});
css_link.appendTo('head');
var jsonp_url = "https://s3.amazonaws.com/lawkickstas/lawkick_html.js";
$.ajax({
url: jsonp_url,
dataType: 'jsonp',
success: function(dataWeGotViaJsonp){
console.log(dataWeGotViaJsonp);
}
});
Just a guess from the limited info you have on the error. There is specific setting required on S3 if the access has to be from another domain.
You might want to check this. Enabling Cross-Origin Resource Sharing - Amazon Simple Storage Service
Related
I'm using Apify to crawl about a hundred pages, and I wish to download the HTML files of all the pages I visit into a dropbox folder. How do I specify this in my jQuery Ajax code?
Sorry in advance, I am quite new to Javascript and everything web-related.
I've tried to follow these tutorials already: https://forum.apify.com/t/data-download-from-within-crawler/48 and https://www.dropbox.com/developers/documentation/http/documentation#files-save_url, however, I am only able to download the HTML file of the second page I visit. I know that my crawler works perfectly fine and visits all the sites it needs to, as I am getting the results I need from these pages, so the problem seems to be that I am not specifying that I want to download all the HTML files. How do I do this?
(In my actual code I have written in the correct Oath-token, I just don't want it to be available online for everyone to see)
var html = $('html').html();
var url = "https://content.dropboxapi.com/2/files/upload";
$.ajax({
url: url,
contentType: "application/octet-stream",
headers: {
"Authorization": 'Bearer ' + 'My Oath-token',
"Dropbox-API-Arg": "{\"mode\":\"add\",\"path\":\"/a.txt\"}",
},
type: 'POST',
data: html,
autoRename: true,
max_results: 1000
});
What I am getting out of this is one file saved as a.txt in my dropbox, which is what I wanted, only that this file only includes one HTML file, not a file including all the files my crawler had visited.
This code is the first thing my crawler meets for every new page it visits.
I'm trying Ajax on loading all pictures inside one local folder onto my html page. Code references this question. Files run on (Tomcat 8.5) server in Eclipse first and I open url in Google Chrome. Then Ajax fails according to the console:
GET /Users/jiaqni/.../WebContent/upload 404 ()
Any idea what I did wrong? Relative path "dir='upload/';" neither works. Thanks guys!
<script>
$(document).ready(function(){
console.log("Image appending...");
var dir = "/Users/jiaqni/.../WebContent/upload/";
var regexp = new RegExp("\.png|\.jpg|\.jpeg");
$.ajax({
url: dir,
success: function (data) {
//List all .png .jpg .jpeg file names in the page
console.log("Success!");
$(data).find("a").filter(function(){return regexp.test($(this).text());}).each(function(){
var filename = this.href.replace(window.location, "");
...
});
}
});
});
</script>
.htaccess was added to folder /User/.../upload/ to ensure it's browsable. And without Ajax, <img src="upload/xxx.jpeg"/> does display image in that folder.
I am guessing that the URL in question here refers to a local resource on your computer.
Unfortunately, this is not possible - usually browsers (e.g., Google Chrome) prevent you from doing so (due to privacy & security issues that may arise by allowing it).
You should put your files in your web server (e.g., Apache, ngnix, etc.) and adjust the URL of the AJAX request accordingly.
Good luck.
I am trying to log data to my console but I keep getting an error, what might be wrong with the code? Actually trying to display the html content on a page. Here is my snippet
$(function () {
$.ajax({
url: '//en.wikipedia.org//w/api.php?action=parse&format=json&page=pizza&prop=text§ion=0&contentmodel=wikitext&formatversion=2',
dataType: 'jsonp',
success: function (data) {
console.log(data);
}
});
});
net::ERR FILE NOT FOUND error message
Your URL starts with // so it uses whatever scheme the URL of the HTML document the JS is running in uses.
If you load the HTML document over HTTP then you try to access Wikipedia over HTTP.
If you load the HTML document over HTTPS then you try to access Wikipedia over HTTPS.
If you load the HTML from a local file, then you try to access Wikipedia from your own hard disk … where it doesn't exist.
Use an absolute URL or, better, test your webpages on a web server.
I am trying to add some emails taken from an inputbox into a .txt file present on my webserver. Here is the code :
email = document.getElementById("mail").value;
$.ajax({
url: 'maillist.txt',
datatype: 'text',
type: 'PUT',
data: email + '; ',
success: function(data) {
alert('Should have work yay!');
}
});
but that doesn't work on any browser. :(
I have tried using javascript classic methods but it was a no go as well...
I would need either a PUT or POST method, either jQuery or JS, to be able to do this on internet explorer 8 and up as well as firefox and chrome. Emails should appear in the text file as
email1#cooldomain.com; email2#cooldomain.com; .....
Just so it works with our in-house VBA Macro. :)
Also, could there be a method for dropping data into XML files (aka create a new XML entry with form data)? And also, is it possible to upload a file from client side to server side using jQuery? Because i would need users to fill up forms and drop their data into an XML file, and link up a file they choose with that. That way they could add stuff into the XML themselves and they would show up brand new into the webpage.
Kindoff "reddit" or "4chan" like if you know the references.
Thanks for your time, really appreciated!
You can't post from a browser to a text file on the server side. You need to have some sort of code on the server side that will receive the HTTP PUT, and persist the data to a file local to the server.
I want to download a image file (jpeg image) to the user files system when user selects a photo and clicks a button. So far I searched and found this link and also this
I saw in one blog that downloadify when used with jszip can enable this feature but it didn't specify any farther on this. Does any one know how to download the images. I have the links to the images and i just want the user to download it on his system rather than again query the server.
Can anyone provide me with an example please.
Use the HTML5 download attribute.
As a choice you can set filename as an attribute value.
<a href="/images/image-name.jpg" download="new-image-name.jpg">
You can load the image in an canvas element get the data url of the canvas and open a new window with the data url as source. Take a look at this example: https://gist.github.com/1875132
Finally I did it. For anyone who may need this in the future here is how I did it using jquery
jQuery.ajax({
url: 'http://localhost:8080/yourwebsite/servlet?img=' + document.getElementById(id).alt,
//data: myData,
type: 'GET',
crossDomain: true,
dataType: 'jsonp',
// success: function() { alert("Success"); },
// error: function() { alert('Failed!'); },
// beforeSend: setHeader
});
this I had to do come across the problem of cross domain http requests which are usually blocked by most websites unless you follow some lengthy process. So, I made it simpler and called a get method in my servlet and passed it the url of the image from which it downloaded the image.This is much easier to do and even easier then this if you are on the same domain but that didn't meet my requirements and this code worked for me :)