Altering the DOM with the Play! Framework - javascript

Hello again StackOverflow.
I've been tasked with modifying a website that runs on Scala's Play! framework and Twitter Bootstrap. I've hit a roadblock concerning altering the DOM. I need to accomplish the following:
(The page being talked about takes user input and passes the server a Form, which if
valid writes the mapped Data in the Form to a database.)
Have the user choose a category from a drop-down. This particular drop-down has nothing to do with the Form.
Based on their choice, query the database for all objects of a certain type that relate to the chosen category via a foreign key.
Alter the DOM (that is, show without reloading the page) to display those objects for the user to select them. Their selections are added to the Form.
Submit the Form, write to the database, etc.
Questions:
Is this a good way to go about what I'm trying to accomplish?
If so, is there a way to alter the DOM via Scala/Play HTML templates without reloading the page?
If that's not possible, what ilk of manually written Javascript is necessary?
Admissions:
I have very little experience with web development other than Play.
I have very little experience with Javascript.
Resources I've been looking at:
This SO post
Play docs on Javascript routing
Scala.js
Thank you!

For anyone who might come upon this in future, the short answer is Javascript.
Long answer:
To do any AJAX work, you'll need a method like the following in your top-level Controller to set up Javascript routing:
def javascriptRoutes = Action { implicit request =>
Ok(
Routes.javascriptRouter("jsRoutes")(
SomeOtherController.someMethod // Returns a JsValue!
)
).as("text/javascript")
}
Then in the HTML template (*.scala.html) which will contain some AJAXy Javascript:
<script type="text/javascript" src="#routes.ApplicationController.javascriptRoutes"></script>
And finally in your actual JS file (assuming you're using jQuery):
$("someSelector").click(function() {
// Notice that this matches the method name that exists in Scala!
// Make sure to pass `someMethod` what it needs.
var req = jsRoutes.controllers.SomeOtherController.someMethod(...)
$.ajax({
url: req.url,
type: req.type,
success: function(json) {
// DOM manipulation, etc., here.
},
error: function(xhr, status, errorThrown) {
console.log( "Error: " + errorThrown );
console.log( "Status: " + status );
console.dir( xhr );
}
}); // ajax
}); // handler

Related

Save HTML DOM to file on server

I have an html5 web page that allows users to drag-n-drop objects between divs. After a user has moved objects around, I would like to save the current DOM to a file on my web server.
I know I can get the current HTML DOM using javascript but of course, I cannot save to a file on my server using javascript. So I thought about passing the html to a PHP page to do the "save" function, but I cannot figure out how to get the html passed to a PHP page. I've tried sending it as an argument in the URL with URI encoding, but the PHP page is not properly getting the entire string from the URL.
Should this approach work? If so, what am I missing to get the html string passed correctly to a PHP page? Or should I be using some other method?
ajax is the way to go here. If you are not familiar with ajax, please google it and learn it well. Any modern web app needs to have ajax integration in some way.
Here is how you can use javascript to communicate with the server.
Please Note I'm using JQuery
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.2.0/jquery.min.js"></script>
<script>
$(document).ready(function(){
$(".draggableDivs").mouseup(function(event){
var documentStructure = ''; // whatever js you use to get document structure
var d = {"document_structure": documentStructure};
$.ajax({
url: "test.php", //Your url both relative and fixed path will work
type: "POST", // you need post not get because you are sending a lot of data
data: d,
success: function(response) {
alert('saved');
},
error: function (XMLHttpRequest, textStatus, errorThrown) {
alert(errorThrown);
}
});
});
});
</script>
on the server you would then do your php and save the data.
After you are done you can just respond with a json object if needed, if not just exit
you can use the approach like
after the drag and drop with javascript, show a button to save the dom.
on click event on the button, take the current dom in a variable.
use ajax to transfer your current dom to a php file(ajax file).
in the ajax file , save it into database.

Get Twitter Feed as JSON without authentication

I wrote a small JavaScript a couple of years ago that grabbed a users (mine) most recent tweet and then parsed it out for display including links, date etc.
It used this json call to retrieve the tweets and it no longer works.
http://twitter.com/statuses/user_timeline/radfan.json
It now returns the error:
{"errors":[{"message":"Sorry, that page does not exist","code":34}]}
I have looked at using the api version (code below) but this requires authentication which I would rather avoid having to do as it is just to display my latest tweet on my website which is public anyway on my profile page:
http://api.twitter.com/1/statuses/radfan.json
I haven't kept up with Twitter's API changes as I no longer really work with it, is there a way round this problem or is it no longer possible?
Previously the Search API was the only Twitter API that didn't require some form of OAuth. Now it does require auth.
Twitter's Search API is acquired from a third party acquisition - they rarely support it and are seemingly unenthused that it even exists. On top of that, there are many limitations to the payload, including but not limited to a severely reduced set of key:value pairs in the JSON or XML file you get back.
When I heard this, I was shocked. I spent a LONG time figuring out how to use the least amount of code to do a simple GET request (like displaying a timeline).
I decided to go the OAuth route to be able to ensure a relevant payload. You need a server-side language to do this. JavaScript is visible to end users, and thus it's a bad idea to include the necessary keys and secrets in a .js file.
I didn't want to use a big library so the answer for me was PHP and help from #Rivers' answer here. The answer below it by #lackovic10 describes how to include queries in your authentication.
I hope this helps others save time thinking about how to go about using Twitter's API with the new OAuth requirement.
You can access and scrape Twitter via advanced search without being logged in:
https://twitter.com/search-advanced
GET request
When performing a basic search request you get:
https://twitter.com/search?q=Babylon%205&src=typd
q (our query encoded)
src (assumed to be the source of the query, i.e. typed)
by default, Twitter returns top 25 results, but if you click on
all you can get the realtime tweets:
https://twitter.com/search?f=realtime&q=Babylon%205&src=typd
JSON contents
More Tweets are loaded on the page via AJAX:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd&include_available_features=1&include_entities=1&last_note_ts=85&max_position=TWEET-553069642609344512-553159310448918528-BD1UO2FFu9QAAAAAAAAETAAAAAcAAAASAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Use max_position to request the next tweets
The following json array returns all you need to scrape the contents:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd
has_more_items (bool)
items_html (html)
max_position (key)
refresh_cursor (key)
DOM elements
Here comes a list of DOM elements you can use to extract
The authors twitter handle
div.original-tweet[data-tweet-id]
The name of the author
div.original-tweet[data-name]
The user ID of the author
div.original-tweet[data-user-id]
Timestamp of the post
span._timestamp[data-time]
Timestamp of the post in ms
span._timestamp[data-time-ms]
Text of Tweet
p.tweet-text
Number of Retweets
span.ProfileTweet-action–retweet > span.ProfileTweet-actionCount[data-tweet-stat-count]
Number of Favo
span.ProfileTweet-action–favorite > span.ProfileTweet-actionCount[data-tweet-stat-count]
Resources
https://code.recuweb.com/2015/scraping-tweets-directly-from-twitter-without-authentication/
If you're still looking for unauthenticated tweets in JSON, this should work:
https://github.com/cosmocatalano/tweet-2-json
As you can see in the documentation, using the REST API you'll need OAuth Tokens in order to do this. Luckily, we can use the Search (which doesn't use OAuth) and use the from:[USERNAME] operator
Example:
http://search.twitter.com/search.json?q=from:marcofolio
Will give you a JSON object with tweets from that user, where
object.results[0]
will give you the last tweet.
Here is a quick hack (really a hack, should be used with caution as its not future proof) which uses http://anyorigin.com to scrape twitter site for the latest tweets.
http://codepen.io/JonOlick/pen/XJaXBd
It works by using anyorigin (you have to pay to use it) to grab the HTML. It then parses the HTML using jquery to extract out the relevant tweets.
Tweets on the mobile site use a div with the class .tweet-text, so this is pretty painless.
The relevant code looks like this:
$.getJSON('http://anyorigin.com/get?url=mobile.twitter.com/JonOlick&callback=?', function(data){
// Remap ... utf8 encoding to ascii.
var bar = data.contents;
bar = bar.replace(/…/g, '...');
var el = $( '<div></div>' );
el.html(bar);
// Change all links to point back at twitter
$('.twitter-atreply', el).each(function(i){
$(this).attr('href', "https://twitter.com" + $(this).attr('href'))
});
// For all tweets
$('.tweet-text', el).each(function(i){
// We only care about the first 4 tweets
if(i < 4) {
var foo = $(this).html();
$('#test').html($('#test').html() + "<div class=ProfileTweet><div class=ProfileTweet-contents>" + foo + "</div></div><br>");
}
});
});
You can use a Twitter API wrapper, such as TweetJS.com which offers a limited set of the Twitter API's functionality, but does not require authentication. It's called like this;
TweetJs.ListTweetsOnUserTimeline("PetrucciMusic",
function (data) {
console.log(data);
});
You can use the twitter api v1 to take the tweets without using OAuth. For example: this link turns #jack's last 100 tweets.
The timeline documentation is here.
The method "GET statuses/user_timeline" need a user Authentification like you can see on the official documentation :
You can use the search method "GET search" wich not require authentification.
You have a code for starting here : http://jsfiddle.net/73L4c/6/
function searchTwitter(query) {
$.ajax({
url: 'http://search.twitter.com/search.json?' + jQuery.param(query),
dataType: 'jsonp',
success: function(data) {
var tweets = $('#tweets');
tweets.html('');
for (res in data['results']) {
tweets.append('<div>' + data['results'][res]['from_user'] + ' wrote: <p>' + data['results'][res]['text'] + '</p></div><br />');
}
}
});
}
$(document).ready(function() {
$('#submit').click(function() {
var params = {
q: $('#query').val(),
rpp: 5
};
// alert(jQuery.param(params));
searchTwitter(params);
});
})

Auto populate text_fields based on selected item from another collection_select in Rails 3

Hello people
I'm trying to figured this out, but I still can't do it.
I have a rails 3 app, I'm working with invoices and payments. In the form for payments I have a collection_select where I display all the invoices number (extracted from a postgres database), and what I'm trying to do is when i select an invoice autopopulate others text_fields (provider, address, etc.) without reloading the page, in the same form.
I know I should use ajax, js, jquery, but I'm a beginner in these languages, so i don't know how or where to start
hope you can help me... thanks
What you are going to want to do is route an ajax call to a controller, which will respond with json containing the information. you will then use jquery to populate the different fields.
In your routes:
get "invoice/:id/get_json", :controller=>"invoice", :action=>"get_json"
In your invoice_controller:
def get_json
invoice = Invoice.find(params[:invoice_id])
render :text => invoice.to_json
end
In your invoice model (if the default to_json method is not sufficent):
def to_json
json = "{"
json += "id:'#{self.id}'"
json += ",date_created:'#{self.date}'"
... //add other data you want to have here later
json += "}"
end
In your javascript file,
$("#invoice_selecter").change(function(){ //calls this function when the selected value changes
$.get("/invoice/"+$(this).val()+"/get_json",function(data, status, xhr){ //does ajax call to the invoice route we set up above
data = eval(data); //turn the response text into a javascript object
$("#field_1").val(data.date_created); //sets the value of the fields to the data returned
...
});
});
You are probably going to run into a few issues, i would highly recommend downloading and installing fire bug if you are not on google chrome.. and if you are, make sure you are using the development tools. I believe you can open them up by right clicking and hitting inspect element. Through this, you should be able to monitor the ajax request, and whether or not it succeeded and things.

Convert many GET values to AJAX functionality

I have built a calendar in php. It currently can be controlled by GET values ​​from the URL. Now I want the calendar to be managed and displayed using AJAX instead. So that the page not need to be reloaded.
How do I do this best with AJAX? More specifically, I wonder how I do with all GET values​​? There are quite a few. The only solution I find out is that each link in the calendar must have an onclick-statement to a great many attributes (the GET attributes)? Feels like the wrong way.
Please help me.
Edit: How should this code be changed to work out?
$('a.cal_update').bind("click", function ()
{
event.preventDefault();
update_url = $(this).attr("href");
$.ajax({
type : "GET"
, dataType : 'json'
, url : update_url
, async : false
, success : function(data)
{
$('#calendar').html(data.html);
}
});
return false;
});
Keep the existing links and forms, build on things that work
You have existing views of the data. Keep the same data but add additional views that provide it in a clean data format (such as JSON) instead of a document format (like HTML). Add a query string parameter or HTTP header that you use to decide which view to return.
Use a library (such as YUI 3, jQuery, etc) to bind event handlers to your existing links and forms to override the normal activation functionality and replace it with an Ajax call to the alternative view.
Use pushState to keep your URLs bookmarkable.
You can return a JSON string from the server and handle it with Ajax on the client side.

Capture all links including form submission

I am wondering how to capture all links on a page using jQuery. The idea being similar to Facebook. In Facebook, if you click on a link it captures the link and loads the same link using ajax. Only when you open a link in new tab etc. will it load the page using regular call.
Any clue on how to achieve such kind of functionality? Am sure capturing links should not be a problem, but what about capture form submissions and then submitting the entire data via ajax and then displaying the results?
Is there any plugin which already exists?
Thank you for your time.
Alec,
You can definitely do this.
I have a form that is handled in just this way. It uses the jquery form plugin kgiannakakis mentioned above. Example javascript below shows how it might work.
$("form").ajaxForm({
beforeSubmit: function(){
//optional: startup a throbber to indicate form is being processed
var _valid = true;
var _msg = '';
//optional: validation code goes here. Example below checks all input
//elements with rel attribute set to required to make sure they are not empty
$(":input [rel='required']").each(function(i){
if (this.value == '') {
_valid = false;
_msg += this.name + " may not be empty.\n";
$(this).addClass("error");
}
});
alert(_msg);
return _valid;
},
success: function(response){
//success here means that the HTTP response code indicated success
//process response: example assumes JSON response
$("body").prepend('<div id="message" class="' + response.status + '"></div>');
$("#message").text(response.message).fadeIn("slow", function(){
$(this).fadeOut("slow").remove();
});
}
});
Form plug-in can transform a regular form to an Ajax one:
$("#myForm").ajaxForm(
{beforeSubmit: validate, success: showResponse} );
It would be difficult to do what you want however for an arbitrary form. What if the form uses validation or is submitted by Ajax to begin with? The same thing applies for links. What if there are some javascript navigations scripts (window.location = Url)? If you don't have full control of the page, it will be difficult to do what you want.
Usually pages like facebook, do each event and each form separately coded, as the server-side files are usually set for each single operation / group of operations. I doubt there will be a clean way to convert a page with just a plug-in. And if it is, I see a lot of overhead.
You can do it by hand, but again that's abuse of Ajax. This isn't flash, and with using ajax for all server communications you run into a lot of problems.
Lack of history tracking.
Watching out for concurrent events and the results of thereof.
Communicating to the user that the page is changing.
Users with javascript turned off.
And much more...

Categories