How to Call Google Apps Script from Web Page - javascript

Have searched high and low for this. I have a web page of basic HTML/CSS/JS. I want users to be able to visit the page and upon opening page, a call is made to a google script I made which takes information from a spreadsheet and displays some of it on the page. I am hoping I don't have to do any fancy set up like in Google's tutorials because none of them were helpful to me.
My Webpage ----> Google Script ----> Google Spreadsheet
My Webpage <---- Google Script <---- Google Spreadsheet
Users should be able to select an item shown on the webpage (item populated from spreadsheet) and click a button which will allow users to enter a new page with a URL derived from the selected item.
This is essentially a chat room program where the chat rooms are stored on a spreadsheet. I want users to be able to create a new chat room as well which should update the google spreadsheet.

Look into using the GET parameters. https://stackoverflow.com/a/14736926/2048063.
Here's a previous question on the topic.
You can access the parameters passed by GET in your doGet(e) function using e.parameter. If you call http://script.google......./exec?method=doSomething, then
function doGet(e) {
Logger.log(e.parameter.method);
}
doSomething will be written to the log, in this case.
Returning data from the script can be done using the ContentService, which allows you to serve JSON (I recommend). JSON is easiest (in my opinion) to make on the GAS end, as well as use on the client end.
The initial "populate list" call would look something like this. I will write it in jQuery because I feel that is very clean.
var SCRIPT_URL = "http://script.google.com/[....PUT YOUR SCRIPT URL HERE....]/exec";
$(document).ready(function() {
$.getJSON(SCRIPT_URL+"?callback=?",
{method:"populate_list"},
function (data) {
alert(JSON.stringify(data));
});
});
And the corresponding GAS that produces this.
function doGet(e) {
if (e.parameter.method=="populate_list") {
var v = {cat:true,dog:false,meow:[1,2,3,4,5,6,4]}; //could be any value that you want to return
return ContentService.createTextOutput(e.parameter.callback + "(" + JSON.stringify(v) + ")")
.setMimeType(ContentService.MimeType.JAVASCRIPT);
}
}
This method is called JSONP, and it is supported by jQuery. jQuery recognizes it when you put the ?callback=? after your URL. It wraps your output in a callback function, which allows that function to be run on your site with the data as an argument. In this case, the callback function is the one defined in the line that reads function (data) {.

Related

Javascript file for multiple html pages

I am beginner in Javascript. I am currentlyworking on a Phonegap app with it. I am stuck in between as I have 4 html pages for signup process, and I have to pass all the html pages input value to single js file as in final all data must be POSTed to server URL and also I have read on many sites that they have recommended using same js file for all the pages of your site to speed up the site. So I have two problems to solve. I searched on many sites but could not find the accurate answer.
I need to pass 4 html page's input value to single js file.
I have to make single js file for both sign-in and sign-up.
My codes for JS page is:
var firstName="";
var lastName="";
var email="";
var password="";
var retypePassword="";
var gender="";
var DOB="";
var institute="";
var course="";
var branch="";
var semester="";
var teachers = [];
function signUpStarting() {
alert(firstName + " "+lastName+" "+email+" "+password+" "+retypePassword+" "+gender+" "+DOB+" "+institute+" "+course+" "+branch+" "+semester+" "+teachers.join(","));
}
function signUp1() {
firstName[0] = $("#first_name").val().trim();
firstName[1] = $("#last_name").val().trim();
email = $("#email").val().trim();
password = $("#password").val();
retypePassword = $("#retype_password").val();
alert(firstName + " "+lastName+" "+email+" "+password+" "+retypePassword);
}
function signUp2() {
gender = $('#gender').find(":selected").text();
DOB = $('#DOB').val();
alert(gender+" "+DOB);
}
function signUp3() {
institute = $('#institute').find(":selected").text();
course = $('#course').find(":selected").text();
branch = $('#branch').find(":selected").text();
semester = $('#semester').find(":selected").text();
alert(institute+" "+course+" "+branch+" "+semester);
}
function signUp4() {
$(":checkbox" ).map(function() {
if($(this).is(':checked')){
teachers.push($('label[for="' + this.id + '"]').text());
}
});
signUpStarting();
}
In html pages I am calling JS functions for each pages:
On first page:
<a onclick="signUp1()" href="register-two.html">continue</a>
On second page:
<a onclick="signUp2()" href="register-three.html">continue</a>
On third page:
<a onclick="signUp3()" href="register-four.html">continue</a>
On fourth page:
<a onclick="signUp4()">continue</a>
On each transaction from one page to next I have set alert in JS, and I am getting alert with accurate values also. But after clicking the continue button from fourth page of html, I transferred the code to main signup function. I tried to see alert in signUpStarting() function but there I am getting response of just fourth page values and other values are showing nothing as the variables are null.
I am not getting how to save variable values for always without using localStorage or cookies and POSTing all data to server.And I think this would have been easier if I would know to code for all html pages for my site to single JS file.
Please help me !
I am not getting how to save variable values for always without using localStorage or cookies and POSTing all data to server.And I think this would have been easier if I would know to code for all html pages for my site to single JS file.
This is exactly right. You cannot store data in memory between page loads in a web browser environment because all javascript variables are naturally destroyed when the browser navigates away from the page to a new page (even if they use the same javascript on both pages). Thus, you have to save it somewhere with more permanence: localStorage, cookies, or on the server via POST or GET.
What I would recommend is scrapping the four different html pages and simply using one html page that changes dynamically as the user fills in data. This way the browser will not eliminate data before you are ready to POST it to the server.

HTML Function displaying as plain text

I have been working on this for quite some time, and have basically been teaching myself HTML, so I apologize if the code is sloppy or if this is a simple fix. Here is what I am attempting to do, and the problem I am running into:
Take Google Form responses, generate an email based on those responses and dynamically email a certain person in my organization based on the location response(this part is done and working, just adding for context). Then create a survey response that sends info back to the original responder, sent from the administrator that the form was sent to. This is the js that I have running, that is working when it is ran in the google project:
function getid() {
var spreadsheet = SpreadsheetApp.openByUrl('https://docs.google.com/a/raytownschools.org/spreadsheets/d/1YWHu_yKn5bqq63x1A4e4-vBUtZANj-xjeF07IBpHP64/edit?usp=sharing');
SpreadsheetApp.setActiveSheet(spreadsheet.getSheets()[0]);
var sheet = spreadsheet.getActiveSheet();
var lastRow = sheet.getLastRow();
}
When I attempt to run that in my HTML code, and insert it into the element, it is simply inserting that code as raw text. HTML isn't running the function, or returning the data that it should be (and does return when ran outside the HTML code as a js app).
I can post the full HTML code if that would be helpful. Hopefully someone on here can help me out.
What you have there is a Javascript function. There aren't functions in HTML, HTML is a markup language.
You must add that function inside Javascript tags like this:
<script type="text/javascript">
function getid() {
var spreadsheet = SpreadsheetApp.openByUrl('https://docs.google.com/a/raytownschools.org/spreadsheets/d/1YWHu_yKn5bqq63x1A4e4-vBUtZANj-xjeF07IBpHP64/edit?usp=sharing');
SpreadsheetApp.setActiveSheet(spreadsheet.getSheets()[0]);
var sheet = spreadsheet.getActiveSheet();
var lastRow = sheet.getLastRow();
}
alert( getid() );
</script>
Take a look at here, on how to use javascript.
Edit
Seems like that code you're trying to execute is for Google Apps Script. I think you must execute it inside the Google script editor, because they don't make this API available for regular websites. Here is a running example with your code.

Where is the Google Analytics pixel in my DOM?

How can I identify, using JavaScript that a Google Analytics pixel (or any pixel for that matter) has been sent, and contains URL parameters i'm looking for?
I thought, since it's a tracking pixel, i could look for it in the DOM, but it doesn't look like it's ever inserted.
Can someone think of a way to analyze the network request made by google using javascript (not a chrome extension)?
something like
document.whenGooglePixelIsSentDoReallyCoolStuff(function(requestUrl){
});
A few things:
1) The tracking beacons aren't always pixels. Sometimes they're XHR and sometimes they use navigator.sendBeacon depending on the situation and/or your tracker's transport setting, so if you're just looking for pixels you could be looking in the wrong place.
2) You don't need to add an image to the DOM to get it to send the request. Simply doing document.createElement('img').src = "path/to/image.gif" is sufficient.
3) You don't need to use a Chrome extension to debug Google Analytics, you can simply load the debug version of the script instead of the regular version.
4) If you really don't want to use the debug version of Google Analytics and want to track what is sent programmatically, you can override the sendHitTask and intercept hits before they're sent.
Update (7/21/2015)
You've changed how your question is worded, so I'll answer the new wording by saying you should follow the suggestion I give in #4 above. Here's some code that would work with your hypothetical whenGooglePixelIsSentDoReallyCoolStuff function:
document.whenGooglePixelIsSentDoReallyCoolStuff = function(callback) {
// Pass the `qa` queue method a function to get acess to the default
// tracker object created via `ga('create', 'UA-XXXX-Y', ...)`.
ga(function(tracker) {
// Grab a reference to the default `sendHitTask` function.
var originalSendHitTask = tracker.get('sendHitTask');
// Override the `sendHitTask` to call the passed callback.
tracker.set('sendHitTask', function(model) {
// When the `sendHitTask` runs, get the hit payload,
// which is formatted as a URL query string.
var requestUrl = model.get('hitPayload')
// Invoke the callback passed to `whenGooglePixelIsSentDoReallyCoolStuff`
// If the callback returns `false`, don't send the hit. This allows you
// to programmatically do something else based on the contents of the
// request URL.
if (callback(requestUrl)) {
originalSendHitTask(model);
}
});
});
};
Note that you'd have to run this function after creating your tracker, but prior to sending your first hit. In other words, you'd have to run it between the following two lines of code:
ga('create', 'UA-XXXX-Y', 'auto');
ga('send', 'pageview');

Accessing headers of a google spreadsheet using HTTP GET and Google App Script

I'm trying to return the header row of a Google spreadsheet using doGet() in a Google App Script that's running as a WebApp. I'm using a HTML form to send the GET request to the WebApp and it's all working except I don't know how to return the headers to my javascript. I'll post my code:
HTML:
<form id="getForm" method="get" action="My URL for WebApp">
<label for="sheetGetID">SheetID</label>
<input type="text" name="sheetGetID" id="sheetGetID" value="">
<button class="ui-btn" onclick='submitGET()'>Submit</button>
</form>
Javascript:
function submitGET() {
var headers = $("getForm").submit();
alert(headers);
}
Google App Script:
function doGet(e) {
//Trying To: Get headers from sheetID and then return to app, then have correct labels for the inputs, then use POST to post.
var ss = SpreadsheetApp.openById(ScriptProperties.getProperty('active'));
var sheet = ss.getSheetByName(e.parameter["sheetGetID"]);
//Return the first 3 cells, A1:C1,
var headers = sheet.getRange(1,1,1,sheet.getLastColumn()).getValues()[0];
return ContentService.createTextOutput(JSON.stringify(headers))
.setMimeType(ContentService.MimeType.JSON);
}
I'm getting a JSON object returned but it's just a text output. My question is how would/could I get the JSON returned and stored as the headers variable?
The return of doGet method must be an HTML.
Build another html page and use the call HtmlService.createTemplateFromFile('newPag.html').evaluate()
Inside your page use the tags and put your server side code manipulating the json object. This way you will create a good look and feel and a good maintanable code.
I got this to work a while ago, I forgot to post the answer just in case anyone else needed it.
You need to output it as a JSON object like the API demo. You also need to append "?prefix=?" to the url when you're doing a $.getJSON() call. The prefix part is to tell the JQuery that it is a JSON object you're receiving.
If anyone has troubles with this just comment and this and I'll post all the code I used.
So on your client end, I'm using JQuery Mobile, I'm not sure how to do it without it, you would do something like:
sheetID = $("#sheetGetID").val();
$.getJSON("https://script.google.com/macros/s/YOUR_KEY_GOES_HERE/exec?prefix=?",
{ sheetGetID: sheetID},
function(results) {
var fields = results.split(",");
//Do something with fields
}
);
}
Where #sheetGetID is the textbox where the user can enter the sheet id for headers.
Note the ?prefix=? appended to the URL, that part is for JQuery to know it's receiving JSON. That part is necessary. The URL is your deployed WebApp.
On the Google App Script side, ie Server side, you'd have something like:
function doGet(request) {
var ss = SpreadsheetApp.openById(ScriptProperties.getProperty('active'));
var sheet = ss.getSheetByName(request.parameter["sheetGetID"]);
//Return the first 3 cells, A1:C1,
var headers = sheet.getRange(1,1,1,sheet.getLastColumn()).getValues()[0];
var result = headers.join();
var content = request.parameters.prefix + '(' +JSON.stringify(result) + ')';
return ContentService.createTextOutput(content)
.setMimeType(ContentService.MimeType.JSON);
}
If you have any questions on how the spreadsheet part works theres plenty of documentation on Google's API's. doGet() is called when you use the $.getJSON(), the return from the G.A.S. needs to be JSON. Most of this is covered in the documentation Google has, some of it I found watching Google Developers Live on youtube. If you are trying to do more stuff I highly recommend checking those sources out.
If you have any more questions about what's being called or parameters you can find it easily enough on Google.

Get Twitter Feed as JSON without authentication

I wrote a small JavaScript a couple of years ago that grabbed a users (mine) most recent tweet and then parsed it out for display including links, date etc.
It used this json call to retrieve the tweets and it no longer works.
http://twitter.com/statuses/user_timeline/radfan.json
It now returns the error:
{"errors":[{"message":"Sorry, that page does not exist","code":34}]}
I have looked at using the api version (code below) but this requires authentication which I would rather avoid having to do as it is just to display my latest tweet on my website which is public anyway on my profile page:
http://api.twitter.com/1/statuses/radfan.json
I haven't kept up with Twitter's API changes as I no longer really work with it, is there a way round this problem or is it no longer possible?
Previously the Search API was the only Twitter API that didn't require some form of OAuth. Now it does require auth.
Twitter's Search API is acquired from a third party acquisition - they rarely support it and are seemingly unenthused that it even exists. On top of that, there are many limitations to the payload, including but not limited to a severely reduced set of key:value pairs in the JSON or XML file you get back.
When I heard this, I was shocked. I spent a LONG time figuring out how to use the least amount of code to do a simple GET request (like displaying a timeline).
I decided to go the OAuth route to be able to ensure a relevant payload. You need a server-side language to do this. JavaScript is visible to end users, and thus it's a bad idea to include the necessary keys and secrets in a .js file.
I didn't want to use a big library so the answer for me was PHP and help from #Rivers' answer here. The answer below it by #lackovic10 describes how to include queries in your authentication.
I hope this helps others save time thinking about how to go about using Twitter's API with the new OAuth requirement.
You can access and scrape Twitter via advanced search without being logged in:
https://twitter.com/search-advanced
GET request
When performing a basic search request you get:
https://twitter.com/search?q=Babylon%205&src=typd
q (our query encoded)
src (assumed to be the source of the query, i.e. typed)
by default, Twitter returns top 25 results, but if you click on
all you can get the realtime tweets:
https://twitter.com/search?f=realtime&q=Babylon%205&src=typd
JSON contents
More Tweets are loaded on the page via AJAX:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd&include_available_features=1&include_entities=1&last_note_ts=85&max_position=TWEET-553069642609344512-553159310448918528-BD1UO2FFu9QAAAAAAAAETAAAAAcAAAASAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Use max_position to request the next tweets
The following json array returns all you need to scrape the contents:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd
has_more_items (bool)
items_html (html)
max_position (key)
refresh_cursor (key)
DOM elements
Here comes a list of DOM elements you can use to extract
The authors twitter handle
div.original-tweet[data-tweet-id]
The name of the author
div.original-tweet[data-name]
The user ID of the author
div.original-tweet[data-user-id]
Timestamp of the post
span._timestamp[data-time]
Timestamp of the post in ms
span._timestamp[data-time-ms]
Text of Tweet
p.tweet-text
Number of Retweets
span.ProfileTweet-action–retweet > span.ProfileTweet-actionCount[data-tweet-stat-count]
Number of Favo
span.ProfileTweet-action–favorite > span.ProfileTweet-actionCount[data-tweet-stat-count]
Resources
https://code.recuweb.com/2015/scraping-tweets-directly-from-twitter-without-authentication/
If you're still looking for unauthenticated tweets in JSON, this should work:
https://github.com/cosmocatalano/tweet-2-json
As you can see in the documentation, using the REST API you'll need OAuth Tokens in order to do this. Luckily, we can use the Search (which doesn't use OAuth) and use the from:[USERNAME] operator
Example:
http://search.twitter.com/search.json?q=from:marcofolio
Will give you a JSON object with tweets from that user, where
object.results[0]
will give you the last tweet.
Here is a quick hack (really a hack, should be used with caution as its not future proof) which uses http://anyorigin.com to scrape twitter site for the latest tweets.
http://codepen.io/JonOlick/pen/XJaXBd
It works by using anyorigin (you have to pay to use it) to grab the HTML. It then parses the HTML using jquery to extract out the relevant tweets.
Tweets on the mobile site use a div with the class .tweet-text, so this is pretty painless.
The relevant code looks like this:
$.getJSON('http://anyorigin.com/get?url=mobile.twitter.com/JonOlick&callback=?', function(data){
// Remap ... utf8 encoding to ascii.
var bar = data.contents;
bar = bar.replace(/…/g, '...');
var el = $( '<div></div>' );
el.html(bar);
// Change all links to point back at twitter
$('.twitter-atreply', el).each(function(i){
$(this).attr('href', "https://twitter.com" + $(this).attr('href'))
});
// For all tweets
$('.tweet-text', el).each(function(i){
// We only care about the first 4 tweets
if(i < 4) {
var foo = $(this).html();
$('#test').html($('#test').html() + "<div class=ProfileTweet><div class=ProfileTweet-contents>" + foo + "</div></div><br>");
}
});
});
You can use a Twitter API wrapper, such as TweetJS.com which offers a limited set of the Twitter API's functionality, but does not require authentication. It's called like this;
TweetJs.ListTweetsOnUserTimeline("PetrucciMusic",
function (data) {
console.log(data);
});
You can use the twitter api v1 to take the tweets without using OAuth. For example: this link turns #jack's last 100 tweets.
The timeline documentation is here.
The method "GET statuses/user_timeline" need a user Authentification like you can see on the official documentation :
You can use the search method "GET search" wich not require authentification.
You have a code for starting here : http://jsfiddle.net/73L4c/6/
function searchTwitter(query) {
$.ajax({
url: 'http://search.twitter.com/search.json?' + jQuery.param(query),
dataType: 'jsonp',
success: function(data) {
var tweets = $('#tweets');
tweets.html('');
for (res in data['results']) {
tweets.append('<div>' + data['results'][res]['from_user'] + ' wrote: <p>' + data['results'][res]['text'] + '</p></div><br />');
}
}
});
}
$(document).ready(function() {
$('#submit').click(function() {
var params = {
q: $('#query').val(),
rpp: 5
};
// alert(jQuery.param(params));
searchTwitter(params);
});
})

Categories