Simple web GUI to capture user data - javascript

I need to develop a simple web page that accepts user information( name, age, birthdate, etc) and saves the data to a CSV or a text file to the server. I currently use Google sheets, but I need something that's more customizable and something that does some simple error checking. Are there any open source frameworks out there that I can use to put something together in a couple hours? I have a mechanical engineering background, and I'm not too familiar with web technologies. Any pointers will be greatly appreciated.

jQuery
JavaScript:
function saveUserData(url, data) {
var csv = [], // the array of values
del = "="; // the "delimiter" or "separator"
function handle(string) {
return encodeURI(string)
.replace(/,/g, "%2c");
}
for(var property in data)
csv.push(property + del + handle(data[property]));
// "name=John Doe,dob=Jan 1%2c 2000"
$.post(url, {data: csv.toString()})
.done(function(returned_data) {
// if you need this later
}, "json");
// "url" the the url to send the data to
// "data:" will likely be determined by the server you're using
// "json" will make "returned_data" a JSON object (if the server supports it)
}

Tesla88,
In your case, you need to write a server-side script. Below are 3 links that will show you how to
open, write, and close a file.
That's how far as I would elaborate on my answer, you need to show us that you actually did the work, and if you ran into issues, you can ask here again.
I hope the above helps.
-Anthony

Related

Post JSON data in a page, from an URL (dynamic data)

I'm trying to post some mailing campaign statistics in a webpage on a website I'm building with Wordpress. I need the statistics to update on their own as data changes, so instead of parsing a JSON string directly I want to use an URL so the data is continually being fetched from the Mailify (mailing service I'm using) server.
I have some Pascal programming experience but I'm new to javascript and I've been struggling a lot with this particular problem. Because I'm getting desperate, I figured I'd try asking for help.
I've figured out how to request the data from the URL (although I forgot how I did it last time and rn I'm not able to get past the pop-up that requests the credentials), and also how to post on the page using JS, which would be something basic. But I can't figure out how to fetch the data from JSON in real time. I've seen so many different examples but they're all a little different and not really similar to this situation.
Some useful data:
API Key: 8RdZ1SkxTeWGlk2T049KOw
Account ID: 5c7650c911ce626a6371c1b5
Mailing campaign ID: TZi4oEiSRDCqciu-vGV98A
JSON URL: https://mailifyapis.com/v1/reports/TZi4oEiSRDCqciu-vGV98A
JSON URL I'm using to try and include the key and ID: https://mailifyapis.com/v1/reports/TZi4oEiSRDCqciu-vGV98A&accountId=5c7650c911ce626a6371c1b5&apiKey=8RdZ1SkxTeWGlk2T049KOw (It's still giving me a pop-up)
This is the Mailify documentation page: http://developers.sarbacane.com/#statistiques (have to translate with Google, they have no translation of their own)
I've tried understanding how curl works but I have to install some stuff and I thought I could get around the problem without it.
Here's what I have tried (I've just been probing the syntax, mostly):
<script>
<p id="demo"></p>
var text = 'https://mailifyapis.com/v1/reports/TZi4oEiSRDCqciu-vGV98A&accountId=%225c7650c911ce626a6371c1b5%22&apiKey=%22w_S31uj2RIGHVENnRbD4xw%22';
var obj = JSON.parse(text);
obj.opens = eval("(" + obj.clicks + ")");
document.getElementById("demo").innerHTML = obj.opens + ", " + obj.clicks();
</script>
I realize it doesn't work like this but that was my last attempt. I can't figure out what I should be doing.

Searching data using Youtube API

Extremely new at this, but wanted to try to build something. Decided on a simple landing page that would assist in pulling Youtube videos. Got my API key and got to building. Ran into a few issues. Found out that it can't call from a local HTML file, so worked out another way to test, but now I have another issue.
I have the following code set to run when a button is pressed, submitting the text in form as the variable topic.
function search() {
var hdi = "how do i ";
var request = gapi.client.youtube.search.list({
part: 'id',
q: hdi + topic,
});
request.execute(onSearchResponse);
};
function onSearchResponse(response){
var responseString = JSON.stringify(response, " ", 2);
console.log(responseString);
}
But I cannot seem to get the response I'm looking for. The JSON data I get seems to correspond to medley of search engines (Yahoo, duck duck go, ect) instead, not the JSON data for the YouTube search.
Here is a CodePen, if that assists more. I have some code commented out that I'm waiting to work on until I get this part implemented. Thanks for any help you can give me!
https://codepen.io/billsobill/pen/OvxNON

Listen for new urls of ads posted to site

for (var i = 3848450; i > 3848400; i--) {
var query = {
url: 'http://classifieds.rennug.com/classifieds/viewad.cgi?adindex=' + i,
type: 'html',
selector: 'tr',
extract: 'text'
}
,
uriQuery = encodeURIComponent(JSON.stringify(query)),
request = 'http://127.0.0.1:8888//?q=' +
uriQuery + '&callback=?';
jQuery.getJSON(request, function (data) {
var datastring = data[0].results;
var datasplit = datastring.toString().split('Sign');
$('#inner-content').append(datasplit[0]);
});
}
I want to listen for new URLs of ads that are posted without writing some kind of arbitrary code that takes up a lot of memory looping through new URLs, etc. I can do that but it seems redundant and such as my code listed above. Im using noodle.js to get the info from the pages. Now I would like a way to listen for new urls instead of looping through every possible url from a to z. Since I don't know z it's a safe bet I'll be using an if statement but how would one go about incorporating this nth URL without ending up with undefined iterations. Im still learning and find this place has many helpful people. This is simply a fun project I'm doing to learn something new.
If I understand you correcly, you want an external thing to inform your javascript when there's new a URL or JSON data
Unfortunately the web is not built for servers to contact clients, with one exception to my knowleadge: WebSockets
You already seem to have a local server so you meet the requirements plus node ships with them ready for use (also available on the browser). To use noodlejs with websockets you'd have to require the package and set up the WebSocket to send data to your client
Other than pointing you towards that direction, I don't think I can do better than the internet at giving you a tutorial. Hope this helps, Have fun! Also thanks for telling me about noodle, that thing is awesome!

Get Twitter Feed as JSON without authentication

I wrote a small JavaScript a couple of years ago that grabbed a users (mine) most recent tweet and then parsed it out for display including links, date etc.
It used this json call to retrieve the tweets and it no longer works.
http://twitter.com/statuses/user_timeline/radfan.json
It now returns the error:
{"errors":[{"message":"Sorry, that page does not exist","code":34}]}
I have looked at using the api version (code below) but this requires authentication which I would rather avoid having to do as it is just to display my latest tweet on my website which is public anyway on my profile page:
http://api.twitter.com/1/statuses/radfan.json
I haven't kept up with Twitter's API changes as I no longer really work with it, is there a way round this problem or is it no longer possible?
Previously the Search API was the only Twitter API that didn't require some form of OAuth. Now it does require auth.
Twitter's Search API is acquired from a third party acquisition - they rarely support it and are seemingly unenthused that it even exists. On top of that, there are many limitations to the payload, including but not limited to a severely reduced set of key:value pairs in the JSON or XML file you get back.
When I heard this, I was shocked. I spent a LONG time figuring out how to use the least amount of code to do a simple GET request (like displaying a timeline).
I decided to go the OAuth route to be able to ensure a relevant payload. You need a server-side language to do this. JavaScript is visible to end users, and thus it's a bad idea to include the necessary keys and secrets in a .js file.
I didn't want to use a big library so the answer for me was PHP and help from #Rivers' answer here. The answer below it by #lackovic10 describes how to include queries in your authentication.
I hope this helps others save time thinking about how to go about using Twitter's API with the new OAuth requirement.
You can access and scrape Twitter via advanced search without being logged in:
https://twitter.com/search-advanced
GET request
When performing a basic search request you get:
https://twitter.com/search?q=Babylon%205&src=typd
q (our query encoded)
src (assumed to be the source of the query, i.e. typed)
by default, Twitter returns top 25 results, but if you click on
all you can get the realtime tweets:
https://twitter.com/search?f=realtime&q=Babylon%205&src=typd
JSON contents
More Tweets are loaded on the page via AJAX:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd&include_available_features=1&include_entities=1&last_note_ts=85&max_position=TWEET-553069642609344512-553159310448918528-BD1UO2FFu9QAAAAAAAAETAAAAAcAAAASAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Use max_position to request the next tweets
The following json array returns all you need to scrape the contents:
https://twitter.com/i/search/timeline?f=realtime&q=Babylon%205&src=typd
has_more_items (bool)
items_html (html)
max_position (key)
refresh_cursor (key)
DOM elements
Here comes a list of DOM elements you can use to extract
The authors twitter handle
div.original-tweet[data-tweet-id]
The name of the author
div.original-tweet[data-name]
The user ID of the author
div.original-tweet[data-user-id]
Timestamp of the post
span._timestamp[data-time]
Timestamp of the post in ms
span._timestamp[data-time-ms]
Text of Tweet
p.tweet-text
Number of Retweets
span.ProfileTweet-action–retweet > span.ProfileTweet-actionCount[data-tweet-stat-count]
Number of Favo
span.ProfileTweet-action–favorite > span.ProfileTweet-actionCount[data-tweet-stat-count]
Resources
https://code.recuweb.com/2015/scraping-tweets-directly-from-twitter-without-authentication/
If you're still looking for unauthenticated tweets in JSON, this should work:
https://github.com/cosmocatalano/tweet-2-json
As you can see in the documentation, using the REST API you'll need OAuth Tokens in order to do this. Luckily, we can use the Search (which doesn't use OAuth) and use the from:[USERNAME] operator
Example:
http://search.twitter.com/search.json?q=from:marcofolio
Will give you a JSON object with tweets from that user, where
object.results[0]
will give you the last tweet.
Here is a quick hack (really a hack, should be used with caution as its not future proof) which uses http://anyorigin.com to scrape twitter site for the latest tweets.
http://codepen.io/JonOlick/pen/XJaXBd
It works by using anyorigin (you have to pay to use it) to grab the HTML. It then parses the HTML using jquery to extract out the relevant tweets.
Tweets on the mobile site use a div with the class .tweet-text, so this is pretty painless.
The relevant code looks like this:
$.getJSON('http://anyorigin.com/get?url=mobile.twitter.com/JonOlick&callback=?', function(data){
// Remap ... utf8 encoding to ascii.
var bar = data.contents;
bar = bar.replace(/…/g, '...');
var el = $( '<div></div>' );
el.html(bar);
// Change all links to point back at twitter
$('.twitter-atreply', el).each(function(i){
$(this).attr('href', "https://twitter.com" + $(this).attr('href'))
});
// For all tweets
$('.tweet-text', el).each(function(i){
// We only care about the first 4 tweets
if(i < 4) {
var foo = $(this).html();
$('#test').html($('#test').html() + "<div class=ProfileTweet><div class=ProfileTweet-contents>" + foo + "</div></div><br>");
}
});
});
You can use a Twitter API wrapper, such as TweetJS.com which offers a limited set of the Twitter API's functionality, but does not require authentication. It's called like this;
TweetJs.ListTweetsOnUserTimeline("PetrucciMusic",
function (data) {
console.log(data);
});
You can use the twitter api v1 to take the tweets without using OAuth. For example: this link turns #jack's last 100 tweets.
The timeline documentation is here.
The method "GET statuses/user_timeline" need a user Authentification like you can see on the official documentation :
You can use the search method "GET search" wich not require authentification.
You have a code for starting here : http://jsfiddle.net/73L4c/6/
function searchTwitter(query) {
$.ajax({
url: 'http://search.twitter.com/search.json?' + jQuery.param(query),
dataType: 'jsonp',
success: function(data) {
var tweets = $('#tweets');
tweets.html('');
for (res in data['results']) {
tweets.append('<div>' + data['results'][res]['from_user'] + ' wrote: <p>' + data['results'][res]['text'] + '</p></div><br />');
}
}
});
}
$(document).ready(function() {
$('#submit').click(function() {
var params = {
q: $('#query').val(),
rpp: 5
};
// alert(jQuery.param(params));
searchTwitter(params);
});
})

Local HTML 5 database usable in Mac Dashboard wigdets?

I'm trying to use HTML 5's local database feature on a Mac Dashboard widget.
I'm programming in Dashcode the following javascript:
if (window.openDatabase)
{
database = openDatabase("MyDB", "1.0", "Sample DB", 1000);
if (database)
{
...database code here...
}
}
Unfortunately the database-variable remains always null after the call to openDatabase-method. I'm starting to think that local databases are not supported in Widgets...
Any ideas?
/pom
No you will not be able to do the above. And even if you could then you would not be able to distribute the widget without distributing the database assuming it was a MySQL or SGLite. (not sure what you mean by HTML 5's local Db.
here are a number of ways round this:-
You can add a data source which can be a JSON file, or an XML file or and RSS feed. So to do this with JSON for example you would write a page on a server in PHP or something that accessed a database so that when the URL was called the result was a JSON string. Take the JSON string and parse it and use it in the Widget. This will let you get data but not save it.
Another way would be to use the user preferences. This allows you to save and retrieve data in the individual widget.
So
var preferenceKey = "key"; // replace with the key for a preference
var preferenceValue = "value"; // replace with a preference to save
// Preference code
widget.setPreferenceForKey(preferenceValue, preferenceKey);
You can then retrieve it with
var preferenceForKey = "key"; // replace with the key for a preference
// Preference code
preferenceForKey = widget.preferenceForKey(preferenceForKey);
The external call, you could also use REST will let you read any amount of data in and the preferences will let you save data for later reuse that will survive log out's and shut downs.
The Apple site has a lot of information about Widgets and tutorials as well thjat are worth working through.
Hope this helps.

Categories