I have a video based project.In this project I want to implement Likes features. That is there is a hyperlink on each video with the total like count and when user clicks on that hyperlink after then hyperlink is to be hidden and only show Liked text with total count of that video.
I have write this code in JavaScript with Ajax but main problem is that in one session if a user likes 5 videos then 5 times db will be hit. Is there any efficient way to implement it?
<div id="status${video.id}"> Like- </div><a id="like1${video.id}" style="color:#ffffff;">${video.likesCount}</a>
function callLike(id)
{
document.getElementById("like"+id).innerHTML='300';
var postData = '?Id='+id;
var url =protocol+'//'+host+'/xxx/getLike'+postData;
// alert("url:"+url);
if (window.XMLHttpRequest) {
req = new XMLHttpRequest();
} else if (window.ActiveXObject) {
req = new ActiveXObject("Microsoft.XMLHTTP");
}
req.onreadystatechange = likesres;
req.open("POST", url, true);
req.send(null);
}
function likesres()
{
if (req.readyState == 4) {
if (req.status == 200) {
response = req.responseText;
document.getElementById("like1"+id).innerHTML=response;
document.getElementById("status"+id).innerHTML='Liked--';
}
}
}
You need a better project architecture if you want to make this project scalable.
A starting point of what you should do is implementing an in-memory queue on your server-side code and a processing mechanism for it. Every time a user likes a video, this information is added to the queue and every 2 minutes, the queue is processed and all its content gets written to the database.
This way, you can control how many times the DB will be hit.
However, it might add some delay for the users if you set the processing time too infrequent.
Hope this helps and gives you an idea to what needs to be done.
Related
I'm currently building a website that uses AJAX to dynamically update sections of the page without the need to refresh, however, when I change aspects of the file that AJAX reads the website sometimes takes minutes to update even though the file is read about once per second. Whilst looking for the issue I found that I can turn caching off by using the developer tools and this then allowed the website to update at the appropriate speed.
Here's the code I am using:
var path = "Path of the json file i am reading"
var state;
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
state = JSON.parse(this.responseText);
}
};
xhttp.open("GET", path, true);
xhttp.send();
I've been looking for a while now and the only advice I can see about what to do about the cache is to use the developer tools to turn it off. Is there any way I can implement some code to automatically tell the browser to not cache the file being read?
I am working on a dashboard web page, it has 5 different square boxes showing the results of heavy queries, usually querying months of data at a time.
Each query could take from 3 to 7 seconds to execute and return data.
What I am trying to achieve from the UI perspective is running those queries in the background and let the UI free to redirect to another page.
My first atempt was a Jquery.ajax() call and the problem there is the following:
Let's say that I have 5 queries to run I make the 5 ajax calls and in this example the browser is waiting for query #3 which takes 5 sec the user clicks a and my expectation is for the browser to redirect right away to google, instead it is waiting for task 3 to finish, then ignoring queries 4 and 5 (which is good) and going to google.
But since taks 3 takes 5 sec, the user needed to wait 5 extra seconds for a piece of data that he didn't use.
I changed my implementation to web workers and the result is the same.
I am posting a simplified version of the code to express my point.
I executed this code in Chrome and the result I am getting is the redirect doesn't happen until the sleep(10) seconds finishes.
I want to redirect as soon the user clicks the link.
webWorkerTest.html
<script>
if(window.Worker){
var myWorker = new Worker("worker.js");
var message = { goToTheServer: true };
myWorker.postMessage(message);
myWorker.onmessage = function(e){
console.log(e.data.result);
};
}
</script>
Google
worker.js
this.onmessage = function(e){
if(e.data.goToTheServer != undefined){
var ajaxResult = goToTheServer(
function (resultado){
this.postMessage({result: resultado});
}
);
}
};
function goToTheServer(callback) {
var xmlhttp = new XMLHttpRequest();
xmlhttp.onreadystatechange = function() {
if (xmlhttp.readyState == XMLHttpRequest.DONE ) {
if (xmlhttp.status == 200) {
callback(xmlhttp.responseText);
}
}
};
xmlhttp.open("GET", "ajax.php", true);
xmlhttp.send();
}
ajax.php
<?php
sleep(10); //heavy query
echo 1984;
I have a little HTML5 game on my website that executes a Javascript function every time the game ends. The function is in an external script:
SubmitScore:(Gets called by game script)
function ONLINE_submitScore(strName,intMs) {
intMs = Math.round(intMs);
result = SQLCommand("online2.php?act=submit&name="+strName+"&score="+intMs);
return result;
}
SQLCommand: next to be called
function SQLCommand(url){
ajax=AjaxCaller();
if(ajax==false)
alert("AjaxCaller() failed!");
ajax.open("GET", url, true);
ajax.onreadystatechange=function(){
if(ajax.readyState==4){
if(ajax.status==200){
return ajax.responseText;
}
}
}
ajax.send(null);
}
AjaxCaller:
Final function called
function AjaxCaller(){
var xmlhttp=false;
try{
xmlhttp = new ActiveXObject("Msxml2.XMLHTTP");
}catch(e){
try{
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}catch(E){
xmlhttp = false;
}
}
if(!xmlhttp && typeof XMLHttpRequest!='undefined'){
xmlhttp = new XMLHttpRequest();
}
return xmlhttp;
}
The problem that I've encountered is that someone can easily use the developer console in Chrome or Firefox and execute the Javascript ONLINE_submitScore function to enter whatever score they please. What can I do to prevent this? Using a server-side password doesn't work because it's easy to see the POST request to read the password client-side.
If you don't have a login system that uses a one-way encrypted password, then there's no way to prevent anyone from putting in any score they want, as many times as they want to. At some point, of course, your high score board is just an open pipe to your database and anyone can spoof any value they want into it. Adding a login system and password you can limit the number of times a user tries to add a score - but you really have no way to check it. Yes, maybe you could write a crazy verification thing that happens within your game, and then gets replayed and checked on the backend (I don't know how your game works) but if someone wants to they can still probably fake a score.
[FWIW casinos work by running all results on the backend but casual/action mobile apps just don't work that way, the game takes place on the user's phone. Just obfuscate and make it harder for them to figure out how to spoof your system]
[Like, also a good starting point would be to not include a super-well-laid-out plan of a PHP file that I can hit from my browser to add a high score. Consider encoding that as part of a big gnarly random file you send up and then decoding it on the PHP side or something.]
Can you use CSRF in your form.
Here is an example http://www.wikihow.com/Prevent-Cross-Site-Request-Forgery-(CSRF)-Attacks-in-PHP
I am using the following Ajax function format:
var xmlhttp;
function addAddress(str)
{
if (window.XMLHttpRequest)
{
xmlhttp = new XMLHttpRequest();
}
else
{
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange = function()
{
if (xmlhttp.readyState == 4 && xmlhttp.status == 200)
{
//specific selection text
document.getElementById('info').innerHTML = xmlhttp.responseText;
}
}
var addAddress = "add";
xmlhttp.open("POST", "sys.php", true);
xmlhttp.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
var queryString = "&addAddress=" + addAddress;
xmlhttp.send(queryString);
}
function GetXmlHttpObject()
{
if (window.XMLHttpRequest)
{
return new XMLHttpRequest();
}
if (windows.ActiveXObject)
{
return new ActiveXObject("Micorsoft.XMLHTTP");
}
return null;
}
Up until now, all of my Ajax functions, like the one above, have been running fine. However, now the function will work only sometimes. Now, sometimes I will have to click the onclick event a couple times to execute the function or the function will just hang, and then after about 4 minutes it will execute.
I tested parts of the function and found that the issue lies some where at the:
if (xmlhttp.readyState == 4 && xmlhttp.status == 200)
{
alert(xmlhttp.status);
//specific selection text
document.getElementById('info').innerHTML = xmlhttp.responseText;
}
When the function works, I can alert(xmlhttp.status) and get 200. However, when it's not working, the alert box doesn't even trigger. In fact, nothing happens, not even an error.
Could this be a server issue? I am kind of thinking my website got hacked, but I cannot find any issues accept that the Ajax functions are not executing properly.
Lastly, I do not get this problem on my localhost, it's only happening on the live website.
Any help will be greatly appreciated.
Thanks
First just confirm that the addAddress function is actually being called when you click the button or control that should trigger it.
Just a simple alert in the first line like this would work:
function addAddress(str)
{
alert('addAddress has been called!')
....
}
If you don't get the alert, make sure there isn't a javascript error on the page that is preventing the function from running. In firefox you press CTRL+SHIFT+J to see the error console for example.
If that part is working, trying putting the URL for the ajax request directly into your browser and diagnose it that way.
Looks like you are requesting this url with ajax:
sys.php&addAddress= (address goes here)
Check that the page will load directly in your browser. If not, the problem is not the ajax request, but something with the sys.php page itself - which you can then drill down on.
Hope that helps!
This wasn't the answer I was expecting, but I ended up having my web host (GoDaddy) change servers, and that resolved the problem. For almost a year, I was running IIS7 with PHP. Since I had never run into any problems, I just continued using that server. After the Ajax latency issue and not being able to figure out a solution, I figured I would just switch over to Apache. After the change, everything started running smoothly again.
I am thinking maybe there was a software update that I was not notified about. Or, maybe my website was getting hit with a DDoS, which was decreasing the performance of my Ajax requests. Lastly, maybe someone got into IIS and changed a setting. I don't know, all I know is that the minute the server was changed over to Apache was when the website started running normally again.
Thanks for everyone's suggestions.
Please Help!! I'm seeing a lot of mixed results when I search for this, none of them which apply here. My goal is basically to turn a randomly and dynamically populated list of songs into a playlist, so that each song plays one after the other, and the iframe for youtube or soundcloud is ajaxed in in sequence.
I have a simple list of songs that is populated from the youtube and soundcloud api's, which is output to an unordered list. As each song in the list is loaded into the browser its anchor tag is given an id.
//List Item
echo "<a id=\"" . $i . "\" href=\"javascript:void(0)\" onclick=\"play_clicked('youtube',".$i.",".$song_count.")\">
<li class = \"song\">";
The first song in the list gets id 0, the second, 1 and so on. The media id of each song is also "pushed" onto a javascript array one at a time as they load, so that the id of the song's anchor tag corresponds to the key in the array where the song's media id is held.
echo
'<script type="text/javascript">
track_id_array.push("'.$vid_id.'");
</script>';
I've created a javascript function that is called when a song is clicked:
function play_clicked(api_type,clicked_key,song_count)
The function receives parameters for api type - soundcloud or youtube, the anchor id, or array key where the clicked media id is held, and then how many songs are in the list. There is a for loop to iterate through the array of media ids:
for (var i=clicked_key; i<=song_count; i++){
So I am starting the loop at whatever the id of the clicked song was, and the goal is to continue to iterate through the songs that follow after the clicked song. First I check if the media id exists in the array:
if(window.track_id_array[i])
If this exists, it should hold the media's id - example youtube id - 6_8ZZtL6qmM. Then I check whether it is a soundcloud or youtube song, and depending on which, I send the media id with ajax to a php script that will embed the id into the html5 iframe embed widget for either soundcloud or youtube, something like this:
$vid_id = $_GET[id];
//Youtube player embed
echo '<iframe width="498" height="280" src="http://www.youtube.com/embed/'.$vid_id. '?autoplay=1" frameborder="0" allowfullscreen></iframe>
';
and then I return this html to a div inside my main page, and the song will play in the corresponding widget. The way I had imagined accomplishing my goal of a playlist is to start with the clicked song's id/key in the loop, retrieve the media id for that key from the array, check the api type, make the proper ajax call, set a timeout for the length of the song, and then have the loop continue to the next key in the array after the song was finished playing, so that it would start the process over with the next song in the list, or id in the array.
Javascript is really not my strong suit and I hate that I have to use client-side code for this. My question is, is the method I described here possible, or am I going about this the wrong way? I only want to make the ajax calls one after the other in the loop, so they won't be happening at the same time. Here is my whole function and I am getting some weird results. It plays the last song in the list and skips over all others. Any explanation for maybe why? Again I am really not great with javascript, and help is MUCH appreciated!!
//play clicked track and those following
function play_clicked(api_type,clicked_key,song_count){
function showPlayTrack() {
if (xhr.readyState == 4) {
if (xhr.status == 200) {
var outLink = xhr.responseText;
}
else {
var outLink = "There was a problem with the request" + xhr.status;
}
}
var vis = parent.document.getElementById("play_content");
vis.innerHTML = outLink;
setTimeout(300000);
}
for (var i=clicked_key;i<=song_count;i++){
if(window.track_id_array[i]){
if(api_type == "scloud"){
var soundcloud_id = window.track_id_array[i];
if (window.XMLHttpRequest) {
xhr = new XMLHttpRequest();
}
else {
if (window.ActiveXObject) {
try {
xhr = new ActiveXObject("Microsoft.XMLHTTP");
}
catch (e) {
}
}
}
if (xhr) {
xhr.onreadystatechange = showPlayTrack;
xhr.open("GET", "getsoundcloud.php?streamurl="+soundcloud_id, true);
}
else {
alert("Sorry, but I could't create an XMLHttpRequest");
}
}else if (api_type == "youtube"){
var vid_id = window.track_id_array[i];
if (window.XMLHttpRequest) {
xhr = new XMLHttpRequest();
}
else {
if (window.ActiveXObject) {
try {
xhr = new ActiveXObject("Microsoft.XMLHTTP");
}
catch (e) {
}
}
}
if (xhr) {
xhr.onreadystatechange = showPlayTrack;
xhr.open("GET", "getyoutube.php?streamurl="+vid_id, true);
xhr.send(null);
}
else {
alert("Sorry, but I could't create an XMLHttpRequest");
}
}
}
}
}
Okay so I solved the problem by using recursive function calls. On click I call the play function, which reads the media id, checks the api type, increments the key and calls the ajax to get the iframe. The success function for the ajax then returns the iframe to the appropriate div, waits the length of the song and then calls the initial play function again with the incremented key as the param. Theres still some small things to work out, like adding or subtracting time if the user seeks in the media, but for the most part this is a great playlist solution!