How can I make a Playstation 3 Game card? - javascript

I want to make a ps3 card like this one:
http://gamercards.exophase.com/2516.png
using the data from the ps3 network site, this one
http://us.playstation.com/publictrophy/index.htm?onlinename=bruno_shady
but I want to make something different, something that show only the name of the person and the games that the person has, to post on a forum that I frequent...
How can I do it, or what do I need to do it?
something simple would be nice, and that it update itself from one to one day, or even more, two, etc...

You would need a server side language to do that - like PHP.
You will need:
A way of catch the data from the PlayStation website. It can be done quite easy using the curl library and RegExps.
A way of compose the final image. You should have one or more base image(s). You will need to load the image with a image library like GD in PHP and draw on it the necessary text.
A cache system. Actually it not necessary but your server will waste too much resource otherwise. It's about of saving the generated image in the hard disk at the first request and send it on every request, with a refrest rate of some hours or days.

Related

architecture for get and store api request data

This is more of a architectural questions. An external platform had product and price information for let's say, books. There is an API available to get this information.
What I read is that it should be possible to create a function in Javascript and connect the Javascript to a page where you want to show the data on my own website. This would mean that for each page request an API-call is made. Since the requested information only changes once a day maximum this does not sound the most efficient solution.
Can someone advise a better solution? Something into the direction of a similar php or javascript function that does the request on the background, schedule an update and import the data into mysql? If so, what language would be most common.
I need the solution for a Joomla/php/mysql environment
Here's a simple idea - fetch and store results from the API (ones you think aren't gonna change in a day), either on disk, or in the database, and later use these stored results to retrieve what you otherwise would've fetched from the API.
Since storing anything in frontend JS across page reloads isn't easy, you need to make use of PHP for that. Based on what's given, you seem to have two ways of calling the API:
via the frontend JS (no-go)
via your PHP backend (good-to-go)
Now, you need to make sure your results are synced every (say) 24 hours.
Add a snippet to your PHP code that contains a variable $lastUpdated (or something similar), and assign it the "static" value of the current time (NOT using time()). Now, add a couple of statements to update the stored results if the current time is at least 24 hours greater than $lastUpdated, followed by updating $lastUpdated to current time.
This should give you what you need with one API call per day.
PS: I'm not an expert in PHP, but you can surely figure out the datetime stuff.
It sounds like you need a cache, and you're not the first person to run into that problem - so you probably don't need to reinvent the wheel and build your own.
Look into something like Redis. There's an article on it available here as well: https://www.compose.com/articles/api-caching-with-redis-and-nodejs/

Is preloading data into session variables with AJAX upon login a good idea?

I've created a subscription-based system that deals with a large data-set. In its first iteration, it had semi-complicated joins that would execute, based on user-set filters, on every 'data view' page. Each query would fetch anywhere from a few kilobytes to several megabytes depending on the filter range. I decided this was unacceptable and so learned about APC (I had heard about its data-store features).
I moved all of the strings out of the queries into an APC preload routine that fires upon first login. In the same routine, I am running the "full set" join query to get all of the possible IDs for the data set into a $_SESSION variable. The entire set is anywhere from 100-800Kb, depending on what data the customer is subscribed to.
I convert this set into a JSON array and shuffle the data around dynamically when the user changes the filters. In creating the system I wanted it to seem as if the user was moving around lots of data very quickly, with minimal page loading (AJAX + APC when string representations are needed), as they played with the filters.
My multipart question is, is it possible for the user to effectively "cancel" the initial cache/query routine by surfing to another page after the first login? If so, can I move this process to an AJAX page for preloading, or does this carry the same problem? Or, am I just going about all of this in the wrong way? I came up with the idea on my own and I'm worried that I've created an unusable monster.
Also, I've been warned that my questions suck and I'm in danger of being banned. Every question I've asked has come from a position of intelligent wonder, written as well as I knew how at the time, and so it's really aggravating when an outsider votes me down without intelligent criticism. Just tell me what I did wrong and I will quickly fix the problem. Bichis.

Using HTML5 how can i send a simple string so I can see it?

So I have a game written in HTML5 which is all fine and dandy. At the end of the game there is a score the player recieves. I want my game to send me how many people have a score of atleast 100 or above. I need this statistic to balance out the game accordingly.
I was thinking I could make a txt file in a dropbox account and make the game edit the file.
Now pls dont tell me about security issues I am aware that they can change their score in the javascript console which sends false data.
How do I go about solving this simple task? I just need a way to tell how many people are getting a score of 100 or above from my game.
Thanks !!!
You will probably need to use a server-side language such as PHP to write to a remote file.
You will want to javascript something simple that when the the score is >100 (or whatever your threshold is) you would do a simple (probably ajax) query to a server.
While plenty of people are telling you to create a server-side app, if you do not already have a server, there are simpler ways to go. For example I believe google spreadsheets has an api or even better you could use something like Parse Data which is a free database-as-a-service with a simple javascript API and intended just for that purpose.
On the other hand, if you DO have a server but don't feel like setting up a full programming language to respond to API calls (which is NOT terribly hard by the way), you could use the server log. Whenever the user hits 100 points for example create a little invisible <img src="http://yoursite.com/i-got-over-100" height=1 width=1 style="position: absolute, left: -10000" /> then just check your server logs for how much i-got-over-100 was hit.

Load a few thumbnails with one HTTP request

Suppose I display a few dozens thumbnails in a few web pages (10 thumbnails per page). I would like to load them as quickly as possible.
Does it make sense to get a few thumbnails with one HTTP request (one thumbnail is ~10K) ? How would you suggest do it with JavaScript?
You can, but you need to jump through a few hoops:
1) Base-64 encode the images on the server as a single file.
2) Send them to the client as a single request blob, via AJAX.
3) Decode the images back into pieces.
4) Use Data-URIs to insert them into the DOM.
...not really worth it.
Regarding network performance it does really make sense.
You could, for example, put a predefinited number of thumbails along in a single image.
On client side you can treat that image like using "css sprite" tecnique
(http://www.w3schools.com/css/css_image_sprites.asp)
If it's important for you to send the images as fast as possible I would consider sending them as a sprite. Unfortunately this may be somewhat difficult on the back end if the provided images may vary. If they are static and the same for every user it is way easier as you can manually prepare the images and the front end code to display the correct image parts.
In combination with the sprite approach it would also be useful to enable progressive/interlaced loading in order to deliver visible results as fast as possible.

Alternative to creating large client side Javascript array of objects via JSON file?

I have a website that contains graphs which display employee activity records. There are tiers of data (ie: region -> state -> office -> manager -> employee -> activity record) and each time you click the graph it drills down a level to get to display more specific information. The highest level (region) requires me to load ~1000 objects into an array and the lowest level is ~500,000 objects. I am populating the graphs via a JSON formatted text file using:
$.ajax({url:'data/jsondata.txt', dataType: 'json',
success: function (data) {
largeArray = data.employeeRecords;
}
Is there an alternative method I could use without hindering response time/performance? I am caught up in the thought that I must pre-load all of the data client side otherwise there will be lagtime if I need to fetch it on a user click. If anyone can point me to best practices and maybe even explain what is considered "TOO MUCH" client side data i'd appreciate it.
FYI i'm restricted to using an old web server and if I want to do anything server side i'd be limited to classic ASP otherwise it has to be client side. thank you!
If your server responds quickly
In this case, you can probably simply load data on demand when a user clicks. The server is quick, so why bother trying to be smarter for no gain.
If the server is quick, but not quick enough, then you might be able to preload the next level while drawing the first. Eg if you have just rendered the graph at the "office" level, then silently preload the "manager" next level down data while the user is still reacting to the screen update.
If the server is too slow for data on demand
In this case you probably need to model exactly where it is slow and address that. There are several things in play here and your question doesnt exactly say.
Is the server slow to query the database, if yes fix it. There is little you can do client side to solve this.
Is the server slow to package for transmission? Harder to fix, server big enough?
Network transmission is slow? Hmmm, need to send less data or get users onto faster bandwidth.
Browser unpack time is slow? (ie delay decoding the data before your script can chart it). Change how you package the data, or send less data, such as chunks.
Can browsers handle 500,000 objects? You should be able to just monitor memory of tHe browser you are using, and there are opionions yes/no on this. Will really depend or target users browser/hardware.
You might like to look at this question What is the most efficient way of sending data for a very large playlist over http? as it shows an alternative way of sending and handling data which I've found to be much quicker for step 4 above. Of course, at 500k objects you will no longer be able to use localStorage, but I've been experimenting with downloading millions of array elements and it works ok. ( still WIP ) I dont use jquery, so not sure how useable this is either.
Best practice? Sorry cannot help with that part of the question.

Categories