How to disable video after render? - javascript

I am attempting to implement the Instagram like feature that destroys videos (stories) after 24 hrs, but in my case its just the duration of the video. Using this example I am trying to allow users to only view a story once, meaning no replay. Even upon refreshing the page, it should not load.I tried localStorage but couldn't work.

I don't think you can write a foolproof solution to this without some server-side intervention.
Yet, if you think localStorage based frontend solution is enough for you (which I don't recommend), you can:
Maintain a sorted list of MD5 hash of the videos in your localStorage.
When your react app loads (in componentDidMount), load that list into memory. Now you can search this ordered list with your current video MD5 hash to decide if you are to play this video of not.
When you play a video, update the list of MD5 hashes both in memory and localStorage.
If your server removes the videos after 24 hours or some interval like that, you should store the time in local storage too to cleanup your localStorage or it'll be ever growing in size.

You can't truly control that video. Once it's sent, it's sent and viewable. But, you can come close.
First, continue with your UI mechanism like you plan to with LocalStorage or similar. This is to prevent normal users from viewing the video over and over again.
Next, use Encrypted Media Extensions to implement DRM. This makes it harder to get at the raw media, and gives you more control over the timing of things. Note that this is a huge hassle, so make sure it's worth it before you go down this road.

Related

Architecture: Javascript event tracking in browser

I want to make a custom tracking system for web events. I have looked into multiple per-excsiting systems, but I want something terribly simple - yet very accurate.
I want to be able to track the following:
Page view even
Time on that page
or:
Video started playing event
Time of video watched
My first initial thought was to do a simple javascript reporting back to the server, but what happens if the user closes the window? How do I know they stopped viewing? And how can I get accurate measurements down to 1/10th of a second? So I thought of a websocket solution, as it know when a user has discounted. I ended up with Socket.io, but I want to make sure there is no better or smarter way to achieve this?
How would you approach his challenge? What is the smartest way to engineer this?
A Websocket connection which reports back to the server frequently was my first thought as well, but if you send 10 messages every second, even that might be too much for a websocket, especially when connectivity isn't top-notch.
Since the server doesn't require the information absolutely immediately, consider batching requests instead - save/update the information into Local Storage every 0.1 seconds, but don't send it to the server then - instead, every 30 or 60 seconds, or on pageload, take the current data in Local Storage and send it to the server, and clear Local Storage so that the next request a minute from now doesn't send duplicate data.

Using API more efficiently

I am trying to load my playlist onto my website from Spotify by using their API and requesting data by lines of AJAX code but that data would be loaded every single time if I refresh the page, which lower the speed and UX. Is there any solution of the problem?
P.S: a more complex problem: I was going to request the data once after period of time and modify the code of the HTML files stored on the server by a server-side Python application I wrote if any changes of the playlist occurred (example: I added/removed songs to or from my playlist via Spotify), for the purpose of not only freeing the browser from requesting datas but also getting the chances to sort that data by the date when publishing, rating or the artists for a better client-side layout. So my questions are:
Is it secure to let the server-side applications to modify the HTML files (which means really saving the changes to the server-side HTML files) automatically because it is a very long process I don't want to either leaving those annoying works to the client-side which bothers the users or updating my website manually which bothers me.
Are those things even possible?
Try using html5 localStorage to store your playlist on the client's browser along with a timestamp. Once the timestamp is a certain period old, any loaded page can reload your playlist if necessary.

Caching client side code results across pages

In our application, we are painting navigation component using JavaScript/jQuery and because of authorization, this involves complex logic.
Navigation component is required on almost all authenticated pages, hence whenever user navigates from one page to another, the complex logic is repeated on every page.
I am sure that under particular conditions the results of such complex calculations will not change for a certain period, hence I feel recalculation is unnecessary under those conditions.
So I want to store/cache the results at browser/client side. One of the solution I feel would be creating a cookie with the results.
I need suggestions if it is a good approach. If not, what else can I do here?
If you can rely on modern browsers HTML 5 web strorage options are a good bet.
http://www.html5rocks.com/en/features/storage
Quote from above
There are several reasons to use client-side storage. First, you can
make your app work when the user is offline, possibly sync'ing data
back once the network is connected again. Second, it's a performance
booster; you can show a large corpus of data as soon as the user
clicks on to your site, instead of waiting for it to download again.
Third, it's an easier programming model, with no server infrastructure
required. Of course, the data is more vulnerable and the user can't
access it from multiple clients, so you should only use it for
non-critical data, in particular cached versions of data that's also
"in the cloud". See "Offline": What does it mean and why should I
care? for a general discussion of offline technologies, of which
client-side storage is one component.
if(typeof(Storage)!=="undefined")
{
// this will store and retrieve key / value for the browser session
sessionStorage.setItem('your_key', 'your_value');
sessionStorage.getItem('your_key');
// this will store and retrieve key / value permanently for the domain
localStorage.setItem('your_key', 'your_value');
localStorage.getItem('your_key');
}
Better you can try HTML 5 Local Storage or Web SQL, you can have more options in it.Web SQL support is very less when compared to Local Storage. Have a look on this http://diveintohtml5.info/storage.html

Save HTML5 video currentTime before user leaves or closes page

I would like to save the position of HTML5 video's currentTime to the database when user leaves a web page. It seems like window.onbeforeunload is not a reliable way to do it (not to mention it gives an undesirable popup window!). Is there a better way to do this?
I can't think of anything other than saving the position to the server periodically. But that seems wasteful resource/bandwidth wise. Netflix seems to do a good job at remembering your last viewed position. How would they be able to achieve that reliably? Low-level server-side C/C++ code maybe?
There are various ways to save such things:
Locally (for the same domain)
beforeunload
This gives you the possibility to cancel the unload event if you desire. However, the popup is optional.
unload
This event can't be cancelled but you still have full access to all nodes and JavaScript variables. So you can do final cleanup/save then if canceling is not wanted. You can use whichever saving method you like, e.g. document.cookie or window.localStorage.
window.onunload = function () {
window.localstorage[myVideo.currentTime] = document.getElementById("myVid").currentTime;
}
If you can handle the fact that you can only process cookies and localStorage when the user comes back to your site. I think this approach would be perfect. You simply save the current time and on the users next visit you'll get the updated information.
On the Backend
If you really need to save that information to your backend you can try some other things.
DB polling
Depending on you accuracy needs you could send an ajax request every 10 - 20 seconds to update the playback time. If you just include the video id and current time, the request is so small it shouldn't have an effect on performance. However keep in mind that if you have lots of domain cookies it might increase the size of requests tremendously and your 500 byte payload might come with a 5kB header.

How to performance tune AJAX heavy application

I'm developing kind of image/profile search application, that is based almost exclusively on AJAX. Main page basically displays profile images and allows user to filter/search and paginate through them.
Pagination works when user scrolls, so the interface has to be very very fast. There will be only 6 (maybe 9, but definitely not more) images displayed on the main page, so users will scroll a lot. I'm currently using very simple JS cache to store results of all the requests in case user decides to go back ... in that case, I simply pull everything out of the cache instead of querying the server.
Client cache
One option that I thought of is to pre-load say 10 pages in front and store them in the cache.
But my biggest issue is filtering/searching, since that completely changes the type of query that goes to the server. My filters aren't very complex, only around 6-7 string/number/enum attributes.
Now if I wanted to do all the filtering in the cache, I would have to duplicate all the search logic and fetch all the data from the server (not just the data I'm displaying), so I could filter the results on client side.
Here raises a question, should I make the cache somehow persistent? Store it into a cookie maybe?
Server cache?
One suggestion might be to use memcached on the server and just store everything there. I'm definiely going to cache away all the results I can, but that doesn't save the server from handling loads and loads of AJAX requests.
I'm developing this application on Rails 3, and even though I love it, I wouldn't say it's the fastest thing in the world. Another option that this gives me is to create separate Rack/Sinatra application to handle only the AJAX requests. By this I mean requests from the main query, not all AJAX.
S3 for images?
Big part of this application are images, even though they're mostly small thumbnails (unless user wants to display it bigger).
At the moment, I don't have problems with bandwidth. My VPS host provides me with 200GB, which should be more than enough (I hope). The problem is loading speed. Would it help if I uploaded all the images to S3 and load them from there, or is this worth doing only for larger files? I'm going to load a lot of 100x150px images, which are generally under 50kB.
Have you looked at SlickGrid. It has an interesting idea of only building the list as users scroll down but then removing the list as the users scroll out of that range.

Categories