I am in the process of translating my Chrome extension into a Firefox extension. I am having a problem though.
My extension requires stored data for the extension to be loaded in the content script as well as in the popup page. I have no problem with this in my Chrome extension. I just use chrome.storage to pass and retrieve storage and I can use that both in my content and popup scripts with ease.
With Firefox, I'm having a having a hard time figuring out exactly what I have to do different. I got that I can't use chrome.storage and rather use the
const storage = require("sdk/simple-storage").storage;
thing, but I need to use this in both the content script and the script for the popup page. I researched and found I can't use the require function more than once, so my question is, would I be able to share the variable between the popup script and the content script? I need the storage from both sides and there isn't any other way I can see to make the extension work.
Thanks.
You use message-passing to have the content script the main addon communicate with each other.
Two possible approaches:
send down the data in advance if it's not too large / doesn't affect too many tabs and also push down updates as they happen. this provides superior read latency at the potential cost of increased memory footprint and increased cost of writes.
request individual data items on demand. this is better for frequently-written or large items of data but comes at the cost of higher latency per request.
You also might want to look at webextensions. storage in content is not yet supported there either, but it probably will be in the future.
Related
I have a web page that references and initializes multiple instances of the same ASP.NET generic user control.
What I want to do, is to cache/store the entire contents (html) of those controls somewhere on the client using the jquery detach() method.
The solution of localStorage does not fit here as it has a limit to 5MB which is low for my needs.
For now, i am using a global variable and more specific a javascript array(key-value) to store the data.
What do you think of this solution? Will I notice any lags or performance issues on browser? Is there any limit for that global var?
Also, is there a better way to implement such a task?
For cross browser compatibility, you can try an AJAX call that pulls/increments in your massive data and cache the call (stored as JSON/JSONP). jQuery has a cache mechanism but the meat of the implementation is going to be on the headers of the page call. Specifically you're going to want to add Expires, Last-Modified, and Cache-Control on the pages you AJAX in.
Then you'll want to pull in the data asynchronously and do the appropriate UI manipulation (if needed).
You don't want to store massive data in a single variable since its going to take longer when it goes through the JS process.
localStorage is still an edge technology, is implemented differently across vendors, and isn't backwards compatible (although there are JavaScript libs that help mitigate backwards compatibility)
Cookies not big enough
On-Page JSON or JS Variable You lose abstraction and increase initial page weight (which is going to be unsatisfactory if you're on mobile)
Whatever implementation you do, I would run some simple benchmark performance tests so you have the metric to backup your code
This will cause browser lag and multiple issues. You can pretty much guarantee that a mobile browser isn't going to work in this scenario because no sensible mobile browser is going to let you download and store 5MB+ in the LocalStorage object. It is a really bad idea to put 5MB+ of HTML into the DOM of any browser and not expect any kind of performance issue.
If you're not concerned about mobile, then look at IndexedDB. It allows a greater amount of storage and it persists even after the session is closed. It is fairly well supported in recent Chrome and Firefox browsers, but requires IE10 or higher.
Are there any performance differences when I would load up a formatted element through an iframe or via an ajax call retrieving json data and let javascript put it in a html design? I tend to look at how major websites are doing it and i noticed that ebay uses a lot of iframes. On the homepage there are like four iframes, one of them is obvious though since it's an advertisement.
Also, within an iframe i can't access the parent's javascript files although it's on the same domain, so within the iframe i have to load the .js file again. I wonder whether this is a technical issue or safety precaution in terms of XSS which is still weird because it's on the same domain... An example is the jquery .js distribution file, I have to load this in both the parent and the iframe. Would browsers use the parent's cached version of this or download the whole .js file again?
This is actually a number of questions.
First I'll address overall speed:
Short answer: it depends. There are a number of factors here.
Ajax method - probably faster to load the data from the server, slower to display client-side.
IFrame method - probably slower to load from server, faster to display client side.
I would think the trend towards using iframes for ads is more to do with security concerns and overall design requirements.
Within the iframe, if it loaded from the same domain as the parent, you should be able to do parent.$ or parent.jQuery. See this question.
Caching introduces a whole extra layer into this. Caching will probably happen, though it really depends on browser settings or proxy settings even.
Well it completely depends on how and from where the data/html is coming from.
If the html content is coming from then cdn network or from other domain and you just want to display the data of a link and you all the working functionality of that other content, then at somepoint iframe is good.
While to load the data fast and with performance i think ajax functionality will be more preferable.
I am interested in making a website that flashes through a visitors entire web history when they visit. I plan on using JavaScript to grab the history on each viewer's computer and animate through it with varying speeds depending on how much they had. My thought was to use history.length to determine the length of the visitor's history, and then use history.go() to navigate -1, -2, -3, etc. through the entire web history. I recognize that load times would be HUGE, but right now I am just trying to think through the concept. This related question seems like what I would use for the basis of my code, however, I don't understand why they describe that this method would not work. I am a student who is very new to JavaScript.
Do you guys have any knowledge of whether or not this will work, or any ideas on ways to achieve my idea?
You can call history.go() once. That's about as far as you'll get. The reason is simple, once you're on the previous page, your javascript is gone. Iframes won't work either due to the fact that you can't execute your own JS in an iframe that has a page from another domain. Read about same origin policy for more info on that.
The only real solution I can think of is a browser extension. The reason that'll work is due to the fact that your JS can persist across multiple sites. You'd probably just need a userscript in each page that does the following:
check a variable to see if the functionality is enabled
if it is, call history.go(-1) after a timeout (to control the speed)
I'm most familiar with Chrome so I'm imagining a browserAction to enable/disable the script and a content script that does the redirect. Other potential options include Greasemonkey (Firefox), Tampermonkey (Chrome), Personalized Web (Chrome) scripts
As stated in the question you linked to, JavaScript and / or the DOM does not give you access to the entire browser history since that would be a severe privacy violation. Imagine going to a site and having them be able to know every site you ever visited in that browser.
This would potentially give the site access to:
Sessions you are still logged into on other sites (if they store the session key in the URL, as some sites do)
Insight into what kind of activities you perform (are you a moderator on site X?)
Enormous amounts of data on what you are interested in.
This is not something that standards bodies or browser manufacturers thought users would be interested in sharing with everybody. That's why there isn't an API to walk through the browser's entire history.
#sachleen has already provided a very nice in-depth answer on how you can get around this limitation for individual browsers if you want to build this application. For the sake of completeness I'll simply mention the key term: "browser extension". :-)
I'm currently building a portfolio-website for an architect that has a whole lot of images on its pages.
Navigation is done with history.js (=AJAX). In order to save loading time and make the whole thing more "snappy", I wrote a script that crawls the page body for links to other pages and fetches these automatically in the background. So far, it works like a charm.
It basically keeps a queue-Array that holds all the links. A setTimeout()-Function works through them and fetches each page using jQuery $.ajax(). The resulting HTML is stored in a Javascript Object.
Now, here's my question:
What are possible problems that might occur when using this on different machines/browsers/operation systems?
I'm thinking about:
max. javascript object/variable size (The fetched HTML is stored in an javascript object)
possible performance problems
max. number of asynchronous requests?
… anything you can think of?
Thanks a lot in advance,
a hobby programmer
Although it might be a good idea to cache the whole website on the client side there are a lot of things that can cause issues:
Memory
Unnecessary load on the webserver
Loading uneeded pages into memory
Some users have a limit to their internet so loading the entire website is not smart in those cases
Once the user naviagets away or refreshes the entire "cache" is gone
What I would do is first try to optimize the server side.
Add a bunch of caching mechanisms from the database to the user, the "Expires" header can really help you.
And if that doesn't help I would then think about caching some pages(which ones are for you to decide) in the offline cache, see (HTML 5 Offline Features)
That way you are safe even on page reload, keep the memory to a minimum and only load what you need.
PS: Don't try to reinvent stuff that the browser already has :P
You should queue the async requests, and only launch one at a time.
And since you're storing everything in variables, at some point you (the browser) may consume to much memory, and the whole thing can become very slow. I suggest you limit the size of your cache to a certain number of pages.
You can also try to not to store fetched content - just fetch it and throw-out. Browser still will cache fetched pages and images in its internal storage, so subsequent loads will be much faster (of course if ajax library does not forcibly disable the cache i.e. by using POST)
Just curious on this, is it possible to cache javascript? That is, to minimize client cpu needed to recalculate some logic each time I refresh the browser?
Take the google javascript map for example. When I emebed the map on my page, is there any cache mechanism that I can use on my page?
JavaScript execution will occur for each page load based. One alternative, is to change how you call the JavaScript by first checking to see if the value has already been calculated before executing the calculation. To accomplish this, you need to store the calculated value in some form of state such as session, the URL as a query string parameter, or in a cookie. This would ensure that the first time the page loads, the value is calculated and stored. For each subsequent page load, the value would be pulled from state rather than re-calculated.
The client can cache the .js file locally (prevent it from downloading) but the operation the said file performs is performed on every load.
As for Google Maps, it needs to perform its operations to display the map. Other then letting the client cache the .js file (thus saving the download), there is not much you can do.
The best you can do is limit the amount of processing the client needs to do, or if the result of your processing is scalar (Strings, numbers, array of), you can store it in a cookie for later use. DOM manipulation is done on every load.
Do all your heavy processing on the server when possible.
You can't cache the result of compiling the Javascript, but you can avoid loading parts of your application logic until they're needed - that is, at the moment you need some logic, add a new <script> tag through the DOM for the functionality you need.
I think you're talking about the images loaded by the javascript, from google's servers?
There is a huge grid of images for each detail level and it doesn't logistically make sense to cache these. After a few minutes of scrolling around in google maps, you'd have enough images to fill your hard drive several times over!
Some browsers don't handle javascript as well as others. Firefox is temporarily lagging behind, but both Google Chrome and Safari are extremely fast. Safari is worth a download because it's Development tools will show you exactly what is taking so long to happen.