We have a web app that runs in Facebook (i.e. a running in an iFrame at a different domain). If a Safari user has Cookies and Website Data set to the default, "Allow from websites I visit", the data we store via localStorage.setItem is acting like sessionStorage, i.e. it's not available beyond the user's current session (i.e. after the user closes the tab). If we change the setting to "Always allow", it works fine just like in Chrome, IE , etc.
As a test, we've tried navigating the browser to our app's domain (https://ourappname.appspot.com) directly and it works fine there. And also then it should truly be a visited website, but when going back to the game within Facebook, the problem still exists.
Note that the setItem call is succeeding, it's just that getItem doesn't return anything in a subsequent session. (So it's not like when the user is Private Browsing and the setItem call itself fails with a Quota Exceeded error.)
What do we need to do to support Safari so that our app, running within Facebook, can use localStorage as intended where the data will survive between sessions?
It's either a bug in Safari or a security feature.
You are visiting FaceBook and not your website. Your app is in iframe which would violate security model if it let you read any data from the browser. Think if a competitor site read data it did/didn't set. That'd constitute an information leak.
Safari is doing it's job well in that regard.
Ideally, in "Allow from websites I visit" mode, no browser should let iframes to set data to localStorage; even if every domain has their own storage-sandboxes.
What's troubling me is why are they even letting you write to localStorage from iframe at all (in your 'Allow for only sites I visit' mode)? That might actually be a bug - a information spoof attack enabling bug.
I think it's because security-exceptions were dropped from localStorage in case of not-same-party origin of request. So Safari might actually will not throw error but let it silently fail (in some cases). That's probably why your setItem call is succeeding.
At this point, with the given information, I suspect, sir you are out of luck due to Safari programmers following standard to the letter.
I'm still waiting on a reply from Apple, but it's safe to say we're stuck with this behavior. So Anubhav's answer is accurate, but we still needed a solution.
So as a work around, we created new endpoints on our server for persisting/restoring game state. We only utilize this for Safari, for all other browsers we're still persisting our game state in localStorage.
There is a slight performance penalty for the user. And a slight server cost. Not a sexy solution, but now our Facebook canvas app supports Safari.
Related
I have been stuck on this for a couple of weeks now and this is a follow on from SO question Delphi REST Debugger Returns Error 429 Too Many Requests but Browser Returns JSON as Expected
I was wanting to get the content of a url response using the TNetHTTPRequest and TNetHTTPClient components. I was continually getting 429 errors “too many requests”. When using Firefox Inspect Element to look at network and storage, I discovered that I needed to receive cookies and then send those cookies with my request. Unfortunately, one of the cookies essential to the website content seems to be dependent (I think) on the execution of javascript. I went back to first principles and dropped a TWebbrowser on a form (VCL) and sure enough browser shows a javascript error “Expected Identifier”.
When I use the TWebbrowser in FMX it does not throw an error it just does not return the website contents at all and remains blank. I need FMX as I will be in a cross platform mobile environment.
The URL is https://shop.coles.com.au/a/national/home
I use Delphi Community Edition 10.3.3 Rio.
The URL returns perfectly in commercial browsers Firefox, Safari, Chrome and even CEF4Delphi. Unfortunately, I can’t use CEF as I need cross platform.
I would like to know how to get the website content returned to the browser (or even better NetHTTPClient) without script errors and how to access the browsers current cookies.
Any help will be most appreciated.
Thanks,
John.
URL returns perfectly in commercial browsers ... without script errors and how to access the browsers current cookies
If you'd inspect the network traffic (F12 > Network, then requesting your URL) or use uMatrix (to block everything that doesn't belong to the domain by default) you'd see the JS does at least one XHR to amazonaws.com. Your HTTP transfer alone (as done by TNetHTTP*) works fine and you get the same resource that each internet browser gets.
However, you don't operate with what you got (in contrast to the internet browser, which also automatically parses the HTML, sees JS resources, and executes them). TWebbrowser does not what you take for granted most likely due to security settings (try to get an error console in there, preferably F12 again). You need to do the same: parse the HTML resource for JS URIs, requesting those and executing what you get, while still providing the same cookie environment.
For executing JS you could use Chakra or mORMot or BESEN. It's challenging at first, but the more you understand about HTTP (including cookies) and a JS engine, the more you'll see why "things work" in one situation and not in another. There's a reason why an internet browser is a very complex software and not just a downloader.
As per this forcing IE11 Quirks mode might cure your problem already when using TWebBrowser:
TBrowserEmulationAdjuster.SetBrowserEmulationDWORD(TBrowserEmulationAdjuster.IE11_Quirks);
I've been playing around with the chrome.storage.sync API as part of a Google Chrome extension that I'm building.
The API makes it clear that if you sign in to the Chrome browser with your Google account and use chrome.storage.sync.set then all the data that is set will be accessible next time you sign in to a Chrome browser with the same Google account and use chrome.storage.sync.get.
What the API doesn't make particularly clear is how chrome.storage.sync behaves when not signed in to the Chrome browser.
From my experiments it appears that, when not signed in to the Chrome browser, chrome.storage.sync.set and chrome.storage.local.set save to different places.
It says in the API:
When Chrome is offline, Chrome stores the data locally. The next time
the browser is online, Chrome syncs the data. Even if a user disables
syncing, storage.sync will still work. In this case, it will behave
identically to storage.local.
It appears that the place where
Chrome stores the [synced] data locally
is different to where chrome.storage.local.set stores it. Can anyone confirm if this is true?
When Chrome.storage.sync is unable to connect to the internet, it stores data in a new, temporary place locally. It does not store it in Chrome.storage.local and is still accessed through Chrome.storage.sync.
The confusion seems to be in the language. Chrome.storage.local is a separate location, and Chrome.storage.sync will behave LIKE Chrome.storage.local. It does not use the same storage location
Actually chrome description has clearly stated on https://developer.chrome.com/extensions/storage
When Chrome is offline, Chrome stores the data locally. The next time
the browser is online, Chrome syncs the data. Even if a user disables
syncing, storage.sync will still work. In this case, it will behave
identically to storage.local.
I am working on a website for a women's shelter and they want a "panic button" that automatically takes you to another site. This is pretty common, but I need it to also automatically clear the cache, so the abuser can't hit the "back" button or history to see what they were looking at before being interrupted. Any ideas?
I think that the answer is "can't be done as website functionality" unless the user installs a browser plugin. For example, here's what Mozilla Development Network (MDN) says about this:
For security reasons .... there is no way to clear the session history or to disable the back/forward navigation from unprivileged code. The closest available solution is the location.replace() method, which replaces the current item of the session history with the provided URL.
Non-privileged basically means any javascript that a website might run on the browser.
Now a "panic button" plugin / add-on would be able to do this kind of thing, but:
The user has to install it. (Simple for a moderately tech-savy person ...)
If someone looks at the browser, it will be apparent that it has been installed.
There is also the issue of people how people decide to trust a plugin like this to be properly implemented, and not contain nasty stuff.
A better idea would be to educate the user in using browser incognito mode. However that still leaves traces on the user's computer (depending on your browser, and other things), and in external network logging, etc.
Ran into something lovely, and intermittent, while trying to write cookies today in an Iframe via javascript.
So say I assign a new cookie:
document.cookie = "key=value;";
Calling said cookie returns as:
document.cookie = "";
I've tried this across both IE 9 and IE10, and it's working for some users, and not others. Same browser versions.
Furthermore, it seems to be machine specific, we're in an enterprise network, and users can log into one machine, and be okay, whilest another station and we'll see this behavior.
I've thought about some group policy, but that last point has me at something of a loss.
Edit: More extra details.
Cookies are allowed on these machines.
This isn't a Session/Persistent Cookie mix up; literally trying to write to the document.cookie object returns "", even if you call it right after assignment.
Here's a big one I missed, I'm trying to assign these as an Iframe, and there doesn't seem to be an issue with P3P headers, as there are machines that are viewing it just fine; same browser versions.
I was experiencing the same problem and found that the users with IE that could not take the cookie had protected mode enabled.
Internet Options/Security/Enable Protected Mode (uncheck)
I have two copies of IE7, same exact security settings and same exact builds. Two different machines, both running WinXP. In my application, my cookie headers are being properly sent to the server on one version of IE. No other cookies are being sent in another version.
What are some points to troubleshoot in this scenario?
Try Fiddler to trace what's happening, It's more appropriate (and simpler) than Wireshark for this purpose.
http://www.fiddlertool.com/fiddler/
you may want to get something like wireshark to see what is being sent across the line. FIrebug has some net utilities like this but your problem seems specific to IE. Still, trying another browser couldn't hurt in trying to troubleshoot this issue.
Other items to look for are the advanced properties of the IE installation and the zone that the website is in.
be careful what links you are accessing. It took me almost a day to discover why the same browser sometimes sent the session cookies, and sometimes it didn't.
Accessing the page via http://www.example.com will create different cookies than on http://example.com (without the 'www') because the browser sees them as two different access points:)
Also be careful about your browsers settings..you should make sure they are identical.