Problem with Document expired after pressing "back button" in browser - javascript

We are using an e-commerce script, which is coded with ionCube technology. In the product catalog we have filters. They are sent via post method. Because of the script I cannot use get - even if I want to.
When the user will try to go to a product details, by choosing one of the filtered products in the catalog, when he tries to go back via back button in browser, he gets Document expired. After clicking the refresh button it shows the right catalog page with all filters which were chosen.
We tried to set this on server:
ini_set('session.cache_limiter','public');
It helps with the above problem, but it corrupts the cart page - everything goes crazy.
I tried many scripts founded in Stack Overflow and in other places on the net, but it won't work.
Please notice that I also cannot use PHP, because of ionCube. When I am trying to add anything in the index.php I get a corrupt notice after the page reload.
Any solution?

Related

WooCommerce - Firefox must send information that will repeat any action

This annoying message pops up every time you try to refresh the page that you have already added the product on that page to the cart and also when you fail to save changes in user account details and try to refresh the page.
I've already read the solutions about this problem in other similar questions like:
Change your request type from POST to GET
Change the request type for reloading the page from window.location.reload(); to window.location=window.location;
Redirect users to another page
which seems like such changes should be made directly on Firefox (which is not what I want) except for redirecting users to the shopping cart (which I'm not thinking at this point unless there is no any other solutions)
I want to fix the problem for every WooCommerce user who browses the store using Firefox. so is there any way (other than redirecting) to prevent WooCommerce from triggering this warning message in Firefox?

Popup displaying on refresh

I have a form, with a code to show a popup when I press a create/edit link. Now when I do a page refresh, I get the following popup
I have managed to stop the popup from appearing when Retry is pressed, by handling it on the code behind of my aspx, but when Cancel is pressed, the page blinks (I guess it renders again?) and the popup is shown.
It doesn't go back to the server. It just goes to the javascript function that displays the popup, and shows it.
It should be noted at this point that this popup is just a <div> which can be shown or hidden.The default property of this <div> is hidden.
Please help me solve this issue and also explain why this is happening. I haven't been able to find anything on the internet explaining this issue.
When submitting a form, content may be sent with either POST or GET.
Sending with GET appends values to the address defining what webpage you are on. It could look like this:
www.domain.tld/page?value1=apple&value2=banana
Sending with POST sends the value in a hidden field that the server receives.
Clicking "Retry" will load the website with the information currently held within the POST field. Clicking cancel should display the address you are heading to without the POST content.
I hope this answers your question. If not, is there any way for you to show the piece of code that handles the POST data?
The browser saves the data in the form when you submit it, and when you refresh the page, the browser attempts to send this data again. The popup is a warning from the browser that this is about to happen, which is important since the form could be on a shopping site, so resending the data would result in accidentally buying the same things multiple times.
To fix this, you can redirect to another page once the form has been submitted, or you can add code to reset the form so the data won't be sent again.
We should follow a best practice to solve this problem. Better have a look at this. When you press the cancel button, it simply load the previous page and values will be persisted.
My understanding so far is that when you press the cancel button, the values for the page is taken from the browser's cache. I cleared the cache to test this theory. The cache isn't just storing the values of the page but also the last server response received. In my case, the last server response was to show the the popup by calling my javascript function, along with the required values, which is what it did.
Now my work around to it was to make the closing button as a server command as well, so that the final response would be to hide the popup.
Please do let me know if there is something wrong in this explanation.

Python - How to scrape multiple dynamically updated forms / webpages?

I've been trying to scrape a dynamically updated website, each webpage containing hundreds of rows, and the website in total has thousands of of pages (as in each page is accessed by clicking a "next" button or a number on the bottom of the page, just like you see in the bottom of a Google search page).
While I've been able to successfully scrape the pages, I've had trouble getting 100% accuracy in my results namely because the pages are dynamically updated (javascript). When a user logs in to their account, the system puts them back to the very top of the first row of the first page. So, for example, if I were just about to scrape page 101, and I were on page 100, and a user on page 101 logs in to their account, then I would miss that user's info. Considering the volume of activity, this can be quite problematic.
I tried running my automation during the wee hours, but realized there were users world-wide, so that was a fail. I also can't scrape pages in parallel because the forms are accessed/uploaded through javascript and I've had to use Selenium to click through one page at a time. (There's no unique URL per page; I've also tried looking through my browser's Network tab, but there's no variable that changes when I click on another page). I also tried accessing the API following the instructions on here, but the link that I was able to obtain only displays the information on the current page -- so it's no different than what I was able to access through the HTML source.
What are my options? Is there someway I can catch all the information at once so that I don't risk missing any information?
I know there will be people asking for the URL, but unfortunately I can't give it away. Even if I did, I couldn't give away the username and password. I'm a beginner at web-scraping, so any help is really appreciated!
If you've got no problem hitting the page as many times as you want, and the information never disappears, just go through all the pages as fast as you can, over and over again. In Selenium you can control multiple tabs and/or browsers simultaneously all using the same cookie to make your scraping faster.

is it possible to prevent oracle apex from submitting a page until a user clicks on a refresh button on that page?

I have just started working with Oracle APEX and would like users to be able to download reports from my application.The problem is I have a number of reports which have a large number of rows. Each time a user clicks on a page tab, the page is resubmitted and the query for the reports are executed again. This results in a lot of delay and is becoming frustrating for the users!
Is it possible to stop APEX from resubmitting the page until the user clicks a refresh button or is it possible to stop the query for reports from executing everytime the user clicks on a page tab?
To prevent submitting you can change the page template. Open page properties, in the section Shared Components find Templates. Near the word Page you will see a link to its template. Follow this link, then find a section Standard Tab Attributes. In the field Current Tab you will see something like this:
<li>#TAB_LABEL##TAB_INLINE_EDIT#</li>
Change this value to:
<li><a class="active">#TAB_LABEL#</a>#TAB_INLINE_EDIT#</li>
After that an active item in a menu will be displayed as a static text, not as a link.
All pages with this page template will have this behavior. If you don't need to change behavior of all pages: before changing template make copy of it, change the copy and choose the new template in a page properties.
Have you tried with the conditions?? I pretty new with Apex too, I had a similar problem, what I did was put conditions to the buttoms and regions.
After that I good a nice result. Hope it helps you.
Good luck

How does Twitter display my profile instantly?

Context
I realized that in Twitter, the profile page is displayed in different ways depending on how it is called:
By clicking the profile link in the menu, the DOM and the latest tweets are loaded, and the page is displayed in ~4 seconds. Each time.
Using the keyboard shortcut GP (or the link on the left), the page is displayed instantly.
Details
I noticed that the profile must have been recently displayed for GP instantly displays the page.
By closing and opening the browser, the profile must be displayed again for GP instantly displays the page.
Investigation
So at first I thought Twitter could use a serverside session variable to store data. Then I discovered a use of localStorage in the Twitter source code. I confess, DOM storage is unfamiliar to me and the Twitter JavaScript code is unreadable. So I don't sure they use localStorage to store the profile.
Question
Any hypothesis, infos or links about Twitter DOM storage / session storage?
This is an interesting question, so I went to twitter, and did some investigation myself.
Clicking on my profile name, the link is done with AJAX. I see my timeline getting downloaded. But, the page is already loaded in advance, so my information is also already downloaded.
By clicking on the link on the left, or with GP you just display the page already loaded (hidden, or in JavaScript object, so in memory). It will just display your profile already downloaded, and by AJAX download the feed (JSON).
The URL will change from https://twitter.com/#!/ to https://twitter.com/#!/wraldpyk (in my case).
When you click your profile in the menu (top right) you go to https://twitter.com/wraldpyk. This will re-open the page, and will download everything. Note you will get redirected to https://twitter.com/#!/wraldpyk, and meanwhile your timeline also gets downloaded (open FireBug and see the images and feeds getting downloaded)
As far as I can tell, no local storage (except in JavaScript, like everyone does) is done. All data is constantly downloaded on new page load.
Same thing happens when you type gh when on your profile. (And also with all other shortcuts)

Categories