Im looking into creating a web wrapper for a existing web app. I clearly want to make it as quick as possible.
Is it possible to host the JS-files locally, instead of having to download the file, without altering the existing web app?
Using a WebViewClient you can prevent loading the javascript from the web server (edit only in API level 11 and higher unfortunately). Or you can disable JavaScript, load the page, then enable JavaScript again. After the page is loaded you can modify the DOM using javascript: urls to load the scripts from a local url (like file:///android_asset from the top of my head).
You can also change the cache strategy of the WebView so that it will never fetch anything that is already fetched once before, which might also be what you want in this case. These are set in http://developer.android.com/reference/android/webkit/WebSettings.html, you could set it to LOAD_CACHE_ELSE_NETWORK in this case.
Related
I have a legacy web application that we are not allowed to modify yet. We need to add a new function to the application in the short term. We have been told that we may modify the webpage with any local scripts we want but we have to wait 4 months before they will unlock the application.
So my goal is to create a webpage locally, click on that local html file and have it open the url for the legacy application, and then inject the new JavaScript function to the application.
On "your" page, use an iFrame to "import" the page you cannot edit, on your page add whatever modifications you need/want.
If there is no server side scripting on the page, then copy the page source to your page, and add whatever you want to it. It is difficult to give you a focused answer without having access to or more information about the actual legacy page.
It can't be done directly since browsers prevent cross site scripting so injecting js from local machine will complain with same origin errors the only workaround i know is to use developer tools and open console then you can type your JavaScript there and run it directly
I'm working on a mobile app with Cordova. When the user starts up the app, I'd like to kick off a non-blocking function to load data from the server. This function can take up to a few seconds.
I'm using the leecrossley/cordova-plugin-background-task plugin. It works fine if I stay on the page that kicked off the function. If I change pages, it stops the function.
Any thoughts?
Jon
Sounds as if you are experiencing web-view-throttling!
Cordova uses the Chromium web engine so it usually follows all of the performance settings implemented in the chrome browser.
You can read some more about the throttling issue here:
https://thenextweb.com/apps/2017/01/26/chrome-throttle-background-tabs-google/#.tnw_WIKDX2EX
The solution to your problem is
to create a main page to function as the core of your application. This main page will always be open 'thus saving your from scripts being stopped when a new page is loaded.
For app pages, i recommend either:
creating a separate .html file per app page, then load those external pages into your main app page via an iFrame ...the src of the iFrame can be updated via javascript. (the downside to this approach is that you will need to write additional javascript to monitor and control the events that happen inside of your iFrames -from the main page.)
or
Build a very big single page application; wrapping all of your app pages in divs.. then create a javascrip menu function that manages what page is displayed and what page is hidden. (This may result in a massive .html page but this method will allow you to run any number of non-blocking scripts the device can handle at once.) <-- This is the method i have been using for over three years, i also add some iFrames to include special page modules when needed. CSS and JavaScript can be loaded from external files.
I believe most cordova developers actually use this single-page method!
I've got this setup:
Single page app that generates HTML content using Javascript. There is no visible HTML for non-JS users.
History.js (pushState) for handling URLS without hashbangs. So, the app on "domain.com" can load dynamic content of "page-id" and updates the URL to "domain.com/page-id". Also, direct URLS work nicely via Javascript this way.
The problem is that Google cannot execute Javascript this way. So essentially, as far as Google knows, there is no content whatsoever.
I was thinking of serving cached content to search bots only. So, when a search bot hits "domain.com/page-id", it loads cached content, but if a user loads the same page, it sees normal (Javascript injected) content.
A proposed solution for this is using hashbangs, so Google can automatically convert those URLs to alternative URLs with an "escaped_fragment" string. On the server side, I could then redirect those alternative URLs to cached content. As I won't use hashbangs, this doesn't work.
Theoretically I have everything in place. I can generate a sitemap.xml and I can generate cached HTML content, but one piece of the puzzle is missing.
My question, I guess, is this: how can I filter out search bot access, so I can serve those bots the cached pages, while serving my users the normal JS enabled app?
One idea was parsing the "HTTP_USER_AGENT" string in .htaccess for any bots, but is this even possible and not considered cloaking? Are there other, smarter ways?
updates the URL to "domain.com/page-id". Also, direct URLS work nicely via Javascript this way.
That's your problem. The direct URLs aren't supposed to work via JavaScript. The server is supposed to generate the content.
Once whatever page the client has requested is loaded, JavaScript can take over. If JavaScript isn't available (e.g. because it is a search engine bot) then you should have regular links / forms that will continue to work (if JS is available, then you would bind to click/submit events and override the default behaviour).
A proposed solution for this is using hashbangs
Hashbangs are an awful solution. pushState is fix for hashbangs, and you are using that already - you just need to use it properly.
how can I filter out search bot access
You don't need to. Use progressive enhancement / unobtrusive JavaScript instead.
I am monitoring browser events such as when a new tab is created. My extension needs to display these browser events in the new tab page.
To make versioning easier I would like the extension to be as dumb as possible. That is, all it needs to do is tell me is that a tab has been created and I need to be able to tell the extension to switch to a tab. Then I do not have to worry about what extension versions people have installed.
The new tab page so far is a redirect to my single-page app hosted on my server.
My options seem to be:
Using custom events to send messages between the content script and embedding page: http://code.google.com/chrome/extensions/content_scripts.html#host-page-communication
This seems like a security risk as the page javascript will also have access to the DOM and hence the messages I am exchanging.
Loading the HTML from server into an iframe, pulling application JS from server and injecting it into the iframe as a contentscript. This allows the app's JS to have full access to the chrome extension API which is what I need.
Another consideration is that my project is currently using RequireJS. For option 2, it seems I won't be able to use this.
Can anyone recommend the preferred option keeping in mind the security risks of option 1?
Will I be able to use RequireJS with option 2?
Is there another way to acheive this?
I have a web app written in ASP.NET MVC 3.0. There are some largish scripts (jQuery, jQuery UI) which I want to ensure are cached for best performance. In the Chrome Developer Tools Network tab the scripts always take around 1.5 seconds to be received when a page is loaded. I would assume if they are cached this would be near instant.
Is there any way to ensure javascript is being cached and how to tell if it is or isn't?
For JQuery in particular it is better to use someone elses CDN - you will not have to stream this content from your server AND caching is properly done by someone else. See http://docs.jquery.com/Downloading_jQuery for recommended CDNs.
For files that you have to host yourself make sure you set correct caching headers.
For static content you need to rely on server (likley IIS in ASP.Net case) to set correct headers - see http://support.microsoft.com/kb/247404 for some detais, search for "iis cache control" to get more links.
For dynamic content choose needed values OutputCache attributes or set headers yourself (i.e. see http://www.asp.net/mvc/tutorials/improving-performance-with-output-caching-cs ).