I am developing a WebView based Android application. This application is very responsive and fast on major Android versions i.e. 2.3 onwards but it's very slow on Amazon Kindle fire. One of the reasons for this sluggish behavior is it's reading lots of xml files (40-50) to load contents in a single html page and there are also hundreds of images per page.
One solution that I can think of is to read and parse xml files using native Android APIs (in a thread) and then incorporate the parsed xml into the html page. Could anybody please tell me how to use Java object in JavaScript? Any help would be highly appreciated.
For your parse slowness issue
Have a JS-> Java Interface to parse in Java and have those XML data fetch operations in Java too.
Displaying too many images slowness
Display only selective set of images as you scroll. Displaying all the images together puts a heavy load on the browser rendering. So use a Lazy Load (jQuery plugin) - http://www.appelsiini.net/projects/lazyload to only load when the user scrolls.
Related
I've been recently scraping some JS-driven pages. As far as I know there are two ways of loading the content: static (ready to use HTML pages) and dynamically (making HTML code in-place from a raw data). I know about XHR, and I've been successfully intercepting some.
But now I've faced strange thing - site dynamically loads the content after the page fully loads but there are no XHRs. How can that be?
My guess is: the inner js files are making some hidden requests (which transfer the data) and building page based on responses.
What should I do?
P.S. I'm not interested in selenium-based solutions - they are well-known, but slow and inefficient.
P.P.S. I'm a back-end developer mostly, so I'm not familiar with JS.
Nowadays you do not need to use selenium for scrapping any more. The Chrome browser can now be used in headless mode and you can than run scraping script after the page is fully loaded.
there is simple guide:
https://developers.google.com/web/updates/2017/04/headless-chrome
There is nodejs library for driving it (chrome-remote-interface) but the downside is that I could not found python one.
This is the problem we have. We have a spa and there are some pages that generates some reports. Those pages allows users to download some graphics generated by the spa. The download can be done with a word (docxtemplater), pdf (pdfmake) or just images files (jszip and download.js). All this is done by the spa (in the browser). There are two problems: as users start to add more data then the download is taking more time to render. And we have seen there are some incompatibilities due to some browser/OS versions. The main issue comes from generating images from the dom. We are using html2canvas (slow) and dom-to-images (fast but only working in chrome for us now - we had some issue when trying to generate images from some google charts).
What we are looking for now is to move all this into the server so we can have a controlled environment so it has support for IE, Safari, Firefox and Chrome. The idea is to generate all these files in the server.
I've read we could accomplish this by using phantomjs, phantomcloud or nightmare. Our backend is REST using java. So, before I dag into this, I'd like to know if anyone has experienced a similar issue or can point me in the right direction.
I am working on an application which is loading XML files and displaying data in SVG charts (line chart). This application needs to be fast.
The problem is that the implementation of this app has already started and it is completely based on JavaScript. I was told that this app needs to load and parse multiple very big XML files (up to 80 MB) and it needs to be very stable. So as JS is not able to parse such big XML files, they are loaded line by line and parsed as strings. When data is parsed, it is displayed in SVG charts which is another problem. There can be a lot of charts with a lot of data so sometimes browser cannot handle it.
My idea was to do it on the server-side, using SAX parsers and lazy loading on charts, but I cannot because people who are going to use it, they want to use it offline.
So now I would like to know if such thing can be done with JS only or do I have to use server or desktop app .
i am trying to improve the performance of my web application.
It is a java based web application & deployed on an amazon cloud server with Jboss & apache.
There is one page in the application that is taking 13-14 seconds to open. The functionality is so much that there are about 100+ http requests that get executed on page loading time. The Js & css files are taking too much time to load.
So i moved all of the Javascript code from my JSP page to new JS files. Minified the JS & css files. Still there is not much difference in the page load time.
There is dojo data on this page as well which takes time to load.
Is there any other appproach i should try to tune this page?
Can something be done at the Jboss or Apache level?
Apply caching for images etc. More here
Use CDN for any external libraries you use (like jquery). More here
Use a library for your js scripts like RequireJS to optimize your css and js files. You can concatenate (merge multiple js files to one) your code. This will reduce the number of ajax calls (that's what the browser does when it sees a js or css dependency) (As #Ken Franqueiro mentions in the comment section, Dojo already has a mechanism for this). More here
Optimize your images. Use appropriate dimensions. Do not use full blown dimensions if you just intend to use it for a 10x10 container. Use sprites if possible. More here
Show a message/loader to show the user some progress. This will minimize the user restlessness More here
If you load data that takes too long, then show the page and load the data afterwards. This will too give some sense of progress to the user.
If the response is very big you can compress your response data. Be careful though that the browsers your application supports, can handle the compressed information by default, or add a custom mechanism to decompress the data. More here
Use some profiling tools like Chrome Development Tools or FireBug for Mozilla.
Take a snapshot of your network traffic and check where the bottleneck is.
In Chrome you press F12 and then select the Network tab.
I'm developing an app for mobile devices. I'm using jquery mobile and phonegap (to access the internal functions of the device). I have a big xml file with 10 to more than a thousand nodes. Basically I need to display each node on its own page. In jquery mobile it's common practice to build one html file that contains all pages.
I'm playing with the idea to build something like this:
<div data-role="page" id="1">JS template filled with content of node1 from xml and linking to page 2.</div>
<div data-role="page" id="2">JS template filled with content of node2 from xml and linking to page 3.</div>
If I create such a huge page by parsing the XML file with javascript, I'm concerned that the app would be tremendously slow. In my experience browsers cannot handle pages that well, if they are very long. The xml file can contain text, and links to images, video and audiofiles.
I tried to render the xml file via xslt on client-side, but ran into the problem, that most android browsers do not support xsl (at least mine does not).
I need to find a solution on client-side, because the app should work also without internet-connection.
Can you point me to the right direction? Maybe I am wrong and it's no big deal if the main html-file contains a couple of hundred pages? I also had the idea to fetch the pages one by one from the xml, since I don't display more than one page at a time. But I'm not sure how to keep track of the nodes that I have already displayed to the user. The xml file contains no auto-incrementing identifiers. So it would be difficult to access specific nodes in order via xpath, or am I missing out on something?
Thank you!
An XML file with a thousand nodes is pretty small by most people's standards (and one with 10 nodes is miniscule), so it's hard to see what your real problem is.
Although some mobile browsers do not have native XSLT support, you could try Saxon-CE which offers an XSLT 2.0 implementation that runs on any Javascript-enabled browser.