Browser got crashed, with continuous drawing path on google map? - javascript

I am drawing Real Time path on google map data are comming continuous manner. But after some time browser have been crashed. I am trying to remove
data from array but problem is that all path get removed and I have to redraw it again. My Requirement is that remove data load on browser without temperig
current path, offcourse old path get removed (means till some distance from start point coulld be removed after removing data from array). Similar like pagination.
load only viewble area data and draw on map.Any guidence is very appreciable.thanks.

Looks like it's a memory leak. Try first to profile by your chrome devtools what is happened with memory an objects in tab.
Identifying a Memory Problem with the DevTools Timeline
Create some like codepen or similar page where we can look at your situation with javascript code.

Related

Workaround for bug on Chromium WebView about high load causing partial rendering on canvas

I found [1] this issue on newest WebView of Chrome. To produce it you need three things:
You have updated WebView to version from 15.11.2021
HardwareAcceleration in AndroidManifest is set true
When you heavily give stuff to render for webview, you start to see missing content on screen.
My technology stack uses a Openlayers map inside app that is actually a small website packed with Cordova, which is the component that uses the WebView.
By heavy use I mean the map has to have many layers of map data on for rendering and I have to make some maneavours moving the map and making it to start loading next rendered content before the previous one is completed.
At first time I open the map after installation, the map works ok with heavy load. Issue starts when I kill the app and login again and make the heavy movements when the app tries to render the first time. After that the map does not render content as a whole, some tiles and map menus come only partially.
The behavior continues even when I click off some map layers, and when there is very little of them anymore, the issue settles down. Interesting point is also, there is no single point where the load is enough to trigger the issue or settle down, at least the point is different going to each direction.
There is a warning message, that comes to logcat only when the issue is present:
chromium: [ERROR:tile_manager.cc(821)] WARNING: tile memory limits exceeded, some content may not draw
and I can observe it to be coming along same lines the 15.11.2021 version update has removed some "useless data" [2]. I am not 100% sure if these two things are related, but I have realized the versions before that particular update work just fine with my app (I removed updates from WebView in Google Play).
There are currently three workarounds:
Do not use map layers that are too heavy and trigger the issue (makes bad for our app user experience by hiding crucial information from user)
Remove webview updates from PlayStore (I personally tested that, overnight the system updated the broken version back, hard to communicate to end users).
Not to use hardware acceleration. (Other parts of the app get useless, too slow behavior makes end users angry).
As you observe none of these is working well. I believe that this same behavior is seen by other WebView heavy users too, so I opened this question with only one main question:
Meanwhile my bug report is advancing in Google, what could I do meanwhile to overcome the issue?
Sources:
[1] https://bugs.chromium.org/p/chromium/issues/detail?id=1274482
[2] https://chromium.googlesource.com/chromium/src/+/a0994781ea666d9b630b6478206ac429ed8444d1%5E%21/#F0

dash.js reloads the same segment many times

I'm using dash.js reference player and I'm trying to do frame-wise steps through a video by simply pausing and calling seek() with the appropriate timestamps on each step.
But there is an annoying delay after each step, because the player seems to keep reloading the same segment again and again.
The screenshot shows the requests:
The first step caused segment 19 to be (re-)loaded. After some idle time, the next 3 segments were loaded, too. Then I did 5 steps in a row, each of which resulted in a request for segment 19.
Is there any way to make dash.js cache a segment while I'm stepping through its content?
This would be an interesting read for your issue, even though it seems Ahmed Basil had answered well. DASH Monitoring
I would recommend using an SDN-based script with P4/OpenFlow to run your player with the segmented video, the reference player automatically records all the segments that are loaded into the player and send them to the client.
If you use a tool like Wireshark, you will be able to see all the segments used in the experimental run. To answer your question, the player usually reloads the same segment when there is an interruption (network or physical) wise.
It could be trying to change the segment root folder due to the player's preferences and resolution used. It will always reload a segment when that condition appears or it's trying to change a resolution due to an interruption. To conclude, yes there is a way to make dash.js cache a segment while it's playing a file. Simply make the Wireshark application monitor your player, this way you will see all the segments used, shown in the image here.

MapBox event when all Tiles are loaded

I'm using Mapbox GL JS API to manipulate a Mapbox map. Right before I upload my result (which is a canvas.toDataURL) to the server through HTTP I need to resize my map (bigger resolution) and then use fitbounds to get back to the original points. After fit bounds fires, it takes the map a while to load all new tiles in. Only after this can I actually perform the upload. Right now though, I don't know if there's an event that's capable of telling me if all tiles are loaded.
I've tried all possible load functions and events in the API. There's a few issues on the GITHUB project but they're now at least a year old and there's been no update. Halfway through 2015 they started talking about adding an Idle event, but I can't seem to find any new documentation of it anywhere.
Has anyone found a way to make the code wait for the map to load? Or has any information regarding an update on this feature?
I doubt it matters much, but I'm working in an angular.js app.
We just added a Map#areTilesLoaded check which sounds like what you're looking for. That should go out in the next release (v0.37.0). In the meantime, the following should work.
map.on('sourcedata', (e)=> {
if (map.loaded()) {
// all tiles are loaded
// turn off sourcedata listener if its no longer needed
map.off('sourcedata');
}
});

Using large JSON files with d3.js causes massive performance hits/crashes

So I currently have a massive JSON file that is about 90mb in size and about 3/4 million lines. I am trying to create a graph from it using the d3.json command. d3.json successfully produces the data, and I can render the graph, but there exists one node in my tree that has in excess of probably 500 children. This causes Chrome to crash and Firefox to grind to a halt, but not crash, giving me the opportunity to close the node and regain performance.
According to this stackoverflow article (d3 — Progressively draw a large dataset), I could use this to progressively draw the dataset. Can this be done for JSON with a more intelligent splicing? However, would the end result not be the same as in Firefox?
Is there any way that I could create a paging system for the child nodes? Is there a viable solution here other than simply not displaying that many child nodes?
Thanks in advance.
I resolved that the issue came from the animation and drawing done by d3, so I ended up creating pseudo-folders within the JSON to minimize the amount of nodes displayed.
So, instead of trying to expand 26154 nodes at once, I resolved to expand 104 folders which each contained 250ish nodes.
Hope this helps anyone encountering the same problem.

Screen-capture like DOM-capture with javascript?

I'm working on a web-app and I would like to have some first hand experience on how our users actually use our software. This is my idea:
*Use javascript to save the html-DOM and cursor-position. Possibly only the changes to the DOM to reduce the amount of data.
*Save it to the server along with the users browser used.
Do a javascript that updates the DOM according to the recording and an image that replicate the mouse movements in the corresponding browser.
Has this ever been done before?
Would this work in most cases?
As circle73 said, you can use HTML5 to do this via canvas, however, I don't think that would track the mouse position. You could write a JavaScript function to track the mouse coords every x seconds, you'd just have to time it with the screen captures so you can match up the mouse movements with the captured frames.
Your other options would be to do this via an ActiveX control as answered here: Take a screenshot of a webpage with JavaScript?
I would approach this with the following high-level strategy:
Use jQuery mouseover to record the user's mouse positions on the page. Store these positions (x,y coordinates) locally. Send a structured request to your server with these coordinates.
Use a browser automation framework like Selenium to "play" the stored coordinates. You can use the same path as your user, only in development, in order to see what he saw. For example:
void mouseMove(WebElement toElement, long xOffset, long yOffset)
This moves (from the current location) to new coordinates. There's more info here.
Take screenshots with of the page with Selenium WebDriver. More info here.

Categories