So... i'm creating a small game, which is already working on my computer.
Now i want to port it to mobile devices. Well, everything runs fine, except when i want to use the right/left button which i implemented using divs with ontouchstart and ontouchend.
That's because the android browser pauses my javascript/canvas displaying while i'm holding down anything on the screen. When i release the buttons, my playerobject moves to the new location.
tl;dr: How can i disable the function of the android browser, which pauses my rendering while holding down anything on the screen?
Thanks in advance!
(Samsung G Note 2, Android 4.3)
Related
I have tested the map on an ipad and thought that it works so far, now i discover that there is a problem on the pc with multitouch systems
the problem seems to be that leaflet decides internally which events are taken. I couldn't defie touchstart as the event type.
when i watch this demo:
https://leafletjs.com/examples/quick-start/
the following difference results on an iPad and a Windows 10 multi-touch screen. If I keep my finger on the screen, whether on the card or outside, I can still touch on the Ipad, but not on the desktop PC. the screen is blocked.
This must of course not be the case, because it is completely normal that the user of public information systems, e.g. puts one hand on the screen and touches with the other.
how can I fix this issue
I am constructing a web-site from scratch and I am using the following code from W3 to implement an image comparison slider : https://www.w3schools.com/howto/howto_js_image_comparison.asp
I implemented it in my web-site succesfully, and it works just fine on desktop browsers.
But unfortunately, it does not work too well on some mobile devices.
If you try it on a samsung tab 3 or any other samsung device (even other brands as well), once you "touch" the slider, it does not move, unless you "touch" again in a different position, the slider will move to that position.
It is more of a click here and there rather than a slider.
I tried debugging with javascript (not an expert here) but I didnt manage to actually find something conclusive.
I saw that both "mouse" and "touch" events are managed...
You can test it directly by going to above link and go to developer tools / as galaxy tab 3
We are working on a webpage that is currently having issues with split screen resizing.
When we change the orientation of the page on a mobile browser or do a window resize on a desktop browser, we are able to run the required updates fine because we listen to the resize and orientationchange events that get fired with either type of event.
However, we just realized that with split screen on Android (and possibly split screen on iOS tablets), we aren't triggering any resize events for the window and therefore can't do proper updating of our elements based on the split screen being activated or updated (when the user moves the split).
Is this something we can work around or am I just completely missing some functionality that I should be using?
I have looked around online and haven't found anything related so far that I could use from our JS.
Thanks! Any pointers would be greatly appreciated.
Information about Multi-Window Feature in Android - doesn't go over any possible ways to detect it from a web view though, but looks at it from an Android app dev perspective
I am building a web application for use by visually impaired users to explore SVG diagrams via a tactile printout. In order to allow this I need to calibrate the tactile printout to the image on the device. Thus I need to be able to receive the x and y coordinates of a click when using a screen reader, specifically VoiceOver for iOS.
Using Talkback, I would tap and hold until I hear an audible click and then use my second finger to double tap and send clicks to the application itself from that position.
When using VoiceOver, I understand how to send swipes by tap and holding until a triple bell and then swipe. However, I cannot for the life of me find out how to send a positioned click.
On a side note, I am using Hammer.js.
Any ideas?
Thanks
As far as I can tell, this functionality was supplanted by the interactive drag-and-drop support added to VoiceOver in iOS 11.
I have this tetris game I programmed with the intention of learning a bit more on javascript: elcodedocle.github.io/tetrominos
I can play it in most tablet/smartphone browsers, but on my Android 2.3.6 stock browser (Samsung Galaxy Ace ST5830) it has two problems:
Zoom events are not exactly canceled by user-scalable=no viewport property: double click and two-finger zooming still work. Sometimes.
The canvas freezes, also sometimes (I'm going mad trying to determine the cause: How the heck you debug a web app running on an android browser??). I'm guessing because of a swipe or drag event triggered that shouldn't be, so it's somehow related to the above. Tapping out of the canvas makes it work again.
I'm using Kineticjs to manipulate the canvas and bind the touch events, on top of jquery-ui for the dialogs and jQuery (not jQuery mobile).
Any suggestions/ideas?
the problem is with the device's processing speed.. evrery device has its own processing speed. canvas animations are based on javascript's setInterval and setTimeout methods..which performs as per the device's processing speed..thats why canvas games are sometimes laggy on handhelds.