Why might dragging SVG elements via TouchEvent on iPhone iOS be laggy? - javascript

I have created a web interface where the user can drag and drop SVG elements on screen. I am struggling with the performance of moving the SVGs via touch events on iPhone iOS using the webkit engine.
Everything is fine on desktop browsers and on Android phones that I could get hold of, but iOS on iPhones shows very bad performance (seems fine on iOS on one iPad that I could get hold of, but it sometimes leaves some traces of the SVG after moving).
There seems to be a delay before the touchstart event kicks in after touching the device and a delay before the touchend event is triggered after releasing the touch: An audio sample (already loaded) that is supposed to play after picking up or dropping the element plays with a delay of ~1.5 seconds. The touchmove event seems to be handled smoothly though - no delay with moving the SVG (after touchstart has ended).
I have already checked iOS Delay between touchstart and touchmove? - but the site that's linked to doesn't help me. I fail to get the scroll event on any element (window, document, svgElement) - and even if I did, I wouldn't know how this could help me.
I assumed the the issue might be related to the size of the base64 encoded background image that the SVGs are using, but reduzing that size even dramatically didn't help.
I read about some 300-350ms delay that iOS might have if there's no "fast tap" mode set, but a) the delay between touching/releasing the screen and playing the audio is longer than 350ms (rather 1.5 seconds) and b) playing with the touch-action CSS property did not help. (Eliminate 300ms delay on click events in mobile Safari)
I am really not sure if I am doing anything wrong (very well possible!) or if the webkit engine on (iPhone) iOS is simply so bad (compared to e.g. Blink on Android that runs flawlessly) that it cannot handle to render/move SVGs? Testing this is particularly iffy, because Browserstack doesn't issue TouchEvents properly and I never succeded to hook up the single physical iOS device that I have (a 2015 iPod Touch) to my Linux machine for remote debugging (while it's very simple for Android on Chromium). I'd really be grateful for hints!
An SVG roughly follows the following pattern (some attributes like viewBox, stroke-width etc. omitted):
<svg>
<defs><pattern id="SOME_ID"><img href="data:SOME_BASE64_ENCODED_IMAGE" /></pattern></defs>
<path fill="url(#SOME_ID)" d="SOME_SIMPLE_PATH"></path>
<path d="SOME_OTHER_SIMPLE_PATH"></path>
</svg>
The SVGs can be moved by MouseEvent or TouchEvent using the following logic:
// this.svgElement is the DOM element within the class
this.svgElement.addEventListener('touchstart', this.handleMoveStarted, false);
this.svgElement.addEventListener('mousedown', this.handleMoveStarted, false);
// Keep track of start position and add move/end listeners
handleMoveStarted(event) {
event.preventDefault();
event.stopPropagation();
if (event.type === 'touchstart') {
this.moveInitialX = event.touches[0].clientX;
this.moveInitialY = event.touches[0].clientY;
this.svgElement.addEventListener('touchmove', this.handleMoved, false);
this.svgElement.addEventListener('touchend', this.handleMoveEnded, false);
}
else {
// Same principle for event.clientX/Y and MouseEvent
}
// Callback to play audio here
}
// Compute delta position and update
handleMoved(event) {
event.preventDefault();
event.stopPropagation();
let deltaX = 0;
let deltaY = 0;
if (event.type === 'touchmove') {
deltaX = this.moveInitialX - event.touches[0].clientX;
deltaY = this.moveInitialY - event.touches[0].clientY;
this.moveInitialX = event.touches[0].clientX;
this.moveInitialY = event.touches[0].clientY;
}
else {
// Same principle for event.clientX/Y and MouseEvent
}
this.svgElement.style.left = `${parseFloat(this.svgElement.style.left) - deltaX}px`;
this.svgElement.style.top = `${parseFloat(this.svgElement.style.top) - deltaY}px`;
}
// Used to remove listeners on tochenend/mouseup
handleMoveEnded(event) {
event.preventDefault();
event.stopPropagation();
this.svgElement.removeEventListener('mousemove', this.handleMoved);
this.svgElement.removeEventListener('touchmove', this.handleMoved);
this.svgElement.removeEventListener('mouseup', this.handleMoveEnded);
this.svgElement.removeEventListener('touchend', this.handleMoveEnded);
// Callback to play audio here
}

I have investigated this issue some more, and it turns out that it's not the SVG dragging that's causing the huge delay in iOS, but it's the callbacks that follow. It seems that iOS has quite some trouble with playing plain HTML5 audio in a timely (real-time) fashion and I'll have to revert to some other solution (HTML 5 audio .play() delay on mobile).

Related

Android browsers long hold disable/ mouse up after 1second of mousedown

I am trying to build a one touch game on HTML5 canvas. It's a running game made with the help of this tutorial:
http://blog.sklambert.com/html5-game-tutorial-game-ui-canvas-vs-dom/
I changed the existing controls from space bar to a mouse click. It works perfectly across all the platforms except Android devices mobile browsers.
In Android devices, the touch makes the user jump. If there is a long hold in the touch, the user keeps jumping even when the touch is released. This problem does not happen in iPhones or iPads or desktops.
Can I make a Javascript function where a mouse down for a certain number of seconds is cut ? Something like:
if(mousedown for 1sec)
mouseup;
Let me know if you can think of another approach.
You can use touch events rather than mouse for touch enabled devices. Refer: https://developer.mozilla.org/en-US/docs/Web/API/Touch_events/Using_Touch_Events
function is_touch_device() {
/* Function code taken from http://stackoverflow.com/a/4819886/3946520 */
return 'ontouchstart' in window // works on most browsers
|| navigator.maxTouchPoints; // works on IE10/11 and Surface
};
if(is_touch_device()) {
canvas.addEventListener('touchstart', handleTouchStart, false);
canvas.addEventListener('touchend', handleTouchEnd, false);
}
else {
// Bind Mouse Events
}
function handleTouchStart(e) {
// This code runs when user touches the canvas i.e. on touch start
}
function handleTouchEnd(e) {
// This code runs when user removes finger from canvas i.e. on touch end
}
Also note that there can be scenario where the user puts two or more fingers on the canvas. Each of them will fire 'touchstart' event. So you'll have to handle that.
You can refer to http://www.javascriptkit.com/javatutors/touchevents.shtml for a good tutorial on touch events.

Javascript support for touch without blocking browsers pinch support

I have a little web gallery that I added swipe navigation to for mobile browsers. I did it with pretty simple touchstart/touchmove/touchend event tracking.
The problem is that when I try to pinch zoom in the browser window it fails if any finger starts in the element I added the touch event handlers to, I'm guessing from the calls to preventDefault.
Is there a way I can track the touch events for navigating my images without blocking the zoom in and out feature of the browser? I don't mind blocking single finger scrolling if it's over my element, but I want to allow the pinch zooming.
Here's the code:
function addDragHandlers(eventDivId) {
var startX, endX;
var slides = $('#'+eventDivId);
slides.bind('touchstart', function(e) {
e.preventDefault();
startX = e.pageX;
endX = startX;
});
slides.bind('touchmove', function(e) {
e.preventDefault();
endX = e.pageX;
});
slides.bind('touchend', function(e) {
e.preventDefault();
if ( endX - startX < 0) {
// go to next image
} else if ( endX - startX > 0) {
// go to previous image
} else {
// do click action
}
}
});
}
What you want is ongesturestart, ongesturestart move and end work the same as on touch but for more than one finger. From their you can add a listener for:
event.stopPropagation();
to prevent "preventDefault()" from propagating in.
I wanted this to work on Android so I didn't want to use gesture events, which I have read are only on iOS - though I have no android to test that claim.
I then looked at a bunch of javascript gesture libraries which try to make gesture support easy. Unfortunately none of them worked well enough for my purposes.
So I ended up using touch handlers on the body to track all the touches on the page and then a custom heuristic to determine whether to use the touch to navigate the gallery or to use it to scroll/pinch the page.
It's still not perfect. If you start by touching my gallery element and then touching outside it the pinch doesn't work.
As an aside, the "TouchSwipe" jquery library has an incredibly well-designed and flexible API, but with the configuration I needed, tracking only horizontal swipes on my element, it was too quirky under iOS6 and hasn't been updated for a few weeks. If you are looking into this sort of challenge and it's a few months from now I'd recommend checking the updates for that plugin.

How to detect mobile Safari global orientation rotation animation

Hi I'm trying to trigger an event when mobile Safari rotates to a different orientation. I am aware of the orientationchange however this is not acceptable because it is called after the orientation rotation animation is played and the new orientation is set. I have an element that I need to hide before or during the animation.
I'm trying to capture the state before the orientation has changed particularly before the animation plays. I've tried applying events like webkitAnimationStart and animationstart to the window, document and document.body and none of them seem to be triggered. Hoping I'm overlooking something.
This is a problem occurring in almost every mobile browser as far as I saw and there is no straightforward solution for it.
A semi-official suggestion coming from the Chrome team posted on their blog under Unlock screen on device orientation change is to use deviceorientation and simulate what the browser does internally to figure out the orientation of the device:
var previousDeviceOrientation, currentDeviceOrientation;
window.addEventListener('deviceorientation', function onDeviceOrientationChange(event) {
// event.beta represents a front to back motion of the device and
// event.gamma a left to right motion.
if (Math.abs(event.gamma) > 10 || Math.abs(event.beta) < 10) {
previousDeviceOrientation = currentDeviceOrientation;
currentDeviceOrientation = 'landscape';
return;
}
if (Math.abs(event.gamma) < 10 || Math.abs(event.beta) > 10) {
previousDeviceOrientation = currentDeviceOrientation;
// When device is rotated back to portrait, let's unlock screen orientation.
if (previousDeviceOrientation == 'landscape') {
screen.orientation.unlock();
window.removeEventListener('deviceorientation', onDeviceOrientationChange);
}
}
});
The particular use case the Chrome team used this code for is to get the device's orientation after using screen.orientation.lock (which disable orientation change events).
This can be generalized as a substitute for orientation change events giving you a slight time-advantage before the animation kicks in.
The tricky part is figuring out the right angle range for which the browser decides to switch orientations (you don't want to start your animation when the browser doesn't actually switch orientations).
One way to solve this is to take complete control over orientation changes using screen.orientation.lock where essentially you set the threshold and lock the orientation accordingly.
However since the world isn't perfect, screen.orientation.lock only works in fullscreen mode or in standalone web-apps... If you intend your app to be a fullscreen experience or a standalone web-app then you're in luck.

Wait To Get Touch Moves, Why?

I am working on a touch based JS application, I've studied Flex and Royal slider as examples. I noticed that both sliders acting similarly when getting touchmove event:
var started,touched ;
el.bind('touchstart',function(e){
started = Number(new Date()) ;
// Get pageX and pageY etc...
}) ;
el.bind('touchmove',function(e){
touched = Number(new Date()) ;
if (started-touched > 500) {
// Handle touch moves etc...
}
}) ;
My JS app works seamless without these, but why do they need to do this? Why they are waiting 500ms to get move datas?
I believe this is some kind of sensitivity setting. You only want to register a touch-move (drag) event if the user has been moving his or her finger across the device for at least 500ms (in this example).
This could be useful to differentiate between taps and drags. Otherwise, if the user would slightly move his/her finger when clicking, for example, a button, the app would also register a drag. As some controls accept both events, this could lead to erroneous behaviour.

Is there a way to catch the zoomed-out-max event in IE10 Metro?

In Internet Explorer 10 Metro Style I want to catch the event that happens when the user zooms out to full view using pinch gesture.
I can do it either using JavaScript running in the page or a C++ code running in IE address space.
It's a hack but it's working, The extra minimum zoom is around 0.85 of the normal size so we check if the user zoomed out more than 0.87:
window.addEventListener ('resize', function () {
if (document.documentElement.clientHeight / window.innerHeight <= 0.87){
// this will run more than once while the user
// is zooming out close to the maximum level
}
}, false);
It won't work when zooming is disabled (mobile websites).

Categories