Using the JavaScript MouseWheel on OSX seems to be not useable for precise stepped scrolling.
Any idea how to fix this example for OSX:
http://jsfiddle.net/daslicht/Qbq4k/
The issue is that, on OSX multiple MouseWheel Events get dispatched.
I only need to touch the MouseWheel of my Logitech Anywhere MX Mouse
and 3 or more events get dispatched.
When I try to scroll the list with the Touchpad,
precise scrolling is impossible due to the excessive MouseWheel Events.
On Windows scolling the List, works like a charm,
since only ONE Event get dispatched on each MouseWheel Step
(even in hyperscrolling Mode of the Anywhere Mouse)
I have even tried to block the incoming events
or lower the threshhold but nothing worked as nice as on Windows:
http://jsfiddle.net/daslicht/Qbq4k/ //blocking events after the first incoming for n ms
How do you guys deal with the MouseWheel on OSX?
if(osx){
mouse=false; //?
}
Related
i have an application that runs two on two browser windows (in two screens) and node webkit helps me mange them both and let them have shared memory.
in the appication i have the ability to drag elements from one scren to the other (between the browsers). at first my idea was to listen to pointerdown event and when it triggers in its callback to start listening to pointermove and pointerup on the document of both windows. the was when i realized that chrome has some intresting behaviour when initiating a pointerdown.
from what it seems, after the pointerdown all the further events are happening only in the current browser, even when the pointer leaves its boundaries. for example the pointermove event will have negative clientX values when the pointer exists the screen from the left side, or it'll be way higher than the screen width when leaving the screen from the right.
i also tested some other events with regular chrome tabs, and with
monitorEvents(document, EVENT_NAME)
and the results where the same - without doing a pointer down both browsers where catching pointermove, pointerup and pointerenter events, but when doing a pointer down event on one of the windows it stopped getting all the poreviously mentioned events on the other window until a pointerup occured.
i even tried to cheat chrome by doing something like
let pointerDownCallback= (event) => {
curerntScreenDocument.dispatchEvent(new Event('pointerup'));
firstScreenDocument.on('pointerup', pointerUpcallback);
secondScreenDocument.on('pointerup', pointerUpcallback);
firstScreenDocument.on('pointermove', pointerMoveCallback);
secondScreenDocument.on('pointermove', pointerMoveCallback);
}
element.on('pointerdown', pointerDownCallback);
chrome got the pointerup event but it didnt stop the pointerdown behaviour.
i still have found a way to disable this kind of behaviour and i tried lots of things - playing with blur and focus and similar things.
for now what im doing is modifying the event that triggers in one screen to match the positions of the other screen and triggering a service that does the pointerup callback action in the correct screen.
let pointerUpcallback = (event) => {
// im getting the screen by using clienX and clientY of the event and by knowing the sizes of each screen;
let screen = getEventScreen(event)
if(screen !== pointerDownOriginScreen) {
// modify the event's positionb to match the correct screen
event.screenX = ....
event.screenY = ....
}
// get the service that will handle the pointerUp in the correct screen
let service = getServiceForScreen(screen)
service.handlePointerUp(screen);
}
i have a similar thing for the pointermove, also most of my oither actions are based on the positions that i get on the event and the correct screen that i know the event should happen in it.
this is a pretty dirrty wasy to handle this problem. so my question is - is there a way to handle this issue in a more elegant and natural way?
what im trying to achieve is making it possible to drag one element to different position including another chrome window (thats of course part of my application). to do that i implemented a custom drag behaviour in my system. the idea of this drag is to minimize the element that im starting to drag, and replace it with a smaller element that represents its heading (similar to the heading of a regular chrome tab, but without seeing the content of the tab). that heading should follow the mouse cursor (even when im switching screens) and when im releasing the drag the element should maximise to its original size and content.
i added a pointerdown event handler to the element that minimizes it, saves its content and adds move and up event listeners on the document of both the chrome windows.
the idea was that the heading will follow the cursor via the pointermove, evenwhen it moves to the other screen because now the one who will fire the event is the document of the other chrome window. same idea was for the pointerup events, so when the drag will end, i'll know in which screen it happened because he will fire the pointerup event.
but that isnt the case because chrome doesnt allow events to fire in another window once you have initiated a pointerdown event on your window. all the following events will only trigger for the origin window of the pointerdown event.
you can simulate that behaviour by opening 2 chrome windows and in the devtolls's console to write in both of the windows.
monitorEvents(documents, 'pointermove')
now when you'll move the cursor from one window to another, there will be pointermove events firing in each window's console. but when you'll click on one window and hold the pointer down and stat moving it around (over the other window too), you'll notice that the pointermove events fire only for the pointerdown event origin window. that is also the case for other events as well.
In the site I am building, there is an effect where the top navigation "unlocks" from being a fixed element when you scroll past a certain point. It works very smoothly on my computer. However, on iPad or iPhone, the scroll event, which looks like this:
$(window).on('scroll', function(){...});
...if you flick to scroll the screen, the scrolling happens automatically, and the event doesn't fire until the scrolling comes to a stop. If you move your finger to scroll, the event doesn't fire until you let go. In other words, it does not fire as you move (i.e., scroll) the screen.
Is there some way I can capture both the movement of the user's finger as the screen is scrolled, and also as the "inertia" movement is happening? If my script would run when those events happen, the screen should be updated along the way, and it should all happen smoothly like it does on my computer.
I assume this has something to do with iOs. I don't have an Android device to test with... not sure if it is also an issue there or not.
Would appreciate any advice.
Thank you!
you could try using the touchmove event instead for mobile users. that way the code runds when they move their finger instead of after they let go.
$(document).on('touchmove', function(){...});
more info here: https://developer.mozilla.org/en-US/docs/Web/Events/touchmove
Like intelligentbean said, you could use the "touchmove" event and listen to it, you could also use touchstart and touchend if you want to do anything special before or after the touch happened.
Also, the jQuery event is also available for touch events, but its not the event passed on the parameter of the listener function, but rather on one of its properties:
$(document).on('touchmove',function(e){
touchEvent = e.originalEvent.touches[0]; //this is your usual jQuery event, with its properties such as pageX and pageY properties
});
I have a webapp running in Google Chrome Browser that runs fine on Windows 7 with a touchscreen, however on a Windows 8 tablet the touch events don't work in the same way.
It is much more sensitive to any movement of the touch which often results in a touchcancel event being fired instead of a touchend.
I'd like to map all touchcancel events to be a touchend. The app uses Hammer JS as I hoped it would cover situations like this. Does anybody have a method to fix this?
Got a much better experience on the events by changing the hammer defaults setting especially around the tap settings to make a larger threshold for a touch tap event which translates through to a touchend.
Go to any online multitouch javascript demo page; here are some:
mdn (same as jsfiddle.net/Darbicus/z3Xdx/10/) or this. I can only post max 2 links but every online multitouch demo i could find has same behaviour as described here.
Put one finger on canvas and don't move or release it. Now put another finger on canvas and try to draw some shape by moving it (be careful not to make movement with first finger). It doesn't draw shape, touchmove event is not happening for 2nd touch! Don't release any finger yet. Try to move only first finger. Now you get touchmove event for both fingers at once and all events work fine (immediately) for both touches from now on.
I tested it on 2 different tablets with Android 4.2.2. On both tablets i tested it first with Chrome 31 and 32 and then with Firefox 26. Always same result.
Why is touchmove event not firing for second touch if first touch hasn't moved yet? How to solve this?
Google fixed this behaviour in latest Chrome beta version 33 (https://src.chromium.org/viewvc/chrome?revision=244063&view=revision)
Firefox and Opera still have this problem.
When doing multiple touches in UIWebview i am unable to get a touchstart when i do the following:
Put two fingers on the screen. (This fires gesturestart)
Keep one of the fingers still, and lift the other (This fires gesturesend)
Put the finger that was lifted on the screen again and don't move any of the fingers while doing this (Nothing fires - i would say touchstart and gesturestart should fire here).
Touchstart and gesturestart fires as soon as one of the fingers are moved.
It seems to me that this must be a bug in UIWebView, and that it should be reported?
Is there a possible workaround?
Since it is in UIWebView it is generally also in every iOS browser (safari).
you can see the issue here: http://porsager.com/uiwebviewbug (only for devices with touchevent)