Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[draft] Add support for system-provided gestures #1544

Draft
wants to merge 3 commits into
base: master
Choose a base branch
from

Conversation

gormster
Copy link

@gormster gormster commented Oct 9, 2023

Issue description

See #1541

Solution description

This adds a new hardware event type alongside ButtonEvent and PointerEvent called GestureEvent. GestureEvent follows the pattern of PointerEvent, mostly, but is distinct in some important ways that make it incompatible. For starters, a gesture event has the idea of a gesture phase, as it's tracking a single gesture over time.

The events fired follow a naming pattern after the phase; e.g. for the MAGNIFICATION gesture type, we have magnification_began, magnification, magnification_ended and magnification_cancelled.

That being said - this is a pretty 1:1 interpretation of gestures on Apple platforms (UIGestureRecognizer on iOS gives pretty much the same pattern, though I haven't added an implementation for it). I haven't looked in to how Windows or Linux provide this information, though I suspect it'll be pretty similar?

There's also the possibility of using the system-provided gesture recognisers e.g. NSGestureRecognizer, UIGestureRecognizer, Microsoft.UI.Input.GestureRecognizer. These allow for some more complex behaviours like gesture priority and exclusion. However, I think at that point we're maybe approaching the edge of what Panda is for…

TODO:

  • Add implementation for GestureType::SWIPE on macOS
  • Add some test cases (help please? how to do this?)
  • Figure out if it's even possible to detect this feature being enabled at the system level

Checklist

I have done my best to ensure that…

  • …I have familiarized myself with the CONTRIBUTING.md file
  • …this change follows the coding style and design patterns of the codebase
  • …I own the intellectual property rights to this code
  • …the intent of this change is clearly explained
  • …existing uses of the Panda3D API are not broken
  • …the changed code is adequately covered by the test suite, where possible.

So far it just gets to the MouseWatcher and logs a message
also move gesture data to union
also remove logging thing from mousewatcher
@gormster
Copy link
Author

gormster commented Oct 9, 2023

wrt feature detection:

on a Mac with a trackpad, you can enable and disable individual gestures in System Settings. If you've got a Magic Mouse, you can also map some of the gestures on the mouse surface to the system gestures. I kind of assumed that there must be some API to see if these gestures are enabled or not? But if there is I can't find it. Admittedly that doesn't mean much because Cocoa is a gigantic decades-old API and it very well could be in there somewhere.

I'm posting about it on the dev forums now, but I'm not sure it'll get anywhere.

@rdb
Copy link
Member

rdb commented Oct 9, 2023

This seems, at first glance, like a good approach to handle system-provided gestures.

For keyboard events, normally we use a ButtonThrower, rather than throwing directly at the MouseWatcher. Maybe we want a GestureThrower?

Do we need support in the GUI system? If so we need MouseWatcher to pass the gestures to the MouseWatcherRegion. At least for two-finger scroll--or is that not a gesture?

What do we do with pointer events? Are they suppressed when gestures are detected?

Checking other platforms is a good idea. I don't know about other platforms, but Android has GestureDetector:
https://developer.android.com/reference/android/view/GestureDetector

I suspect that on some platforms we might need to provide our own gesture detection. I suppose we could create a data graph node for that at some point (that takes in pointer events and emits gesture events and filtered pointer events). But that's out of scope here and this approach is still compatible with that.

@gormster
Copy link
Author

I wasn't super sure if regions made sense for gestures, but on reflection I think so - gestures are recognised at the view level; they're sent to the view that the cursor is over when the gesture starts and keep getting sent to that view until the gesture ends. If you've got say, a minimap, you definitely want a zoom gesture in the minimap to zoom the minimap.

I made a little toy app to mess with this stuff, and I found some weird behaviour that I think is worth learning from. If you start a zoom gesture in one view, then move the cursor to a subview and start a rotate gesture, the subview now receives all gesture events, including magnify gesture events, and the original view never gets the gesture ended event. This feels like a bug? Even though it's very unlikely to get hit - you need to be using a mouse and a trackpad simultaneously - I think the sensible thing to say is that a gesture of a given type is locked to a specific region until it hits the ENDED or CANCELLED phase, just like we do with mouse clicks and _preferred_button_down_region. (If it has a BEGAN phase, that is… SWIPE events only have a CHANGED phase, no beginning or end.)

Re: GestureThrower – I was confused by the two totally separate code paths for handling button presses… I think I see now, one is for the GUI and one is for the scene. I'm not sure I understand why? I get that you don't want a key press that was consumed by the UI to be forwarded on to the scene, but I'm not sure what the point of ButtonThrower is when it seems like MouseWatcher implements pretty much the exact same functionality. If there's no region for it to throw to, can't it just drop the event parameter in throw_event_pattern? What am I missing here?

Re: Two finger scroll – at the moment this is handled as a MouseWheel button, and as far as I can tell is not counted as a "gesture" by macOS. NSPanGestureRecognizer isn't triggered by it. I think MouseWheel being a button is… not great? But that's a different issue.

@rdb rdb marked this pull request as draft October 10, 2023 06:25
@rdb
Copy link
Member

rdb commented Oct 10, 2023

I agree that when you start a gesture it should be locked to that region.

Yes, MouseWatcher is designed to filter out events that are meant for the GUI system. As for the separation between ButtonThrower and MouseWatcher - I am not privy to the rationale for the design decisions, but I imagine that the designers thought it was good to separate these duties, which perhaps allows for more flexibility such as obviating the need for MouseWatcher in a GUI-less application, creating multiple ButtonThrowers with different prefixes, or allowing other data graph configurations that I can't imagine. Note that there also isn't a MouseWatcher for other input devices such as gamepads (those directly get a ButtonThrower attached), but those don't generate gestures anyway. I don't know what the pattern events on MouseWatcher are used for.

We ought to fix the two-finger scroll being a wheel event. I think there ought to be a scroll event with delta values. I don't know if that should be considered a gesture or not (for macOS, these events probably predated the gesture system), but as you say, it's probably out of scope here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants