flowkey/UIKit-cross-platform

Android: Improve touch handling

ephemer opened this issue · 2 comments

SDL's touch handling makes it impossible to get accurate timestamps on events, which breaks our velocity scrolling and causes other weird behaviour.

Since we have a custom custom/core/SDL_android.c, we can override the entry point and do our own processing of the touch events, e.g. attaching correct timestamps, device/finger IDs and avoiding unnecessary unit conversions.

If we implement this ourselves in Swift, we will be part-way to being independent of SDL.

Some more thoughts on this:

SDLOnTouchListener.kt contains the onTouch function called by the Android system on oncoming touch events. This in turn calls the native onNativeTouch function, which is currently implemented in custom/core/SDL_android.c.

The easiest way to go about improving the current behaviour would be to remove the onNativeTouch definition from SDL_android.c (and SDL_android.h) and implementing it directly in Swift:

var event = SDL_Event(tfinger:
    SDL_TouchFingerEvent(
        type: SDL_FINGERUP, // || SDL_FINGERDOWN || SDL_FINGERMOVE
        timestamp: timestampFromAndroidSystem, // ensure this is in ms
        touchId: 0, // I think this is the "Touch Device ID" which should always be 0, but check this
        fingerId: fingerIdFromAndroidSystem,
        x: xFromAndroidSystem,
        y: yFromAndroidSystem,
        dx: 0, // Unfortunately incorrect, but we calculate these
        dy: 0, // ourselves in our UIGestureRecognizers...
        pressure: pressureFromAndroidSystem
    )
)

// add the event to SDL's event stack
// don't use SDL_PushEvent because it overrides `event.timestamp` with its own:
SDL_PeepEvents(&event, 1, SDL_ADDEVENT, 0, 0)

That would allow us to use handleSDLEvents in our render loop as we have until now - this would mean the least amount of code change, keeping us compatible with the existing Mac code. To do this though, we'd need to update our switch statement to check for touch events too: up until now we've been relying on the mouse events SDL simulates based on the touch events. The problem there is that the simulated mouse events always have a fingerId of 0, which means we get weird behaviour if there's more than one touch simultaneously.

In UIApplication+handleSDLEvents.swift it's possible something like this will work (incomplete!):

extension UIEvent {
    static func from(_ event: SDL_Event) -> UIEvent? {
        switch SDL_EventType(event.type) {
        case SDL_FINGERDOWN, SDL_FINGERMOTION, SDL_FINGERUP:
            // TODO: try to get an existing/active event and update its touch / add a touch to it
            // CODE MISSING

            // ELSE: no existing event was found:
            let touch = UITouch(
                touchId: Int(event.tfinger.fingerId),
                at: CGPoint(
                    x: CGFloat(event.tfinger.x),
                    y: CGFloat(event.tfinger.y)
                ),
                timestamp: event.timestampInSeconds
            )
            return UIEvent(touch: touch)
        case SDL_MOUSEBUTTONDOWN:
            let touch = UITouch(
                touchId: 0,
                at: CGPoint(x: CGFloat(event.button.x), y: CGFloat(event.button.y)),
                timestamp: event.timestampInSeconds
            )
            return UIEvent(touch: touch)
        default:
            return nil
        }
    }
}

case SDL_FINGERDOWN, SDL_MOUSEBUTTONDOWN:
    let event = UIEvent.from(e)
    sendEvent(event)
cshg commented

performance issues still to be fixed here