rust-windowing/raw-window-handle

Decouple display and window handles

kchibisov opened this issue · 17 comments

When user wants to deal with platform specific code they most of the time need raw display, like e.g. wl_display. However RawWindowHandle provides with both of them.

I'd suggest to logically split them into RawDisplayHandle and RawWindowHandle, where RawDisplayHandle would have only reference to display and RawWindowHandle could have only window, like wl_surface or be the way it is right now?

The issue I'm having is that I want Display on every platform, but I don't want surface, since the surface isn't yet created.

For example winit could implement raw_display_handle on event loop, so I can pass to e.g. glutin.

From what I understand, this would make sense for WaylandHandle.display, XcbHandle.connection, and XlibHandle.connection? I'm not sure if there's something analogous on other platforms.

For something like android/macOS it would make sense to have plain Android and Macos, etc. Since on those platforms you don't have those. Also with Gbm(device).

So it'll work the same way it works right now, it just prevent me from creating RawDisplay right inside glutin.

So it'll work the same way it works right now, it just prevent me from creating RawDisplay right inside glutin.

Alright. I'm not familiar enough with glutin to understand why this would be useful, but I'm assuming this has something to do with decoupling glutin from winit?

Alright. I'm not familiar enough with glutin to understand why this would be useful, but I'm assuming this has something to do with decoupling glutin from winit?

To create EGLDisplay or GLXDisplay you need raw display handle, nothing more. This display allows you to create objects, such as EGLContext, EGLConfig, EGLSurface, etc. The RawWindowHandle is only required for EGLSurface, and not all the surfaces actually, since e.g. PBuffers don't need it at all.

I'd assume the same applies to vulkan or e.g. softbuffer(not the current impl of softbuffer in particular).

I'd assume the same applies to vulkan

I don't recall there being any concept of a "Display" connection (needed) to do this. As long as you can enumerate a Vulkan device one can create/allocate a texture for "rendertarget" purposes.

You'll need both to create a surface on a window though:

https://github.com/ash-rs/ash/blob/master/ash-window/src/lib.rs

Perhaps a "display" handle may be relevant for VK_EXT_physical_device_drm to associate a physical Vulkan device with its DRM handle.

Well, don't you need a wl_display for Wayland platform with vulkan? I'd assume you'd need one.

@kchibisov I don't think you need any of that to create a Vulkan device and do anything like compute or offscreen rendering. As linked to the ash-window crate a wl_display and wl_surface are needed to create a vkSurfaceKHR though.

That's interesting. Though for EGL you need wl_display to create EGLDisplay, and the wl_surface is required to for EGLCreatePlatformWindow (but not wl_display, since EGLDisplay is used here).

In general wl_display could be used to initialize stuff that doesn't require window, like setting up a wayland queue to handle events.

@kchibisov Yes that surprised me too, seems the EGL API builds on top of an active windowing system (but it should be possible to run it headless?) whereas Vulkan only looks at physical GPUs provided by the system and can access them without any.

I do agree that it'd be nice to have/use it this way.

This entire topic completely doesn't make sense to me as a Win32 programmer because with Win32 it's just the HWND and that's it.

However, I'd like to say that I'm open to changes for other platforms if people that know about those platforms think it's appropriate.

I'll add that the goal of the crate has always been for there to be some sort of common data/trait for talking about the OS handle so that [window lib] and [graphics api lib] can share info without ever directly depending on each other, but that doesn't necessarily mean we only have a single data value. It certainly might be appropriate for some platforms to have more than one "handle" that can be asked for. This is the part I don't know about, and where I'll leave it up to others who know about those APIs.

This entire topic completely doesn't make sense to me as a Win32 programmer because with Win32 it's just the HWND and that's it.

Yeah, because you have only one possible display server? On macOS and android it's the same from what I can see, however Wayland, X11, and XCB do have their own stuff to connect to system display server, since you know, there're multiple of them (mesa should work with Xorg and Wayland at the same time, and it must differ them somehow). So something like RawDisplay will be required for glutin shorty ( I already have one internally, so I can upstream it later on when I'll be done with the rest platforms).

@Lokathor Doesn't Windows have this for HINSTANCE (used in ash-window) as well? It's been super long ago but I thought that was application/process/message-pump wide?

Ah, interesting comparison.

Yes, there's an HINSTANCE type. It's occasionally needed, but 99.99% of the time what you're supposed to pass to functions asking for an HINSTANCE is the instance for your own process. You can get a copy of this value at any time with GetModuleHandleW(NULL).

Each HWND is associated to an HINSTANCE. You can get the instance for a window handle with GetWindowLong(hWnd, GWL_HINSTANCE), and relatively few APIs need the instance, so often you might not store a window's instance yourself at all (just look it up if you happen to need it).

Each HWND has its own event loop. The event loop for a window is similar to a MPSC: Other threads, even other processes, can put events into a window's queue, but then only the thread that initialized the window can pull from the window's message queue and handle the messages.

Just checking quickly, it looks like:

  • GL uses an HDC (which is a thing you can get for a window) to make a Context.
  • VK uses an HWND + HINSTANCE pair to make a Surface.
  • I dunno what DX uses I'm an open source ninny.

If a "display server" is something like "the system has multiple display servers available, and i get to pick what display server i want to make a window for", that doesn't really map to any windows API that I know about.

Like kchibisov said, there's just the one windowing system on Windows, and you get what you get.

@Lokathor Thanks for the detailed insight!

So yes, if @kchibisov has a use-case for needing the display-server connection without having a surface/window yet - even if it's just for EGL - I think this is a good change to make. And Vulkan user may benefit when utilizing VK_EXT_physical_device_drm.

I am a fan of this, especially for cases where the display isn't easily clonable. My only concern: how would you stop, say, a Wayland display server from being used with, say, an XCB window? That feels like a larger surface that could possibly panic.

My only concern: how would you stop, say, a Wayland display server from being used with, say, an XCB window? That feels like a larger surface that could possibly panic.

Well the API can prevent it in a first place. And if you've got a Wayland display, you're unlikely to get an Xcb window, since you don't have xcb around?

The Api is already unsafe to begin with, since you basically pass pointers around. In general applications can compare whether the right thing is passed. Like it's not a problem when we're doing low level stuff.

The main issue I'm trying to solve here, is that I need a display before I even try to create a surface, like I don't have a window at all in glutin when I query for X11 visuals with glx for example. And later on I'll use that visual to create an X11 window :/

If you have an idea on how to make them safer to use, I'm all years, but take into account that I should have display without a window.