narzoul/DDrawCompat

COMPETITIONS AT ROSEMOND HILL colour issue

BEENNath58 opened this issue · 21 comments

This game is a part of the "Let's Ride! The Rosemond Hill Collection".

The colours in the game render incorrectly, regardless of running it native or with DDrawCompat
rosemondhill

Can you please look into the issue? Seems somewhat new to me

I tested it on my Intel GPU machine, and the problem is fixed. Looks like a Nvidia (+AMD?) problem.

Also @ghotik tried to set the "preferred" GPU to Nvidia (because Intel has no issue) but it didn't work, and to my knowledge, it doesn't work unless the application uses DirectX9 or higher (correct me if it isn't)

Luckily, I have the same difference on my Intel and NVIDIA GPUs. A quick look at the used resource formats in DDrawCompat logs reveals some key differences.

These lines are missing on Intel:
Using resource format: D3DDDIFMT_X8L8V8U8, texture, managed
Using resource format: D3DDDIFMT_X8L8V8U8, mipmap texture, vidmem

These 2 lines on NVIDIA:
Using resource format: D3DDDIFMT_X8L8V8U8, mipmap texture, managed
Using resource format: D3DDDIFMT_X8L8V8U8, texture, vidmem
are replaced by these 2 lines on Intel:
Using resource format: D3DDDIFMT_X8R8G8B8, mipmap texture, managed
Using resource format: D3DDDIFMT_X8R8G8B8, mipmap texture, vidmem

The D3DDDIFMT_X8L8V8U8 format is not supported on Intel. As far as I know, this format would be used for bump mapping. But somehow I doubt the game wants to use it for that, especially if it uses X8R8G8B8 as a replacement, which makes no sense. Maybe it just tries to find any 24 bit format, it happens to find that bump map format first, but it doesn't check that it's not an RGB format, and uses it by mistake.

Anyway, removing bump map texture formats from enumeration fixes it. In DDrawCompat, you can use the SupportedTextureFormats=rgb, argb setting.

I just remembered another game, Roller Coaster Mania 1, has similar issues. It turns out the same solution works for that too. I guess they use the same engine.

There is more to add line
VertexBufferMemoryType=vidmem
otherwise I get mess when selecting the rider.
rider-mess

What GPU is that? I can't reproduce it with either Intel, AMD or NVIDIA. It looks as if your vertex buffer gets corrupted by some invalid write or buffer overflow elsewhere. If your GPU has dedicated memory, then likely this is why putting the vertex buffer in video memory protects it better. I can try to add some code to detect this case, for testing purposes.

GTX 950.

What GPU is that? I can't reproduce it with either Intel

I can reproduce it on Ivy Bridge Intel HD Graphics

I can reproduce it on Ivy Bridge Intel HD Graphics

And is it fixed by VertexBufferMemoryType=vidmem? I think for integrated GPUs it should make no difference, since both are ultimately in system memory. But this makes it even more likely that it's some kind of memory corruption then.

And is it fixed by VertexBufferMemoryType=vidmem

That's correct. Vidmem setting works on Intel iGPU, while Nvidia 1050Ti works with both vidmem and sysmem settings

Hi, Narzoul, thanks so much for the precious help.
I am trying to replicate the fix in DxWnd and I think that the EnumTextureFormats method could be a good place.
I'd like to ask you if you made the patch also at the d3dumddi level. I tried to browse your code and found some link to the EnumTextureFormats method, but I couldn't track any d3dumddi callback able to do the same thing in a deeper and more generalized way. Can you suggest me if this is possible and, in the affirmative case, which callback should be hooked?

For the UMD, that would be the GetCaps function from D3DDDI_ADAPTERFUNCS, with the pData->Type parameter set to either D3DDDICAPS_GETFORMATCOUNT or D3DDDICAPS_GETFORMATDATA. Both together describe the supported formats. Be aware that on some runtimes (e.g. DX9) and driver versions, the same format may be returned multiple times with different capabilities, so each instance would have to be filtered.

In DDrawCompat, I used EnumTextureFormats for the filtering, because doing it at the UMD level would also prevent me from creating helper surfaces with the excluded formats.

@huh02 @BEENNath58 For the graphics corruption, please reproduce it with this version (without VertexBufferMemoryType=vidmem):
ddraw.zip (diff.txt compared to v0.5.4)

In my case, the game creates these 4 vertex buffers at this point:

11:43:44.472 VertexBuffer: &039B4200 Address: &0BDC0920 Size: 320000 AllocationBase: &0BD70000 BaseAddress: &0BDC0000 RegionSize: 3862528
11:43:45.722 VertexBuffer: &039B4460 Address: &0BE10E40 Size: 280000 AllocationBase: &0BD70000 BaseAddress: &0BE10000 RegionSize: 3534848
11:43:45.724 VertexBuffer: &039B3B80 Address: &1082A8A0 Size: 1224 AllocationBase: &10720000 BaseAddress: &1082A000 RegionSize: 1081344
11:43:45.724 VertexBuffer: &039B3020 Address: &1091AD40 Size: 42696 AllocationBase: &10720000 BaseAddress: &1091A000 RegionSize: 98304

The size=42696 one is the rider's model, which is a static model that is only locked/unlocked once. I added write-protection to most of its memory pages after the unlock. Hopefully, this will crash your games before the model gets corrupted. If this happens, please enable also the LogLevel=debug and CrashDump=full settings, and share both the log and dmp files. Though most likely the corruption comes from within Rosemond.exe and I won't have time to reverse engineer what causes it.

PS: I just realized that SupportedTextureFormats=argb removes also the RGB formats without alpha, so the correct setting should be SupportedTextureFormats=rgb, argb. I updated my previous comment about it also.

Game crashed.
Rename .zip to 7z.
logCompetitions at Rosemond Hill.zip

Well, that didn't crash where I expected. Actually that crash is caused by a debug logging bug in DDrawCompat, since some struct fields are not present on Windows 7. I just removed the logging of those fields for now, until I figure out a better solution.
Please try again with this version:
ddraw.zip (diff.txt compared to v0.5.4)

For me, if I have set the VertexBuffer as sysmem, it crashes in the rider screen. When set to vidmem, it renders correctly and runs.

With the previous build, the rendering would be corrupted but wouldn't crash with sysmem

New logs here.
Rename 001.zip and 002.zip to 7z.001 and 7z.002.
2Competitions at Rosemond Hill.001.zip
2Competitions at Rosemond Hill.002.zip

Thanks! It looks like it might not be heap corruption after all. The difference in your logs compared to mine is that in your case, the IDirect3DVertexBuffer7Vtbl::Optimize method actually does something. It destroys the original vertex buffer and creates a new one, presumably with the optimized vertex content. But either it does something incorrectly, or there is some other interworking I'm not aware of. For me, the method simply returns 0 without touching the original vertex buffer, unless using video memory, in which case it fails when trying to create the new buffer.

I tried to compare the Windows 7 and 11 implementations of d3dim700.dll in IDA, and it looks like the Optimize implementation may have been nuked in newer Windows versions. Probably not a big deal, because I doubt it had much (if any) effect on modern hardware.

Anyway, I opted to try to fix it by removing the Optimize step in your case too. See if this works:
ddraw.zip (diff.txt compared to v0.5.4)

Unfortunately, the game crashes again during Chose rider. Here are the logs.
Delete the .zip extension.
3Competitions at Rosemond Hill.7z.001.zip
3Competitions at Rosemond Hill.7z.002.zip

Oh, sorry, I think I attached the previous version again by accident. I updated the download link in my previous post with the correct version, please download it again!

OK, this one works fine, no crashes, perfect.

I tested it on my Intel HD Graphics on Win7 and it works fine with both vidmem and sysmem