Some reverse-engineering of the Level Editor and TXD stuff

Anything to do with Drakan level editing and modifications, post it here! Also this is the place to tell us about your new levels and get player feedback.
User avatar
Mechanist
Dragon
Posts: 303
Joined: Wed Mar 07, 2018 7:27 pm
Location: Poland

Some reverse-engineering of the Level Editor and TXD stuff

Post by Mechanist »

Here's a bunch of things I uncovered while digging around in the Editor:

--------------------------------------------

First, let's clear up some confusion:

Code: Select all

Flags: (...) // 0x01 this has a seperate alpha map attached after the image data
This does NOT work! For every possible combination of pixel formats!
Although the Level Editor places this data correctly (*) when importing alpha maps, the Riot Engine is incapable of actually using it due to some unknown bug (probably somewhere in the rendering code).

(*): Importing alpha maps works correctly, now that I've fixed it. Previously, the alpha channel data was also getting placed into the correct location, but was getting partially mangled in the process due to a programming error in the byte-order-flipping routine; this caused subtle distortion of the imported alpha maps.

So the purpose of this functionality:
if flags & 1:
// Unknown interpretation.
// This data might need to be appended to the above, or maybe not.
// Or maybe this data is corrupt and unusable.
uint8[uint32] separate_alphamap_data;
...is thoroughly moot, unless someone first fixes the underlying bug in the Engine.
However... the Level Editor uses this exact data when the "Edit :arrow: Convert" option is selected on a non-integrated alpha map.
This is the very data that is added when "Add :arrow: Alpha map" is selected; the Editor always adds it as a separate alpha map, thus effectively requiring an additional conversion step to be performed by the user.

So for practical purposes, best current practice for any future clones of the Riot Engine (and/or clones of the Level Editor) would be to throw a warning message whenever a TXD is loaded that has any entries with a non-integrated alpha map, prompting the user to convert these to the integrated format (since otherwise they will remain non-functional in the original Engine).

This is because currently the only way to have working alpha channels in textures is by integrating the alpha map with the surface pixels:

Code: Select all

Flags: (...) // 0x02 an alpha channel is integrated into the image data
.
--------------------------------------------

NOTE: The critical parameter that determines whether a texture's alpha map works (or not) is the FLAGS parameter in the TXD data.

For 32-bit textures, changing FLAGS.1 to False disables the (integrated) alpha channel with no in-game side effects; however it also causes strange behavior in the Editor - which now thinks that the integrated alpha map is actually a completely separate sub-texture (it isn't!); this allows some inappropriate operations to be performed on it (don't!), likely causing the Editor to crash and/or corrupt unrelated data in the process (untested!).

With all the other texture types, changing FLAGS.1 to False will cause colorspace corruption, due to the different meaning of the bitfields in the pixel data after alpha channel integration.

--------------------------------------------

For 32-bit textures, the byte order for the pixels' component colors in the .TXD files is as follows: BGRA (8:8:8+8).
Despite the Editor's vehement assertions to the contrary, the Riot Engine can handle 32-bit textures perfectly fine - including the alpha channel!

As proof, I've attached a database file containing a single 32x32 32-bit RGBA texture, which has been hexed in manually as a proof of concept.

Regardless of the pixel format, the texture pixel data is stored in the exact same pixel layout as in the BMP files - that is, flipped vertically. So the image data in memory starts at the bottom left corner, and ends at the upper right.

--------------------------------------------

The .BMP import codepath:
(Note: Function arguments are listed by the reverse order from that in which they are pushed onto the stack - that is, the same way that ollydbg counts them; "Called from" applies only to this particular codepath!)

When the "Add texture" option in the Databases window menu is clicked, the first thing that happens is that the TextureImporter function is called, from what is presumably the main loop for the Databases window:
loop TextureManager (Level_Ed.00426860)
Active at all times when the Level Editor's window is in focus,
Appears to oversee stuff related to displaying and manipulating textures in the Editor,

Notes:
- clicking on "Add texture(s)" :arrow: case 111 of switch 004268A7 :arrow: Level_Ed.00426D8C,
- bails out of the loop if no color palette is loaded (spawning the error message window first),
- otherwise branches to 00426D8C and continues from there,
- calls some other code which does a lot of (irrelevant?) things, including a heap allocation call; it seems that this is where the pointer to texture descriptors originates from,
- calls function TextureImporter,
- checks if the width of the loaded texture is one of the permitted values from a list of (32, 64, 128, 256, 512, 1024, 2048),
- checks the number of bits per pixel of the loaded bitmap,
- if it's 24 or 32-bit, calls BitmapBitnessConverter,
(...)
function TextureImporter (Level_Ed.004231D0)
Called from: Level_Ed.00426860 +600h
Arg_ECX: Pointer to a location where the texture metadata (width, height, BPP, etc.) should be written to,
Arg1: Unknown (seems to be important, whatever it is - but it sure ain't a valid pointer!),
Arg2: Pointer to the full file path for the import, stored on the stack (always 0 when called fromLevel_Ed.00426860 +600h),
Arg3: Pointer to memory block containing database metadata; specifically, it points to the beginning of the currently loaded palette data (taken verbatim from the .PAL file, minus the header),
Arg4: Unknown (always 0FFFFFFFFh when called from Level_Ed.00426E60),
Arg5: Unknown (always 0 when called from Level_Ed.00426E60),


Notes:
- sometimes gets called a second time after (if) the user presses the Cancel button in the file open dialog (can't replicate reliably!),
- reserves a lot of space on the stack (over 256 bytes) - this is the buffer for the file path returned by FileOpenDlgHandler,
- pushes a whole bunch of other crap onto the stack,
- if no pointer to the file path was specified, calls function FileOpenDlgHandler; otherwise it uses the path provided,
- checks for .BMP extension (again!),
- if all is in order, calls function BMP_Reader then returns.
function FileOpenDlgHandler (Level_Ed.0046A2A0)
Called from: function TextureImporter (Level_Ed.004231D0) +0A1h
Arguments: Unknown
(Appears to take no arguments? The last thing pushed onto the stack before it's called is another pointer to some empty memory space...?)

Notes:
- gets the current working directory from somewhere (the stack?)
- spawns the "Open file..." dialog, opened in the current working directory,
- checks if the selected file has .BMP extension,
- if successful, returns the full path to the selected file on the stack, as well as a return code of 0 in EAX.
function BMP_Reader (Level_Ed.004233F0)
Called from: function TextureImporter (Level_Ed.004231D0) +1FFh
Arg_ECX: Pointer to where the new texture metadata struct needs to be written to,
Arg1: Unknown (seems to be important, whatever it is - but it sure ain't a valid pointer!),
Arg2: Pointer to the location of the full ASCII file path on the stack,
Arg3: Pointer to memory block containing database metadata; specifically, it points to the beginning of the currently loaded palette data (taken verbatim from the .PAL file, minus the header),
Arg4: Unknown (appears to always be 0FFFFFFFFh),
Arg5: Unknown (appears to always be 0 if importing a base texture; 1 if importing an alpha map),

Notes:
- Arg1, 3, 4 & 5 appear to be passed verbatim by the calling function,
- also checks if the file actually exists, and bails out if it doesn't,
- that seems redundant, because the calling function also performs its own checks to that effect,
- then tries to open the file (Kernel32.OpenFile),
- if successful, calls function BMP_HeaderParser,
- if successful, writes some data to a struct in process memory (at the location pointed to by the 3rd pointer on the stack after returning from function BMP_HeaderParser; see description below),
- calculates the size of the pixel data (width in bytes * height in pixels) and allocates this much memory (Kernel32.GlobalAlloc),
- puts the handle of the allocated memory onto the stack,
- locks the memory with Kernel32.GlobalLock,
- copies the entire pixel data from the .BMP file verbatim into the newly allocated memory,
- if a 24-bit BMP, flips the byte order of the individual pixels from BGR to RGB,
- unlocks the allocated memory with Kernel32.GlobalUnlock,
- puts the handle to said memory into the same struct as the other data,
- closes the .BMP file with Kernel32._lclose,
- calls function TextureDefLoader,
- returns 0 in EAX if successful (or 1 if not?),

NOTE: the struct listed below is the texture metadata used by the Editor for all of its operations; hexing it causes the relevant values to change in real time (such as the presence or absence of alpha channel).
Data written to the struct in memory (all DWORDs):
- horizontal width of the texture data, in pixels,
- vertical height of the texture data, in pixels,
- horizontal width of the texture data, in bytes,
- # of bits per pixel of the base texture,
- # of bits per pixel of the alpha channel data,
- unknown DWORD value (always 0FFFFFFFFh if importing a .BMP...?),
- unknown DWORD value (always 0 if importing a .BMP...?),
- unknown DWORD value (always 0 if importing a .BMP...?),
- unknown DWORD value,
- handle to the allocated memory buffer containing the flipped pixel byte data (effectively, a pointer to the pointer),
- handle to the allocated memory buffer containing the separate alpha channel pixel byte data (valid if FLAGS.0 = 1),
- unknown DWORD value (always 0 if importing a .BMP...?),
- unknown BYTE value (usually 0Ah, sometimes just 0),
- BYTE value containing the texture flags,
- unknown WORD value (always 0?),
- unknown DWORD value (always 0 if importing a .BMP...?),
- unknown DWORD value (maybe a pointer?),
- pointer to the "fake palette data" (if it exists).
function BMP_HeaderParser (Level_Ed.00423830)
Called from: function BMP_Reader (Level_Ed.004233F0) +0C6h
Arg1: File handle to the BMP file opened by the calling function,
Arg2: Pointer to a lot of empty space on the stack, reserved by the calling function (roughly 2.7kB; the exact value is constant and very specific),

Notes:
- seeks to the beginning of the BMP file (Kernel32._lseek),
- reads the first 14 bytes (the BMP descriptor; Kernel32._lread),
- checks if that's actually a real, valid BMP file,
- then loads the next 40 bytes of the BMP header and checks if read was successful,
- compares the length of the header (read from the BMP file) with 0Ch (Why? Is it something related to .PCX import?),
- checks the # of bits per pixels of the .BMP,
- calculates some values which it then places in a struct on the stack,
- reads the offset to the start of .BMP pixel data and ._lseek's to that point before returning,

Return values on the stack:
- return address + the 2 arguments (cleaned up by RETN 8 ),
- pointer to ASCII string "bmp"
- pointer to ASCII string "pcc"
- pointer to some empty memory space in the block containing the .DEF data for the currently loaded databases (!NEEDS FURTHER INVESTIGATION!),
- (unknown DWORD value, but not any valid pointer; also needs further investigation),
- 0 at return time (later the handle of the allocated memory for the BMP pixel data gets placed here),
- file handle to opened BMP file (yes, it's a duplicate of the handle that BMP_HeaderParser had been called with),
- 0 at return time (later the pointer to the allocated memory for the BMP pixel data gets placed here),
- horizontal width of the texture data, in pixels,
- vertical height of the texture data, in pixels,
- horizontal width of the texture data, in bytes,
- # of bits per pixel,
- unknown DWORD value (always 0 if importing a .BMP),
- unknown DWORD value (always 0FFFFFFFFh if importing a .BMP),
- A LOT of zeros (the rest of the stack space reserved by the calling function was not used at all)!
function TextureDefLoader (Level_Ed.00469ED0)
Called from: function BMP_Reader (Level_Ed.004233F0) +3DFh
Arg1: Pointer to the location of the full ASCII file path on the stack,
Arg2: Pointer to where the data should be written to (immediately following the previous texture metadata struct :arrow: offset 40h),
Arg3: Unknown (always 100h when called from Level_Ed.004233F0),

Notes:
- does not seem to do anything of real importance beyond just writing the full pathname to the location specified,
- there's some extra code for handling cases such as files being on network drives, but in any case this data is only used for the .DEF file texture descriptors.
function BitmapBitnessConverter (Level_Ed.00421BB0)
Called from: Level_Ed.00426860 +677h
Arg_ECX: Pointer to a location where the texture metadata (width, height, BPP, etc.) should be read from (NOTE: this pertains to the source bitmap!),
Arg1: Pointer to memory block containing database metadata; specifically, it points to the beginning of the currently loaded palette data (taken verbatim from the .PAL file, minus the header) - note that this appears to always be 0 when exporting 16-bit bitmaps!,
Arg2: Number of bits per pixel of the target bitmap,

Notes:
- compares the BPP of the source and target bitmaps,
- if they match, returns 0 and does nothing else,
- otherwise
(...)
- if the bitness of the source bitmap is lower than the target's, it allocates the right amount of memory, and deallocates the previous buffer after completing the conversion,
- updates the struct on the stack with the handle to the new buffer, if applicable,
- the value it writes at Arg_EAX+24h is a pointer to the memory buffer (since it's GMEM_FIXED)!
(...)

--------------------------------------------

The texture export codepath:
function TextureExporter (Level_Ed.00422DF0)
Called from: Level_Ed.004274A5 (if exporting texture manually from the menu)
Arg_ECX: Pointer to a location where the texture metadata (width, height, BPP, etc.) should be read from,
Arg1: Handle to a window (but which - the calling one, presumably?),
Arg2: Pointer to the path+filename for the texture being exported (always 0 when exporting manually from the menu),

Notes:
- checks if Arg2 is nonzero,
- if it's 0, brings up the dialog prompting the user to select a location that the exported texture should be saved to,
- if Arg2 was nonzero, loads the path from the pointer provided and checks if the extension is .bmp,
- again checks if the extension is .bmp,
- if yes, calls function TextureExportHandler; otherwise it bails out (showing an error message if the extension was .pcx instead).
function TextureExportHandler (Level_Ed.00422F90)
Called from: Level_Ed.00422DF0 +185h
Arg_ECX: Pointer to a location where the texture metadata (width, height, BPP, etc.) should be read from,
Arg1: Handle to a window (but which - the calling one, presumably?),
Arg2: Pointer to the path+filename for the texture being exported,

Notes:
- checks if the texture has an integrated alpha channel (FLAGS.1 = True),
- bails out with an error message if that's the case,
- otherwise, calls function TextureExportMemAllocator (which creates a temporary copy of the source texture),
- then calls function BitmapBitnessConverter (which allocates more memory if required for the conversion result to fit),
- bails out with an error message if the texture bitness conversion failed for any reason,
- otherwise, creates a new file using Kernel32.CreateFileA,
- bails out with an error message if the file creation attempt failed,
- calculates the values that need to be written to the .BMP header,
- writes the .BMP file header in 2 separate operations, using Kernel32.WriteFile,
- locks the export pixel data buffer using Kernel32.GlobalLock,
- copies the data from the source to destination buffer, flipping the order of the color bytes,
- writes the contents of the destination buffer to the .BMP file, using Kernel32.WriteFile,
- unlocks the export pixel data buffer using Kernel32.GlobalUnlock,
- closes the opened file with Kernel32.CloseHandle,
- calls function TextureExportMemDeallocator,
- returns 0 if the file was exported successfully.
function TextureExportMemAllocator (Level_Ed.004214F0)
Arg_ECX: Pointer to some space on the stack where the struct for the new temporary (converted for export) texture should be written to (NOTE: this struct is 240h bytes in size!)
Arg1: Pointer to a location where the texture metadata (for the texture being exported!) should be read from (NOTE: this is the pointer to the source texture!),

Notes:
- the newly allocated buffer is the same size as the source texture,
- the pixel data is then copied verbatim,
- the value it writes at Arg_EAX+24h is a handle to the memory buffer (not a pointer)!
function TextureExportMemDeallocator (Level_Ed.004214F0)
Arg_ECX: Pointer to a location where the texture metadata should be read from (NOTE: when exporting a texture, this is the pointer to the temporary texture that got converted to the output .BMP pixel format!)

Notes:
- reads the texture metadata and deallocates the buffers used for the primary (and alpha, if it exists) texture pixel data,
- if this is a texture that is being deleted from a database, additionally calls some DirectDraw function to deal with that.
Attachments
Sample database with 32-bit texture.zip
(1.69 KiB) Downloaded 997 times

UCyborg
Dragon
Posts: 433
Joined: Sun Jul 07, 2013 7:24 pm
Location: Slovenia

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by UCyborg »

Wow, works in the editor and its 3D view! The game complains about unsupported 32-bit texture, so we still have some loose ends. When I messed with this, it only seemed to work in editor's 2D view. So the thing working in editor's 3D is new to me.

Edit: works in-game if Drakan.exe is modified to not bail out; skipping check at 0x436FBD. Or perhaps it would be better to add 32 (and 24?) to check to bail out if somehow faux value makes it there, the similar way it was done for texture sizes.
"When a human being takes his life in depression, this is a natural death of spiritual causes. The modern barbarity of 'saving' the suicidal is based on a hair-raising misapprehension of the nature of existence." - Peter Wessel Zapffe

User avatar
Mechanist
Dragon
Posts: 303
Joined: Wed Mar 07, 2018 7:27 pm
Location: Poland

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Mechanist »

UCyborg wrote:So the thing working in editor's 3D is new to me.
One critical thing that I had to do is to set the texture "pitch" value correctly in the TXD. With a wrong value, the preview in the Databases window was only slightly wrong, but in the 3D View it was totally messed up.

UCyborg wrote: Edit: works in-game if Drakan.exe is modified to not bail out
I take it that we can expect this change in next AiO patch then? :D

Admittedly, I didn't actually test it in-game, only in the Editor. But since both use essentially the same engine core, I jumped to the conclusion that it should also work OK in-game.

Interesting why would they add an extra check for that though, if the Editor already disallows importing them.
Especially since they actually work fine!

UCyborg wrote:Or perhaps it would be better to add 32 (and 24?)
Does it even support 24-bit textures? I haven't done any testing in that direction, because there is no good reason.
32-bit format is universal: it can accept a 24-bit texture verbatim AND it's also capable of storing alpha information on top of that, with no loss of quality.

Arguably using 24-bit (if it's even supported!) would reduce the database sizes by up to 1/4th (assuming no textures have alpha channels), but at present it doesn't sound like a very compelling argument; not when I can be throwing hundreds of MBs around with reckless abandon.

Zalasus
Whelp
Posts: 18
Joined: Mon Jan 29, 2018 6:50 pm
Location: Germany

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Zalasus »

Mechanist wrote: Sun Aug 19, 2018 12:33 pm ...is thoroughly moot, unless someone first fixes the underlying bug in the Engine.
Just as an interesting side note if you haven't noticed: In my early attempts at reversing the alpha maps I have found that this appended data seemed to always have the same length as the compressed pixels. The data always was seemingly random even for all black or all white alpha maps, which would suggest some sort of compression, but there never was any zLib header to be found. Might have been coincidence or me being stupid, but that confusion and the fact no in-game texture seemed to use alpha maps was reason enough for me to not investigate the issue further.

Mechanist wrote: Sun Aug 19, 2018 12:33 pm For 32-bit textures, the byte order for the pixels' component colors in the .TXD files is as follows: BGRA (8:8:8+8).
Could you elaborate how you found out about that? I initally assumed the byte order was GBAR (I know, probably just me being stupid), which turned out wrong after some testing. However, it now seems to me like the byte order is RGBA (just like the 24-bit format with the alpha byte slapped at the end, not adjusted for any byte order). At least that is what gives me correct results when loading 32 bit textures made by the editor in OpenDrakan. This of course involves no alpha whatsoever as right now I only have the version of the editor that refuses to add alpha channels to 32-bit textures.

Mechanist wrote: Sun Aug 19, 2018 2:37 pm Does it even support 24-bit textures? I haven't done any testing in that direction, because there is no good reason.
32-bit format is universal: it can accept a 24-bit texture verbatim AND it's also capable of storing alpha information on top of that, with no loss of quality.
In the Riot Engine, trying to create layers with 24 bit textures (as well as 32 bit) only yielded glitching black holes for me. I didn't test it with models, however.

User avatar
Mechanist
Dragon
Posts: 303
Joined: Wed Mar 07, 2018 7:27 pm
Location: Poland

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Mechanist »

Zalasus wrote:the fact no in-game texture seemed to use alpha maps
What. There are several of those, even in the stock game.
Arokh's wings, as well as his fireballs (Resources\Texture510) - those are the 2 that I can name right off the bat.
There are several other textures with transparency, but I don't remember what they were.
However... they all use the "integrated" alpha maps!

Zalasus wrote: The data always was seemingly random even for all black or all white alpha maps, which would suggest some sort of compression, but there never was any zLib header to be found.
Interesting. I never investigated this in any detail, because it didn't work for squat. Assumed it was just an engine limitation.

Quick testing shows that this data is perfectly valid IF the texture is saved in the .TXD with NO compression!
Subsequently reloading the .TXD and converting the alpha produces the expected result.
Of course even though the data is valid, the alpha channel still isn't rendered at all until it's converted; seems that the Engine always ignores non-integrated alpha data.

However, as you have stated, the compressed data is corrupted and unusable - attempting an alpha conversion on a texture that had been saved (with separate alpha) in compressed format creates only a glitchy square when that texture is subsequently applied to a layer.

The Editor is full of bugs, so it wouldn't surprise me in the slightest to find that it calls the compression routine when it's not supposed to, or calls it with the wrong parameters.

Anyway... I think it's perfectly safe to ignore the "separate alphamap data" in any case.
The Engine clearly doesn't want any part of it, valid or not. Storing it in the .TXDs is a total waste of space.
The Editor doesn't allow removing any existing alpha maps, even separate ones.

In my new fixes for the editor's BMP import functionality, I'll probably just skip the step of having separate alphamap data entirely and have it integrate it into the pixel data during import (showing the pixel format selection dialog for non-32bit base textures).

BTW, I found the root cause of the alpha import corruption bug... I was wondering WHERE was it getting the wrong "byte width" value from, and I found it yesterday.
It's the "byte width" of the existing ("base") texture (which are usually 16-bit)!
Which is totally wrong, since the subsequent code expects the "byte width" of the .BMP being loaded from disk instead! (and that's what my fix does)

The reason it didn't corrupt regular textures is because in that case, there was no preexisting (and likely incorrect) width value to use. It's as simple as that...

Zalasus wrote:BGRA (...) Could you elaborate how you found out about that?
The simplest and most obvious way.
After I have successfully hexed in a 32-bit texture, I started messing with the individual bytes in the .TXD, which is how I have rapidly arrived at that conclusion.
NOTE: This is the byte format the Riot Engine accepts! Not what the Editor outputs.

IIRC, converting existing 16-bit textures to 32-bit in the Editor caused the colors to be mangled as seen in the 3D View.

Zalasus wrote: At least that is what gives me correct results when loading 32 bit textures made by the editor in OpenDrakan.
Ahh, that explains it then. The existing routines for importing/converting 32-bit textures in the Level Editor are totally broken, it would appear. Unsurprising, given that they are not officially supported anyway.

Obviously, in this case it's important to recreate the Engine's behavior, not the Editor's. So BGRA it is.
I guess it's flipped due to endianness, since ARGB (or RGBA) would make the most sense, but whatever.

Zalasus wrote: This of course involves no alpha whatsoever as right now I only have the version of the editor that refuses to add alpha channels to 32-bit textures.
Wait, did you make your own Level Editor? Or are you talking about the existing one?
If you mean the original one, then again the same reason applies: 32bit = unsupported. End of story.

Zalasus wrote: In the Riot Engine, trying to create layers with 24 bit textures (as well as 32 bit) only yielded glitching black holes for me. I didn't test it with models, however.
Interesting, because as you can see here, 32-bit textures work perfectly fine - as long as they are NOT created by the Editor in its current shape!
Admittedly I didn't test them with models either (not yet, anyway), only with layers - but I suppose they should work OK regardless.

Zalasus
Whelp
Posts: 18
Joined: Mon Jan 29, 2018 6:50 pm
Location: Germany

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Zalasus »

Mechanist wrote: Mon Aug 20, 2018 4:06 am What. There are several of those, even in the stock game.
(...)
However... they all use the "integrated" alpha maps!
Sorry, that's exactly what I meant. All transparent textures I found use integrated alpha channels. I just have adopted the editor's convention of calling the latter "alpha channels", and those alpha channels that exist as a separate bitplane "alpha maps".

Mechanist wrote: Mon Aug 20, 2018 4:06 am NOTE: This is the byte format the Riot Engine accepts! Not what the Editor outputs.
(...)
Obviously, in this case it's important to recreate the Engine's behavior, not the Editor's. So BGRA it is.
Ah, okay, now I see my mistake. I totally agree that what the engine expects is important here. Also, BGRA makes sense as it is the byte format most commonly used by Direct3D back then, iirc. This way the engine can simply pipe the data from the .txd to the texture buffer and be done with it.

Mechanist wrote: Mon Aug 20, 2018 4:06 am Wait, did you make your own Level Editor? Or are you talking about the existing one?
If you mean the original one, then again the same reason applies: 32bit = unsupported. End of story.
No, I was talking about the existing one. I haven't found any way to import 32 bit textures into the engine except by using the 'convert' function, which I think we now have officially established as broken for 32 bit textures :D

Okay, while importing 32 bit BMPs yields 16 bit textures, adding alternate textures to existing ones from 32 bit BMPs actually yields 32 bit texture records. But since I have no clue what these "alternate textures" are about, I have yet to verify whether those are actually any good.

User avatar
Mechanist
Dragon
Posts: 303
Joined: Wed Mar 07, 2018 7:27 pm
Location: Poland

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Mechanist »

Zalasus wrote: All transparent textures I found use integrated alpha channels.
Yes, and I suppose this is the case precisely because the alternative doesn't work in the Riot Engine.

That, and probably also because these hypothetical "separate alpha maps" use more disk space, memory, and presumably also incur additional processing overhead.

Zalasus wrote: This way the engine can simply pipe the data from the .txd to the texture buffer and be done with it.
Ok, that's one mystery cleared up then :)

Zalasus wrote: But since I have no clue what these "alternate textures" are about,
Haven't messed around with those yet, but aren't the alternate textures used for things like moving water, etc. via the "blended material" approach?

// As these textures are unused and can generally be considered unsupported by the engine
Just you wait :D

// FIXME: the byte order created by the editor's convert function is RBGA, the one expected by the engine seemsto be BGRA.
// As these textures are unused and can generally be considered unsupported by the engine, I won't bother with
// this issue right now.
zdr >> blue
>> red
>> green
>> alpha;
Wait, what? This doesn't match either description? :?

UCyborg
Dragon
Posts: 433
Joined: Sun Jul 07, 2013 7:24 pm
Location: Slovenia

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by UCyborg »

locks the memory with Kernel32.GlobalLock

Another oddity. Locking and unlocking memory has no meaning since 32-bit Windows became a thing. Just a relic from 16-bit Windows days and preserved for compatibility. Mostly, see this.

Though the then new DirectDraw indeed implements the concept of (un)locking (video)memory to directly access surface data.
"When a human being takes his life in depression, this is a natural death of spiritual causes. The modern barbarity of 'saving' the suicidal is based on a hair-raising misapprehension of the nature of existence." - Peter Wessel Zapffe

User avatar
Mechanist
Dragon
Posts: 303
Joined: Wed Mar 07, 2018 7:27 pm
Location: Poland

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Mechanist »

Even more strangely - it unlocks the memory eventually, BUT then reads the unlocked memory later on. WTF?
So it's not even consistent with itself!

EDIT: updated the OP with more info about the routines involved.

Further notes:
  • Bypassing the check at 00426EC6 causes the texture to be loaded into memory with the same bitness as in the source .BMP,
  • 24-bit textures do not appear to work - they look OK in the preview, but in the 3D view all I get is a fully transparent square (despite the texture in question having no alpha channel!),
  • Loading 32-bit textures this way causes the byte order to be totally wrong (as expected); moreover, it's not even consistent from attempt to attempt,
  • It is possible to load 32-bit textures with integrated alpha in one step (by setting FLAGS.1 = True right after parsing the .BMP data), but that causes the wrong name to be used for the alpha channel due to the memory being in an inconsistent state.
So the "safe" way to load 32-bit textures would be to:
  • Add new bitness check at 00426EC6,
  • Treat 24-bit imports as equivalent to 32-bit textures with no alpha channel (corrected for byte format),
  • In any case, the imported texture needs to have its byte format adjusted to the 32-bit BGRA format expected by the Engine,
  • For 32-bit imports, check if the texture has alpha channel data; if not, the statement above applies; if yes, perform these extra steps:
  • Call function BMP_Reader again with the same filename but with Arg5 = 1; this will import the same file as separate alphamap data (which we don't care about anyway), but more importantly it puts the required memory values into a consistent state,
  • Clear FLAGS.0 and set FLAGS.1 in the texture metadata struct, this "converts" the texture to integrated alpha (which it actually was, right from the start) without touching any of the pixel data (tested by manually hexing the right values in - works as expected),
  • Also set the value for alpha bits per pixel to 8, so that the alpha channel will work properly and the correct .DEF descriptors will be generated when the database gets saved to disk (tested by manually hexing the right values in - works as expected),
  • ???
  • Profit...?

Zalasus
Whelp
Posts: 18
Joined: Mon Jan 29, 2018 6:50 pm
Location: Germany

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Zalasus »

Mechanist wrote: Mon Aug 20, 2018 2:42 pm Wait, what? This doesn't match either description? :?
This whole 32-bit texture business seems cursed. The fact that the Surreal guys didn't get it right and that it took me three commits to remove all errors from 8 lines of code should be proof enough.

Fixed it now :roll: Thanks for pointing out my error.

UCyborg
Dragon
Posts: 433
Joined: Sun Jul 07, 2013 7:24 pm
Location: Slovenia

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by UCyborg »

Even more strangely - it unlocks the memory eventually, BUT then reads the unlocked memory later on. WTF?
So it's not even consistent with itself!

This would cause chaos if the program was actually ever made for pre-Windows 95 systems. We were already bitten by similar programming mistakes with this game in the past. More on GlobalLock.

24-bit textures do not appear to work - they look OK in the preview, but in the 3D view all I get is a fully transparent square (despite the texture in question having no alpha channel!)

If I bypass the check at 0x426EC6 or just make it call that converter function, if BMP contents were full of red, they turn into full blue in 2D preview. That was the check I messed with back then, thanks for pointing it out!

This whole 32-bit texture business seems cursed. The fact that the Surreal guys didn't get it right and that it took me three commits to remove all errors from 8 lines of code should be proof enough.

I guess they got it right with their later games. They said they fixed tons of issues with Drakan: TAG iteration of the engine.
"When a human being takes his life in depression, this is a natural death of spiritual causes. The modern barbarity of 'saving' the suicidal is based on a hair-raising misapprehension of the nature of existence." - Peter Wessel Zapffe

User avatar
Mechanist
Dragon
Posts: 303
Joined: Wed Mar 07, 2018 7:27 pm
Location: Poland

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Mechanist »

UCyborg wrote: Tue Aug 21, 2018 8:46 pmIf I bypass the check at 0x426EC6 or just make it call that converter function, if BMP contents were full of red, they turn into full blue in 2D preview. That was the check I messed with back then, thanks for pointing it out!
Not quite sure we're on the same page here...
What I meant was that I bypassed checks in the Editor that cause 24-bit BMPs to be converted to 16-bit. So it took the pixel data verbatim from the .BMP file, then flipped the byte order (as it always does when importing 24-bit BMPs).

So the preview colors were OK (because of flipped byte order), but it still wouldn't work in 3D View; 24 bit textures appear to be simply completely unsupported by the Riot Engine.

UCyborg
Dragon
Posts: 433
Joined: Sun Jul 07, 2013 7:24 pm
Location: Slovenia

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by UCyborg »

Sorry, I accidentally omitted important part of sentence. What I wanted to say, I tried to make it call the function after the check at 0x426EC6, but changing the existing argument of 0x10 (16) to 0x18 (24). I guess that told it to convert 24-bit BMP I had to 24-bit, which doesn't make any sense. :mrgreen:
"When a human being takes his life in depression, this is a natural death of spiritual causes. The modern barbarity of 'saving' the suicidal is based on a hair-raising misapprehension of the nature of existence." - Peter Wessel Zapffe

User avatar
Mechanist
Dragon
Posts: 303
Joined: Wed Mar 07, 2018 7:27 pm
Location: Poland

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by Mechanist »

Actually some of these functions in the Level Editor are very well written.

Eg. the texture import codepath has correct handling for importing 32-bit BMPs when it flips the byte order to BGRA, without touching the alpha bytes. Even though that's 100% unsupported.

The "texture bitness converter" function also has sanity checking for most (if not all) possible values of input and output bitness, bailing out in cases such as you just described. Crazy.

UCyborg
Dragon
Posts: 433
Joined: Sun Jul 07, 2013 7:24 pm
Location: Slovenia

Re: Some reverse-engineering of the Level Editor and TXD stuff

Post by UCyborg »

Good to know!
"When a human being takes his life in depression, this is a natural death of spiritual causes. The modern barbarity of 'saving' the suicidal is based on a hair-raising misapprehension of the nature of existence." - Peter Wessel Zapffe

Post Reply