PDA

View Full Version : color depth support



DarkDragon
01-04-2017, 03:28 AM
ZC has command-line flags that let you choose the color depth. How exactly does this work? All of ZC's (heavy) use of palettes requires 8-bit color depth only, right?

If I try to set the color depth to something other than 8 (e.g., -32bit for my desktop's native 32-bit depth) I get artifacts and crashes. But are these flags supposed to work?

Gleeok
01-04-2017, 05:26 AM
They are, in fact, supposed to maybe or maybe not work, or do different and unexpected things on different platforms (On Windows XP they actually work minus the wrong GUI colors), and that's why they are undocumented and no one knows they exist. :P

I can't remember exactly why; just that it was the Windows 7 era and there were multiple gfx issues no one knew how to fix.

allegro actually converts all the 8-bit to 32-bit internally before displaying it.

ZoriaRPG
01-04-2017, 09:32 AM
They are, in fact, supposed to maybe or maybe not work, or do different and unexpected things on different platforms (On Windows XP they actually work minus the wrong GUI colors), and that's why they are undocumented and no one knows they exist. :P

I can't remember exactly why; just that it was the Windows 7 era and there were multiple gfx issues no one knew how to fix.

allegro actually converts all the 8-bit to 32-bit internally before displaying it.

I thought that they may've been something for Linux, which is what I mentioned to DD on Skype. If they truly do nothing, we can rid ourselves of them. Until we support native 24b or higher, they're superfluous.

SUCCESSOR
01-04-2017, 09:53 PM
I always wondered what it would take to make ZC work in higher color. Is that a major overhaul sort of deal. I know nothing about the workings of allegro.

DarkDragon
01-04-2017, 10:06 PM
Oh yes, it's a major overhaul ;)

In short, ZC currently makes heavy use of color palettes to implement CSets, some of the animations, etc. The palettes are implemented as true bitmap palettes: each bitmap is stored using 8-bit colors where each "color" is an index into the palette.

Of course in true color each pixel stores an R, G, B value representing its color, rather than an index into a color palette. This means that to implement CSets in true color, you either need to:
- precompute several different true-color versions of each sprite, one per CSet, and swap between them as its CSet changes; or
- write code to recolor sprites on the fly as the CSet changes.

It's doable but it's not a minor change.

SUCCESSOR
01-05-2017, 12:45 AM
That's about what I figured for True Color. I am guessing there's no 16 bit implementation of palettes. ZC is the only software I've used that uses palettes. Guess I should have been into game development in the late 90s.

Gleeok
01-05-2017, 02:25 AM
Palettes shouldn't be too hard (in theory) to move to glsl (you jimmy-jack and flatten the zc palette map to be a 256xN 2D RGBA texture, send down vertex colors as some kind of offset data to the actual single-color texture which are indices into the the actual color map, etc), it's probably just the wonkyness of the entire cset system that zc uses that's sure to give some headaches to whomever has to rewrite the palette stuff.


[edit] I just realized how funny that sounds starting the sentence with "Palettes shouldn't be too hard to move to glsl." :P

SUCCESSOR
01-05-2017, 09:14 AM
I have quite a bit of experience moving pallets.