I've been using 16bit palettes with gsKit and had no problem. PCSX2 also was displaying the images just fine. Just note, that for 16bit palettes, if you scale anything that is bilinear filtered, the transparency can sometimes be a little bit odd (depending on parameters).
Originally Posted by SP193
Anyway, just set gsGlobal->Test->AREF, gsGlobal->Test->ATST, gsGlobal->Test->AFAIL and then invoke gsKit_set_texa and gsKit_set_primalpha with the parameters you want. I've been using the values that were simulating PSX type palettes and transparency effects most acurately. You will most likely need something else
As for the clut memory. If the texture is 4 bit, just copy 32 bytes to the clut buffer. If the texture is 8 bit, and you have a clut consisting of 256 consecutive 16bit entries, then just copy them in following order:
... or simply exchange middle two 16bytes of every 64 bytes .
for (i = 0; i < 8; i++)
memcpy(&clbuffer[i*4*16 +0*16], &clsrc[i*4*16 +0*16], 16);
memcpy(&clbuffer[i*4*16 +1*16], &clsrc[i*4*16 +2*16], 16);
memcpy(&clbuffer[i*4*16 +2*16], &clsrc[i*4*16 +1*16], 16);
memcpy(&clbuffer[i*4*16 +3*16], &clsrc[i*4*16 +3*16], 16);
Then just send clubuffer to vram (GS_PSM_CT16: 16:16 for 8bit, 8:2 for 4bit).
PS. I'm writing it from memory, but it seems to be right .
Don't all the glyphs have the exact same clut? If that's the case, then why bother whether the clut is 32 or 16 bit? In that case you're saving 512bytes at most... Not really a big deal (unless I'm wrong and each glyph uses own, different palette - I've never actually used fontm at all).