[HELP] Xmem decompression error with MC360 chunk data
Posted: Sat Apr 03, 2021 5:22 pm
The data can be decompresses using the official XDK Xmemdecompress API but fails with MSPACK_ERR_DECRUNCH when using quickbms implementation in Linux or Mac, which uses libmspack lzx_decompress with window size 17 and a modified read function from UEViewer.
The file does not use native compression and has a standard 0xFF header.
(libmspack impl: https://github.com/kyz/libmspack/blob/m ... ack/lzxd.c)
(UEViewer modified sys_read function https://github.com/gildor2/UEViewer/blo ... on.cpp#L90)
The script used is the following:
And the file is: https://www.mediafire.com/file/5od3897o ... k.dat/file
For clarification, all other types of files such as savegames decompress perfectly, but all chunk data that I've tried fails. I'm going to try debug or at least identify the problematic lzx blocks, but I thought posting it here would be a good idea in case someone has any clue of what might be happening.
(Crossposting from ZenHax: https://zenhax.com/viewtopic.php?f=9&t=15062)
The file does not use native compression and has a standard 0xFF header.
(libmspack impl: https://github.com/kyz/libmspack/blob/m ... ack/lzxd.c)
(UEViewer modified sys_read function https://github.com/gildor2/UEViewer/blo ... on.cpp#L90)
The script used is the following:
Code: Select all
endian big
comtype xmemdecompress
get ZSize long
get Size long
math ZSize &= 0x7fffffff
clog result Offset ZSize Size
For clarification, all other types of files such as savegames decompress perfectly, but all chunk data that I've tried fails. I'm going to try debug or at least identify the problematic lzx blocks, but I thought posting it here would be a good idea in case someone has any clue of what might be happening.
(Crossposting from ZenHax: https://zenhax.com/viewtopic.php?f=9&t=15062)