On the CELT front
Jul. 30th, 2010 07:11 pmI mentioned in my previous post that much technical work was done while at the IETF meeting. First, it's always good to have other people looking at your code, and meeting face to face is the best way to actually explain your code to others. The first thing that happened while Tim was looking at my code was he found much simpler ways (closed-form) to compute probability distributions I was computing in an iterative manner. The next thing that happened was that while I was trying to explain to him some bit allocation detail, I just couldn't figure out why there was a division by two in the bit allocation of the band split. The explanation was simple: we just shouldn't be dividing by two. That resulted in an easy (though small) increase in quality.
Another CELT related topic that we were finally able to investigate more is allocation of the bits between the fine energy (gain) and the PVQ codebook (shape). There was a mismatch between the code and the theoretical analysis we had. After actual calculations based on (Laplacian) random data, Tim found that it matched the theory almost perfectly. The only problem is that PQevalAudio (objective quality measurement) disagrees with the theory as to what the optimal allocation is. The problem is that it's very hard to tell which one is really optimal just by listening, so this is still not fully resolved.
The last thing we've worked on (with Tim) that's still ongoing is optimising the pdfs used by the range coder for coarse energy encoding. There may be a few
bits there we can save so, it's worth trying.
Another CELT related topic that we were finally able to investigate more is allocation of the bits between the fine energy (gain) and the PVQ codebook (shape). There was a mismatch between the code and the theoretical analysis we had. After actual calculations based on (Laplacian) random data, Tim found that it matched the theory almost perfectly. The only problem is that PQevalAudio (objective quality measurement) disagrees with the theory as to what the optimal allocation is. The problem is that it's very hard to tell which one is really optimal just by listening, so this is still not fully resolved.
The last thing we've worked on (with Tim) that's still ongoing is optimising the pdfs used by the range coder for coarse energy encoding. There may be a few
bits there we can save so, it's worth trying.