I have a digital record damaged by incorrect connection of codecs. It seems like some butthead attached 32-bit PCM output to 16-bit input of AAC codec, or something like that.
The sine of samples is still pretty beautiful, with inversion points clearly observable even after DCT.
Unfortunately, the inversion(bitdepth_overflow) itself is neither clipping nor silence.
I need some kind of (maybe) algebraic filter which makes corelation of inverted-back samples with signal expectations and imposes back-transform to samples or regions if necessary.
Is it possible in Adobe software?