Motor Daddy
Valued Senior Member
Suppose I have a bunch of different FLAC files, all having 44,100 Hz sample rate, and 16 bits per sample, for each channel (2 channels).
So 44,100 x 16 x 2 = 1,411,200 bits per second.
The problem is that different 44,100 - 16 files have different bitrates, even though they all claim to be 44,100 Hz at 16 bit.
So it isn't true that FLAC is lossless, and it is not bit perfect. What gives? How can they claim lossless when it is clearly lossy?
So 44,100 x 16 x 2 = 1,411,200 bits per second.
The problem is that different 44,100 - 16 files have different bitrates, even though they all claim to be 44,100 Hz at 16 bit.
So it isn't true that FLAC is lossless, and it is not bit perfect. What gives? How can they claim lossless when it is clearly lossy?