A recent letter presented a novel neural-network learning rule, BAR, (boundary adaptation rule) which was shown to converge to a scalar quantizer with equiprobable outputs. Such quantizers will be called maximum entropy quantizers (MEQs). It is interesting that such a simple rule can produce these quantizers. Its practical usefulness is limited, however, by two factors. First, there are more efficient algorithms which yield better results, and second MEQs are unsuitable for many quantization tasks, as discussed below.
IEEE Transactions on Neural Networks,
Vol. 7, no. 1 (Jan 1996), pp. 254-256