Novel Locally Adaptive Method for Compression of Visual Data

Lina Karam (Inventor)

Research output: Patent


Driven by a growing demand for transmission and storage of visual data over media with limited capacity, increasing efforts have been made to improve compression techniques for visual information. Most of the existing methods for image coding are designed to minimize distortion. However, these methods fail to guarantee preservation of good perceptual qualities in the reconstructed images and may result in visually annoying artifacts.True perceptual quantization requires computing and making use of image-dependent, locally-varying, masking thresholds. However, the main problem in using a locally-adaptive perceptual quantization strategy is that these locally-varying masking thresholds are needed both for encoding and decoding. This, in turn, would require sending or storing a large amount of side information resulting in a significant increase in bit rate.The existing and recently developed "perceptual-based" compression methods attempt to avoid this problem by giving up or significantly restricting the local adaptation. These methods fail to exploit the large dynamic range of the available masking resulting in over-coding of some image components or unnecessary visible artifacts. Researchers at Arizona state University have developed a quantization scheme for visual data that does not require any additional side information. It uses a rather simple perceptual model and exploits characteristics of the perceptual masking for natural images in estimating the available masking. This estimation is based on already quantized data. It has been demonstrated that this method results in superior performance compared to non locally-adaptive schemes, which do not fully adapt to the local changes in available masking.
Original languageEnglish (US)
StatePublished - Jan 1 1900


Dive into the research topics of 'Novel Locally Adaptive Method for Compression of Visual Data'. Together they form a unique fingerprint.

Cite this