JPEG History

Up until the 1980'ies regional and national telephone switches operated by large telephone companies were analogue. Managing the telephone lines of the subscribers required screwdrives and/or soldering irons. At that time Integrated Services Digital Network (ISDN) was about to be introduced in connection with digital switches that would make the management much easier. This new technology did however require massive investments in the communication infrastructure everywhere.

How could such investments be financed? It would be hard to ask telephone customers to pay more for the same speech quality they had known since 1870'ies. The incentive for customers to reach deeper into their pockets was thus thought to be value added services on top of just telephone calls. One such service could be transmission of high quality photographic images. Very crude images was already in use in the form of teledata, but never really caught on.

At the time a high quality photographic image was considered to be a color still image with the same resolution as one frame on a color TV set: 720 pixels across on 576 lines with 16 bits pr. pixel. The 16 bits per pixel were distributed with 8 bits for the luminance known from black and white TV and 8 bits for each of the two chrominance (color) components. The 16 bits for color were however shared by two neighbouring pixels, yielding on average 8 bits of color per pixel. In total one such image consisting of 414720 pixels therefore contained 829440 bytes of data.

Subscribers to an ISDN telephone line would get two data channels of 64 kbits/second and one signalling channel of 16 kbits/second. Only the data channels could be used for transmitting digital sound and photographic images. Using one of these channels to download an image with the resolution described above would take more than 100 seconds. If you would stop talking during download both channels could in principle be used, bringing the download time down to just under one minute. That was not considered terribly attractive.

It was clear that for such a service to be a success, the download time should be brought down to just a few seconds.This would however require that the amount of data in the images should be reduced by a factor of 20 or more. In other words - the images should be compressed. Furthermore it should be possible to decompress the datastream in real time arriving over an ISDN line.

A number of large tele communication companies across Europe decided to cooperate to develop methods of image compression to do the job. This joint research project was in part financed by the Commision of the European Communities, a precursor of the present European Union, as part of their European Strategic Program on Research in Information Technology (ESPRIT). The project - number 563 under ESPRIT - was named Photovideotex Image Compression Algorithm, or PICA for short.

Research on image compression had been done earlier but without much success. Different approaches had been investigated - e.g. vector quantization and transform coding using various transforms. At the onset of the project various methods were distributed between the PICA partners for possible refinement. It quickly became clear however that transform coding looked most promissing.

The optimal transform is the Karhounen-Loeve Transform (KLT). The KLT is however computational intensive, far more than could realistically be used with the computers available in the late 1980'ies. Various other transforms were investigated, but the Discrete Cosine Transform (DCT) was clearly the one closest to the KLT in concentrating the image energy.

The DCT converts pixel values to amplitudes of two-dimensional cosine functions. Instead of storing and transmitting the pixel values, the amplitudes are stored and transmitted. Whereas all pixels are equally important, that is not the case for the amplitudes. The low frequency cosine functions are by far the most important whereas the high frequency functions represent details in the image that are much harder for the human eye to distinguish. Very often the amplitudes of these high frequency functions can be discarded altogether without ruining the visual apprearance of the image. In some images the details are however so prominent that they must be retained. The way to decide which amplitudes to retain and which to throw away is by the process of quantization. After quantization an all important entropy coding of the quantized amplitudes takes place to reduce the amount of data to be stored and transmitted.

The three step compression process DCT=>Quantization=>Entropy coding ensures that every image going through the process comes out with the same quality. Simple images without many details are compressed a lot, while images with lots of visual important details are less compressed. The algorithm therefore adapts to the image in question and was therefore named Adaptive Discrete Cosine Transform, or ADCT for short.

While the PICA project was working on image compression primarily for telecommunication purposes, ISO and CCITT became interested in standardizing image compression in general. Research labs from around the world were invited to present their algorithms to ISO/CCITT. Ten image compression algorithms were presented including the ADCT from Europe. In a selection meeting in June 1987 in Copenhagen the three most promising algorithms were selected for further improvement: The Adaptive Binary Arithmetic Coding (ABAC) from IBM, The Block Separated Progressive Coding (BSPC) from Japan, and the ADCT from Europe.

During the next six months the teams behind the three algorithms worked very hard to improve their methods. Five standard images were distributed to the teams and it was decided that each team should compress these images from 16 bits per pixel (bpp) down to 2.25 bpp, 0.75 bpp, 0.25 bpp and 0.08 bpp. To make sure that the algorithms were not tuned to the specific standard images, computer code with the algorithms was sent to ISO before receiving the images.

Not only should the teams compress the images to the decided size, they should also demonstrate that the compressed datastreams could be decompressed in real time arriving over a 64 kbit/sec communication line on a 25 Mhz IBM-PC.

In a final selection meeting in Copenhagen in January 1988, subjective blind testing of the standard images compressed at the different compression levels took place.


Blind test and final results, January 1988, Copenhagen
It turned out that the ADCT performed best on all compression levels with excellent quality at 0.75 bpp (less than 5% of the original amount of data) and indistinguishable from the original at 2.25 bpp. Furthermore the ADCT was the only algorithm shown to be decompressible in real time as required. It was therefore decided that the Adaptive Discrete Cosine Transform algorithm should form the basis for the coming international standard for image compression - JPEG.

Following the meeting in January 1988 a lot of work took place in defining and developing in detail other modes of JPEG:

During this work the standard was drafted and sent out for testing - was it implementable and unequivocable?

Finally in 1994 the standard was officially released:

ISO/IEC 10918-1:1994
Information technology — Digital compression and coding of continuous-tone still images: Requirements and guidelines

We have seen literally thousands of standards: bad ones, good ones, but only a very few what one may call “Standardization Miracle” – because of their overwhelming market dominance. For example: Some web standards, like ECMAScript (JavaScript) for Web scripting languages, or HTML belong to that category. But the first JPEG Standard (informally JPEG-1 or formally ITU-T T.81¦ISO/IEC 10918-1) belongs to those very few as well.