Media Quality Tests
12. Measuring the quality of recorded media
Review Pages
2. View Page Description
3. Submission of Tests
4. Submit Tests detailled description
5. List of Tests conducted by a particular reader
6. Empty
7. An example of viewing a test
8. An example of submitting a test
9. Security constraints
10. Csv file format
11. Some suggestions for the proper submission
12. Measuring the quality of recorded media
13. Frequently Asked Questions
14. Glossary of Terms
15. Programming decisions
16. APPENDIX 1. UmDoctor Pro II
17. APPENDIX 2. KProbe
18. APPENDIX 3. CdSpeed
19. APPENDIX 4. PlexTools Professional
Measuring the quality of recorded media
General Considerations
There many factors that influence the quality of a particular media recorded on a particular recorder drive. There are also many ways to identify and qualify these factors as well.
Different media are based on different chemicals and differences in the manufacturing process followed during the making and packaging of a disc might largely influence the outcoming media quality when testing them after recording. For example, the so called yellow dyes (phthalocyanine) used by some manufacturers require less laser power for recording and consequently a different recording strategy in part of the recording drive’s firmware for obtaining optimum quality. On the other hand, discs based on the same chemicals, when produced by different manufacturers have been identified to produce large deviations in quality. (We won’t name any of then here, but it is a good idea for the reader to experiment himself on the quality tests page itself!:)
Even drives from the same manufacturer based on the same model and firmware might produce different quality measurements. This is not strange in itself, all sections of traditional industry have learned to live with such discrepancies for nearly a hundred years now. The industry has also exercised the use of the so-called "statistics" for dealing with these phenomena. A large number of samples, with averaged "quality" measurements over the number of units produced offers an acceptable measure of quality in most cases.
There are many indicators of quality when dealing with CD’s and DVD’s in the labs. These are based on both mechanical and optical measurements of the surface of a recorded disc or a pressed one. Even unrecorded discs can be measured against some specific test for identifying imperfections on the ATIP or the disc geometry itself. All these measurements are of some importance to industry engineers for locating manufacturing process flows or when working towards disc and recorder improvements in research and development groups.
Most of the above measures are "analogue" in their nature. They offer some sort of a numeric value along the changes of a geometric, optical, or otherwise physical property of the medium, the recorder or some other aspect of a process. Examples of such measures is the focus/tracking deviation of the spiral of a disc with respect to the laser diode beam. They are of particular importance for drive and production machine calibration, for developing better algorithms and strategies for encoding and recording, but in the final end, these seem to be of lesser importance to end-user requirements. This might sound as a bold statement, but actually it is not. Please read below.
The final end of any "quality" measurement for a particular user is the playability or not of a particular disc on his/her player under most circumstances. Consequently, when it comes to pure user understanding of quality, of prominent importance becomes not some sort of an analogue measure in general, but the number of reading errors the drive encounters per second during playback.
These errors are either of a minor importance to overall disc readability or constitute a major and thus deterministic factor of reading quality. In the former case these errors, in the case of compact discs, are defined as C1 errors and are only indicative of the quality of the disc. The latter type of errors are called C2 errors and in the case of audio CD’s, when encountered in sufficient consecutive numbers, will most probably result in audible clicks during playback. When encountered in large quantities over a very small amount of time will most probably heard as glitches or not heard at all (muted music due to consumer player built-in algorithms).
This outcome depends on the type of a player and the playback method. For example, consumer players are self-embodied by better C2 error concealing algorithms with respect to PC drives. PC drives, on the other hand, will bypass the built-in error concealing/smoothing algorithms and will produce audible glitches when playback is done digitally (ripping or digital extraction to an outer D/A decoder/booster or an included sound card)!
As long as data CD’s (and PC backup admins) are concerned, there is also a third layer of error correction, applied only on data discs. The errors produced at this level are characterized as L3 (layer-3) errors. Encountering such an error once per disc usually means that at least one backup file is corrupted.
In the case of our tests here, we will not have to deal with such errors at all. We consider already a major event the existence of even isolated cases of C2 errors. The non-existence of errors of this type, already eliminates any possibility of L3 errors.
Implementation details
In the case of our tests, we restrict our selves to only those types of tests that are easily conducted by the average user (or, at least, by those users somewhat more technically inclined). This was the first requirement in our design board. Secondly, a number of other factors influenced our decisions and practices. Each test should be technically well recognized, widely acceptable and easily implementable by products offered in the market at reasonable prices. Since most professional equipment is out of the reach of the budget of the everyday user, we thus restricted our tests to only those offered by some program developers on market drives.
(We can very easily include professional tests if we decide to do so, and we already have included many tests we have performed during our regular drive reviews using our own expert equipment. In the future, all tests and reviews of new hardware will also be appearing in this section of "Media Quality Tests" as well.).
Having expressed the above requirements, we must state that, thus far, we are aware of only 4 such programs. PlexTools Professional, CdSpeed, UMDoctor and KProbe. It is important to point out that not all of these programs work on any available drive. In particular, PlexTools Professional works only with the Plextor Premium (523252), CdSpeed works with those drives based on the Plextor Premium chipset and the MediaTek chipsets, UMDoctor requires a Sanyo chipset with an accompanied firmware and KProbe is restricted in it’s use to only drives based on the MediaTek chipset and firmwares.
We have thus restricted our prospective drive readers to only to only a few dozens of CD and DVD readers drives. But still, we have left open the possibility of using an arbitrary recorder. This is certainly good news, as any user can find out the best disc for his recorder, albeit he will not be able to submit any tests himself!
Review Pages
2. View Page Description
3. Submission of Tests
4. Submit Tests detailled description
5. List of Tests conducted by a particular reader
6. Empty
7. An example of viewing a test
8. An example of submitting a test
9. Security constraints
10. Csv file format
11. Some suggestions for the proper submission
12. Measuring the quality of recorded media
13. Frequently Asked Questions
14. Glossary of Terms
15. Programming decisions
16. APPENDIX 1. UmDoctor Pro II
17. APPENDIX 2. KProbe
18. APPENDIX 3. CdSpeed
19. APPENDIX 4. PlexTools Professional