DVD Media Format Compatibility Tests
6. Test Results - Conlusion
Review Pages
Compatibility Measures
Having defined the parameters of our tests and declared the testing conditions and methodology, as long as writing is concerned, we had to determine what should be the relevant steps during playback. Our experience has shown to us that there are many aspects of compatibility of a recorded disc. Unfortunately, this is in 1-to-1 relationship with respect to the degree of anger a user feels when problematic playback is encountered:(
We could easily define a black/white condition. Either a disc plays on a unit as expected or not. We felt we had to be more descriptive. We decided to define in details the problems we encounter, trying to develop a systematic and uniform way of writing them down. We could thus afterwards rate the severity of any error we encountered and turn it into a relevant number. It is very important to point out that the methodology we followed for rating the compatibility issues is by no means restrictive or cumbersome. As we explain later on in this section we could still offer black/white assessments or just go on into more deliberate estimations, as we chose to follow in our description here.
Player/Drive disc recognition
Our testers always vowed when encountering a disc that could not be recognized by a particular player. Indeed, they had one less test to carry out. They would finish their "job" earlier and much easier. In this case there are few things to be done except of trying again. If the problem persisted we had to shut down the drive and start allover again. We had to be 100% sure that what we have found is actually correct. In some cases the players seemed to "hung". A rebooting was again necessary. We encountered a very few cases where a disc could be recognized in the 5th or even 10th attempt. In these cases we repeated the whole recognition procedure many more times until we came to a "statistically" correct result. If recognition was so much difficult, then it was most probable that the casual drive and disc user would have quit already. Even if he had not already thrown away the drive, he would certainly have done so with the disc itself!
So, in our disc recognition tests, a drive mentioned not being recognized implies one of the previously described cases.
Playback quality test
We tested the playback of each disc at prescribed points. One near the start of the disc, one in the middle and another one at the end. At each one of these points, disc playback was allowed for a maximum of 90 seconds. If the disc played ok, we moved one to another one. In many cases there were visual artifacts, playback skipping or even drive "hugs". In these cases the tests were being repeated in order to be absolutely sure about the problem encountered. In some cases it was just a mpeg 2 incompatibility of the player with respect to the video/audio stream we used. If some problem emerged only once and we were not able to subsequently validate it by repetition, it was not considered a problem at all.
The reader should be aware that after completion of the full tests, we repeated once again the tests on only those discs that had initially considered problematic. We were thus able to completely verify all erroneous playbacks. As a rule, we followed the same methodology with respect to both fast playback and disc recognition parts of our tests.
Fast forward and backward tests
Between the prescribed points of playback, as described in the previous paragraph, each disc was completely watched for errors during fast forward and fast backward at the maximum allowable speed. In some cases the drive hung, in other cases there were unacceptable skippings. We finally rated only those that could easily be reproduced each time. Sporadic problems were attributed to "statistical" errors and were thus ignored.
Defining a measure of compatibility
This is indeed the most interesting point of our tests. We wrote down electronically each test and developed a small database project to both handle the tests and their interpretation.
We used declarative referential integrity to ensure a high degree of accuracy during form insertion in our database. An additional auditor checked one-by-one everything on paper and the database itself. Each test outcome was characterized according to our system in one out of 3 possible outcomes. "OK", and "NO" meaning the obvious cases of full compatibility in terms of the particular tests being done (recognition/playback/FF/FW) or none at all, and "PROBLEM", meaning the equally obvious problematic cases.
We next rated each outcome according to the following "point-system". NO was set equal to zero (0), PROBLEM equal to 1 and OK equal to 3.
We could characterize each result according to additional outcomes such as "Problem Small", "Problem Large", and so on, but then the final outcome would be somewhat objectively dependent on our testers. So, we stuck with the initial 3 outcome classification of the tests.
According to this system, the best rating for a test outcome could be 9 = 3 x 3, that is 3 points for perfect recognition, 3 points for perfect playback and another 3 points for flawless FF/BF. If a disc was not recognized, then it was rated as 0 in all cases, even if in 1 out of, say 20 tries, it could playback ok.
Having defined the measure of compatibility as explained above, we developed an on-line analytical processing "cube" for parameter and measure interpretation. From this multidimensional analysis it was very easy for us to get fixed, or even live, spreadsheet results in Excel.
Some issues of compatibility estimation
Fortunately, we were able to foresee the implications of defining a generally acceptable measure of compatibility and early enough in our efforts we took special precautions for adjusting easily our measure according to what finally our sense of common thinking would led us to decide.
We immediately realized that allowing a three scale measure of playability seemed a little confusing to our audience. Anyone usually whishes a definite YES or NO. Intermediate degrees are always subjective to possible discriminatory judgments.
Next we had to decide about how problematic a "problem" had to be in order to be considered as a "true" No. Things where easy for disc recognition. If more than 80% of the tried insertions led to errors, the disc was considered as non-recognized. Consequently, both "Playback" and "FF/RW" were defined as NO as well. Subjective as it is in nature the decision in the other cases, we chose to follow on this closely our hearts. If we felt that the number of successive tries on a particular disc would be enough to excite the nerves of a potential consumer/user, we stopped and signed a big NO on the piece of paper we used for writing down the results. Otherwise, we kept trying in order to have enough evidence that the average consumer would consider the use of the disc as a very slight annoyance.
Our final decisions
At the end we easily adjusted our measure as follows. A number of zero ("0") indicates no playback and a one ("1") full compatibility.
Review Pages