Due to the recent interest in video content delivery applications in networked environments, automatic quality assessment of scalable videos is becoming an important issue. This paper presents a study comparing the performance of objective metrics developed for quality assessment of scalable videos. A database containing video sequences produced by the scalable extension of H.264/AVC and corresponding subjective ratings is used for benchmarking. We aim at investigating the maximal capabilities of the metrics via optimization of their parameters on the database. Experimental results show that they outperform the peak signal-to-noise ratio (PSNR) and a correlation coefficient higher than 0.9 can be achieved, but there is still room for improvement in order to facilitate reliable objective quality assessment.