Evaluating the perceptual quality of 360° stitched panoramas is challenging because no pristine reference exists and the distortions are localized along stitching seams rather than global. Conventional full-reference and no-reference IQA methods are mainly designed for compression or projection artifacts and thus fail to address stitching-specific degradations such as luminance inconsistency and detail loss.
We propose GC360IQ, a perception-referenced framework that assesses stitching quality based on the human visual expectation of luminance continuity and structural fidelity. A dedicated stitched-image database is constructed to isolate blending-induced luminance inconsistency and detail loss while minimizing geometric misalignment. These two perceptual dimensions are modeled using dual CNN branches inspired by gradient and structural features, whose fused representation predicts overall perceptual quality.
Subjective experiments using an HMD with 30 participants show that GC360IQ achieves significantly higher correlation with mean opinion scores than existing FR and NR IQA methods. The framework offers an interpretable, stitching-aware solution and provides guidance for perceptually optimized blending.
PDF