Recent advances in high-performance graphics and physics engines (e.g., Unreal Engine) have popularized simulators for safety-critical system testing, yet credible validation is essential for reliable outcomes. This paper introduces a novel methodology for validating simulation toolchains, combining principles from SAE and UNECE frameworks with validation cycles to accommodate evolving safety-critical requirements. We demonstrate this approach through a case study evaluating the color fidelity of an Unreal Engine-based perception toolchain for safety-critical applications such as human and obstacle detection. Comparative tests of real and simulated camera outputs show that Unreal Engine’s camera model achieves "Delta E" < 4 under controlled lighting, closely matching the reference colors, but complex real-world lighting and seasonal variations can introduce perceivable color discrepancies. Our iterative methodology enables progressive refinements (reducing "Delta E" variations) and establishes critical traceability links for assessors related to evolving system requirements, toolchain modifications, as well as validation evidence. The resulting framework provides assessors with a verifiable chain of evidence from initial discrepancies to compliance, bridging the gap between adaptive development and certification needs.