RPCS3 Forums

Full Version: Automatic Frame Interpolation when below 60 fps: is it possible? (ie: Trumotion, etc)
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Is it possible to implement some kind of fast frame interpolation technique in RPCS3? I know we will have a 1 frame lag, but that would help getting God of War 3 at a constant over-60 fps.

I know real frames are always better, but I saw this frame interpolation thing working in my LG TV from 10 years ago (it has this TruMotion thing) when playing Horizon Zero Dawn in a PS4, and it was smoooooooth, so nice. If it works in a 10 years old tv, why wouldn't it work in a 2022 laptop with a RTX 3070?

Before someone starts flaming me, check out these links and see I'm not talking too much nonsense:

LucasArts' 60FPS Force Unleashed II tech demo | Eurogamer.net

AND's | rtfrucvg (intercon.ru)
QCOM_frame_extrapolation

OpenGL ES Extension #333
Requires OpenGL ES 2.0

    Frame extrapolation is the process of producing a new, future frame
    based on the contents of two previously rendered frames. It may be
    used to produce high frame rate display updates without incurring the
    full cost of traditional rendering at the higher framerate.

    This extension adds support for frame extrapolation in OpenGL ES by
    adding a function which takes three textures. The first two are used
    in sequence as the source frames, from which the extrapolated frame
    is derived. The extrapolated frame is stored in the third texture.

https://registry.khronos.org/OpenGL/exte...lation.txt

Why not?
We don't use OpenGL ES + this is a Qualcomm extension made for smartphone platforms.
(08-06-2022, 12:37 PM)Ani Wrote: [ -> ]We don't use OpenGL ES + this is a Qualcomm extension made for smartphone platforms.

Got it. But it's just another idea on how to make it possible.

I'm trying to find something that doesn't involve RPCS3 itself, like a ReShade. I'm learning that even though it's a fantastic idea, an FPS upscaler instead of a resolution upscaler, no one implemented it in game rendering, only in video editing. That's sad.

It would be awesome if it was an option in Nvidia/AMD/Intel drivers though. Even better if it could be done by the iGPU at the same time the real GPU takes care of the real rendering. Something like a TruMotion co-processor.
"Optical Flow SDK 3.0 enables DirectX 12 applications to use the NVIDIA Optical Flow engine. The computed optical flow can be used to increase frame rate in games and videos for smoother experience or in object tracking. To increase the frame rate, Frame Rate Up Conversion (FRUC) techniques are used by inserting interpolated frames between original frames. Interpolation algorithms use the flow between frame pair(s) to generate the intermediate frame."

https://developer.nvidia.com/blog/whats-...w-sdk-3-0/


"Optical flow can also be used very effectively for interpolating or extrapolating the video frames in real-time. This can be useful in improving the smoothness of video playback, generating slow-motion videos or reducing the apparent latency in VR experience, as used by Oculus (details). Optical Flow functionality in Turing and Ampere GPUs accelerates these use-cases by offloading the intensive flow vector computation to a dedicated hardware engine on the GPU silicon, thereby freeing up GPU and CPU cycles for other tasks. This functionality in hardware is independent of CUDA cores.."

https://developer.nvidia.com/opticalflow-sdk
We don't use DirectX + we don't implement one-vendor or one-OS features, needs to be a standard
I don't think there is a way to do what you propose
Nvidia just went there and used these ideas + Optical Flow to make DLSS 3.0...

Introducing NVIDIA DLSS 3 | GeForce News | NVIDIA
Temporal solutions like DLSS 1/2/3, FSR 2.0 and XeSS require motion vectors from the Game Engine, which are impossible to have on an emulator.
And that's not even a standard, that's a proprietary solution for one vendor.