Tap to unmute

I’m Embarrassed I Didn’t Think of This.. - Asynchronous Reprojection


Komentáře • 3 231

  • Linus Tech Tips
    Linus Tech Tips  Před 2 měsíci +369

    Thanks to Ridge for sponsoring today's video! Save up to *40% off and get Free Worldwide Shipping until Dec. 22nd at www.ridge.com/LINUS

    • Crushi Vintage
      Crushi Vintage Před měsícem


    • mrfofo
      mrfofo Před 2 měsíci

      You should do a video about Xbox Live 1.0 its back thanks to Insignia and from a tech standpoint reengineering Xbox live to work again on the Original Xbox is a interesting feat

    • Radbug
      Radbug Před 2 měsíci +1

      Why has no one mentioned how async proj could improve actual 144 or 240 fps rates? Because it does...

    • Solstice Projekt
      Solstice Projekt Před 2 měsíci

      When was the last time gamers asked for a compromised form of rendering, you ask? ... When they wanted DLSS, of course. DLSS is *exactly* that.

    • 0ops
      0ops Před 2 měsíci

      id love to see this with reshade

  • Nabalazs
    Nabalazs Před 2 měsíci +1519

    I am so happy that Philip managed to get the message THIS far out. I do fear that this tech might have issues with particles and moving objects and the like, but when you mentioned that we could use DLSS to ONLY FILL IN THE GAPS, my jaw dropped. Thast so genius! I really hope that this is one of those missed opportunity oversights in gaming, and there isnt like some major issue behind it not being adopted yet.

    • Xfy123
      Xfy123 Před měsícem

      @ANtiKz That's async compute completely different then what they are talking about in the video.

    • ANtiKz
      ANtiKz Před 2 měsíci

      @Nikephor my apologies with the "exact same setup" I believe I was referring to enabling ASYNC and DLSS as the other guy stated. Still, ASYNC has been an option for around a year. You can add the setting under system settings of a games config on Lutris

    • jubuttib
      jubuttib Před 2 měsíci +1

      There are obvious caveats with scenes that have a lot of movement in them for this particular implementation, but remember that this kinda stuff has been done on VR games for a while now, there are many ways to improve it beyond what's shown here.

    • TheCustom FHD
      TheCustom FHD Před 2 měsíci +1

      Or you render a slightly higher res image, crop into the middle, amd use the outside pixels as a filler. You could even render the outside stuff less accurate and like only every 4th pixel, and then approximate the rest.

    • yes yes
      yes yes Před 2 měsíci +1

      I was pretty sure your display had to support async reprojection or something. Async reprojection is better than having low framerates, but not better than having normal high frame rates. VR headsets have different kinds of motion interpolation I think? and some outright dont support it.

  • Pillowmancer
    Pillowmancer Před 2 měsíci +188

    Not mentioned in the video: you can render frames at slightly higher fov and resolution the the screen, so that there's some information "behind" the monitor corner.
    Won't save you from turning 180 degrees, but it will fix most of the popup for a very slight hit on performance

    • bedro
      bedro Před 15 dny +1

      @Ben Hur Thats what foveated rendering is.

    • Ben Hur
      Ben Hur Před měsícem +4

      You could even render the stuff outside the screen at an even lower resolution, you wouldn't notice that much since you'd only see it while in motion...

    • Right this way, sir
      Right this way, sir Před měsícem +9

      @Just Some Dinosaur Careful... thats a [Lvl. 163] PC Master-Supremacist, the bane of mobile, console, and vr gamers.....

    • Just Some Dinosaur
      Just Some Dinosaur Před měsícem +15

      @L Y R I L L Brainlet take

    • L Y R I L L
      L Y R I L L Před měsícem

      @Martin Krauser asmh

  • TheTrainMaster15
    TheTrainMaster15 Před 2 měsíci +250

    Philip is revolutionising the way we think about gaming and game dev just with common sense

    • TheTrainMaster15
      TheTrainMaster15 Před 2 měsíci +50

      @Neurotik51 using technology for VR with conventional monitors? I haven’t heard of that before

    • Neurotik51
      Neurotik51 Před 2 měsíci +5

      what? nothing here is new

  • Blapman007
    Blapman007 Před 2 měsíci +1275

    2kliksphilip and LTT is a crossover I never knew I needed. Make it happen.

    • Noob
      Noob Před 19 dny

      @morfgo nobody here realizes this comment is referring to Valve ignoring 3kliks instead of LTT
      there was a controversy where valve used a bugfix (mapfix?, idk) that 3kliks made without crediting him within the patchnotes of CS:GO.

    • Piotr Kowalski
      Piotr Kowalski Před měsícem

      @morfgo They mentioned his username around 10 ri=times during the video. Isn't this crediting?

    • W K
      W K Před měsícem

      they both love counter strike it makes sense

      CLEARRTC Před měsícem

      @morfgo did you not watch the video? They talked about him several times.

    • JustSomeRandomAsianDude
      JustSomeRandomAsianDude Před 2 měsíci +2

      @morfgo you should delete your comment before you get ratio lol

  • DezzyDayy
    DezzyDayy Před 2 měsíci +404

    I actually was thinking about writing an injector to apply this to existing games a few years ago when I have seen the effect on the HoloLens. A few limitations though: camera movement with a static scene can look near perfect, however if an animated object moves depth reprojection cannot fix it properly, and you would need motion vectors to guess where objects will go, but that will cause artifact near object edges.

    • Just Some Dinosaur
      Just Some Dinosaur Před měsícem

      @ayaya This isn't true. Game object's position and movement aren't random. They have set values and formulas that could be co-opted to the prediction algorithm. The problem is that it would be really costly to check every frame. This would have to be built into the game itself in a very clever way.

    • Nurse Xóchitl
      Nurse Xóchitl Před měsícem

      @Shin A lower FOV might help with that, and since I already use a lower FOV... it might not be too bad.

    • jabadahut50
      jabadahut50 Před měsícem

      I wonder if it can be done with ReShade. Would be fantastic for the steam deck.

    • TheBaldrickk
      TheBaldrickk Před měsícem

      @Krille k ... Pardon?

    • Krille k
      Krille k Před měsícem

      yeah.. this is the reason for video ram memory. if you render from center point, all the unused 10+gb of ram on gpu, that all modern gpu have totaly unused... can hold old w.a.s.d-permutations while working on new vector permutations. its not a problem. it should be able to tell the new delta/angle to render, using floatingpoint operations.

  • tubehellcat
    tubehellcat Před 2 měsíci +4110

    "He owns a display" - that's gotta hunt him for ever like the "you're fired" for Colton 😂
    Love it 😁

    • Filcayra72
      Filcayra72 Před měsícem

      do you really mean hunt? or did you mean to say haunt?

    • WhiteG60
      WhiteG60 Před měsícem

      @Faremir Oh hi Mark.

    • Roberto Lopez
      Roberto Lopez Před 2 měsíci

      lmaoo, fr!!

    • Patrick Tierney
      Patrick Tierney Před 2 měsíci +1

      This came up a lot in that early adopter monitor firmware video. It's a great display though. Happy to have one myself.

    • Ahmed Anssaien Plays
      Ahmed Anssaien Plays Před 2 měsíci

      @EndstyleGG Yeah, crazy, right? xD

  • Mateusz Rogalski
    Mateusz Rogalski Před 2 měsíci +38

    Just think about that: You can see the difference on a CHclip video! Granted it's 60FPS but it's still compressed video streamed from CHclip. I can only imagine how much of the difference you can see live running it yourself. This makes it even more amazing!

  • Sonic Tekeno
    Sonic Tekeno Před 2 měsíci +24

    A LTT vídeo at 60fps?! My god, the little animations they put like the outro card look so good 👍

    • MrPaxio
      MrPaxio Před měsícem +2

      yeah doesnt the dummy know 60fps is better than 8k uploads

  • Czllvm
    Czllvm Před měsícem +9

    THIS IS INSANE, I use this already on Assetto Corsa in VR so I play at 120hz but it renders 60fps, Such a light bulb moment at the start, Really wish this can catch on because I've already seen first hand how great this is

  • Samudec
    Samudec Před 2 měsíci +4

    It's huge for low/mid range setups to make games more responsive but it's also nice for high end machines because you'd completely negate the impact of 1%frames and feel like you're always at your average

  • Rodrigo Teles
    Rodrigo Teles Před 2 měsíci +941

    Plouffe's "He owns a display" gag is always going to crack me up.

    • Solo-Ion
      Solo-Ion Před 21 dnem

      Mark Rathgeber

    • I lick the insides of microwave popcorn packages
      I lick the insides of microwave popcorn packages Před 2 měsíci +16

      @Lu It's a joke. In one of the previous videos there was a throwaway comment about him being the display guy and he goes "I'm the display guy here. I do all the reviews of displays, I own a display..."

    • 666Tomato666
      666Tomato666 Před 2 měsíci +5

      @Lu but his display is not a regular display... It's Plouffe's!

    • tobi
      tobi Před 2 měsíci +7

      @Lu he bought the alienware miniled one and hes proud that he was one of the first to get it and now its a meme

    • Kyle Reeping
      Kyle Reeping Před 2 měsíci +36

      @Lu But his display is.... *special*

  • Chris C
    Chris C Před měsícem +6

    One thing I wondered about when I first saw that video is if the PERCIEVED improvement is good enough that you could lose a couple more frames in exchange for rendering a bit further outside the actual fov, but at a really low resolution. Basically like a really wide foveated rendering. It would mean the warp would have a little more wiggle room before things started having to stretch.

  • David Baker Sound
    David Baker Sound Před 2 měsíci +9

    I’ve never understood why this hasn’t been done before. I’ve thought it should be done since 2016 when I got my VR headset. Like you said, extremely obvious!

  • Jon H.
    Jon H. Před 2 měsíci +298

    You might be able to hide a lot of the edge warping by basically implementing overscan where the game renders at a resolution that's like 5-10% higher than the display resolution, but crops the view to the display resolution. It should in theory be only a very minor frame rate hit since you're just adding a relatively thin border of extra resolution.

    • M33f3r
      M33f3r Před 2 měsíci +7

      @Batuhan Çokmar Yeah that sounds awesome and a much closer approximation to how we actually see the world.

    • Corey Davis
      Corey Davis Před 2 měsíci

      I’m glad I wasn’t the only one thinking this

    • Wiggy!
      Wiggy! Před 2 měsíci +1

      Yes, definitely. Surprised they don’t do this.

    • Batuhan Çokmar
      Batuhan Çokmar Před 2 měsíci +60

      @Carl O That assumes you'd need same resolution for overscan. If game is rendered at 45deg FOV at 1440p, render an overscanned area between 45 and 90deg FOV at 360p. You don't need a lot of detail, just something to make valid guesstimates within that motion blur until proper frame fills up the screen.

    • AB
      AB Před 2 měsíci +30

      The magic combo there would be foveated rendering alongside the async reproj with overscan. The games that would make sense for will inevitably be a case-by-case thing for but the performance gains would be massive.

  • Altefier
    Altefier Před 2 měsíci +2

    Welp, I think I already know what this feels like without watching this video. Because I've played games before that had unlocked framerates while having character animations stuck at 30 fps. I think it's pretty much that. Yea, it will feel good, but look a little disappointing.

    • tpodole
      tpodole Před měsícem

      yeah, my comparison was very laggy Gmod physics when props are trying to summon a kraken, all while you can still walk around it. In some games it will not help, but there are so many that I care about where it would.

  • happysmash27
    happysmash27 Před měsícem +3

    Asynchronous projection is great in VRChat (in PCVR on my relatively old PC) where I usually get 15 fps and often even far less than that! I don't really mind the black borders that much in that case especially since the view usually tends to go a bit farther than my FOV making them usually only appear if things are going _extremely_ slow, like, a stutter or any other time I'm getting over 0.25 seconds per frame. So, perhaps another way to make the black bars less obvious, would be to simply increase the FOV of the rendered frames a little bit so that there is more margin. Would make lower frame rates, but it might be worth it in any case where the frame rates would be terrible anyways.

  • Alexander Crocker
    Alexander Crocker Před 2 měsíci +1

    Imagine Async Reprojection on the Steam Deck, where Valve already has the software from SteamVR! Battery savings while feeling like at 60fps!

  • ToasterTom9737
    ToasterTom9737 Před 2 měsíci +9

    I stumbled upon 2kliksphilip’s channels when I was researching how to make maps in Hammer. So glad you guys have mentioned him in multiple videos now!

  • JokerZappie
    JokerZappie Před 2 měsíci +1

    Here's an idea to improve this: have the game asynchronously render at like 5% (or even 1%) of the original resolution, at the refresh rate of the monitor, and use those frames to fill the gap that isn't able to render at full resolution yet (assuming the full resolution cannot be rendered as fast as the monitor's refresh rate).

  • afroninjaen
    afroninjaen Před 2 měsíci +822

    I always had a feeling that tech like this is actually the real future of gaming / VR performance. And not just raw rtx 4090 performance.

    • MrPaxio
      MrPaxio Před měsícem

      and it already exists on oculus for a whlle, so that future is already in the past

    • MrPaxio
      MrPaxio Před měsícem

      BuT yOu NeEd ThAt 4090 tO mInE cRyPtOoOoOoOo

    • zeroa69
      zeroa69 Před 2 měsíci

      @Catmato ya but its not that special anymore. Its physx all over again. Eventually they will stop dicking around with the marketing schemes and will be a standard feature in all gpus.

    • Thezuule
      Thezuule Před 2 měsíci

      @VLPR part of the magic will be cloud streaming. NVIDIA's Cloud XR and Omniverse stuff will make cloud based XR a reality sooner than most realize.

    • Catmato
      Catmato Před 2 měsíci

      No way bro, Raytracing is the future. Just ask Hardware Unboxed.

  • sa art
    sa art Před 2 měsíci +7

    Now that's a public interest video ! Raising awareness on this technique will certainly go a long way, especially in open source. I hope the constructors don't shy away from it from fear that it would diminish interest on their high-end GPUs.

    • D Nitz
      D Nitz Před měsícem

      What r u talking abt, they can make AAA game 8k 240fps without 6slot GPU

  • richfiles π
    richfiles π Před měsícem

    FINALLY! I've been saying for years that the retina and brain process visual stimuli asynchronously, allowing us to perceive framerates >100 FPS. This kind of compromise is an _EXCEPTIONAL_ means to fill in the space between frames with something "close enough" so that our eyes and brain can get the smoothed out experience. reality is asynchronous. _WE_ are asynchronous... Why shouldn't a computer and it's video output take advantage of asynchronicity as well!

  • Brook Richardson
    Brook Richardson Před 2 měsíci +1

    We need AI upscaling, AI frame generation and Asynchronous edge detection built into a monitor. Everything would instantly look better with no load on the computer. And then support for component video would make retro gamers very happy.

  • Dddsasul
    Dddsasul Před 2 měsíci +1

    Can you please test this on low end hardware? So far I get the feeling it only works on devices that already have some spare performance. Cause either way the gpu has to renter something, even though it's less taxing. Maybe find some hardware that can push only ~40 fps and try it on that. I'd really want to see the effect of it.

  • Hildebert
    Hildebert Před 2 měsíci +308

    I know philip will see this and I know he will feel awesome.
    You have come a long way Philip. I am proud to be part of your community since your first tutorial videos.

    • jo_kil
      jo_kil Před 2 měsíci +2

      @Charlie more like 14 lol

    • Fargoth
      Fargoth Před 2 měsíci +1

      kliki boy i love you

    • TuRmIx
      TuRmIx Před 2 měsíci +3

      Love him. His tutorials layed the base for my environment artist gamedev job.

    • Charlie
      Charlie Před 2 měsíci +9

      Here's to Philip, love his videos on all 3 of his channels

  • Lucas Carracedo
    Lucas Carracedo Před 2 měsíci +1

    Sadly this is useful in limited scenarios. Objects moving even relatively fast, particles, volumetrics and anything motion blurred or defocused in any way will almost always make this useless.
    That's why resolution upscaling became the main way to achieve better performance. In theory this could still help, but engines would need to isolate moving objects and effects to make it work in more scenarios, and that's hard to automate given the amount of them you usually have on screen. It's the same problem as with the frame interpolation in DLSS3. Artifacts will always be there. And they are noticeable.

  • Radosław Orłowski
    Radosław Orłowski Před 2 měsíci +1

    I commonly lock my fps to something like 20-30 fps as I play RTS, and it keeps my laptop quiet enough that it's not waking people.
    this would be very nice If they would render like 75' while fov is 70' or stuff like that. If you get just a touch more than is currently on-screen, it would be unnoticeable.
    All the people complained about were corners where tech made a lot of assumptions about what's there. If added just a bit of render around the edges, I don't think they would notice anything changing at all up to 15.

  • Deus_nsf
    Deus_nsf Před měsícem

    See, this is what I expect from DLSS 3, no increase of inputlag.
    I also wonder if you could render more on the sides and then crop the image to give a native res frame, that would eliminate border artefacts more, like they do to eliminate SSR border artefacts sometimes.

  • Eden Thompson
    Eden Thompson Před 2 měsíci +4

    I'm very interested in using a form of overscan where you're viewing a cropped in frame of the whole rendered image, so when you're panning your screen around it doesn't have the issue of stretching, unless you pan outside of the rendered frame.

    • DaVince
      DaVince Před měsícem

      Well, that's as simple as rendering a little more outside of the screen area.

  • Egwene22
    Egwene22 Před 2 měsíci +250

    This is probably my favorite type of video from LTT. Highlighting and explaining interesting technology is fascinating.

    • thebaum64
      thebaum64 Před 2 měsíci

      it's up there for sure

    • TheWend
      TheWend Před 2 měsíci

      oh wait, time for another balls to the wall computer build! only the third this week. /s
      But for real, they've been doing a great job with not doing what I just said

  • wiggenvan
    wiggenvan Před měsícem +1

    I would be very curious if a hybrid solution would be possible, such as in a fps game, synchronously drawing the environment, but asynchronously drawing the players? I’m sure there’s some limitations involved with that, but it does sound intriguing

  • Virtual Party Center
    Virtual Party Center Před měsícem +5

    They need this on the Steam Deck ASAP. It would make it last much longer with more demanding titles later on

  • Sir Henry
    Sir Henry Před 2 měsíci

    This is so so incredible! I hope this will be the next-gen image helper in all upcoming and older games!

  • Rurou
    Rurou Před 2 měsíci +4

    This kind of thing is what I actually always think about since way back when motion interpolation becomes common in TV plus the fact that I'm familiar with 3D (I don't do real time 3D rendering, only non real time). What I'm thinking was the fact that you have this motion data, depth, etc should be good enough to have some kind of in game motion interpolation but not really interpolation but for future frame. Even without taking the control input into account, just creating that extra frame based on the previous frame data should be good enough to give that extra visual smoothness feel (basically you'll end up with somewhat the same latency as the original FPS). Since it already working directly within the game, we should be able to account for the controller input and the AI, physics, etc to create the fake frames with an actual lower latency benefit, so basically the game engine run at double the rendering FPS so the extra data can be used to generate the fake frames.
    For screen edge problem, the simple way to solve it is simply to overscan the rendering (or simply zoom the rendered image a bit) so the game have extra data to work with. Tied to this problem is actually the main problem with motion interpolation and this frame generation/fake frames thing, which is disocclusion. Disocclusion is something that was not in view in the previous frame becoming in view in the current frame. How can the game fill this gap because there is no data to fill the gaps. Nvidia I believe is using AI to fill those gaps which even with AI, it still looked terrible. But as it has been mentioned by people using DLSS3, you don't really see it, which is actually good for non AI solution, because if in motion people don't see that defects, then using non AI solution to fill the gaps (simple warp or something) should be good enough in most situation. Also doesn't need that optical flow accelerator because the reason why Nvidia use optical flow is to get motion data for elements that is not represented on the game motion data (like shadow movement) but in reality, that is not important, as in most probably won't notice when the shadow just move based on the surface motion (rather than the shadow motion itself) for that in between fake frames.
    For a more advanced application, what I'm thinking is a hybrid approach where most stuff are being rendered at like half the FPS and half of it will reuse the previous frame data to lessen the rendering burden. So unlike motion interpolation or frame generation, this approach will still render the in between frame, but render it less, like probably render the disoccluded part, maybe decouple the screen space stuff and also shadow so it rendered at normal FPS instead of half so what the game end up with is alternating between high cost and low cost frame.
    When I thought about that stuff, AI wasn't a thing thus I didn't think including any AI stuff in the process. Since AI is a thing right now, some stuff probably can be done better with AI like for example the disocclusion problem, rather than render the disoccluded part normally, probably it can just render the disoccluded part with flat texture as a simple guide for the AI to match that flat rendered look to the surrounding image which might be the faster way to do it.

    • Tenchi
      Tenchi Před měsícem

      Interpolation for the future is called extrapolation

  • Mauro Merconchini
    Mauro Merconchini Před 2 měsíci +926

    I'm so happy Phil put a spotlight on this concept, and I'm even happier that a channel like LTT is carrying that torch forwards.

    • Daniel Armstrong
      Daniel Armstrong Před 2 měsíci +11

      I tried to build something like that demo a few years ago, but I was trying to use motion vectors + depth to reproject my rendered frame which I never got to work correctly. In my engine I rendered a viewport larger than the screen to handle the issue with the blackness on the edges and then was going to use tier 2 variable rate shading to lower the render cost of the parts beyond the screen bounds. But VRS was not supported in any way on my build of Monogame which is what my engine was build apon so that was another killer for the project.
      I am so glad that Phil popularised the idea and its awesome that someone else managed to get something like this working, how he did it in one day I will never know, I spent like 3 weeks on it and still failed to get it working correctly. I should find my old demo and see if I can get it compiling again.

  • Homer Morisson
    Homer Morisson Před měsícem

    GSYNC for me was real game changer and jawdropping moment when I first experienced it with the exact same rest of hardware, it worked so much better than VSYNC ever could, so much smoother especially when FPS dipped only momentarily, with the only downside to GSYNC being that at very low FPS, the Refresh Rate naturally also goes down into the teens, and that leads to super-juddery input lag from hell.
    So if GSYNC and ASYNC Repro could be combined to have ASYNC take over at those lower FPS or at least support GSYNC by stopping the syncing of Refresh Rate to Framerate, that might make for the best of both worlds experience, could be pretty cool especially for mid range gaming rigs.

  • Palaash Atri
    Palaash Atri Před 2 měsíci +150

    Take this a compliment : I love how LTT has now transformed more into a Computer Science/Electronics for Beginners channel than just another "Hey we got a NEW GPU [REVIEW] " channel.

    • Sir Spam alot
      Sir Spam alot Před 2 měsíci +1

      It's why I keep watching them, I got tired of watching reviews of hardware I can't afford/don't really need yet. Though my VR rig is getting very tired.

    • PrograError
      PrograError Před 2 měsíci +4

      Well... They had covered everything on that aisle...

    MASEON Před měsícem

    Ahhh so this is how Quest2's run when linked to PC using AirLink. I always wondered why sometimes I would see stretching or the stacked frames if connection was weak.

  • StormyDoesVR
    StormyDoesVR Před 2 měsíci

    As a huge VR fanatic, seeing the tech that makes standalone VR possible put to use on a flat screen game is amazing!

  • Tillson
    Tillson Před 2 měsíci +4812

    I asked in a VR subreddit about a year ago why nobody is making Async for computer games and people gave me shit about it like "wouldn't work that way, the idea is stupid, just not possible, etc." so I gave up. Glad I asked the right people

    • Alkemyst
      Alkemyst Před měsícem +1

      @FreeDooMusic Why would anyone need more than 640K.....

    • Charles Alexander
      Charles Alexander Před měsícem

      It's because your head moves independent of your gun that it works in VR. Camera-attached stuff like HUD and weapons can't work with it without layers (which vr supports, usually just used for hud not weapons as they won't react to the lighting right during the move without re-rendering them). Notice in the demo there is no weapon in use by the character.

    • Sinity
      Sinity Před měsícem

      @InfernosReaper Being an expert doesn't help much either. The problem is nicely described in "Technology Forecasting: The Garden of Forking Paths" by Gwern

    • Jorge Martinez
      Jorge Martinez Před měsícem

      @InfernosReaper If a person speaks in absolutes then ... THEY ARE A SITH!!! 😱

    • glenfoxh
      glenfoxh Před měsícem

      @zyxwvutsrqponmlkh Understood now.
      But quoting something posted, without posting why, can lead to interpretations as to why. By posting something as not your words, says noting about the intent on posting it.
      Leave anything up to interpretation, and anyone can only make guess about it. Unfortunately.

  • CerN
    CerN Před 2 měsíci

    Would love this in games. Even if I can run at 90fps and make it feel like 140 that would be amazing.

  • Rastislav Porin
    Rastislav Porin Před 2 měsíci

    I hope this guy does something with this tech quickly, before some "company" incorporates this tech into theirs and calls it DLSS 3.0

  • Hatchet Hatter
    Hatchet Hatter Před 2 měsíci

    I can't wait for Nvidia to take the idea and use it exclusively on the 40 series! What a great new feature!

  • HonoredMule
    HonoredMule Před 2 měsíci

    It's all well and good to make the video feed _smoother,_ but the information we're _not_ getting is the most important - smooth up-to-the-millisecond data on the position and movement of _other_ objects (esp. people and projectiles).
    This is exactly why VR games can often _feel_ smooth yet the real gameplay be janky because we aren't actually getting smooth interaction with in-game assets.
    In multiplayer FPS especially, there is simply no substitute for eliminating as much disruption to real, accurate feedback as possible, and at high levels of play the jankiness of anti-lag, prediction, ping compensation, or actual network latency are all already nearly unbearable on their own. While there may be use cases for reprojection (quite notably VR), for the most part I'd rather not have games introducing error just to make the game _feel_ smoother than the actual gameplay _is._
    It's just one more lie to subconsciously deconstruct on the path to intuitive gameplay. And the fact that this _is_ (mostly) just a placebo would have been much clearer in the trials if there was actually something in the scene besides the player camera moving - or even just a gun to fire.

  • Grayson Smith
    Grayson Smith Před měsícem +1

    This makes me wonder if you can get a low latency eye-tracker to use foviated rendering on 2D games. That could get a large performance boost with extremely little compromises. But maybe it's only useful in VR due to the FOV to screen ratio.

  • Hobox
    Hobox Před 2 měsíci +1

    Seems like rendering outside of the FOV further would help eliminate a lot of the problems along the edges of the view when using normal-speed movements and panning.

  • Matt Burkey
    Matt Burkey Před 2 měsíci +1

    If the game was rendered with a slight overscan area so the GPU had more information about the objects at the edges, and then the image was then cropped back to the display resolution I bet those noticable edge smears probably wouldn't even appear.

  • alset333
    alset333 Před 2 měsíci

    The biggest issue I can think of... multiplayer and "tick" rate may prevent it from being used to upgrade old games. Some game engines would take significant redesign to make this worthwhile -- smoother is great but it doesn't give you more data than the system is processing. If the game is still limited but now also has to run an extra step, it might not make any difference-- higher level players use muscle memory for finer movements anyways and once you surpass the game engine's tick rate, may prefer low latency to high frame rates

  • Tactical Funny Man
    Tactical Funny Man Před měsícem

    I thought this was VR when I first clicked because VR has had this since 2017 at least. Awesome to see this finally make its way to the flatscreen :D

  • John Doe
    John Doe Před 2 měsíci

    This is pretty interesting, even if the DLSS/FSR seems like the better alternative.

  • 215Days
    215Days Před měsícem

    I can imagine this being used on old games where it has a forced framerate, be it 30, 60 or even a weird number like 25 (Nintendo 64 emulation could count).
    Just think about it, you can play Red Alert 2 at 30 FPS but it feels like it's at 60+ FPS instead, it will be amazing!

  • Perry 714
    Perry 714 Před měsícem +1

    3kliksphilip is a cs legend, how he isn’t payed by valve to improve their game I don’t know

  • NateLB
    NateLB Před 2 měsíci

    Thank you for shouting out 2Kliks on such a huge platform, I know he already has a base, but, obviously nothing compared to your base.

  • Nachname Vorname
    Nachname Vorname Před 2 měsíci

    So thats what the black border in vr games is about! Great technology, I love this kind of stuff!

  • QuantumAiCartoonist
    QuantumAiCartoonist Před 2 měsíci

    Thanks for driving tech into usability :)

  • Nemysis Retro Gaming
    Nemysis Retro Gaming Před 2 měsíci

    asynch reprojection and all other implementations of it such as motion smoothing for VR have always had one major flaw, and that is when rendering stationary objects against a moving background. The best examples are driving and flying titles such as MSFS and American truck sim. The cab/cockpit generally doesn't change distance from the camera/player view so when the scenery goes past the cockpit parts of the cockpit that are exposed to the moving background start rippling at the edges.
    This is one of the reasons AR it not used in VR anymore and also the reason why Motion smoothing is avoided as well. And besides we are talking two different technologies DLSS V Asynch Repro. One is designed to fill in the frames and the other is an upscaler. Not really an apple to apple comparison!

  • MightyElemental
    MightyElemental Před měsícem

    I'm amazed this tech hasn't been implemented in more games.

  • Gertjan Van den Broek
    Gertjan Van den Broek Před 2 měsíci

    One thing strikes me though. What about ingame animations (like character movement) or even cutscenes for that matter.
    Those aren't tied to user controls like mouse and keyboard. So.. they would still be perceived as 10 fps.. right?

  • King Zilant
    King Zilant Před 2 měsíci

    Please keep updates on this going, need to know when I can make my rx580 into a rx1160.

  • Val
    Val Před 2 měsíci

    "...or on a Steam Deck 2?"
    Steam deck runs quite a bit of open source software. There's not much stopping someone from building it into Proton.

  • MlordSlav
    MlordSlav Před 2 měsíci

    it’s kinda interesting how the more and more more’s law comes to a halt we’ve been moving away from hardware improvements to just absolutely cranking the software trickery to 11, fov rendering, async spacewarp, dlss and such.
    like in 50 years computers as we know it today won’t be that much more powerful, itl be the sheer improvement of the code we make them run thatl make them a multi generation jump from todays probably stone age way of software engineering

  • Armgoth
    Armgoth Před 2 měsíci +1

    This would be a huge asset to game develeopers if it catches wind

  • Jonathan Tanner
    Jonathan Tanner Před měsícem

    I also wonder whether GPUs will spit out frames in 2 layers, one that gets reprojected, and the other that doesn’t.

  • Mark Langridge
    Mark Langridge Před 2 měsíci

    So is this technology heavier on the cpu? I assume interpolating a new frame from a 2D frame is less work than creating a new frame from 3D models.

  • Moonsickle Gaming
    Moonsickle Gaming Před 2 měsíci +624

    2kliksphilip is an unsung hero, his DLSS coverage is also one of his best content

    • MrPaxio
      MrPaxio Před měsícem

      @HonoredMule just wait til two pump chump philip comes thru

    • MrPaxio
      MrPaxio Před měsícem

      ouch, dont be so mean

    • HonoredMule
      HonoredMule Před 2 měsíci +4

      @Elise 3klicksphilip is just more work. Both will be _automatically_ obsolete when 0clicksphilip releases.

    • Elise
      Elise Před 2 měsíci +21

      2kliksphilip had a good idea, but 3kliksphilip is more advanced in every way!

    • Maspian
      Maspian Před 2 měsíci +38

      Personally super excited to see 2klicksphilip's video referred to in a LTT video, a lot of Philip's content is really high quality, especially the ones where he covers DLSS and upscaling as mentioned earlier. Can't recommend checking it out enough!

  • Jonny Smith
    Jonny Smith Před 2 měsíci

    When the testers started introducing themselves in my mind i said "hi im blank and i own a computer". And then there is Nicholas and you really put "owns a display" below which was why i thought that sentence in the first place.
    I will always have to think of that statement when someone names their qualification.

  • Johnathan
    Johnathan Před 2 měsíci

    this is amazing, this better be the norm in 2d games!

  • Undercoverfire
    Undercoverfire Před 2 měsíci

    They could add this to the Steam Deck with just a software update. They already have the ability to let you toggle FSR 1.0 from outside of the game. I don't see why they couldn't add this to their Gamescope compositor.

  • Ultra-Widescreen-Gaming
    Ultra-Widescreen-Gaming Před měsícem

    14:02 - My TV supports interpolating frames, so I can play Nintendo Switch games at 60 FPS, instead of 30 FPS. It also has fewer artefacts than DLSS 3.0, because most games are cartoon-looking which has clear edges, borders, outlines, etc. which makes interpolating easier/better.

  • Facundo
    Facundo Před 2 měsíci +45

    The main issue with these workarounds is that they depend on the Z buffer, they break down pretty quickly whenever you have objects superimposed like something behind glass, volumetric effects or screen space effects

    • David Goodman
      David Goodman Před měsícem +1

      You technically only need the depth buffer for positional reprojection (eg. stepping side-to-side). Rotational reprojection (eg. turning your head while standing still) can be done just fine without depth, and this is how most VR reprojection works already, as well as electronic image stabilization features in phone cameras (they reproject the image to render it from a more steady perspective).
      It might sound like a major compromise but try doing both motions, and you'll notice that your perspective changes a lot more from the rotational movement than the positional one, which is why rotational reprojection is much more important (although having both is ideal).

    • Lucky-segfault
      Lucky-segfault Před 2 měsíci

      Ya, that sounds like it could be a big issue...

  • OldFrankHog
    OldFrankHog Před 2 měsíci

    Static objects look great in ASW when paning around the camera. Slow objects are ok-ish. The issue is fighter jets zooming across your screen. Yuk.

  • Rago
    Rago Před 2 měsíci

    whoa, now I understand, why i can still move my head around the last rendered frame, when a VR game (eg. Pavlov) crashes!! super cool.
    ....and wouldn't it also be a gamechanger when car interfaces would use that, imagine a responsive screen in a car :o :D

  • Gallopeermeneer
    Gallopeermeneer Před 2 měsíci +1

    Undoubtedly a very useful technology, although it would have been a more relevant demo if there some objects were actually moving on the screen. This method works better when mainly the character is moving, and not much else. If there are for example enemies running across the screen, the "real" fps needs to be a lot higher to be convincing.

    • MrPaxio
      MrPaxio Před měsícem

      yeah this only decreases input lag which makes it feel faster but thats all. I tried async timewarp on VR and it was disgusting and more unplayable

  • Tyler O'Blenes
    Tyler O'Blenes Před měsícem

    "Even running diagnostics on that pesky printer that never cooperates. You know which one I'm talking about, all of them!"
    Actually the story of my life in IT...

  • James Cross
    James Cross Před 2 měsíci

    You see motion above 20 frames per second but it depends how far things have moved and where it is. The centre of your vision is the slowest. It works well on some things and terribly on others. Parallax movements seem worst, so on a car or aircraft simulation the windscreen pillars completely break it. But maybe they could apply it to the far field and just composite the cockpit afterwards after all, they aren't moving.

  • The OS expert Daymon
    The OS expert Daymon Před 2 měsíci

    I'm salivating over this tech. I would've killed for this when I was stuck gaming on a laptop iGPU.

  • IRMacGuyver
    IRMacGuyver Před 2 měsíci +1

    The biggest problem could be solved by rendering some overscan so the edges of the screen aren't right at the edge of the render

  • Johannes Otto
    Johannes Otto Před měsícem

    what an interesting discovery. I dont think that 600-1500$ gpus with 350+ W should be the future for gaming.

  • Hex
    Hex Před měsícem

    I am wondering if game engines will have like localized rendering or something. Like if the player is standing still why not just render whatever parts of the frame are due to change, like moving characters?
    Either way I definitely like this technology. It won't give you more info on other player's movement, but just making the screen move more smoothly still helps you aim because you get more feedback on placing your reticle on the other player.

  • Dissy
    Dissy Před měsícem

    I think what I would like most for this is reducing jitter, not being 100% active

  • Impreziv 02
    Impreziv 02 Před 2 měsíci

    Love seeing that Radeon on the test bench at the end, instead of a GeForce card. 😉

  • Aestareth
    Aestareth Před 2 měsíci

    is there any reason it looks bad in the video, or is the screen tearing happening on the monitor too?
    because if they were pro gamers, they would have noticed that

  • zzzjz
    zzzjz Před 2 měsíci +1

    Ok hear me out what if we combined checkerboard rendering and interlaced rendering at half the res and then upscale it w fsr 2.1 so basically render a whole frame w 12.5% or even less pixels !

    TONKAHANAH Před 2 měsíci

    or would good on the current steam deck since, ya know, it just runs the software in your library. if games just have to support it, then current steam deck could already start benefiting from it. and if it can at all be added on the compositor or driver level, dont see why it couldnt be updated to add support here.

  • PhyshStycc
    PhyshStycc Před 2 měsíci +5

    As one of the top 250 vrchat players, I remember when async was a new tech in it's buggy years but new people take for granted HOW MUCH of a massive difference it makes nowadays. I was with VR since consumer conception and it's interesting to see how things have shaped up.

    • tpodole
      tpodole Před měsícem

      seen couple people "oh I need to turn it off in this" and then forget that motion smoothing is not the same and in steam VR it's quite a hidden option that resets every session.

    • tiestofalljays
      tiestofalljays Před 2 měsíci +2

      This feels like a self-own

  • yensteel
    yensteel Před 2 měsíci

    Could this be game changing for cloud gaming? Lower latency response.

  • Coal Powers
    Coal Powers Před 2 měsíci +1

    Technically, the GPU reprojection is "rendering" the frame at 240fps, but the content being fed in to be reprojected is only updated at 30fps. You even show that it is rendering to a texture internally and then rendering a single surface (two triangles) with that texture on it. As long as the content (texture) update rate is above human flicker fusion rate, you might never notice. This falls apart at the edges of moving frames when you whip the mouse around, but if instead of rendering full resolution at 30 fps, you sacrifice 6 of those frames to overscan/oversampling, then your 20% reduction in update rate could be used instead to render 20% more pixels (so you don't have to copy from the edges so aggressively). It would resemble what GyroFlow does to stabilize video, but in 3D games, it would smooth any scene where your camera remains stationary (e.g. viewing 360° photos) and also in low-action scenes. Fast movement in the scene will not update at the same rate that you look around, so while this fixes a lot of motion sickness, I don't think it would help as much with racing (incl. driving, flying simulators) or fighting games. Put more simply, I don't think the static scene (and walking around slowly) was a good representation of the overall effectiveness of this technique. More varied examples (than a small, unmoving sandbox) would be needed.

  • Ultra-Widescreen-Gaming
    Ultra-Widescreen-Gaming Před měsícem

    12:40 - I use an LG 60Hz 21:9 LC-Display driven by an RTX 2080Ti (Yeah, I know, not the best combo). Anyway, just because my display can't show more frames per second, it doesn't mean I can't benefit from higher FPS while gaming. In fact, I disable V-Sync and set up my FPS cap to 119,88 FPS. This results in much lower latency, better response in games and also eliminate tearing. I mean, it feels like playing on a 120 Hz display but without owning an 120 Hz display.

  • Meister Wu
    Meister Wu Před 2 měsíci

    3:08 - this is indeed the smoothest 0fps I've ever seen

  • Daniel Holmes
    Daniel Holmes Před 2 měsíci

    I feel like this isn’t some fancy thing that you think it is, once the models are more complex or higher res textures it’s probably barely worth it.

  • MADagain
    MADagain Před 2 měsíci

    They need to add this to Vikings Battle for Asgard... I've still been trying to find a way to play that game, being that it's locked at 30fps, with a reasonable outcome that didn't feel like a sluggish mess

  • Ryua
    Ryua Před 2 měsíci

    As someone who can't use V-Sync due to input lag delays feeling unplayable, I need this.

  • muffensmasher
    muffensmasher Před měsícem

    I've used this to fix black bars on Shadow PC inside the Occulus, using this method on desktop is genius. We need to push this to game devs immediately.

  • Charlie Maybe
    Charlie Maybe Před měsícem

    I see this in pavalov when loading, this effect at 0 fps is evident

  • HotdogWater Enthusiast
    HotdogWater Enthusiast Před 2 měsíci

    you could get rid of edge affects by rendering at a higher fov the zooming in a little bit. it would likely hide the stretching issues

  • Thorn
    Thorn Před měsícem

    My PC to vr streaming setup is a bit choppy and the visual artifacts in this match almost exactly to the visual artifacts I see during lag spikes

  • Pampersrocker
    Pampersrocker Před 2 měsíci

    From an engine developer perspective I do have a more critical stance to this as there are a few oversights no one seems to talk about:
    One the one side things like DLSS 3 get pixel peeped to the max and every wrong predicted pixel is taken as a flaw, but here we take the way more drastic image errors as not dramatic. Also the sample provided with some flat colors does not do it justice for the amount of visual artifacts this generates when moving around.
    However, the bigger point here is the proper implementation of this technique. This works fully fine, while the GPU is not loaded at all and sits idly waiting to render those reprojected frames. But as soon as the GPU is under full load and the reprojection would be useful you quickly get massive frame timing issues reprojecting the frames in time since you can't guarantee the timeslot you get on the GPU. Async Compute Pipelines do exist but definitely do not execution in time.
    Modern Engines pre-compute a lot of the draw calls and send them out in CommandLists to reduce the DrawCall overhead to achieve the performance in the first place. You cannot easily interrupt the GPU at any time to do a reprojection and continue where it left of. The Graphics Pipeline State would be lost the engine carefully created and sorted so it does change the least amount of time. An analogy here would be to stop a newspaper press mid-run to pick out a few example and trying to start it up again like nothing happened.
    VR gets away with this as the actual reprojection gets done on the Display Device (and some minor reprojection at the end of the frame just before submitting to the Headset).
    So for this to be available on regular Desktop games extra hardware either in the Monitor or GPU would be required. This would take the last frame and the changed viewport transform to perform the reprojection in the background, while the rest of the GPU is computing the new frame.

  • Lievel
    Lievel Před 2 měsíci

    Intel NEEDS this for their Arc GPUs

  • Davii Mai
    Davii Mai Před 2 měsíci

    Never would've thought LTT would cover this, definitely not before Digital Foundry.