Velikost videa: 1280 X 720853 X 480640 X 360
Zobrazit ovladače přehrávání
watching the 8k bluescreen struggling to slowly render-scan in at 5:50 was my favourite part
@Fyn Kozari 🤠
@Chris aka Schulbus could also be slow CPU
Sometimes bluescreens are really slow. My last one (intentional via the kernelconnect bug, don't ask why i had to try it) also rendered this slow and i don't have an 8K screen. ^^
@Sebastian *all drivers and software stop working immediately and the kernel seizes all control
@Yuxuan Huang It is, the kernel is forcibly software drawing at this point, given that the GPU driver stops working immediately.
I think the game graphics will need to get closer to photorealistic before those resolutions will matter at all.
Cyperpunk at higher res with full RT is a photorealistic movie. If U can't relate ur 1440p monitor sucks.
@LucYfYre Arch of TwiLight for standard monitor on desk the resolution should be 1080p or 1440p (2k), better is going from 60 to 120hz than 2k to 4k
@LucYfYre Arch of TwiLight not needed absolutely, and in doom when its ultra fast paced game 1080p is good
The sweet spot in a few years will be 8k 120fps with tons of path tracing bells and whistles. 4k 60 will be plenty for most. And 8k 120fps will be VR demand. But a lot of people don't like VR. It's not going to become the "metaverse" for at least a decade. I currently game at 4k 120fps with my 4090. If it never exceeded this I'd be satisfied.
@aaaaaahhh! yes and no, in machine code, that is the term for non integer values, you can also have a 16bit float(Half-precision). In audio DSP, it means kind of the same, but it also means, specifically to an audio engineer, it has the ability to go above 1 amplitude. Kind of the same way images beside 8bit pr channel, can be stored in 16, 24 or 32bit. Means it can contain information outside of the visibly projected standard range.
Nicole was right, most games aren't developed or intended to be that high of res. It depends on the resolution of the materials applied to the walls, boxes, etc.
@Ducky 6950 games evolve mate
@Kennedy EXACTLY THIS! Too many games are just interactive CGI movies, or walk simulators. Where's the fun? Where's the AI!? We're in 2023 and AI is either impotent or nonexistent. How about no more CGI trailers that have nothing to do with the game play, and use those people to make the game better and/or faster?
@Linus Kardell Don't forget the clarity of distant objects.
Only 2 to 10 TB of hard drive space for 8K textures? Count me in! :P
yup, her observation was an out of the box, smart one.
as a dev i assure you we DO NOT make graphical assets at 8k, nor often do we even make them at 4k. The amount of compression, file size and file handling would just be a waste of time. You would end up with games being over 200GB's or higher and compression is already a massive issue within our games.
@TechnoArtfest please touch grass
@WebX what do you think DLSS is
Imagine a game that was only a couple hours long thouhg...the trade off for the most beautiful experience possible might be shorter experiences that require less storage.
@Carlos Oruna agree
Yup filesize 200gb for a game. Maybe 500 GB.
Mathematically it makes sense just like you concluded. If you get a monitor large enough to see 8K worth of pixels, you have to sit far enough back that that you can't see the difference. 8K for distribution only makes sense on signage in a mall where you might be standing 2 feet from the screen.
@Jeff Jones I believe someone jumped in the 8k bandwagon! Hahaha. Thanks for the info btw!
@Nice One QHD is well enough, barely any difference to 4k
@Jeff Jones maybe VR is the playground for such resolutions..maybe not necessarely with standard FOV of around 100° horizontalbut for instant: the Pimax 8kx (2 4k Screens for each eye): with i think 160° horizontal FOV you still see pixels, you still see SDE...so 4k for that kind of FOV is not enough - its only around 25-30 ppd (retina would be acround 60ppd)so there is a place for that high of resolution in the future.. not in the living-room with television at 35-40° fovbut for living-room-VR-Headsets
@JTK - Flight sims might be one good use scenario. For movies though, the field of view is the biggest issue. If you get close enough for a screen to fill a lot more than about 50 degrees it gets fatiguing to watch because you can't take in the whole scene at once and you have to turn your head to see something happening on one edge of the screen or another. So no matter how big a screen gets you have to sit a certain distance away for it to be comfortable. That happens to be a distance where most people can't make out the 8K of detail.There are certain scenarios where you might want to fill out more of their field of view, but most movies are not framed that way. Movies almost always frame the image for a small screen with close ups that make someone's head fill the entire frame. That kind of close up doesn't work on a screen that extends into or beyond your peripheral vision.8K cameras will likely be an important thing because acquiring 8K video gives more flexibility in editing, but distributing 8K is an issue of extremely diminishing returns. Unless you are in a scenario where people are standing right next to the screen or a special video designed to create nausea like those 180 or 360 videos at Disney world, then 8K doesn't make much sense.The 1% of people with 20/10 vision, who can also afford a very large screen, might love it though.
What are your thoughts on cinema/movie theatre screens? I could imagine something with a big enough screen like an IMAX there might be a perceivable difference. That and *maybe* simulators for training are the only actual use cases I can think of.
Spends thousands of dollars on a liquid cooled GPU. This is totally playable!
@jkeebla if you'll use it over many years then suddenly monthly cost of it does not feel that high
I bought my liquid cooled MSI Suprim 4090 for $1700. I plan on getting a second to double up on AI experiments.
"playable" != "affordable"
I got a aoi evga 1080 ftw hybrid. Is standard watercooled. (But lots of games are not that playable now haha)
yeah there's no point in a 4090 ti unless you're already a jakillionare
I am currently building a game in Unreal Engine and I think that Nicole hit the nail on the head. It doesn't matter what resolution the TV or Graphics card can hit if the dev team only used 1080P, materials on the mesh items in the game. I have only seen a few assets that were in 8k so far without looking for them so, strictly based on my little experience, I would be willing to bet that most games are only using 4k materials at best which makes having an 8k card and monitor/tv pointless. Love the content and thank you for showing me the performance difference between the 3090 and 4090. I have been going back and forth trying to decide which one to get. My 3070 ti doesn't like some of the graphic experimentation I have been doing with it's little 8gb or memory. Not a big enough difference that I would switch for regular gaming but with game development and cinematics rendering the more power the better. Thank you again and keep the great videos coming.
@Crecross haha yeah
@Random Orange Yours is miniscule 8gb is really low
@Random Orange I mean that speaks to how non existent yours is.
@NZGames its a cool concept. Guess it all would go down to how its implemented. Also have to make sure the upscale textures themselves look good.
@Mercury oh nah I'm just thinking you'd only upscale textures when you load them into vram. It's (probably) higher load times but it would mean that you don't need every game to take up 200 gigabytes on SSD.
The textures need to be made for 8k in order to tell a difference.
@Nuitari X and you're wrong. It's kind of like 400 hz gaming monitors, it's simply imperceptible after a certain point. This isn't like the old days where people were debating The difference between 30 and 60 FPS, there truly is a finite limit where it makes no difference whether you're getting a thousand FPS or 10,000 FPS, your eyes cannot tell the difference, nor can they tell the difference between an 8K display or a 4K display at certain ranges
@Ghola Tleilaxu 1 gbs internet connections are not enough. Need 10Gbe, 8k tv, 8k assets, 16TB NVMe, etc....8k gaming is still years away
@Joselito B. Maciel Huh? I thought the issue was that these video games have low resolution textures, as in 1080p at best.
@Eclisis Exactly...The whole test is a nonsense...I thought Linus was smart than that....
No, The issue here is that they are using the same 8K TV to do the test. The 8K TV will always project 8K regardless of the input signal, I mean, the 8K TV wiil upscale the 4K input signal to 8K, that is why they don't see much difference between 4K and 8K signals.
I used to be team 1440p. But honestly recently got a laptop with a 4k amoled display and am thinking of upgrading my whole rig to 4k now. Yes it’s overkill but there is just that little extra sharpness that makes me go wow this is how games should look regardless of the screen.
I play at 1080 so I don’t get seduced.
My RIG can play 4K with a 4090 and 13700kf, but i just got an ultrawide 1440p monitor, wish i love. But yeah, sitting close, 4K is insane. Oh well, 4K monitors are still expensive as fuck and barely if not even avaible in widescreen with high hertz. (that would be even under a 1000 bucks).
I run 1440p 165Hz, I can definitely tell the difference between 4k, but 4k 144hz monitors are $500 at the cheapest
@bito Agreed. The whole unnoticeable “thing” stems from standards in viewing distance. With a smartphone at 8-14in. Hold is closer and you should be able to easily tell, like a VR headset’s display.
@Marvin the LG 27gn950 has all of those features and as far as i can tell comes in just under $700 USD
At 55 years old - 8K looks like 1080p to me.
I think GPUs are going to end up twice as big as the consoles. lol
Yup and they should be a standalone product with their dedicated mains power supply integrated and just output video and input from computer using a new cable that replaces pcie highly inefficient ones.
My FE 3080ti renders on my HTC Vive Pro 2 at sufficiently high resolution that I can't see any pixels in Half-life: Alyx. I will likely get a newer card at some point in the future once power consumption comes down but right now I see absolutely no need.
i had an hp reverb g2 with my gigabyte 1660 Super and i did see pixels, and the gpu hat to be overclocked
@Dang Nguyen i owned it too, you can still see SDE...HL:A has no open fields, so resolution is not that of a problem.. but: for far sight -> there is still a lot of resolution they can pack into it...8k on VR would be similar to 4k on a television...
i owned the HTC Vive Pro 2 as well - had to return it, the binocular overlap was bad at the headset and didnt like some other features..AND: of course, still there was SDE to be seen... not anoing when playing, but still visible..
Really? I owned 3080 with a quest 2 and played Alyx at the resolution 3056x3172 per eye (steam VR super sampling x1.5) but I still saw shimmering effect with far objects (especially trees). For nearby objects, they were perfect and looked so real. Maybe because of the resolution of Quest 2 that caused shimmering effect? :D
Linus: Dont game at 8k Me who still uses a vga Cable with my monitor :/. Dont worry linus i wasnt planing anyways
Wow, 8K looks so detailed on 1080p.
yes, 4k and 8k looks the same on my 1k monitor
this comment jesus ahahah
You're lucky, I'm getting "quality unavailable" pretty sure it's playing at 420p. I'm not even sure what I'm looking at.
try it in in 144p
I have 8K TV, I've watched native 8K Videos and 4K Videos Upscaled to 8K. You could barley tell the difference. How could the TV upscale 4K to 8K without any problems, where it struggles to upscale 1080 to 4K. It's really simple. 4K pictures have so many pixels and details, that can even be upscaled to 16K if needed. When you have so many pixels and details in a picture, you can multiply it by n number of pixels. Where 1080 don't have much pixels or details where even at 1080 there are missing pixels and details. So you can't upscale a picture that has so many pixels that are basically nothing.
8k gaming only makes sense in a VR application and nothing more. That's where the tech is needed.
@Keith Harner no because you see it each monitor with both eyes, since each eye has its own screen when you use VR each eye sees 4k stand-alone but opening them together is 8k
@RKsolar I run three 4K displays. By your logic is that 16K? LOL
@Keith Harner It´s actually divided into 4k per eye. That´s why.
Really? How many 8K VR headsets do you know of?
Yep. Any other screen is too far from your eyes for you to physically be able to see the difference.
8k tvs also have less vibrant colours for twice the price…
Obviously, they can tell 4K and 8K apart just by the FPS alone. Why did yall leave the FPS counter on? Yall are the best of the best? smh. lol love yall!
Me gaming at 720p-1440p: I think I'm ok.
Them kids, I played on the TV screen with ZX Spectrum compatible in those times. It was totally playable.
As Nicole said, it's the textures. Eventually, textures will probably reach that point, but for right now(just like with 4K), it isn't practical. Within the next 10 years, you'll probably see 8K become a bigger thing, at least with gaming.
It's not just the textures. You can run 16K textures on a 2k display and have it appear sharper the more you zoom in. But the details and outlines of objects is what makes it count, and I don't see 8K doing it below a screen size of what? 160 inches?The problem with the difference between 4K and 8K isn't the detail per se, it's that 4K was crammed into a display size standard, that wasn't worth it from the beginning.4K below 30 inches for example, is more than wasted performance. 4K below 40 inches, is still wasted performance.At around 50 inches is where you start getting more value opposed to 2K, but increasing the 4K pixel density AGAIN by 4x at monitor sizes that haven't increased much in the last 7-8 years, at all, is an absolute waste.
@Tenchi That's a red herring. The recommended sitting distance from a TV screen means you aint gonna see the difference between 4 and 8k gaming.
@Tenchi The best way to see the difference is too look at thin models like antennas of cars. Those become solid on higher resolutions. It's especially apparent on Battlefield games.
@Tenchi Likely they chose it for FPS. The game is playable on pretty much anything from the last 10 or more years
CSGO came out 10 years ago, I don't know why they used that game as an example. A single player game with a lots of small details that cause shimmer at low resolutions would have been so much better for testing.
Sir, it looks like you teared over when you saw that 8K native stuff.
Whats that for an awesome keyboard/mousepad combo?
I have spent an obscene amount of time testing resolution and actively looking for differences with my big screen LG CX. Even at 55", 2k has always been the sweet spot for performance and quality balance. If you are playing an FPS/something fast paced, 8k is not going to be noticeable unless you are staring at something near motionless, like the sway of the weapon you're carrying. Even 4k is really not worth the performance loss if you're competitive gaming. 2k looks great and runs like butter with all settings maxed on virtually every game.
@Sav i RARELY drop below 70 on 4k on my rtx 3080 with all settings at high
I agree with you guys about grainy. I would argue that 2k looks good and 4k is just crispy. If I'm playing competitively, I prefer whatever gets me high frame rate(100+). If I'm playing Assassin's Creed or something singleplayer/open world then I definitely want 4k; as long as actions scenes don't dip below 60-70 fps. I think the latest offerings from green and red will do great in 4k.
@matthew Gerlach you are wrong. You are welcome. It's OK to call 2560*1440 "2.5K" though, though it's still less smart than calling it QHD, or actually writing down the damn resolution instead of marketing bullshitting terms.
@Евгений Санду 1440p is 2k, so back up dude
@matthew Gerlach while I do admit that quite some people call 2560*1440 "2K", it's just wrong.
What linus fails to consider is productivitya 65" 8k screen is the same as 4x 32" 4k screens. The main problem right now is that you cannot push over 8k@60hz with current cables. As far as i am aware the monitor they where using was only 60hz that would explain the less fluid feel compared to what people are used to. When you go down to 4k it runs at 120hz
Dawg. That TV is $49,000AUD here lol. So $30K+USD for a TV, that's unreal
Me with a 1080p monitor and an integrated graphics: Okay, I won't.
I've bought 1080p last year, very happy with 244Hz
@Pessi Escobar mad racist
2 1080p with an 1660 Super: i dont even know what 4k is
@Pessi Escobar bro what
@Pessi Escobar dude you aint gotta be rude to dude like that, he probably dosent even game, so why does he need to spend so much money on something he dosent do? or dosent want to spend so much money on something so little in life, you dont even know the dude so how are you gonna tell him to get a job? he probably makes wayyy more money than you, i had a laptop with intergrated graphics and on a non 1080p tv connected to it, i was running games perfect and everything else, running 4k videos smooth, so dont be out here judging people for what they have.
Should have hidden the fps in the first place to avoid bias😂
Alex guesses every test they throw at him LOL
discovering the endless possibilities with Intel integrated graphics HD 4600. A new dawn in pc gaming. lololol
I'm pretty sure the textures developed by the designers don't magically upscale...
But meshes, procedural shaders and effects do.
I love Colton.Also Alex and Jake immediately recognizing that it wasn't native 8K was impressive
Not rlly just looking at the fps
Employee test, fire the rest of them LOL!
@PCMasterRaceTechGod yeah you got a GPU bottleneck but with technology coming around the corner here soon I'm sure CPUs can start to take over a tiny bit of the load. Something similar to sli. But not nearly as good because it's a CPU not a gpu
Not really, they just deduced that they wouldnt get 200fps running at 8k. Common sense. Not difficult when the FPS counter is on the display.
yea they were just looking at the FPS
I feel like this technology could be game changing for VR headsets and like Linus said with projectors or really large tv's, but knowing that the cards now can run 8k over 60fps means that in a couple years 8k will be the norm most likely! Thanks Linus&Team!
I remember when DPI is what really mattered. I have nothing against 8k or 16k or 24k or 32...whatever. I just don't think it matters as much as basically every other spec. It's kind of like the megahertz wars in that it didn't really matter too much but for the progress directly related to the tech (like material and cooling.) AMD had the first 1 gigahertz cpu but it barely competed with Intel's 500 meg for the most part. I'd take a 27 inch 4k over even a 50 inch 8k any day and especially when you consider the trade offs at the moment. Of course I'm sitting close to the display...
I wonder how well this will hold up, i remember the resistance to 4k/UHD when that came out ...
Is it possible to apply some kind of filter that duplicates every second pixel for half of the screen, effectively giving you a picture where half of the screen is 4K and the other half is 8K, and then they have to pick which is which? That would be an interesting test.
@John True, however it's hardly worth it if you have to get close to pick it.
it's definitely noticable, you just have to get close.
As a 1080p user i can't even imagine how the 16 times resolution of my screen would look like
@EVOLICIOUS no, it isn't.
@Steel ughh no????
I have a 4K monitor and in RDR2 i put the resolution scale in 200 percent which i think is simulating 6k and it looked goddamn incredible
It's a huge difference. My monitors are 1080 and the pixels are fucking huge
@EVOLICIOUS so long as games support 1080p it really doesn’t matter
When you raise the resolution, you have to get better artists; this has been the rule from the beginning. You need games that are written for 8k, then you will notice the difference.
@Zac Johnson Yeah, you would need curved or multiple screens to enjoy the higher resolution. It's not that uncommon for high end gamers to have 3 screens, 3 x 4k is already pretty close to 8k. This card seems more useful to get higher FPS at 4K than really 8K.
If you watched until the end of the video, you’d know that wouldn’t even matter. If you’re the correct distance from the screen, you won’t even notice a difference. Now of course there are those that sit 1 inch away from their screens, but they’re destroying their eyes for the sake of “clarity.”
Well we can finally say that monitor resolution has matured as 4K, with 8K being the good old overshoot recipe, more is better. I've recently owned a Sony 8K and returned it. It was a monster of a TV, but the difference was barely noticeable and only up close.
You'd have to be in the extreme right tail of human vision to get anything out of 8K. 4K is pretty much the peak that makes sense and 8K is a desperation move to avoid totally commoditizing televisions and GPUs.And, as noted, the 8K needs textures of that resolution. 8K textures are 16 times larger than 2K, meaning rought 16 times more artist work. That's expensive. Maybe AI assisting could do this in the future (it sucks right now for this), but I think that's a bit of a pipe dream.We're deep into the flat part of the logarithmic improvement curve now.
@Nathan Bronx But your eyes do, eventually, have a limit. 8k is about the estimated resolution of the whole human visual field (if you have good vision). But you don't fill the whole visual field with your screen, more like half... so 4k is about as good as it ever needs to get. Except for VR where it is about 8k (and eye tracking will eventually be necessary).There are plenty of other ways to make graphics better like better ray tracing implementations, better models, better textures, etc. It would make more sense to pursue those than just crank up resolution forever.
people said the same about 1080p, and then about 4k, and then about monitors above refresh rates of 144, 8k isnt ready for the average consumer yet. it sucks to not be at the top, with the highest gear imaginable so you can flex it on your little subreddits. but in reality, most of us don't need anything past a mid tier setup.
Personally, I don't see the point in increasing it any further. 4k already gives a lot of detail and helps illustrate my reason for playing a game, the story. Any more detail just adds too much realism and a game isn't normally about realism, it's about having fun playing out a mission, a story, or just competing against friends. A game is normally means of escaping reality, so adding more realism would just take away from it. I guess another way of saying it is that 8K is just too much of a good thing.
Agreed. 4k 144hz is the peak for me. Anything more and the diminishing returns make it unnoticeable
8k looks so out of focus 😂
8k is way past the point of diminishing returns. 4k upscaled with some temporal handling would pretty much be the max high end I would think for displays. 4k is already usually retina.Without temporal handling, the main benefit of 8k wouldn't be so much the output pixels as it would be the increased number of samples. IE, 8k native and 8k on a 4k screen would look pretty much identical. 4k might actually look better because there would be less pixel pop-in. So I think we are more or less "done" in terms of resolution, and I think for displays, dynamic range and color depth would be more important.The main problem with 8k gaming is that it is so rare, devs are rarely going to design around it, when they could instead focus on 4k and put more of that horsepower into lighting and assets. Going much above mainstream resolutions would be mostly about playing older titles that don't know what to do with the extra horsepower.Beyond that, the main frontier might be VR, which can ALWAYS use more pixels, and 8k would be of benefit. But eye tracking and foveal rendering would very much help drive that. VR would be all about brightness, contrast, and FOV. Dynamic focal length pixels would be absolutely SICK. So they should probably let display manufacturers focus on the pixel count while VR manufacturers focus on....focus.
I still have a 1080p monitor, but with 360hz, I think you'll get much more with that than an 4K or 8K monitor, you don't really need such a high resolution for good gaming experience. And other than that; such a high resolution also has some side effects, because some programms or mods of games don't support anything more than 1080p which cast annoying things like way too small windows or icons and makes some games or mods of games even unplayable, it mostly are the little things that cause this, but might still be very important. I once had a 4K monitor, and had nothing but problems and issues with it. So 1080p is not only way cheaper, but also the best option for me.
I finally got my first 24' monitor 144hz in 1080p I was blown away
So proud of Alex for instantly catching onto the fact that it was actually not 8k
@Uwe Pieper You would be surprised how much of a difference it makes though. 4k Textures look fantastic in 4k, not so much in 1080 or even 1440. It's a surefire way to tell
@Cavey Möth But tree leaves are not always models. Those are often textures with an alpha channel. But depends on the game.
@Uwe Pieper Yeahh, gotta look at the borders of objects. Power lines and edges of buildings are often very aliased without AA at 4k. Heck, even with AA at 4k, tree leaves can look very blocky and difficult to discern from each other, especially when in motion. Far Cry 4's trees annoy me in particular because they look jaggy, no matter what AA setting I use. I guess DLSS would probably help mitigate the artifacts (Or even TAA like in Far Cry 5).
@Uwe Pieper I was thinking the same exact thing.
@Mike Well kind of. You need any modeled parts that has fine details and might show clipping or stairs. So diagonal edges are fine. The power lines Linus looked at were perfect. Trees in the distance could be textures, as well to save performance. Textures have a limited resolution and will look bad if you get too close. The mesh has an unlimited resolution. You might have simple objects but you can scale those up to infinity.
surely i could have the old "useless" 3090
People like to walk up to these low resolution textures. But the actual benefit of 8k over 4k is that you can have the sign with the text on, appear half the height on the TV, and still have the same number of pixels. The benefit is objects and textures in the distance, not close up.But that's if you can even see the pixels in 4k.
Which, unless you are sitting uncomfortably close to a 4k display, you won't ever see the pixels.
Surprised Linus didn't go into the implications of 8k for VR. I know there's probably not any headset out there that can really take advantage of this yet, but I think if 8k has any practical benefit, it'd be with VR.
I was quite happy with moving from 1080p to 1440p. Decided to try 4k after that and the experience was quite underwhelming with Rx 6900xt. Maybe building a 4070 with 1440p would be ideal..
I suppose 4K/8K is only useful when everything in a game is photorealistic (with real life textures, lighting, color and shapes). The Tomb Raider in this video is still not a photorealistic game to me...it should be like this game chclip.net/video/RN0OWRmIPdk/video.html&ab_channel=ReallyGoodGames
This game that you linked is such a bad showcase for UE5. Just free drag and drop assets out of the quixel library, default unreal engine vehicle game template (lack of speed, wonky physics), overblown post-process effects, etc, etc. This is the equivalent of a next-gen steam greenlight scam game. Whats showcased in this video can be done in 2-3 days of 'work'.
Doesn't the screen's processor auto upscale 4k the displayed content to appear as close to 8k as possible? Would there be a more noticeable difference if a 4k screen at the exact same size of the 8k screen change the results to the end viewer?
Yeah, the settings of the TV itself are very important here, since they'll affect things like the sharpness (which is most of what you'll get from increasing resolution in a game without higher resolution textures) and even framerate if you use smoothing (though they don't seem to be using that here).
4K 240hz or 360hz might be the pinnacle of gaming in the future. Anything crazier may not have any benefit due to diminishing returns.
The only place I think it'd be relevant is in VR applications, a beautiful 8k panel would definitely look really sharp inside of a vr headset
Id like to see a comparison of 4k vs 8k for a game that is actually designed to run in either. Zooming in on low fidelity textures is fruitless as they are not designed to take advantage of the hardware...I have been gaming in 1080 for a while now, and have been seriously considering a move to 4k - however, this would require investing into a full upgrade for my hardware, my monitors, and my xbox to take any advantage of (you are only as fast as your slowest part...) I'm trying to discern if it is even worth it right now, as my hardware still runs new games with decent performance, and I'm still not convinced enough games are built to take advantage of 4k hardware.
The only thing that could benefit from more than 4K are VR displays.
Flat screen gaming might not gain much from a 4090 but I know that VR could def use the boost. I have a 3090 and there a tons of games you can’t run at a stable frame rate with native resolution ESPECIALLY simulators.
Literally been talking about this since 2016.I got a 34" 21:9 ultrawide monitor, buddy got a 32" 4K monitor.Whenever we gamed on my monitor, he said the extra horizontal width of my monitor and added visibility is amazing, meanwhile, the extra vertical pixels on his 4K screen aren't noticable. Like, at all. On top of the extra space, my monitor even achieved more frames, because technically it's 20-30% less pixels that need to be rendered.So, Yes, you can go beyond and above with the 4K monitor in terms of size, but where will you sit when you sit in front of a 50" "monitor"? You sit too close, you need to turn your head all the time and will get hedaches, sit too far away, you won't be able to see the small details anymore that you might need.
Nicole has a good point about the texture assets, if they aren't 8k, 4k etc. it really isn't going to matter. Older games definitely were not using even 4k textures.
Only a few games I have use 4K in any meaningful way. Brotato looks sick though...
@RecurveNinja yeah, textures for specific big stuff like dinosaurs or giant ground planes/maps. But usually 1-2K (even in CG work) is quite enough. Doing stuff for games needs to be well optimized and the objects with textures on them that are seen don't usually take the whole frame, so no need for them to be super high.
@TheMrLeoniasty There are quite a lot of 4K textures in modern games. Not everything, but more than you would expect. Game artists often work one level of detail higher than what will ship, so if they're going to ship 2K textures, they'll work at 4K so if the game is updated with a high-res texture pack DLC, they can just re-export at 4K. A lot of environment textures in games like MGS V are shipped at 4K simply because they're designed to fit on a 4x4 meter piece of land. I haven't seen a game ship with 8K textures yet (aside from things like RAGE megatextures), but it wouldn't surprise me if they did. 8K and even 16K are very common in CG work outside of games. A big dinosaur for a hollywood blockbuster might have 40+ 8-16K UDIM 'materials' with 4-8 textures each.
@Rares Macovei yes, but the textures themselves are not 4K, that would weigh to much and quickly empty all the Vram.
it can matter. it's comparable to walking to a wall ingame vs being closer to the monitor in real life. walking closer to a texture is the same as zooming in on an image. even on a 1080p screen you could zoom so much into a 8k image that it's possible to count the pixels (since zooming in is basically reducing the pixel density)
I've been playing my first gen Xbone on the same 1080p 60hz I bought it with, can't imagine what I've been missing.
the avg consumer setup will be so much more amazing in abt 5 more generations, as the bottom tier tech will be significantly more affordable without necessarily losing out on any image fidelity.
8K gaming might be a VR thing in the future. With some VR headsets displaying 4K per eye.
i made an experiment with grounded dried clove tree leaf because of the minerals element contained within it, i found it to be plasmonic because of the copper , dielectric and semi conducting alsoi put on a magnetic converter stove and it did shine but you just gave me an insight when the guy said thats a lot of pixelsbecause my video from my smart phones from my experiment with the magnetism effect of grounded clove took 20 gb data into my cellphones for a 10 seconds video while i was looking at the plasmonic resonance of cloves, i found out the vibrations of the elements resulted into trillions additional pixels for my camera to record increasing the sizes of the files 100 fold at least, this means these nano particles excitation produces more pixels resolution at a nano scale, i wonder if screen material could be produced out of those
I have worked in games a long time and texture texture resolution is always a challenge due to memory. Normal texture sizes are often 2048x2048 but 4096x4096 are also used even though they are a bit less common. I try to author our textures in 8k to make them future proof but often do not submit them to the game because they take so much disk space. If you walk up to a wall you could even argue that the resolution needs to be higher than the screen resolution if you are close enough. I would say that most games do not have enough texture resolution to do 4K gaming justice
@bellowingsilence Upscaling can actually improve the actual resolution of the source
@Bud The Cyborg this is also, low key, part of why low res video and graphics always looked fine to people on CRT TVs. The grid and scanlines would softly apply a sort of fine texture to literally everything.
Neither do even some highly well made, gorgeous animated movie[s]. Many of those are actually upscaled to 4K, and the benefit for the viewer is simply a higher bitrate from the source, spread over a higher resolution.
Why don't you make a free dlc with higher texture resolution to make it optional war thunder did a similar thing
@Alexander Schu Are you talking about Kkrieger from 2004? It was a 96k executable. The algos they used to generate stuff "from thin air" were quite insane.
I want a 4090. I don't even game at 8k. like my 3090 does 4k just fine but like...... 4090
Bro, Halo Infinite isn't "very demanding," it's basically a modern TF2.
Linus when not able to play 8k - I need to play 8k its so much betterWhen he gets to play- it's fucking bad
This is like how 4k gaming was "totally possible" with a 1070, but no one would EVER use a 1070 for that now. We need a buuuunnch more generations of GPUs before it'll be feasible.
I still use my 1070 for 4k
Surely 8k is feasible for playing old games like Battlefield 3 which came out 11 years ago. It would look so much better.
I do like how Alex immediately noticed it was 4k because it was too good for 8k
Jake as well. He noticed how high fps was
I absolutely loved how Alex was like "nope, you guys are up to something. I know where I work."
Lol, he saw the fps, Alex knows when he's being used as a lab rat
I have watched on my OLED 55" TV 4k on CHclip played 8k video and even 12k video. It all looks amazing and it is funny I can see a difference on my 4K OLED TV, which you'd think you wouldn't notice, but you can tell. Crazy cool how my 4K tv works totally fine and the 8K and 12K look no different from one another. OLED FTW!
I think its great on 4k screens with DSR on older titles. No AA looks as good as DSR. Natively I think 8K has rather limited value. Not now, not ever.
Gaming has really been pushing the limit of technology
I love 8K for editing cause you can crop in with no quality lossbut anything I'm consuming I never really see the point to go above 4K
27" 2K 144hz is actually a pretty good spot for gaming on desk. At 80cm from screen, eye resolution and pixel density is matching great.
I thought that it's gonna work similarly to 4K on RTX 2060, but it's actually playable
I mean it quite useful to know a lot of sim racer would love to run triple 4k monitors or super high fidelity vr gear at max setting with max fps and triple 4k is almost 8k in resolution.
Good old Pixel Pitch....I'd love to see you do something like a spreadsheet with mixed text and graphics to see what the difference is with the same font and size to see if they still work - screen fonts are better than when Verdana was created - at which point no fonts aligned with CRT pixels and were harder to read/more blurry. Will going to 8k mean we need more pixel heavy fonts??
I like that Nicole made a very good point, if a game wasnt designed for 8k, or even 4k, the assets may not even benefit from the higher resImo a better benefit for a card like the 4090 really is atleast playing higher quality settings while still having head room
@Tenchi a straight line is more than enough to see the resolution
@Eduardo Santiago The geometry would have been a lot more noticeable in a game like Dark Souls 3 or Elden Ring, CSGO and that shooting ground really weren't it.
"the assets may not even benefit from the higher res"And that's one of the greatest benefits of making games to assume 8K is the target resolution.For the last 20 years one of my biggest gripes with games has been muddy textures, now that we're in the 4K era it's getting better but we really can push texture quality a lot higher.
I play FNV at 4k with no mods. Looks awful and it's great
@Motokid600 You guys are making some real good points, gotta be honest.
this 4k video feels low bitrate AF
8k will be great in the future. I currently have a 4090 with a 13900kf . In some games 8k DSR looks much better than native 4k, like in Mafia definitive edition or the Witcher 3 next gen.
i prefer the IMAX experience when watching films or playing games, sitting close and taking up my entire field of view is a much more immersive experience, when your eyeballs are 70-100cm from the screen on a 65 inch 8k tv you notice it more, I still use 4K input and let my 4th gen LG AI upscaler or DLSS to get to 8K as its pointless to do it native until the in game textures are 8k to begin with.
Good video and well, just as expected I'd say... 8K would be nice on an 32" desktop display for productivity, editing ect. But huge TV's? probably not so much...Granted I usually work on an 15" Laptop that has basically 4K resolution and it's fantastic, almost like a smartphone screen. So scaled up to 8K ~30" would be fantastic on a Desktop as well. Finally, no visible pixels! But besides the fact that this needs drastic performance increases in the future, at this point I don't think it's a rational choice. Especially not with upscaling like DLSS.And anything upwards of 8K on a desktop would in my view be actually unnecessary and useless...
i think its going to be pointless until game companies start optimizing their games for 8k monitors (ps im in the beginning of the video, idk if linus says this further on)
Hahaha! I've been saying I can't wait for 8k to become a thing. 4k still hasn't really become standard. It's really fucking close though. The reason I can't wait is so everything 4k can just drop in price.
8k è 4 volte la densita di pixel del 4k cioè : 4k 3840x2160 =8294400 pixel...invece 8k no è 4k x 2,ma bensi 7680x4320=33177600 pixel sse si confrontano il numero dei pixel 8k usa 4 volte esatte il numero di pixel del 4k.es.33177600 : 8294400 = quindi una scheda ke in 4k fa 80fps di media ne fara 15/18 in 8k purche la quantita di ram e la gpu lo supportino.....Lho voluto specificare perche magari tanti neofiti pensano per poca esperienza che un 8k sia il doppio di 4k ...non è cosi'....sono 4volte tanto, in temini di pixel....
7:06 there's been a screen tearing
As others have said, if the textures aren't natively 8k then you won't see more --in the textures--. The real difference is in the quality of the lines where they meet and how sharp or less stepped and aliased they are. Take a cube and turn it 45 degrees so the lines are angled and compare those between 4k and 8k and you should see a difference.
8K won't be a thing until 2026 when games will be talking about 8k textures and an 88 inch 8k OLED will be 10k or less....
I mean technically you can more clearly see the pixels of the textures
I can't believe those idiots were looking at the textures instead to the edges. And those have job on gaming channel. Unbelievable.
you could almost argue at that point, due to the sheer pixel density, that you could basically disable AA and it would still look buttery smooth
with AMD's displayport 2.1 it's capable of 165 fps at 8k. i have a feeling only the third party cards pull it off though with 3 instead of 2, 8v plugs planned.
I do play 6 k uhr DSR on 2x 4k at 120 Hz , which bringts the displayport to the edge, but works flawless in 170 FOV and its great to play callisto like that in Vr. Sadly oc does not make much sense on this model, still a good card, for VR users
Would VR be a situation where 8k could be kind of important still?
to compare 8k to 4k you need two different TVs the same size on native 8k and one native 4k, running the same game, the 4k on a native 8k Screen looks different than 4k on a native 4k screen
I own a projector and have a "screen size" of about 2.2 Meters. It's not possible to me in a distance of about 5 Meters to see any difference between 2K and 4K Material :) The difference is not as big as with old PAL/NTSC shows compared to 2K Stuff.
(17:20) So what about metric?
well i guess we still dont know any game more demanding and beautiful than "Red Dead Redemption 2", a 2018 game even in 2022 i get it! but if u guys really wanna see how it takes to play native 8k rtx on in december 2022 u better take that while playing something like "Plague Tale Requiem" maybe! its RTX 3.0 at least, way more demanding & prettier than shadow of the tomb raider by now i guess!
I have a 65r656 and when I use the vrr with my ps5 the resolutions jumps around causing flickering and the resolution info in the corner constantly. Please help. Should I just leave vrr off?
@Daniel Daly mine has been working fine now. I just unplugged the tv and waited like 30 seconds and the. The vrr worked. Still get the occasional bug in menu screens but in game hrs oerfeft
Yes. Both the 646 and 656 have issues with vrr @ 4k/120. Not a PS5 issue, happens on PC too. Funny enough it seems to work just fine at 60hz and below but why bother? Just turn it off and enjoy, it's still an awesome TV for dirt cheap. I got the 65" 646 and got my old PC connected to it in the bedroom. Love it
I love how Alex and Jake immediately noticed something is fishy from the fps counter alone :D
@Dexo They probably left it there to stop people from accusing them of faking the resolution or something.
@adlata tech jesus is boring, dont event compare him to linus u dmn nerds
@adlata Wait... are you really telling me that I do the same thing as a dude with a [supposed] net worth of $40M?! WOW!
@adlata you need to take a chill pill 😂 it's not that serious.
@adlata okok we get it cool down
what lap accessory is he using for his keyboard and mouse?
roccat something I think it's discontinued
For me the go to is 1080p ultra with some sharpening, looks great and plays great.
I tried Cyberpunk at 8K on my 4090 when I got it. I could barely tell the difference to be honest but then again I was gaming on a 4K monitor.
I'm still gaming at 1080p on a TV locked to 60fps. And it honestly is totally fine, I have no interest in upgrading at this time. 60 fps is fast enough for me and I only have to use a fraction of my GPUs power so I don't have to worry about my case, which really does not have good thermals. Maybe when I can get a 4k TV or monitor for 200 bucks I'll consider it.
@Klaus Jr. i can deal with 60fps but it needs to be a singleplayer game. i cant do anything remotely competitive unless its 120hz or more
@FatKiefBowls 8 years... I just build a PC that can run 144hz at 1440p, but I I'm glad i still didn't find it's noticable enough in my eyes. But there is no way back for under 50fps.
@Klaus Jr. to me yeah but ive been playing at 144hz/fps for like 8 years
@FatKiefBowls is it as noticable as 30 to 60?
i wish i never played at 144hz. 60fps hurts my eyes it looks so terrible because I'm so used to 144.