4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.
*monkey’s paw curls*
Granted! Everything’s just internal render 25% scale and massive amounts of TAA.
He said next-gen not current gen. :/
For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.
TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.
Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p
That’s how I feel when people complain about 4k only being 30fps on PS5.
I laugh because my 1080p tv lets the PS5 output at like 800fps.
Has anyone else here never actually bought a TV? I’ve been given 3 perfectly good TVs that relatives were gonna throw out when they upgraded to smart TVs. I love my dumb, free TVs. They do exactly what I need them to and nothing more. I’m going to be really sad when they kick the bucket.
I’ve been using the same two TVs since 2008 and I have zero desire to upgrade.
Any TV is a dumb TV if you plug a Kodi box in the HDMI and never use the smart trash.
I set up a tv for my mother in law. No joke had to register with an email before it would let me switch to HDMI.
That’s terrible. And good to know how low they’ve sunk.
I was a given free, very decent, dumb tv and upgraded it to a smart tv with a $5 steam link and ran a cat 6 cable to it from my router. Best $5 ever. Have no intention of buying a new one. If I ever do, I will try my hardest to make sure if it’s a dumb one. I know they sell “commercial displays” that are basically a tv with no thrid party apps or a way to install them.
Yes, people like me buy TVs. I’m the guy who keeps giving away perfectly good TVs to other people because I’ve bought a new one and don’t want to store the old one. I’ve given away 2 smart TVs so far, though I’m not sure what I’ll do with my current one when I inevitably upgrade.
I used my family’s first HDTV from 2008 up until last year, when my family got me a 55" 4k TV for like $250. Not gonna lie, it’s pretty nice having so much screen, but I’m never getting rid of the ol’ Sanyo.
One of my TVs was given to us by my mother-in-law, but we did buy the other one. Before the ‘smart’ TV era though.
Jokes on you – I’m still using the last TV I bought in 2005. It has 2 HDMI ports and supports 1080i!
I miss this the most, older tv models would have like over 30 ports to connect anything you wanted. All newer models just have like 1 HDMI connection if even.
To add these older screens last. New stuff just dies after a few years, or gets killed with a firmware upgrade.
PSA: Don’t connect your “smart” appliances to the internet fokes.
We had an older Hitachi tv with 4 HDMI plus component plus RCA input and 4 different options for audio input.
New Samsung TV. 2 HDMI, that’s it. One is ARC which is the only audio interface besides TOSLINK so really theres effectively 1 HDMI to use.
But of course all the lovely
spywaresmart features more than make up for it.I’ve got 4 HDMI 4k 120hz ports on my LG…
Are you telling me there are modern TVs with only 1 HDMI port??
I was curious I so went and browsed some budget TVs on Walmart’s website. Even the no-name budget ones all had 3 HDMI. Maybe if it’s meant to be a monitor instead of a living room TV but I just looked at living room style TVs.
Thanks for putting in the work. People like you make the internet a better place
Mine’s running a Wii on component YPbPr video. Looks mint!
4k is the reasonable limit, combined with 120 FPS or so. Beyond that, the returns are extremely diminished and aren’t worth truly considering.
480720108014404k is as much as anyone’s gonna need, the next highest thing doesn’t look that much betterThere are legitimately diminishing returns, realistically I would say 1080p would be fine to keep at max, but 4k really is the sweet spot. Eventually, there is a physical limit.
I fully agree, but I also try to keep aware of when I’m repeating patterns. I thought the same thing about 1080p that I do about 4k, and I want to be aware that I could be wrong again
Yep, I’m aware of it too, the biggest thing for me is that we know we are much closer to physical limitations now than we ever were before. I believe efficiency is going to be the focus, and perhaps energy consumption will be focused on more than raw performance gains outside of sound computing practices.
Once we hit that theoretical ceiling on the hardware level, performance will likely be gained at the software level, with more efficient and clean code.
One of my TVs is 720p. The other is 1080p. The quality is just fine for me. Neither is a ‘smart’ TV and neither connects to the internet.
I will use them until they can no longer be used.
The last TV I owned was an old CRT that was built in the 70s. I repaired it, and connected the NES and eventually the SNES to it. Haven’t had a need for a TV ever since I went to university, joined IT, and gained a steady supply of second hand monitors.
My son is on his 3rd Dualsense controller in about 18months.
Yesterday I plugged my Xbox 360 controller into my steam deck and played Halo 3 Like an OG.
My Xbox Series S controller got stick drift like 3 months after I got it. My friend’s finally succumbed last week, after about a year of owning it. What is it with stick drift on new controllers? Seems like every modern system has the exact same problem
We are at a point where 4k rtx is barely viable if you have a money tree.
Why the fuck would you wanna move to 8k?
I’m contemplating getting 1440p for my setup, as it seems a decent obtainable option.
8k 15fps will be glorious.
lol
It’s all about dlss
And getting the newest gpu every year because they lock you out of the most recent dlss update when you don’t upgrade to the newest line up right?
Or just make dlss access a subscription
Make it piratable then, i ain’t getting no subscription on hardware functioning.
Fuck that shit to the high heavens and back.
Oh I’m sure some folks will figure out how to pirate it for sure. But as long as big businesses pay the sub fee, Nvidia won’t give a shit about us.
My 46" Sharp Aquos that I paid $2,000 for in 2004 is still chugging along like a champ. It’s been used nearly daily.
Cherish it (though maybe not its power requirements?) - based on the big ole chunky bois I’ve seen at the dump 📺 (looked like those rear projector models or something).
Same here. 40” Sharp Aquos quattron not only still working, but working flawlessly. It’s also got way more inputs than any TV that size today, and a stand that swivels that I use all the time. I’m in no hurry to replace it.
Me still rocking the 1080p 42 inch I bought off a coworker for $50 10 years ago
I mean, you can get 4K TVs for cheap and fix them (As long as the display is NOT damaged, once that’s gone the TV is nothing but scrap)
Got a 60 inch 4K HDR TV for free off Facebook, the led backlights had just gone out. $20 for a replacement set, 2 hours of my time and a couple cuts on my hand and it’s been a fantastic TV since lmao
The performance difference between 1080p and 720p on my computer makes me really question if 4k is worth it. My computer isn’t very good because it has an APU and it’s actually shocking what will run on it at low res. If I had a GPU that could run 4k I’d just use 1080p and have 120fps all the time.
1440p is the sweet spot. Very affordable these days to hit high FPS at 1440 including the monitors you need to drive it.
1080@120 is definitely low budget tier at this point.
Check out the PC Builder YouTube channel. Guy is great at talking gaming PC builds, prices, performance.
Tldr: Higher resolutions afford greater screen sizes and closer viewing distances
There’s a treadmill effect when it comes to higher resolutions
You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience
The reason to upgrade to a higher resolution is because you want bigger screens
If you want a TV for a monitor, for instance, you’ll want 4k because you’re close enough that you’ll be and to SEE the pixels otherwise.
You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience
This is sort of how I feel about 3D movies and why I never go to them. After about 20 minutes, I mostly stop noticing the 3D.
My takeaway from this comment section is that smart TVs are straight from hell and should be treated as such. It is very important, that you get a TV BEFORE smart TVs were a thing.
Display technology has advanced quite a bit since smart tvs have become ubiquitous, though. So you are sacrificing quality to avoid those headaches.
Personally I just don’t give my smart TV an internet connection.
That’s what I did too. It has no connection and I don’t use any of the smart TV features. Instead I have my own box I’m using. I never felt this stupid.
Mine is that I’m not the same demographic than most Lemmy users who comment here.
Lmao sorry
Lol my phone has the best GPU and display in my house, and has raw specs of half the ram and cores of my 2012 desktop 😹
At work all day I remind people that a container with 1 vcpu and 2GB of ram is like running on a ten year old phone, theoretically 🙃
I game at 1440p. The day my 1080ti dies will be a sad day indeed.
I went from a 1070Ti to a €600 4070.
There was no need whatsoever for that. I have noticed almost no difference at 1440p 165Hz except newer games go like 20 fps higher.
I was thinking VR would be hugely different. It wasn’t, and now my headset cable is broken so I can’t even game in VR.
That kid was about as cool as kids could be back then. I wonder what he’s up to today.
https://knowyourmeme.com/memes/brent-rambo
Work(s/ed?) for Sony as of 2013 and worked on Planetside 2. Pretty cool!
Work(s/ed?) for Sony as of 2013
Funny turn of events considering the meme
deleted by creator