The 1080ti is already sold out everywhere in the US it seems. That didn't take long.
Last edited by ovm33; 2017-03-10 at 08:26 PM.
I sat alone in the dark one night, tuning in by remote.
I found a preacher who spoke of the light, but there was Brimstone in his throat.
He'd show me the way, according to him, in return for my personal check.
I flipped my channel back to CNN and lit another cigarette.
All right, gentleperchildren, let's review. The year is 2024 - that's two-zero-two-four, as in the 21st Century's perfect vision - and I am sorry to say the world has become a pussy-whipped, Brady Bunch version of itself, run by a bunch of still-masked clots ridden infertile senile sissies who want the Last Ukrainian to die so they can get on with the War on China, with some middle-eastern genocide on the side
Make it color calibrated, 144Hz, and VA/something else awesome and I'll be sold!
- - - Updated - - -
Don't bother with him man. He's just angry lately. Also, for what it's worth... You were a bit more correct. True 2k is 2048x (unspecified) and REAL (read: true) 4k is actually the resolution you listed, 4096x2160.
So if someone wants to correct you on specifics, well, that's just silly. Here, have a source, @Lathais.
https://www.cnet.com/news/tv-resolut...they-all-mean/
He was wrong and artorius and lathais were correct actually.
"That means 1080p is not "1K." It's 2K, as much as UHD TVs are 4K. Which is to say, at 1,920x1,080 they're close to the DCI's 2K specification of 2,048. That said, most people don't call 1080p 2K; they call it 1080p or Full HD."
Taken from your article
I get you.
It is like they were all a little wrong, though I didn't quite see where @Artorius posted. Q.Q
Close to 4k is not true 4k, just like close to 2k is not true 2k. If people want to get specific, then they need to make sure what they post is entirely true, otherwise everyone looks just a little dumb, as we can see here. hehe
I don't want to make too big a deal of it, it's just that the guy said 1080p is 1k which is actually almost 2k so that's more then a little wrong.
screen resolutions get referenced in too many different ways and then you get these incorrect statements. Anyways, this is way afftopic so gonna leave it at that.
No matter how you slice it, 1080p is not 1k and 1440p is not 2k, despite people always referring to them as such. It's also pretty funny tht you say I'm wrong, quote a source and your source says what I said:
2k is becoming shorthand for 1080p the same way 4k became shorthand for 2160p/UHD. Calling 1080p 2k is commonly accepted. Calling it 1k is most certainly not accepted anywhere, neither is calling 1440p 2k.But now that "4K" has gained traction as a term used to describe TVs and content, "2K" is becoming increasingly common as shorthand for the 1080p resolution used by most HDTVs, as well as Blu-ray. It's not technically accurate, but that didn't stop "4K" from becoming more popular than UHD.
Last edited by Lathais; 2017-03-11 at 04:12 AM.
If you are allowed to round 3840 to 4k, I'm allowed to round 1920 to 1k and 2560 to 2k, especially if it's not important. You just round up, I round down. And you cannot say I'm doing it the wrong way. Unless you are bad at math and computing.
All right, gentleperchildren, let's review. The year is 2024 - that's two-zero-two-four, as in the 21st Century's perfect vision - and I am sorry to say the world has become a pussy-whipped, Brady Bunch version of itself, run by a bunch of still-masked clots ridden infertile senile sissies who want the Last Ukrainian to die so they can get on with the War on China, with some middle-eastern genocide on the side
Or just use the HD (1280*720), FHD (1920*1080), (W)QHD(2560*1440), UWQHD(3440*1440), UHD (3820*2160), DCI 4k (4096*2160) designations
Last edited by Butler to Baby Sloths; 2017-03-11 at 09:50 AM.
For what it's worth the actual TV and film industry doesn't care as much as this thread does right now (on a conversational level). When I was at NAB last year UHD was commonly referred to as 4K, especially in tongue, while if someone meant the other they'd specify "DCI"- or "cinema"-4K
Rounding 1920 to 1000 rather than 2000 is a bit of a stretch though That would confuse eeeveryyyonneeee
All right, gentleperchildren, let's review. The year is 2024 - that's two-zero-two-four, as in the 21st Century's perfect vision - and I am sorry to say the world has become a pussy-whipped, Brady Bunch version of itself, run by a bunch of still-masked clots ridden infertile senile sissies who want the Last Ukrainian to die so they can get on with the War on China, with some middle-eastern genocide on the side
Slightly related to the non-related topic, on my second PC I have set both my HDMI VA monitors to limited 16-235 range (with corresponding setting in the nvidia driver ofc) and I don't care They seem to like it more, easier to calibrate and no weird shifts in certain colors. Though we're talking basic 100 dollar monitors here.
Baaack to the 1080 Ti. I seem to finally, FINALLY, after all these years of buying hardware, have gotten a good sample of something. Hitting 2GHz at 77C on 90% fan speed atm without even touching voltage. My now sold-off 1080 (MSI gaming X super special mega duper card with extra power connector and extra phases and everything) struggled with 2GHz and that was about a 1000 CUDA cores less.
Don't have more time to play around with it for today, though. Let's just assume it will hit 3GHz
Holy crap. Post in the gpu oc thread @Wries! That's awesome! I really want to see some bench results!