Check http://en.wikipedia.org/wiki/Dvi for the correct connector. You need the DVI-D dual link.
Check http://en.wikipedia.org/wiki/Dvi for the correct connector. You need the DVI-D dual link.
Why do something simple, when there is a complicated way?
@Icycoldd I've bought Pixel Perfect version and a you can see - no broken pixels and I'm happy
@ShadowTitan Yes, it(by it I mean that you have dual-dvi input on your GPU) will work, they even ship the new cable with the monitor.
First self-built rig
CPU:Intel i5-2500k|CPU Cooler: CM 212+ EVO
MoBO:ASRock Extreme 4 z77|GPU:EVGA GTX 680
Memory: Corsair Vengeance 2x4gb black|SSD: Crucial m4 64gb
PSU: Corsair TX 750]
After reading through this thread I'm very tempted to exchange my 1080p monitors and go 3x1440p. Thanks for the pictures Ninjaxl.
Does anyone know how long these last? I hope it lasts more than a year.
System Specs -
CPU - i5 2500k @ 5ghz | CPU Cooler - RASA RX240 | Motherboard - Asus Maximus IV Gene-Z/Gen3| GPU - Nvidia GTX 590|
SSD - Crucial M4 128GB | RAM - 8GB Corsair Vengeance | PSU - Corsair TX850M | Case - CoolerMaster HAF 922
| Monitor - Crossover 27Q (2560x1440) | Sidewinder X4 Keyboard |
I dont see the draw in these 27" monitors, why do people consider bigger=better. I would much rather spend my 300 bucks on a 23" real 120hz monitor.
Panel quality and 1440p. It is mainly the 1440p i guess, which is handy for a lot of things besides better looking games.
For me, i dont see the use in 120 hz panels.
I guess, if all you play is BF3/BO2 or other shooters. The 120hz monitor will really shine. But if you play other games more, or are not that competitive with shooters, you could go for the 1440p and ips.
My reasons for picking a 1440p monitor (in short) was to achieve a larger workspace as well as the higher quality on the image displayed.
Of interest, since i was back to WoW on a free week (returned to see if soloing had really changed... vengeance remains in the same nerfed stated which makes bosses like Brutallus slightly out of range for me now... if they buff the AP gained from magic damage, then it'd be much nicer -- i did still manage all of BT, A'lar, Vashj, the elemental boss in SSC, and all of Hyjal), i tested VRAM usage last night. At maximum settings with all the goodies enabled and running in windowed mode, i was sitting at 1700MB usage after 6-7 hours of playing. This was through a combination of raids from BT to TK to SSC and finally to Hyjal. What you need to keep in mind is that after you clear Hyjal (doom is no longer cast, and i didn't get airbursted at all), a LOT of demons will spawn at the Horde camp and start attacking the buildings. I went and pulled as many as i could, around 20 infernals, 10 ghouls and 10 aboms. These are all gonna take up VRAM as they're being drawn into the game world.
Basically, WoW never came close to the 2GB VRAM available on my 670. I've not monitored it in BF3, but judging by the FPS still being smooth, i'd say it also doesn't quite use up enough VRAM to need to start offloading to system RAM.
I'm not quite as amazed at the resolution as i was. The initial shine has worn off as happens with everything after a while (just like moving from 15Mb/s to 40Mb/s broadband next week will eventually become standard), but it's still my single greatest IT-related purchase of the past 3 years, without a doubt. Everything i wanted to do could be done at 1080p, but everything is so much more pleasant at 1440p. The only exception to this is Steam's "Big Picture", which only has resolutions up to 1080p and so looks extremely terrible on a larger resolution screen.
I'm not sure there'd be much benefit for me to move up to 1600p. In fact, i'm not sure that there'll be any benefit for me to move above 1440p for at least another 5 years for the simple fact that the design constraints of the tools and applications i use hasn't changed for over a decade. Delphi 7 from 2002 had a similar layout to Delphi XE3 released in September 2012. In fact judging from screenshots, Delphi 5 from 1999 has a similar layout to XE3, so it's safe to assume that unless they suddenly decide to add something massive to the UI, i won't need to worry about changing resolution for quite some time.
With that said, i'd rather play BF3 on 1440p 60hz now, rather than 1080p 120hz, simply because the refresh rate isn't as detrimental as it would be in a game like CS. CS, Quake, and Unreal would all be better on 120Hz due to the speed of the games, but in BF3 and CoD, it's not nearly as important.
In all honesty, it's more for the fact that i can game and do design work on 1440p and get more benefit. If i had the option to move to 1600p (just to clarify, these same Korean sellers also sell 1600p screens too), i would. I kinda wish GPU's supported resolutions above that, something like 3200x1800 or 3600x2025, but then we get into the realms of ridiculousness. 120Hz is nice, but it doesn't provide any real-world benefit compared to an increased resolution unless you're playing fast paced FPS.
I think this picture i posted a few pages back explains it;
The red area and the more faded parts of the image are 1920x1080. The larger areas and the full opacity areas are 2560x1440. You can see that the design area in the center of 1440p is almost the same size as the whole application at 1080p. 120Hz brings absolutely no benefit there and for almost every single game i play, there's no benefit there either. It's nicer, but it's not benefiting me in any way. I can play games and design effectively at 1440p, whereas i can only game effectively and do half-decent design at 1080p.
Now, if we had 1440p on a 40" screen, it'd be just as pointless as 1080p on a 40" screen for gaming. If we were talking 4K resolution (4096×2160), then it'd be beneficial. What you do need to consider is that as the resolution increases, either the screen size must increase, or it becomes difficult to read text on the screen. However, as resolution increases, you can see more on the screen at the same time, but if the screen is too small physically, then all that resolution goes to waste.
You've also got to consider desk space. My desk is around 4ft wide, but i've got a wall mounted 40" TV to account for (it's 2-way lever). The wall behind the desk is approx 8ft, and the TV can be moved to anywhere on the wall. Even with this amount of space, i've only just got space for the 27" screen and a 23.6" screen on the desk. Even then, the 23.6" screen is at an angle, with half of it's stand off the desk, and leaning against a wall to make sure it doesn't fall over. With a smaller screen, i'd have more desk space, but lower resolution and thus be just as ineffective at design as i would be on the 40" 1080p TV. 27" is an ideal size, and it'd be difficult for me to fit a larger screen in here. If i could get a 24" at 1440p, i probably would have done.
So, for many of us, it's NOT about the physical 27" size. It's about the resolution. Gaming isn't affected much by it (other than performance, but i still get admirable performance out of my GTX670 even at the larger resolution).
Coder, Gamer - Thoughtcloud | Node.js Monkey | Supporter of error 418 (I'm a teapot)
Knows: Delphi, PHP, WQL/SQL, Python, JS + jQuery, HTML + CSS, Ruby, Node.js (+ Express & Socket.io)
PC: 750D / 16GB / 256GB + 750GB / GTX780 / 4670K / Z87X-UD4H | Laptop: 8GB / 120GB + 480GB / GTX765M / 4700MQ
I think i saw some cheap 30 inch 1600p screens come by when i checked it out last week. To bad 30 inch is a bit to big.If i had the option to move to 1600p (just to clarify, these same Korean sellers also sell 1600p screens too), i would.
That said, im also getting a 1440p screen. One from dell tho, i need HMDI and than the difference is around 100 euro. And id rather spend that on better/easier warranty than risk a korean one. If i only used DVI, i would def try one out
It is still on sale
@Synthax you're right about the physical size. But a bigger monitor would obviously give more light but using a 43" tv 1080p as a monitor is pixelcounting.. 1, 2, 3, 1.6million!
A bigger screen giving more light is in some eyes better quality for them..
30" is wider by about 2".
Currently what is the best 1440p monitor available? I'm looking to upgrade and I'm very skeptical about the korean models in case of something going wrong. I'm currently using a dell u2311.
I have 2 Catleap 27'' Monitors. I ordered them both from ebay and have been very pleased with them. They are vibrant and beautiful monitors. As previously stated the stands do kind of suck. They aren't balanced terribly well and side by side it is noticeable that one is somewhat taller than the other. I am very pleased with the picture and have had them for several months now with zero problems. No dead pixels on either one even though I didn't purchase the "no dead pixel policy".
I wouldn't hesitate to order again from the supplier. Shipping was quick and the box was packaged well. They are fantastic monitors for the price. I had considered going to the dell u3011 but just didn't want to shell out the nearly $1000. I bought two 27'' for less than $350 each. I am very pleased.
I've had my Shimian for a few days now, and am posting my thoughts.
Here's how it was shipped, wrapped x3 in a bubble wrap aswell as a fragile sign
I assume the hole was from the security thing when going through one of the countries
Opened it on the right side, got about 3-4 static (?) electricity shocks when removing the bubble wrap
Korean & English manual, standard stuff
They even included a scandinavian power plug. +1!
Here's the Shimian after setting it all up, dwarfing my Samsung 22" & Xerox 24".
What are my thoughts on it?
It's absolutely brilliant, after trying serveral times I couldn't find any dead pixels, there's only very light backlight bleeding that isn't noticeable on normal use. After gaming some LoL & Half Life 2 on it, I can safely say I'm never going back to 1920x1080.
TL;DR: No issues, very light backlight bleeding that isn't noticeable. 10/10