Originally posted by sourcenemy Like most have allready pointed out go for a IPS monitor, but heres a hint for most ppl if you want to get highest pixelcount from any monitor allways use a DVI contact even if the monitor itself has HDMI check if your graficscard has a DVI port and use a DVI to HDMI cable. couse HDMI caps the resolution so goin with DVI from the computer lets you set a noticeable higher/better resolution with helps as we know for working with photos
Only if you're using a really old HDMI implementation. DVI, being a rather old standard that hasn't been updated since the Clinton administration, only supports a max of 2560x1600 at 60 Hz. If you want higher resolution, you have to reduce the framerate; 4k is supported at a "count the flicker cycles" 17 Hz.
Initial versions of HDMI were capped at 1920x1200p60, but HDMI has supported 2560x1600p since version 1.3, which came out in 2006. It has supported 4k since 1.4 (released in 2009), although only at 24 fps.
Also be aware that if you use a single link cable, DVI is only 24 bit RGB color. You need to make sure to use a dual link cable to get 32 bit or higher color. xvYCC is also not supported by DVI.
To make the most of new high-end displays, you need to switch to HDMI or DisplayPort. DVI is an old standard that is going to be abandoned in due time. If you're seeing gamut or gamma issues with HDMI, it's likely a black level thing. I forget which is what, but only ago, I had a TV with DVI and I had to turn on extended blacks so that everything wouldn't be washed out. With HDMI, that same setting made everything look dark and evil. Make sure your blacks are set to PC level (I believe) because this is different from video standards.