in theory a 1920x1080 monitor or tv of the same size would render the image the same way. and both have similar picture adjustments although intrinsic contrast ratios/black points and some of that stuff on a tv are designed for tv viewing and monitors are designed for computer images.
the big difference would be the TV would not have a DVI connection which would be my preferred method of data transmission if you don't have a Display Port capable monitor/gpu.
also, monitors tend to be a little lighter because they don't have as many functions so fewer parts.
but basically if you can use a monitor as a tv, you can use a tv as a monitor.
Originally posted by JeffB Most "monitors" are designed for arms length viewing distance and are higher resolution than a "TV".
A 1920x1080 image is going to be the same image (resolution of 2,073,600 pixels or 2.1 MP) whether its on a 22" or a 72" screen. the difference will be pixel density. a character such as the letter "M" will look bigger on the 72" tv because the pixels are spaced father apart, but it will still be made up of the same 100 or so pixels. that's handy when dealing with text/graphics, but it really doesn't have a major impact on photo editing once you get past about 27" or so or under about 22". In fact, my photos viewed from the same distance on my old 28" monitor look horrible on my 55" tv, because of the pixel density. ie you aren't gaining much value in spending more money for a larger screen. the caveat to that is the individual's myopia, hyperopia or other eyesight related profiles. someone who's nearsighted might gain a little from a larger monitor, much like reading glasses help magnify small print.
the real impact comes into play when you move up to a 4k monitor. at 3840x2160 you get 4x the pixel density (and resolution of 8,294,400 pixels or 8.3 MP) for the same screen size. you'll really start to easily notice things in your photos, especially motion blur and fringing, that you didn't see on a 1080p monitor.