Originally posted by MadMathMind While we here have great understanding of how pixels work and what all that stuff means, most consumers only rudimentarily understand it. They get that more pixels can mean a better image, but they don't quite know how or why. Look at when 1080p first came out. People bought it listening to the marketing that it would be a lot better than their 720p TVs. In many cases, it wasn't. They bought too small a screen for their seating distance, so the better image quality couldn't be seen.
It's more like: "This monitor has higher resolution, it's better!" So people buy them. And then they find out that it doesn't work like they thought. That's more of what I meant.
More resolution actually is better, though. I don't really know anyone who went from 720p to 1080p and was super disappointed in the image quality. I expect 4K would be fairly noticeable on large (>60in) screens, assuming you fed it with a 4K video source. That's probably the source of more heartbreak than the difference between 720p and 1080p - there's a lot of 1080i Bluerays out there, and there's no more data in 1080i than 720p. Also, cable TV and Dish tend to squish the HD channels into very little space as well, to bump up the number of "HD Channels" they can advertise for the same amount of satellite bandwidth. It's better than SD, but it's not really giving you the best your fancy screen can offer. It's not enough just to have a high-res display, the whole chain has to be up to hi-res and consumers are being fleeced about what they're getting. Regardless, the difference between 1080p and 4K is more than twice as large an increase as between 720p and 1080p. 4x the pixels is usually a noticeable jump.
Now, conversely, there's another argument for PCs. PCs generate the images right there for you, so your desktop or a game you play at 4K resolution has a high-quality video processing chain (or you can upsample 1080p DVDs, etc). And people love high-PPI displays (see: Apple Retina), and they appreciate having more desktop space, etc. It particularly makes a difference when it's not a TV you're sitting across the room from, you're mashing your face right into a monitor by comparison so you need a higher PPI. Now that 4K displays have crashed from $3-5k to $500-700 in the last year you'll see a lot more takeup. Just like how FF takeup is accelerating now that they're no longer flagship products costing $3-4k.
Quote: What we're likely to see is DSLRs take advantage of "smaller." Obviously, there's two things they can't fix: lenses (limited by the physics of optics) and sensor size. Right now, the average smartphone has way more processing power than even the most powerful SLR. Smartphones can use average optics and make up for it with fancy signal processing, something we don't see on SLRs, probably because SoC are too expensive to mix with the necessary expensive components. As tiny processors become more cheaper and more powerful, we may see SLRs move to become a bit smaller--at the very least, they will become a lot more advanced.
You are totally barking up the wrong tree with this one. You can apply as much signal processing to a DSLR image as you want. It's called "shooting RAW", and yes, you can usually squeeze quite a bit more out of an image than the onboard JPG processor. A RAW is more or less a direct readout of the CCD/CMOS data and you can apply more intensive algorithms in Aperture/Lightroom than a DSLR body could support.
The size thing is partially a function of sensor size (you have to fit a mirrorbox and a viewfinder in there, which get bigger as the sensor does), partially a fixed cost (the image processors really don't get much smaller or larger relative to the format, current draw probably won't vary hugely so battery sizes are relatively constant for a given battery life target, etc), and partially ergonomics (given how many buttons manufacturers want to squeeze on their cameras). I think the big shift here will be when manufacturers alter the ergonomics equation - mostly touchscreen with a few assignable controls will allow you to fit the same functions into a smaller space.
I generally feel that the "I have big hands" argument is simply the result of poor ergonomic design, given that virtually all SLRs used to be smaller than the smallest DSLR bodies today (Canon SL-1, etc). Get rid of all the useless "direct print" buttons the marketing department put on and focus on delivering usable ergonomics in a smaller body size. If ISO, shutter speed, and aperture are physically controllable, everything else can be on the touchscreen so far as I am concerned. Or even just ISO and aperture with aperture-priority mode. You can build a body much smaller like that. Give me a digital Pentax ME, basically.
Of course mirrorless can inherently be smaller than DSLRs since you don't have to fit a mirror box in there. That is, as long as you are willing to make a leap and require that DSLR compatibility requires an adapter. Otherwise the legacy mount makes your camera fat, like the K-01. That advantage gets bigger as the sensor format gets larger. Most DSLRs operate using legacy full-frame mounts (35mm), if sensor sizes get larger manufacturers will be forced to increase register distance to fit the bigger mirror, whereas MILC can remain relatively thin so long as the angle of incidence doesn't become too extreme for the sensor/microlenses at the edge. I've heard there are organic sensors that are much better at extreme incidence angles coming down the pipe.
As time goes the technical advantages are moving from DSLR to MILC. Lack of phase detect autofocus was the biggest problem in mirrorless, and the sensors with built in phase-detect pixels will hopefully fix that. The remaining DSLR advantages are longer battery life and an optical viewfinder. MILC are simpler to build since you don't have to deal with a swinging mirror, and people charge their cellphones daily anyway. Right now they aren't taken seriously but I think better AF will change that.