Pentax/Camera Marketplace |
Pentax Items for Sale |
Wanted Pentax Items |
Pentax Deals |
Deal Finder & Price Alerts |
Price Watch Forum |
My Marketplace Activity |
List a New Item |
Get seller access! |
Pentax Stores |
Pentax Retailer Map |
Pentax Photos |
Sample Photo Search |
Recent Photo Mosaic |
Today's Photos |
Free Photo Storage |
Member Photo Albums |
User Photo Gallery |
Exclusive Gallery |
Photo Community |
Photo Sharing Forum |
Critique Forum |
Official Photo Contests |
World Pentax Day Gallery |
World Pentax Day Photo Map |
Pentax Resources |
Articles and Tutorials |
Member-Submitted Articles |
Recommended Gear |
Firmware Update Guide |
Firmware Updates |
Pentax News |
Pentax Lens Databases |
Pentax Lens Reviews |
Pentax Lens Search |
Third-Party Lens Reviews |
Lens Compatibility |
Pentax Serial Number Database |
In-Depth Reviews |
SLR Lens Forum |
Sample Photo Archive |
Forum Discussions |
New Posts |
Today's Threads |
Photo Threads |
Recent Photo Mosaic |
Recent Updates |
Today's Photos |
Quick Searches |
Unanswered Threads |
Recently Liked Posts |
Forum RSS Feed |
Go to Page... |
570 Likes | Search this Thread |
03-06-2016, 06:16 AM - 2 Likes | #751 |
But if you are shooting mf, you are not going after the same results or same scenarios. Think not how a specific photo would look using the various formats. Think what scenario best fits the format. The specs are immaterial. Look at the photos and you will find mf shots that couldn't be replicated with another format. For example, there is a wildlife/landscape photographer on the West coast Canada who uses mf for bears. I shoot bears, and had a situation like this last year that I could not capture. One shot has this exquisite scene of greenery and water and a sleeping bear. No light, all in sharp focus, the larger the print the more it took your breath away. Film, 4x5. I stared at it a long time, both seeing the impossibility of getting that shot with apsc, and recognizing that my style and shooting assumptions would never even have seen that shot in the first place. So if you fail to see the value of MF, don't look at specs. Look at photos. The 645D has pretty poor specs, but in skilled hands produces exquisitely beautiful tones. It would take me two years of concerted effort to get something decent out of one. I know that, and the problem isn't the camera; it would take me that long to begin to approach the vision and skill level of more skilled artists. | |
These users Like derekkite's post: |
03-06-2016, 07:09 AM | #752 |
But if you are shooting mf, you are not going after the same results or same scenarios. Think not how a specific photo would look using the various formats. Think what scenario best fits the format. The specs are immaterial. Look at the photos and you will find mf shots that couldn't be replicated with another format. For example, there is a wildlife/landscape photographer on the West coast Canada who uses mf for bears. I shoot bears, and had a situation like this last year that I could not capture. One shot has this exquisite scene of greenery and water and a sleeping bear. No light, all in sharp focus, the larger the print the more it took your breath away. Film, 4x5. I stared at it a long time, both seeing the impossibility of getting that shot with apsc, and recognizing that my style and shooting assumptions would never even have seen that shot in the first place. So if you fail to see the value of MF, don't look at specs. Look at photos. The 645D has pretty poor specs, but in skilled hands produces exquisitely beautiful tones. It would take me two years of concerted effort to get something decent out of one. I know that, and the problem isn't the camera; it would take me that long to begin to approach the vision and skill level of more skilled artists. Yeah, its not just numbers, iso f stops and alike. | |
03-06-2016, 08:28 AM | #753 |
This sounds like a statement that could have come from a LP and speaker cable salesman. It could easily be fended of by referring all the excellent photos taken with phone cameras in the hands of good photographers. I absolutely respect and admire good photographers, but I think its not just about the people. Why would anyone buy an expensive camera in stead of a phone if that was 100% true? I think great photos are a conjunction between good photographers, good gear and good luck. Just like a photography equivalent to the fire triangle. Luck might be taken out of the equation with careful planning, but I think that will take away the some of the magic. For instance, take look at Nick Brandt or Ansel Adams most famous photos. They wouldn’t have been the same if Nicks animals posed badly or if it had been raining the days Ansel was out to take those photos. I'm sure there was a lot of rainy days in between the best images and quite a lot of animal photos that was thrown away. Although in the digital age (photographers state of digital thinking), the throw away ratio would have been orders of magnitude higher. Last edited by Simen1; 03-06-2016 at 08:38 AM. | |
03-06-2016, 09:13 AM | #754 |
Do it but you'll cry looking in any other VF then | |
03-06-2016, 09:31 AM | #755 |
According to the simplification above, MF got a 1 stop advantage over FF in sensor size, while FF got a 2 stop advantage over MF in terms of available apertures. If the situation had been the same between APS-C and FF, then APS-C would have f/0,7 primes and f/1,4 zooms, when FF "only" has f/1,4 primes and f/2,8 zooms. (I'm going way of topic now, but I actually got a 6x zoom lens with constant f/1,0 aperture, but thats for a much smaller format. It was a stupid buy because I haven’t been able to use it on my Q yet. Sorry, I'm way of topic now.) This make us forget that for high iso/noise that in the past ten years we easily got 3 from sensors and also 2-3 EV from noise removal software. In comparison the rest look quite small. Just switching to DxO prime instead of lighroom could make as much a difference than switching format. On the opposite, when you are shooting at base iso (for me it is most of the time), the noise level as so low that this comparison of noise level per sensor format is simply not relevant, even on APSC. So that difference is simply irrelevant. And trust it or not, people don't buy a 645Z or phase one to shoot black cat on a nights without moon at 25600 iso and f/2. This sounds like a statement that could have come from a LP and speaker cable salesman. It could easily be fended of by referring all the excellent photos taken with phone cameras in the hands of good photographers. I absolutely respect and admire good photographers, but I think its not just about the people. Why would anyone buy an expensive camera in stead of a phone if that was 100% true? I was like you for audio cable. Ironically I have 10 meters of audio cable between my computer and amplifier and also I can use wifi + digital entry from a box. With the analogic cable you can ear a soft base noise when there no music. Wifi + digital (optical) out perfectly crisp sound. You can repeat for the speakers, my 5.1 system has maybe 25m of cable. My speaker were provided with high quality cable and I tried the previous more basic cable I had. There was quite a difference. Because you didn't try, because you doesn't understand why there should be a difference doesn't mean there isn't one. And in all cases even if you understand this difference this doesn't mean you have to buy the best neither or people that actually buy are stupid. Last edited by Nicolas06; 03-06-2016 at 09:46 AM. | |
03-06-2016, 09:52 AM - 1 Like | #756 |
Wrt the discussion started by @Simen1 ... His arguments are all sound and valid. Speaking generally however, there is, for any given image quality and any given moment in time, a sweet spot (for sensor size) which delivers this image quality in the most cost-effective way (read, at the lowest possible price). I tried to elaborate a bit about this topic in my blog. Two things to observe:
The reason for #1 is that cost of silicon decreases over time, while the cost of glass does decrease less rapidly or does even increase. To make things worse, #1 and #2 combine to yield a cummulative effect to favor larger sensor sizes over time, more rapidly than some people believe. This is a general fact which can't be denied as it can be proven rigorously. There are a few intersting corrolars to the above:
| |
These users Like falconeye's post: |
03-06-2016, 10:17 AM | #757 |
Silicium price decrease over time but this is more complex. Most of the leverage for silicium is to put more into the same surface. This is not that interresting for sensor size. This may bring 4K and later on 8K or BSI but this doesn't mean a huge chip is going to cost nothing or consume no power. Intel predict they will stagnate by 2020 and if you did notice, the processor didn't evolve much in the past 10 years. What happened is we managed to close the gap between smartphone and desktop, not that processor overall are that much better today. They are like 2-4 faster while by previous evolution rate they should have been 30-50 time faster. It is not clear until when the price will go down, and it is not clear neither of how this price is anyway a big factor in the price of the gear. With enough volume, an FF camera like K1 could sell for 500$ and the latest 70-200 too. MF would be maybe 2-3K. But the volume isn't there, worse the market shrik significantly each year. Sale go down drastically and if current price drop as a desperate move to get back some volume and sales, manufacturers might find themselves in a position where the R&D to go bigger/better will simply proove too costly to justify. As long as there nothing better, and competitor don't do better, no reason to improve yourself. All manufacturers dropped price in japan just after pentax announced its FF for 1800$, that's no random event. It will not happen again for no reason. But that doesn't sound neither like they have the money to invest or the market to sell the gear. All the contrary. | |
03-06-2016, 11:08 AM | #758 |
Silicium price decrease over time but this is more complex. Most of the leverage for silicium is to put more into the same surface. This is not that interresting for sensor size. This may bring 4K and later on 8K or BSI but this doesn't mean a huge chip is going to cost nothing or consume no power. Intel predict they will stagnate by 2020 and if you did notice, the processor didn't evolve much in the past 10 years. What happened is we managed to close the gap between smartphone and desktop, not that processor overall are that much better today. They are like 2-4 faster while by previous evolution rate they should have been 30-50 time faster. Here is a source: https://www.karlrupp.net/2013/06/cpu-gpu-and-mic-hardware-characteristics-over-time/ See the first two graphs under 'Raw Compute Performance'. The blue lines show performance increases of Intel's Xeon processors, the most expensive consumer processors are usually not too far from those in performance. As you see, between 2007 and 2014, there was an increase of 100 to 1400GFlops in single-precision, and 50-700GGFlops in double precision calculations. That is a factor 14 in 7 years, which is consistent with Moore's law (which would predict a factor 8 in 6 years, a factor 16 in 8 years, although that is an oversimplified interpretation of Moore's law) Its easy to think processors have stagnated because not too many people need the full processing power of the newest CPU. Their PCs are often limited instead by memory bandwidth or GPU performance, and very few people still buy PCs with high-end CPUs (which used to be much more common when CPUs were often the bottleneck in the system). Even older CPUs can run almost any task thrown at them, so people feel no need to upgrade, but that does not mean the technology isnt progressing. Similarly, while mobile processors have made huge leaps in performance, they are still very far away from competing with desktop processors. The numbers (number of cores and clock speed) might make them look similarly fast, but they are running a different instruction set, and therefore their speed cannot be compared using those numbers. Getting back to cameras, once could ask themselves if there is a moore's law for sensors. I havent been following the market long enough to know, but it seems reasonable to assume that unlike CPUs, we aren't to near the edge of what is possible yet. Digital camera sensors are a much newer technology, and the investments made by industry to push the technology to its edge arent on the same scale. | |
03-06-2016, 11:44 AM | #759 |
And I'll add to it. Even if your argument would hold true (it doesn't), the process cost for a given area of silicon and a given process would decrease dramatically over time. That's because of the unavoidable deprecation of old technology / old machines / old fabs in a stagnating world. Canon in particular now works with their very old (more than ten y.o.) fabs and can still mantian its market share despite competitors using ten years more recent tech. If they wanted, Canon could deliver some extremely low-priced FF cameras ... But what we are discussing? That sensor sizes for a given budget have grown is an stablished fact by now. Just look at the market 2000, 2005, 2010, 2015 and it should be strikingly clearto everybody. | |
03-06-2016, 11:45 AM | #760 |
Here is a source: https://www.karlrupp.net/2013/06/cpu-gpu-and-mic-hardware-characteristics-over-time/ See the first two graphs under 'Raw Compute Performance'. The blue lines show performance increases of Intel's Xeon processors, the most expensive consumer processors are usually not too far from those in performance. As you see, between 2007 and 2014, there was an increase of 100 to 1400GFlops in single-precision, and 50-700GGFlops in double precision calculations. That is a factor 14 in 7 years, which is consistent with Moore's law (which would predict a factor 8 in 6 years, a factor 16 in 8 years, although that is an oversimplified interpretation of Moore's law) This no longer happen for many years, each new processor generation bring 5-15% more efficiancy at the same frequency every 2-4 years. There was the tentative with multicore. The idea being you can do more in parallel, run arbitrar computation on each core. We got up to 4 for most desktop. Most laptop are actually 2 and only server have 8 or more. That a bit because it is difficult to speed up things by running more task in parallel. Outside of a few areas (graphics, scientific computations) general purpose program don't gain much with this approach. That's why basically we don't have more than 4 cores on desktop. In server it work quite well because when many client are connected, it is easy to deal with each client request independantly as they are actually independant requests. Anyway this avenue now stagnate. Issue like memory bandwitdth and cache size make it difficult to massively scale for the current type of application we have. Then there the least usefull gain of all, massively parallel, identical computations to bump number like floating point operation per second. This sound great on benchmark but again this work mostly for graphics or scientific computer. For you typicall desktop outside games and photo/video editing this is useless. This doesn't help neither for the typical web server. This is not really a progress because device like that existed for year to handle graphics and scientific computation. This is called a GPU. Yeah now thank to that a modern CPU match the processing power of a GPU of 10 years ago. That's about it. And it doesn't have the necessary memory bandwidth to really follow. People that really need the performance for that, go with clusters of GPU. Not CPU. So I stand my point. We stagnate. The day we stopped to increase frequency slowed down progress dramatically. Of course thanks to dozen billions investment it still improved and he continued for a bit, albeit slower and slower with less and less visible improvements. I mean a benchmark does 3 things: - generate eat and consume energy (the most visible effect) - allow the manufacturer to put fancy specs and say they have the biggest one. - allow the client to feel like he has the biggest one. But if actual applications don't follow well... We will see if hardware neuron networks or quantic computing will change things. But we are at the end of what current CPU model can bring, or at least we don't innovate nearly as fast as before. Last edited by Nicolas06; 03-06-2016 at 11:51 AM. | |
03-06-2016, 11:47 AM | #761 |
What xandos said. And I'll add to it. Even if your argument would hold true (it doesn't), the process cost for a given area of silicon and a given process would decrease dramatically over time. That's because of the unavoidable deprecation of old technology / old machines / old fabs in a stagnating world. Canon in particular now works with their very old (more than ten y.o.) fabs and can still mantian its market share despite competitors using ten years more recent tech. If they wanted, Canon could deliver some extremely low-priced FF cameras ... Last edited by Nicolas06; 03-06-2016 at 11:53 AM. | |
03-06-2016, 12:02 PM | #762 |
When foundries make the step to 45cm wafers (up from 30) prices will decrease. The full frame sensor became affordable at the end of 2012 when the D600 and D610 were launched. Pentax should have been ready so they could have launched a full frame in 2013, but Hoya probably didn't want to invest in that so they had to wait for Ricoh to give the go ahead.
| |
03-06-2016, 12:25 PM | #763 |
The reverse is true, investment for new fab technology grow exponentially as the gate go smaller. Basically there Intel... And that's it. The other are far behind. It is more Canon 20 years behind and the other only 10 years. And Intel doesn't make sensors. It just not worth it money wise. As far as cost / image quality, I think Falc is right. But he does not comment about the size and weight of the glass to put in front of a larger sensor. One could study how to reduce the weight and size of lenses for larger sensors (referring to the new Nikon 300mm prime lens). Last edited by biz-engineer; 03-06-2016 at 12:37 PM. | |
03-06-2016, 01:10 PM | #764 |
I think the circuitry and design like BSI benefit of it through. But the things evolve anyway much slower as you said. And that my personal opinion, as the market shrink and less and less people are willing to buy a new camera because many find the old one still working and good enough, this will reduce investment and R&D... Slowing things even more. We might not have got half the progress we got in the 5 last years, if we didn't benefit from research of sensor for smartphones. | |
03-06-2016, 01:25 PM | #765 |
Before, you got increase in frequency basically, if you double the frequency and kept the same architecture, boosted a bit the memory bandwidth and cache, you nearly got 2X performance on all programs. maybe you got only 1.8 or 1.9 but it was near 2X. This no longer happen for many years, each new processor generation bring 5-15% more efficiancy at the same frequency every 2-4 years. There was the tentative with multicore. The idea being you can do more in parallel, run arbitrar computation on each core. We got up to 4 for most desktop. Most laptop are actually 2 and only server have 8 or more. That a bit because it is difficult to speed up things by running more task in parallel. Outside of a few areas (graphics, scientific computations) general purpose program don't gain much with this approach. That's why basically we don't have more than 4 cores on desktop. In server it work quite well because when many client are connected, it is easy to deal with each client request independantly as they are actually independant requests. Anyway this avenue now stagnate. Issue like memory bandwitdth and cache size make it difficult to massively scale for the current type of application we have. Then there the least usefull gain of all, massively parallel, identical computations to bump number like floating point operation per second. This sound great on benchmark but again this work mostly for graphics or scientific computer. For you typicall desktop outside games and photo/video editing this is useless. This doesn't help neither for the typical web server. This is not really a progress because device like that existed for year to handle graphics and scientific computation. This is called a GPU. Yeah now thank to that a modern CPU match the processing power of a GPU of 10 years ago. That's about it. And it doesn't have the necessary memory bandwidth to really follow. People that really need the performance for that, go with clusters of GPU. Not CPU. So I stand my point. We stagnate. The day we stopped to increase frequency slowed down progress dramatically. Of course thanks to dozen billions investment it still improved and he continued for a bit, albeit slower and slower with less and less visible improvements. I mean a benchmark does 3 things: - generate eat and consume energy (the most visible effect) - allow the manufacturer to put fancy specs and say they have the biggest one. - allow the client to feel like he has the biggest one. But if actual applications don't follow well... We will see if hardware neuron networks or quantic computing will change things. But we are at the end of what current CPU model can bring, or at least we don't innovate nearly as fast as before. I also agree that graphics/scientific computations benefit the most so far from the newer CPUs. That is partially due to the fact that those are two applications that heavily rely on CPU speed, and therefore need to be optimized for new hardware (as far as they run on CPUs, many scientific computations are now run on GPUs) . Most other programs simply don't need the full speed of the CPU. I can't say anything about neuron networks, but about quantum computing: dont expect a working consumer quantum computer in the next 2 decades. | |
Bookmarks |
Tags - Make this thread easier to find by adding keywords to it! |
auto, base, camera, company, compression, d810, design, dr, electrons, fa, hardware, iso, k-1, k-3, lenses, pentax, pentax body, pentax k-1, pentax news, pentax rumors, photos, pre-order, risk, sensor, specifications, timelapse, trip, vs |
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Pentax K-3 II Officially Announced | Adam | Pentax News and Rumors | 1014 | 07-03-2015 10:55 PM |
Pentax K-S2 Officially Announced | Adam | Pentax K-S1 & K-S2 | 12 | 05-23-2015 06:49 AM |
Pentax K-30 Officially Announced! | Adam | Pentax News and Rumors | 245 | 09-12-2012 08:32 PM |
Pentax K-5 Officially Announced | Adam | Pentax News and Rumors | 533 | 03-06-2012 05:45 AM |
K-5 Firmware 1.02 Officially Announced | Ole | Pentax K-5 & K-5 II | 50 | 01-20-2011 10:05 PM |