Forgot Password
Pentax Camera Forums Home
 

Reply
Show Printable Version 570 Likes Search this Thread
03-06-2016, 06:16 AM - 2 Likes   #751
Pentaxian




Join Date: Sep 2011
Location: Nelson B.C.
Posts: 3,782
But if you are shooting mf, you are not going after the same results or same scenarios.

Think not how a specific photo would look using the various formats. Think what scenario best fits the format. The specs are immaterial. Look at the photos and you will find mf shots that couldn't be replicated with another format.

For example, there is a wildlife/landscape photographer on the West coast Canada who uses mf for bears. I shoot bears, and had a situation like this last year that I could not capture. One shot has this exquisite scene of greenery and water and a sleeping bear. No light, all in sharp focus, the larger the print the more it took your breath away. Film, 4x5. I stared at it a long time, both seeing the impossibility of getting that shot with apsc, and recognizing that my style and shooting assumptions would never even have seen that shot in the first place.

So if you fail to see the value of MF, don't look at specs. Look at photos. The 645D has pretty poor specs, but in skilled hands produces exquisitely beautiful tones. It would take me two years of concerted effort to get something decent out of one. I know that, and the problem isn't the camera; it would take me that long to begin to approach the vision and skill level of more skilled artists.

03-06-2016, 07:09 AM   #752
Veteran Member




Join Date: Nov 2013
Posts: 4,854
QuoteOriginally posted by derekkite Quote
But if you are shooting mf, you are not going after the same results or same scenarios.

Think not how a specific photo would look using the various formats. Think what scenario best fits the format. The specs are immaterial. Look at the photos and you will find mf shots that couldn't be replicated with another format.

For example, there is a wildlife/landscape photographer on the West coast Canada who uses mf for bears. I shoot bears, and had a situation like this last year that I could not capture. One shot has this exquisite scene of greenery and water and a sleeping bear. No light, all in sharp focus, the larger the print the more it took your breath away. Film, 4x5. I stared at it a long time, both seeing the impossibility of getting that shot with apsc, and recognizing that my style and shooting assumptions would never even have seen that shot in the first place.

So if you fail to see the value of MF, don't look at specs. Look at photos. The 645D has pretty poor specs, but in skilled hands produces exquisitely beautiful tones. It would take me two years of concerted effort to get something decent out of one. I know that, and the problem isn't the camera; it would take me that long to begin to approach the vision and skill level of more skilled artists.

Yeah, its not just numbers, iso f stops and alike.
03-06-2016, 08:28 AM   #753
Veteran Member




Join Date: Jul 2009
Location: Tromsų, Norway
Photos: Albums
Posts: 1,031
QuoteOriginally posted by Nicolas06 Quote
APSC crop factor is 1.5 (and it depend a bit if you choose Canon or Pentax for example). now for MF it depend what MF format you speak of? 6x4.5 is 1.66 time FF 645Z more like 1.3 and 6x9 is 2.5 time more difference than m4/3 vs 24x36.
Just for the ease of the discussion I simplified to estimate a square root of two crop factors in between all the formats (1", m43, APS-C, FF, MF) and the same aspect ratio. I think readability is lost if i'm too detailed.

QuoteOriginally posted by Nicolas06 Quote
You don't need f/0.7 APSC-primes and f/1.4 zooms to match, 1.4/1.5 = 0.93 so that f/0.93, you can round it to f/1.
According to the simplification above, MF got a 1 stop advantage over FF in sensor size, while FF got a 2 stop advantage over MF in terms of available apertures. If the situation had been the same between APS-C and FF, then APS-C would have f/0,7 primes and f/1,4 zooms, when FF "only" has f/1,4 primes and f/2,8 zooms. (I'm going way of topic now, but I actually got a 6x zoom lens with constant f/1,0 aperture, but thats for a much smaller format. It was a stupid buy because I haven’t been able to use it on my Q yet. Sorry, I'm way of topic now.)

QuoteOriginally posted by derekkite Quote
So if you fail to see the value of MF, don't look at specs. Look at photos.
This sounds like a statement that could have come from a LP and speaker cable salesman. It could easily be fended of by referring all the excellent photos taken with phone cameras in the hands of good photographers. I absolutely respect and admire good photographers, but I think its not just about the people. Why would anyone buy an expensive camera in stead of a phone if that was 100% true? I think great photos are a conjunction between good photographers, good gear and good luck. Just like a photography equivalent to the fire triangle. Luck might be taken out of the equation with careful planning, but I think that will take away the some of the magic. For instance, take look at Nick Brandt or Ansel Adams most famous photos. They wouldn’t have been the same if Nicks animals posed badly or if it had been raining the days Ansel was out to take those photos. I'm sure there was a lot of rainy days in between the best images and quite a lot of animal photos that was thrown away. Although in the digital age (photographers state of digital thinking), the throw away ratio would have been orders of magnitude higher.

Last edited by Simen1; 03-06-2016 at 08:38 AM.
03-06-2016, 09:13 AM   #754
Pentaxian
thibs's Avatar

Join Date: Jun 2007
Location: Belgium
Photos: Albums
Posts: 7,001
QuoteOriginally posted by Simen1 Quote
Sure, me too. Thats why I'm asking someone to elaborate the reasons I struggle to see myself. I already started to learn some of those reasons: high resolution single exposures of moving subjects, and lens sharpness.
Ever looked through a 645Z VF? No? Then you can't understand (this is serious and not trying to insult you or something).
Do it but you'll cry looking in any other VF then

03-06-2016, 09:31 AM   #755
Veteran Member




Join Date: Nov 2013
Posts: 4,854
QuoteOriginally posted by Simen1 Quote
According to the simplification above, MF got a 1 stop advantage over FF in sensor size, while FF got a 2 stop advantage over MF in terms of available apertures. If the situation had been the same between APS-C and FF, then APS-C would have f/0,7 primes and f/1,4 zooms, when FF "only" has f/1,4 primes and f/2,8 zooms. (I'm going way of topic now, but I actually got a 6x zoom lens with constant f/1,0 aperture, but thats for a much smaller format. It was a stupid buy because I haven’t been able to use it on my Q yet. Sorry, I'm way of topic now.)
Well simplification doesn't work when the camera that get only 0.3EV on top on D810 still FF cost 3000$, the one that add 0.6-0.7 EV on top of same D810 cost 7000$ and the one that round that to 1EV like the phase one cost 30-40K$. The prices are exponential for very small gain each time. Approximation also make forget that in current state of affair for iso, there more 1.5EV high iso difference between APSC and best performing FF not 1.1 iso like the theory could make us think.

This make us forget that for high iso/noise that in the past ten years we easily got 3 from sensors and also 2-3 EV from noise removal software. In comparison the rest look quite small. Just switching to DxO prime instead of lighroom could make as much a difference than switching format.

On the opposite, when you are shooting at base iso (for me it is most of the time), the noise level as so low that this comparison of noise level per sensor format is simply not relevant, even on APSC. So that difference is simply irrelevant. And trust it or not, people don't buy a 645Z or phase one to shoot black cat on a nights without moon at 25600 iso and f/2.


QuoteOriginally posted by Simen1 Quote
This sounds like a statement that could have come from a LP and speaker cable salesman. It could easily be fended of by referring all the excellent photos taken with phone cameras in the hands of good photographers. I absolutely respect and admire good photographers, but I think its not just about the people. Why would anyone buy an expensive camera in stead of a phone if that was 100% true?
Ask yourself man the gear exist, is very expensive and actuall used by many pro, some quite well known and respected. So either they got ripped of their money by the sale guy or there a reason they didn't buy a smartphone instead or even an FF. Hey most of them have MANY cameras so they have the MF on top on 1-2 FF, maybe a small compact or APSC for vacations or prepare for a shooting.

I was like you for audio cable. Ironically I have 10 meters of audio cable between my computer and amplifier and also I can use wifi + digital entry from a box. With the analogic cable you can ear a soft base noise when there no music. Wifi + digital (optical) out perfectly crisp sound. You can repeat for the speakers, my 5.1 system has maybe 25m of cable. My speaker were provided with high quality cable and I tried the previous more basic cable I had. There was quite a difference.

Because you didn't try, because you doesn't understand why there should be a difference doesn't mean there isn't one. And in all cases even if you understand this difference this doesn't mean you have to buy the best neither or people that actually buy are stupid.

Last edited by Nicolas06; 03-06-2016 at 09:46 AM.
03-06-2016, 09:52 AM - 1 Like   #756
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
Wrt the discussion started by @Simen1 ...

His arguments are all sound and valid.

Speaking generally however, there is, for any given image quality and any given moment in time, a sweet spot (for sensor size) which delivers this image quality in the most cost-effective way (read, at the lowest possible price).

I tried to elaborate a bit about this topic in my blog.

Two things to observe:
  1. The sweet spot wanders towards larger sensor sizes over time.
  2. The sweet spot wanders towards larger sensor sizes when increasing the given image quality.
The reason for #2 is that beyond some point, the cost of implementing the lens grows much faster than the cost of implementing the sensor. Think of the cost of f/0.5 lenses ...

The reason for #1 is that cost of silicon decreases over time, while the cost of glass does decrease less rapidly or does even increase.

To make things worse, #1 and #2 combine to yield a cummulative effect to favor larger sensor sizes over time, more rapidly than some people believe.

This is a general fact which can't be denied as it can be proven rigorously.

There are a few intersting corrolars to the above:
  • Pentax would vanish in the enthusiast market if they don't upgrade to FF at some moment in time (one may still argue about the correct moment in time though -- for many it is just right now, for some, too early, for some including myself, it is about 4 years too late).
  • FourThirds will be adopted by smaller image quality standards (such as compacts) and will become extinct in the enthusiast market.
  • Because small lenses (and small cameras) are a most-wanted item, there will be such lenses for, e.g., full frame, in the future. Just like professional-grade F/4 primes and F/5.6 zooms. This is still a large gap in the market and IMHO, one of a few ways Pentax could differentiate and gain market share. That's also one of the strengths of the Leica line-up.
  • As the difference between FF and cropped 44x33 medium format is so little (0.8x crop factor), all current medium format vendors will have to provide an upgrade path to full medium format, at an affordable price. Except maybe Leica, which replaces Leica S by Leica SL.
  • A fixed lens 44x33 medium format mirrorless camera must be imminent. The question is who delivers it first ...
  • There is an option ... that Canon and/or Nikon, when eventually jumping to professional-grade mirrorless, do this with cropped 44x33 medium format. As it would be affordable enough, saves their DSLR business from cannibalism, allows them to introduce a new mount with no questions asked, and bridges AF performance until multi-pixel AF has achieved professional tracking performance (which it eventually will -- btw, Sony just delivered a Dual-Pixel AF sensor to Samsung ...).
  • ... many more corrolars exist, think about it for a few minutes ...
03-06-2016, 10:17 AM   #757
Veteran Member




Join Date: Nov 2013
Posts: 4,854
QuoteOriginally posted by falconeye Quote
Wrt the discussion started by @Simen1 ...

His arguments are all sound and valid.
Only partially. People don't buy MF for high iso, they don't even buy necessarily for more MP and they don't buy neither for the max apperture / shallow deph of field.

Silicium price decrease over time but this is more complex. Most of the leverage for silicium is to put more into the same surface. This is not that interresting for sensor size. This may bring 4K and later on 8K or BSI but this doesn't mean a huge chip is going to cost nothing or consume no power. Intel predict they will stagnate by 2020 and if you did notice, the processor didn't evolve much in the past 10 years. What happened is we managed to close the gap between smartphone and desktop, not that processor overall are that much better today. They are like 2-4 faster while by previous evolution rate they should have been 30-50 time faster.

It is not clear until when the price will go down, and it is not clear neither of how this price is anyway a big factor in the price of the gear. With enough volume, an FF camera like K1 could sell for 500$ and the latest 70-200 too. MF would be maybe 2-3K. But the volume isn't there, worse the market shrik significantly each year. Sale go down drastically and if current price drop as a desperate move to get back some volume and sales, manufacturers might find themselves in a position where the R&D to go bigger/better will simply proove too costly to justify.

As long as there nothing better, and competitor don't do better, no reason to improve yourself. All manufacturers dropped price in japan just after pentax announced its FF for 1800$, that's no random event. It will not happen again for no reason.

But that doesn't sound neither like they have the money to invest or the market to sell the gear. All the contrary.

03-06-2016, 11:08 AM   #758
Senior Member




Join Date: Jun 2010
Photos: Gallery | Albums
Posts: 120
QuoteOriginally posted by Nicolas06 Quote

Silicium price decrease over time but this is more complex. Most of the leverage for silicium is to put more into the same surface. This is not that interresting for sensor size. This may bring 4K and later on 8K or BSI but this doesn't mean a huge chip is going to cost nothing or consume no power. Intel predict they will stagnate by 2020 and if you did notice, the processor didn't evolve much in the past 10 years. What happened is we managed to close the gap between smartphone and desktop, not that processor overall are that much better today. They are like 2-4 faster while by previous evolution rate they should have been 30-50 time faster.
Sorry for moving part of the discussion off-topic, but I have to object to the statements about processor performance. Processors have evolved according to Moore's law over most of the last decade, although there has been a a slight slowing down over the last few years (as of the 22nm node at 2012) due to the fact that minimizing circuits is starting to reach the edge of what is possible. The prediction of Intel of stagnation for 2020 has to do with that, and the only way past that is to move on to a better technology platform than sillicon.

Here is a source: https://www.karlrupp.net/2013/06/cpu-gpu-and-mic-hardware-characteristics-over-time/
See the first two graphs under 'Raw Compute Performance'. The blue lines show performance increases of Intel's Xeon processors, the most expensive consumer processors are usually not too far from those in performance. As you see, between 2007 and 2014, there was an increase of 100 to 1400GFlops in single-precision, and 50-700GGFlops in double precision calculations. That is a factor 14 in 7 years, which is consistent with Moore's law (which would predict a factor 8 in 6 years, a factor 16 in 8 years, although that is an oversimplified interpretation of Moore's law)

Its easy to think processors have stagnated because not too many people need the full processing power of the newest CPU. Their PCs are often limited instead by memory bandwidth or GPU performance, and very few people still buy PCs with high-end CPUs (which used to be much more common when CPUs were often the bottleneck in the system). Even older CPUs can run almost any task thrown at them, so people feel no need to upgrade, but that does not mean the technology isnt progressing.

Similarly, while mobile processors have made huge leaps in performance, they are still very far away from competing with desktop processors. The numbers (number of cores and clock speed) might make them look similarly fast, but they are running a different instruction set, and therefore their speed cannot be compared using those numbers.

Getting back to cameras, once could ask themselves if there is a moore's law for sensors. I havent been following the market long enough to know, but it seems reasonable to assume that unlike CPUs, we aren't to near the edge of what is possible yet. Digital camera sensors are a much newer technology, and the investments made by industry to push the technology to its edge arent on the same scale.
03-06-2016, 11:44 AM   #759
Veteran Member
falconeye's Avatar

Join Date: Jan 2008
Location: Munich, Alps, Germany
Photos: Gallery
Posts: 6,871
QuoteOriginally posted by Nicolas06 Quote
Silicium price decrease over time but this is more complex. Most of the leverage for silicium is to put more into the same surface.
What xandos said.

And I'll add to it. Even if your argument would hold true (it doesn't), the process cost for a given area of silicon and a given process would decrease dramatically over time. That's because of the unavoidable deprecation of old technology / old machines / old fabs in a stagnating world. Canon in particular now works with their very old (more than ten y.o.) fabs and can still mantian its market share despite competitors using ten years more recent tech. If they wanted, Canon could deliver some extremely low-priced FF cameras ...

But what we are discussing?

That sensor sizes for a given budget have grown is an stablished fact by now. Just look at the market 2000, 2005, 2010, 2015 and it should be strikingly clearto everybody.
03-06-2016, 11:45 AM   #760
Veteran Member




Join Date: Nov 2013
Posts: 4,854
QuoteOriginally posted by xandos Quote
Here is a source: https://www.karlrupp.net/2013/06/cpu-gpu-and-mic-hardware-characteristics-over-time/
See the first two graphs under 'Raw Compute Performance'. The blue lines show performance increases of Intel's Xeon processors, the most expensive consumer processors are usually not too far from those in performance. As you see, between 2007 and 2014, there was an increase of 100 to 1400GFlops in single-precision, and 50-700GGFlops in double precision calculations. That is a factor 14 in 7 years, which is consistent with Moore's law (which would predict a factor 8 in 6 years, a factor 16 in 8 years, although that is an oversimplified interpretation of Moore's law)
Before, you got increase in frequency basically, if you double the frequency and kept the same architecture, boosted a bit the memory bandwidth and cache, you nearly got 2X performance on all programs. maybe you got only 1.8 or 1.9 but it was near 2X.

This no longer happen for many years, each new processor generation bring 5-15% more efficiancy at the same frequency every 2-4 years.

There was the tentative with multicore. The idea being you can do more in parallel, run arbitrar computation on each core. We got up to 4 for most desktop. Most laptop are actually 2 and only server have 8 or more. That a bit because it is difficult to speed up things by running more task in parallel. Outside of a few areas (graphics, scientific computations) general purpose program don't gain much with this approach. That's why basically we don't have more than 4 cores on desktop. In server it work quite well because when many client are connected, it is easy to deal with each client request independantly as they are actually independant requests. Anyway this avenue now stagnate. Issue like memory bandwitdth and cache size make it difficult to massively scale for the current type of application we have.

Then there the least usefull gain of all, massively parallel, identical computations to bump number like floating point operation per second. This sound great on benchmark but again this work mostly for graphics or scientific computer. For you typicall desktop outside games and photo/video editing this is useless. This doesn't help neither for the typical web server. This is not really a progress because device like that existed for year to handle graphics and scientific computation. This is called a GPU. Yeah now thank to that a modern CPU match the processing power of a GPU of 10 years ago. That's about it. And it doesn't have the necessary memory bandwidth to really follow. People that really need the performance for that, go with clusters of GPU. Not CPU.

So I stand my point. We stagnate. The day we stopped to increase frequency slowed down progress dramatically. Of course thanks to dozen billions investment it still improved and he continued for a bit, albeit slower and slower with less and less visible improvements.

I mean a benchmark does 3 things:
- generate eat and consume energy (the most visible effect)
- allow the manufacturer to put fancy specs and say they have the biggest one.
- allow the client to feel like he has the biggest one.

But if actual applications don't follow well...

We will see if hardware neuron networks or quantic computing will change things. But we are at the end of what current CPU model can bring, or at least we don't innovate nearly as fast as before.

Last edited by Nicolas06; 03-06-2016 at 11:51 AM.
03-06-2016, 11:47 AM   #761
Veteran Member




Join Date: Nov 2013
Posts: 4,854
QuoteOriginally posted by falconeye Quote
What xandos said.

And I'll add to it. Even if your argument would hold true (it doesn't), the process cost for a given area of silicon and a given process would decrease dramatically over time. That's because of the unavoidable deprecation of old technology / old machines / old fabs in a stagnating world. Canon in particular now works with their very old (more than ten y.o.) fabs and can still mantian its market share despite competitors using ten years more recent tech. If they wanted, Canon could deliver some extremely low-priced FF cameras ...
The reverse is true, investment for new fab technology grow exponentially as the gate go smaller. Basically there Intel... And that's it. The other are far behind. It is more Canon 20 years behind and the other only 10 years. And Intel doesn't make sensors. It just not worth it money wise.

Last edited by Nicolas06; 03-06-2016 at 11:53 AM.
03-06-2016, 12:02 PM   #762
Pentaxian
D1N0's Avatar

Join Date: May 2012
Location: ---
Photos: Gallery
Posts: 6,802
When foundries make the step to 45cm wafers (up from 30) prices will decrease. The full frame sensor became affordable at the end of 2012 when the D600 and D610 were launched. Pentax should have been ready so they could have launched a full frame in 2013, but Hoya probably didn't want to invest in that so they had to wait for Ricoh to give the go ahead.
03-06-2016, 12:25 PM   #763
Pentaxian




Join Date: Feb 2015
Photos: Gallery
Posts: 12,237
QuoteOriginally posted by Nicolas06 Quote
Before, you got increase in frequency basically, if you double the frequency and kept the same architecture, boosted a bit the memory bandwidth and cache, you nearly got 2X performance on all programs. maybe you got only 1.8 or 1.9 but it was near 2X.
QuoteOriginally posted by Nicolas06 Quote
The reverse is true, investment for new fab technology grow exponentially as the gate go smaller. Basically there Intel... And that's it. The other are far behind. It is more Canon 20 years behind and the other only 10 years. And Intel doesn't make sensors. It just not worth it money wise.
You are a software guy. You are referring to what's called Moore's law of VLSI front-end process node doubling density every 18 months (https://en.wikipedia.org/wiki/Moore's_law and https://en.wikipedia.org/wiki/International_Technology_Roadmap_for_Semiconductors ), does provide compounded benefits: as transistor size decreases, bias voltages decrease, as well as gate and channel capacitance, therefore, smaller offer more function per square mm, more speed and less power dissipation (more MIPS per mW). Image sensor is mostly an large matrix analog circuity which does not follow the continuous improvement described by Moore's law, improvement of image sensors is slower and much more limited.

As far as cost / image quality, I think Falc is right. But he does not comment about the size and weight of the glass to put in front of a larger sensor. One could study how to reduce the weight and size of lenses for larger sensors (referring to the new Nikon 300mm prime lens).

Last edited by biz-engineer; 03-06-2016 at 12:37 PM.
03-06-2016, 01:10 PM   #764
Veteran Member




Join Date: Nov 2013
Posts: 4,854
QuoteOriginally posted by biz-engineer Quote
Image sensor is mostly an large matrix analog circuity which does not follow the continuous improvement described by Moore's law, improvement of image sensors is slower and much more limited.
Agree I was not clear on my wording but that was part of the point for me. Most improvment of silicium come from moore law that doesn't really work anymore (say officially by Moore itself) and that law never applied that much to sensors.

I think the circuitry and design like BSI benefit of it through. But the things evolve anyway much slower as you said.

And that my personal opinion, as the market shrink and less and less people are willing to buy a new camera because many find the old one still working and good enough, this will reduce investment and R&D... Slowing things even more.

We might not have got half the progress we got in the 5 last years, if we didn't benefit from research of sensor for smartphones.
03-06-2016, 01:25 PM   #765
Senior Member




Join Date: Jun 2010
Photos: Gallery | Albums
Posts: 120
QuoteOriginally posted by Nicolas06 Quote
Before, you got increase in frequency basically, if you double the frequency and kept the same architecture, boosted a bit the memory bandwidth and cache, you nearly got 2X performance on all programs. maybe you got only 1.8 or 1.9 but it was near 2X.

This no longer happen for many years, each new processor generation bring 5-15% more efficiancy at the same frequency every 2-4 years.

There was the tentative with multicore. The idea being you can do more in parallel, run arbitrar computation on each core. We got up to 4 for most desktop. Most laptop are actually 2 and only server have 8 or more. That a bit because it is difficult to speed up things by running more task in parallel. Outside of a few areas (graphics, scientific computations) general purpose program don't gain much with this approach. That's why basically we don't have more than 4 cores on desktop. In server it work quite well because when many client are connected, it is easy to deal with each client request independantly as they are actually independant requests. Anyway this avenue now stagnate. Issue like memory bandwitdth and cache size make it difficult to massively scale for the current type of application we have.

Then there the least usefull gain of all, massively parallel, identical computations to bump number like floating point operation per second. This sound great on benchmark but again this work mostly for graphics or scientific computer. For you typicall desktop outside games and photo/video editing this is useless. This doesn't help neither for the typical web server. This is not really a progress because device like that existed for year to handle graphics and scientific computation. This is called a GPU. Yeah now thank to that a modern CPU match the processing power of a GPU of 10 years ago. That's about it. And it doesn't have the necessary memory bandwidth to really follow. People that really need the performance for that, go with clusters of GPU. Not CPU.

So I stand my point. We stagnate. The day we stopped to increase frequency slowed down progress dramatically. Of course thanks to dozen billions investment it still improved and he continued for a bit, albeit slower and slower with less and less visible improvements.

I mean a benchmark does 3 things:
- generate eat and consume energy (the most visible effect)
- allow the manufacturer to put fancy specs and say they have the biggest one.
- allow the client to feel like he has the biggest one.

But if actual applications don't follow well...

We will see if hardware neuron networks or quantic computing will change things. But we are at the end of what current CPU model can bring, or at least we don't innovate nearly as fast as before.
You might disagree with taking the number of GFlops as a useful indication of how fast a computer is. However, in my opinion its the only useful number as a CPU is a data processing machine. The fact that most software does not manage to use the increased computional power in newer CPUs does not mean the CPUs are not faster, it means that software isnt being optimized for newer hardware quickly enough. At a lower level the speed of a processor should be measured by how quickly it can run processes on data, and that is done by measuring GFlops.

I also agree that graphics/scientific computations benefit the most so far from the newer CPUs. That is partially due to the fact that those are two applications that heavily rely on CPU speed, and therefore need to be optimized for new hardware (as far as they run on CPUs, many scientific computations are now run on GPUs) . Most other programs simply don't need the full speed of the CPU.

I can't say anything about neuron networks, but about quantum computing: dont expect a working consumer quantum computer in the next 2 decades.
Reply

Bookmarks
  • Submit Thread to Facebook Facebook
  • Submit Thread to Twitter Twitter
  • Submit Thread to Digg Digg
Tags - Make this thread easier to find by adding keywords to it!
auto, base, camera, company, compression, d810, design, dr, electrons, fa, hardware, iso, k-1, k-3, lenses, pentax, pentax body, pentax k-1, pentax news, pentax rumors, photos, pre-order, risk, sensor, specifications, timelapse, trip, vs

Similar Threads
Thread Thread Starter Forum Replies Last Post
Pentax K-3 II Officially Announced Adam Pentax News and Rumors 1014 07-03-2015 10:55 PM
Pentax K-S2 Officially Announced Adam Pentax K-S1 & K-S2 12 05-23-2015 06:49 AM
Pentax K-30 Officially Announced! Adam Pentax News and Rumors 245 09-12-2012 08:32 PM
Pentax K-5 Officially Announced Adam Pentax News and Rumors 533 03-06-2012 05:45 AM
K-5 Firmware 1.02 Officially Announced Ole Pentax K-5 & K-5 II 50 01-20-2011 10:05 PM



All times are GMT -7. The time now is 12:26 AM. | See also: NikonForums.com, CanonForums.com part of our network of photo forums!
  • Red (Default)
  • Green
  • Gray
  • Dark
  • Dark Yellow
  • Dark Blue
  • Old Red
  • Old Green
  • Old Gray
  • Dial-Up Style
Hello! It's great to see you back on the forum! Have you considered joining the community?
register
Creating a FREE ACCOUNT takes under a minute, removes ads, and lets you post! [Dismiss]
Top