Originally posted by Ayoh What about the additional costs associated with stitching?
OK, let me redo my analysis with greater care. It is a while back I did it the last time ...
I am basically aware of two public sources:
#1
http://smithsonianchips.si.edu/ice/cd/CEICM/SECTION2.pdf (1997)
#2
IC Knowledge - Technology Trends
#1 contains a detailed break-down of 166MHz Pentium CPU cost, at that time one of the most complex chips around:
Defect density 1.2/cm^2 (DRAM was better at 0.5/cm^2), yield was 37%, wafer cost was $1890, function test yield was 70%, final cost per CPU was 112$, sales price per 1000 was 350$. That's $26000 revenue per wafer. At that time, DRAM was more like $3000 revenue per wafer cost of $1500, despite the same yield rate (larger die for 64MB).
Intel has a monopoly on the "Intel" processor and asks 3x the manufacturing costs as chip price. DRAM is more like 1.8x. Image sensors may be between 2x 1/2.3" and 2.5x (FF).
#2 suggests that the defect density today is between 0.005/cm^2 and 0.05/cm^2. Which is between 0.05 and 0.43 per FF chip. In my model, I use a value of ~0.33 per FF chip. Yield on FF should have improved dramatically over recent years. So, after functioning test, there should be at least 30 FF chips sold per 300mm wafer. With 2.5x ASP, a single FF sensor should sell to Nikon et al. at 1/12 cost per 300mm wafer. Let's be pessimistic and say 1/10 the cost.
(note: #2 contains an offline calculator for good units out, given defect density, die size and wafer size
->
http://www.icknowledge.com/misc_technology/die_calculator.xls )
#2 also suggests that die sizes slowly grow, a result of decreasing defect densities. By 2001 (outside the imaging market), dies became as large as an APSC sensor (Intel IA64 etc.). This explains, why 10 years ago, the imaging market was clipping at the APSC size and why Olympus who started early, selected an even smaller die size. But #2 suggests that 10 years later (the corner in their chart), "normal" die sizes should have reached ~1000 mm^2 (I need a source though, although more recent IA64 CPUs seem to almost as large as an FF die).
Wrt to stitching.
You're correct, many reticle machines or steppers don't support such large mask sizes (900 mm^2). However, I guess Sony invested into this and now has large enough reticles w/o need for stitching. Two reasons:
- they make 1/2 million FF sensors per year so it is a cost cutting investment.
- And the D800 chip is 35.9x24mm rather than 36x24 (like Canon who do stitching AFAIK). Looks like an optimization like the reticles their machines support are just a tad to small.
Still assuming stitching, the extra cost is small though. According to #1, cost for masks and reticles is only about 1% of overall cost, although growing faster than average. But the xtra cost due to stitching should remain below 10%, already covered by our 1/10 pessimistic estimate. BTW, #1 contains an estimate on R&D overhead as well: below 10% too.
So, it all boils down to the cost of a wafer.
I only found this hidden link in the open net, most studies are very expensive to purchase.
#3
http://www.icknowledge.com/economics/A%20Simulation%20Study%20of%20450mm%20W...vision%201.pdf
It is recent (2008) and studies 450mm wafers, thereby exhibiting the 2008 cost of 300mm wafers: 1936$.
Even assuming image sensors need a more expensive process, I think $3000 / 300mm wafer by 2012 is a fair cost estimate.
According to the reasoning above, this translates to an average sale price
ASP per FF chip of $300.
In my previous post, I said $500 and that may be what Sony asks external customers when using column-parallel A/D technology. But that's about it.
Still no cost driver when it comes to understand why not all cameras are FF when they were APSC back in the early days.
Last edited by falconeye; 03-03-2012 at 06:14 AM.