Originally posted by kenyee the cost is directly proportional to the size on the wafer of your chip (wafer cost is a fixed cost, and the bigger the area, the more bad chips there are so yield goes down as size goes up).
Right.
But try to actually compute the cost of a wafer for a given process, the yield (fault density) and waste (300mm placement).
I did from best public sources (chipworks and others) and cost is nowhere near what the current market makes us believe. I posted it here a couple months back. Maybe, you have better sources and can redo the exercise?
BTW, if image sensors can be made by avoiding the most recent process, machines are much cheaper or deprecated and machines are the dominant part in wafer cost.
Edit... Without looking up, a quick search tells me a 300mm wafer should be significantly below 10000$. Maybe, I should estimate it by going by DRAM cost and known DRAM silicon real estate. A wafer minus waste gives 60 FF dies, about 40 good ones from defect density numbers I could find (yield). This is $250 per FF die. Or less. Certainly not a decisive factor in overall cost. We're not talking cost here, we're talking about what the market is ready to pay. And as long as we still pay 1000$+ for APSC cameras, FF prices will stay high too.
What is happening now is that sensor fabs charge 100% margin (500$) and camera makers charge another 150% margin, making FF cameras 1500$ more expensive for us. With more competition, margins would come down.