Originally posted by jsherman999 Are you fluent in python & perl, do you know Linux, are you
NAC'd, can you work from home and what's your hourly rate?
Yes, yes, yes, not willing to do clearance work right now (did a service academy, no thanks), prefer a real job that I can come in to every day, and as of today I'm no longer working hourly. Respectively.
Originally posted by Erictator So, are you willing to bet your "real-world service business" on quantum computing? Yes or no will do.
I mean, it's a pretty straightforward test to see who's an idiot in the field: if you think QUANTUM is the answer to all your problems, you're an idiot who's going to run your business into the ground.
Your "internet real business" means as much as my "internet real qualifications". Diddly squat, if what you're saying doesn't make sense.
Quote: As to how any of this relates to camera CPU's leaves a lot to be desired...heck my cell phone is a quad core nowadays...but the bottlenecks are always the lowest common denominator... bus and cache memory and well as storage memory speed that has to be able to keep up & power and heat which in total has to be within reason for LSI in something as small as a camera running on a little battery that's expected to be good for at least 300 actuations with a flash thrown in ocassionally on a single 1Hr charge.
Eric
So give me a rough estimate in terms of computations per watt on how much an ASIC consumes relative to a general-purpose phone CPU? Should be interesting, because it's orders of magnitude difference.
If you want to get a lot done on a little power - you program an FPGA or get an ASIC burned, same as it's been for the last 40 years.
Quote: My guess is that large files will be easier to handle at about the same rate as the electronic technology progresses to handle the throughput, meaning it will be a net break even with modest gains over time in FPS as new sensors come out. The PC tech will follow the same curve, and the latest output images will be handled within reasonable limits on current generation PC's or the manufacturer wouldn't bother making a camera for the average consumer that couldn't handle it.
Agreed - the person who said that new CPUs and memory are advancing over sensors by a factor of 2 is absolutely correct. The PCs aren't the limiting factor here, it's the diffraction limit. Quantum doesn't do
crap for cameras. If your computer is derpy in TYOOL 2014 - it's your fault.
Last edited by Paul MaudDib; 10-31-2014 at 08:27 AM.