Invest the money in a good 4k monitor, and skimp everywhere else
Fast RAM is probably a good idea (Most CPU's have < 8Mb CPU cache, and RAW images tend to be much bigger!)
Get a good SSD. If you really want to push the boat out, get one of the 512Gb Samsung M2/PCI-e SSD's (approx 4x faster than a normal SSD).
As for graphics cards, these are easy to upgrade in the future if needed. I'd be tempted to just spend the money on a monitor, and upgrade if you feel you need to.
Most image processing is IO bound, and not CPU bound (I speak from experience here, my day job is writing CPU/GPU optimised code for graphics processing). *IF* the code is optimised for GPU, then this can be a huge performance win (typically a good GPU will have memory speeds faster than 5000Mhz, compared with main system ram of 1600 -> 2133). I was under the impression that lightroom was starting to utilise pixel & compute shaders in OpenGL 3/4, but I wasn't aware of a Cuda requirement? (Honestly, as a software developer, I can't see why anyone would target cuda when OpenGL/OpenCL has wider support?). Personally speaking (as a developer), I've always found the ATI drivers to be better than the Nvidia ones (admittedly, updating ATI drivers can be a ballache, usually requiring some 3rd party driver-removal tool). For actual OpenGL/OpenCL support though, ATI is always better (unless you're using linux maybe).
Just checked here:
https://helpx.adobe.com/lightroom/kb/lightroom-gpu-faq.html
Yup, OpenGL 3.3 only (which even the Intel onboard GPU will support)
The only practical reason why an i7 may be better than an i5, is simply larger cache memory on the CPU. (That's also the only reason you'd consider a Xeon over an i7). A 'k' series i5 or i7 will allow you to use higher than normal memory speeds, but to be honest it's going to be hard to go wrong with any desktop offering right now. Invest in the monitor, a good SSD(or two), and then any i3/i5/i7 you throw at it, will be a joy to work with.