Forgot Password
Pentax Camera Forums Home
 

Reply
Show Printable Version 200 Likes Search this Thread
10-22-2020, 03:39 PM   #91
Digitiser of Film
Loyal Site Supporter
BigMackCam's Avatar

Join Date: Mar 2010
Location: North East of England
Posts: 20,705
QuoteOriginally posted by awscreo Quote
I think most of that performance is coming from the p2000. Your pro apps are probably utilizing that a ton.
Possibly. That said, it runs very quickly just off the integrated Intel GPU for most tasks. The place I notice it most is batch processing of raw-to-TIFF exports...

QuoteOriginally posted by awscreo Quote
In terms of an i7 in a tablet - Microsoft Surface had i7 models.
Sure, but as you know there's a huge difference between different i7 processors. Those in the Surface - as with most tablets - are energy-efficient "U" versions. They benchmark considerably slower...

QuoteOriginally posted by awscreo Quote
The cpu comment was a reply to your comment stating the "desktop performance". If it works for you, that's great and no change is needed. But there's always something more powerful.
Indeed, but rarely (if ever?) in tablet form. My point is, tablets are fantastic but not if processing power is your priority. Laptops with higher-end CPUs get rather closer to decent desktops. Which one is better depends entirely on individual requirements.

QuoteOriginally posted by awscreo Quote
I don't mean to say that you have to follow what I'm doing with my platforms, it's like suggesting that the music I listen to is what you should be listening to - i.e pointless and dumb. I was simply sharing my experience with laptops.
Same here

QuoteOriginally posted by awscreo Quote
Most of my hate from laptops comes from being stuck with an Uber expensive MacBook pro for work, and just not getting the performance I need out of it.
Understood. With respect, though, that's a pretty small sample group for a class of machine that ranges from the lowest-powered student laptops with no replaceable components to high-end mobile workstations like my own where every component is performance-orientated and many are upgradeable. Similarly, tablets cover a huge range of specs - some barely better than a half-decent Android phone... others closer to medium-level laptops. Like I said previously, there's a lot of crossover... but they're different devices with different competencies and specialisations...


Last edited by BigMackCam; 10-22-2020 at 04:09 PM.
10-22-2020, 05:46 PM   #92
Veteran Member




Join Date: May 2016
Photos: Gallery | Albums
Posts: 2,722
QuoteOriginally posted by BigMackCam Quote
Possibly. That said, it runs very quickly just off the integrated Intel GPU for most tasks. The place I notice it most is batch processing of raw-to-TIFF exports...



Sure, but as you know there's a huge difference between different i7 processors. Those in the Surface - as with most tablets - are energy-efficient "U" versions. They benchmark considerably slower...



Indeed, but rarely (if ever?) in tablet form. My point is, tablets are fantastic but not if processing power is your priority. Laptops with higher-end CPUs get rather closer to decent desktops. Which one is better depends entirely on individual requirements.



Same here



Understood. With respect, though, that's a pretty small sample group for a class of machine that ranges from the lowest-powered student laptops with no replaceable components to high-end mobile workstations like my own where every component is performance-orientated and many are upgradeable. Similarly, tablets cover a huge range of specs - some barely better than a half-decent Android phone... others closer to medium-level laptops. Like I said previously, there's a lot of crossover... but they're different devices with different competencies and specialisations...
In fairness, I also owned a regular type Acer laptop in the past, and also more expensive gaming machines with what seemed to me powerful gpus. Last I bought personally was the MSI Ge72, with top of the line components. And it lasted me maybe 2 years in terms of gaming performance, about the same in terms of photography due to me buying K-1 and increasing my raws from 16mp mft to 36mp ff.

I've happily built a ryzen/Nvidia based gpu last fall, and will probably never buy a laptop. At least, not in next few years.

---------- Post added 10-22-20 at 09:00 PM ----------

QuoteOriginally posted by BigMackCam Quote
Possibly. That said, it runs very quickly just off the integrated Intel GPU for most tasks. The place I notice it most is batch processing of raw-to-TIFF exports...



Sure, but as you know there's a huge difference between different i7 processors. Those in the Surface - as with most tablets - are energy-efficient "U" versions. They benchmark considerably slower...



Indeed, but rarely (if ever?) in tablet form. My point is, tablets are fantastic but not if processing power is your priority. Laptops with higher-end CPUs get rather closer to decent desktops. Which one is better depends entirely on individual requirements.



Same here



Understood. With respect, though, that's a pretty small sample group for a class of machine that ranges from the lowest-powered student laptops with no replaceable components to high-end mobile workstations like my own where every component is performance-orientated and many are upgradeable. Similarly, tablets cover a huge range of specs - some barely better than a half-decent Android phone... others closer to medium-level laptops. Like I said previously, there's a lot of crossover... but they're different devices with different competencies and specialisations...
Out of curiosity, looked at the benches.
So according to this one here, the a13 bionic chip from Apple outperforms the i7-8750h, and goes toe to toe with a newer i7-9880h. I had no idea the apple chip was so competitive to be honest.

Intel Core i9-9880H vs Apple A13 Bionic vs Intel Core i7-8750H

Ps: we're so off topic lol, kind of hilarious.

Last edited by awscreo; 10-22-2020 at 06:01 PM.
10-22-2020, 07:19 PM   #93
dbs
Pentaxian




Join Date: Jun 2010
Location: Clare Valley S A
Photos: Albums
Posts: 7,568
Original Poster
Dave here

Of topic ?
Nah.
Gaming desktops are also a " must be the latest and greatest " , you can see a pattern here.
I've got my coffee and tim tams so I'm sweet.


Dave


ps yes Noel he does live at home and rent free ( whats happened to the world )
10-23-2020, 12:56 AM - 1 Like   #94
Site Supporter
Site Supporter




Join Date: May 2019
Photos: Albums
Posts: 5,976
QuoteOriginally posted by slartibartfast01 Quote
Is a laptop really expensive compared to an iPad Pro? Unless it's an Apple laptop of course.
No, my Asus gaming laptop (upgradeable 12 GB RAM, i7 CPU, 6GB GPU, SSD+HDD, 17" 75 fps 100% sRGB screen so good enough for Lightroom too) was a sharp 1000€ (tax included) three years ago. That's about 2-300 bucks under an iPad pro.

For reference a comparable Apple somethingbook is 2400€ and I would have to install Windows anyway

---------- Post added 10-23-20 at 01:00 AM ----------

QuoteOriginally posted by awscreo Quote
In fairness, I also owned a regular type Acer laptop in the past, and also more expensive gaming machines with what seemed to me powerful gpus. Last I bought personally was the MSI Ge72, with top of the line components. And it lasted me maybe 2 years in terms of gaming performance, about the same in terms of photography due to me buying K-1 and increasing my raws from 16mp mft to 36mp ff.

I've happily built a ryzen/Nvidia based gpu last fall, and will probably never buy a laptop. At least, not in next few years.

---------- Post added 10-22-20 at 09:00 PM ----------



Out of curiosity, looked at the benches.
So according to this one here, the a13 bionic chip from Apple outperforms the i7-8750h, and goes toe to toe with a newer i7-9880h. I had no idea the apple chip was so competitive to be honest.

Intel Core i9-9880H vs Apple A13 Bionic vs Intel Core i7-8750H

Ps: we're so off topic lol, kind of hilarious.
Nah, at 1920x1080 a 1060GTX runs most of the current stuff at 60fps if you don't go ham on special effects. The problem is gonna be the next couple years because the consoles are *loaded* with hardware this gen... Devs are going to go from sidelining* optimization to ignoring it completely.

The ARM chip uses a different architecture so benchmarks are a bit iffy but they look super promising. X86 might be at an end. The benchmark looks weird, though... The ARM chip gets literally the same numbers? On scales that go to 20K points? Repeated in half a dozen tests or more? I don't trust those numbers.

*There is no excuse for stuff that comes out today to both look AND run worse than BF3 or similar early 2010s stuff.


Last edited by Serkevan; 10-23-2020 at 01:05 AM.
10-23-2020, 01:45 AM   #95
Digitiser of Film
Loyal Site Supporter
BigMackCam's Avatar

Join Date: Mar 2010
Location: North East of England
Posts: 20,705
QuoteOriginally posted by dbs Quote
Of topic ?
Nah.
Gaming desktops are also a " must be the latest and greatest " , you can see a pattern here.
I've got my coffee and tim tams so I'm sweet.
With that seal of approval, I'll continue

QuoteOriginally posted by awscreo Quote
In fairness, I also owned a regular type Acer laptop in the past, and also more expensive gaming machines with what seemed to me powerful gpus. Last I bought personally was the MSI Ge72, with top of the line components. And it lasted me maybe 2 years in terms of gaming performance, about the same in terms of photography due to me buying K-1 and increasing my raws from 16mp mft to 36mp ff.

I've happily built a ryzen/Nvidia based gpu last fall, and will probably never buy a laptop. At least, not in next few years.
Unsurprisingly, we have different use cases. I'm not a gamer... The most CPU and GPU intensive stuff I do (and I'm ever likely to do) is batch photo processing and exports. Frankly, I don't actually need anything like the processing power I already have. I over-spec'd my HP ZBook 15 G5 a couple of years back to allow for OS and software advances and my own possible increase in (non-gaming) demands over a projected period of many years - hence the then top-of-the-line i7-8750H and Quadro P2000, both of which are now considered "long in the tooth" But if the chassis, display and hardware all hold up, I fully expect to be using the same laptop ten years from now with no significant concerns over performance and no need to upgrade... and if any component other than the main board or display fails, I'll slide off the service access cover and replace it myself in minutes.

As a side note, my older, structurally-failing consumer-grade HP laptop remains in daily use and is still capable of delivering to my fairly modest requirements. That packs a lowly 4th gen i7-4700MQ with GeForce GT740M GPU, 2TB HDD and no SSD. It's a dinosaur by current standards, but still quite capable of doing what I currently need after eight years of daily thrashing. I'm hoping for even better longevity from my newer ZBook.

QuoteOriginally posted by awscreo Quote
Out of curiosity, looked at the benches.
So according to this one here, the a13 bionic chip from Apple outperforms the i7-8750h, and goes toe to toe with a newer i7-9880h. I had no idea the apple chip was so competitive to be honest.

Intel Core i9-9880H vs Apple A13 Bionic vs Intel Core i7-8750H

Ps: we're so off topic lol, kind of hilarious.
I'm not familiar with Apple's own chips, nor especially familiar with its products (other than knowing they exist). I know they were using Intel processors in their high end MacBook Pro model, but I read somewhere that they're switching over to their own processors going forward. I'm sure they wouldn't do that unless they could at least match - or, preferably exceed - Intel's performance...

Last edited by BigMackCam; 10-23-2020 at 02:07 AM.
10-23-2020, 06:03 AM   #96
Pentaxian
cmohr's Avatar

Join Date: Apr 2011
Location: Brisbane. Australia
Photos: Gallery
Posts: 1,824
Unsubscribe.
10-23-2020, 06:17 AM   #97
Veteran Member




Join Date: May 2016
Photos: Gallery | Albums
Posts: 2,722
QuoteOriginally posted by BigMackCam Quote
I'm not familiar with Apple's own chips, nor especially familiar with its products (other than knowing they exist). I know they were using Intel processors in their high end MacBook Pro model, but I read somewhere that they're switching over to their own processors going forward. I'm sure they wouldn't do that unless they could at least match - or, preferably exceed - Intel's performance...
Me neither to be honest, this ipad pro is the first apple product i liked in ages. I don't really know much about their cpu's architecture, but it's pretty cool that it seems to be on par (at least in synthetic bechmarks) with intel/AMD.

My use case also includes working with graphic applications (PS, AI, Figma), photo editing, animation (AE), and 3D packages. Gaming GPU's do quite well in all of those, so it makes sense to get them.

One thing to note - my MSI laptop has a cracked hinge now, which is pretty annoying as it's obviously out of warranty. I guess they spent most of the money on the hardware (that was pretty good for it's time, i7 5700hq, 970m, ssd etc), but the build quality is truly not great. I guess that's one thing apple has, at least the chassis is not falling apart after few years (although they have a ton of other hardware issues).

---------- Post added 10-23-20 at 09:20 AM ----------

QuoteOriginally posted by Serkevan Quote
Nah, at 1920x1080 a 1060GTX runs most of the current stuff at 60fps if you don't go ham on special effects. The problem is gonna be the next couple years because the consoles are *loaded* with hardware this gen... Devs are going to go from sidelining* optimization to ignoring it completely.

The ARM chip uses a different architecture so benchmarks are a bit iffy but they look super promising. X86 might be at an end. The benchmark looks weird, though... The ARM chip gets literally the same numbers? On scales that go to 20K points? Repeated in half a dozen tests or more? I don't trust those numbers.

*There is no excuse for stuff that comes out today to both look AND run worse than BF3 or similar early 2010s stuff.
I can't go cutting my settings no more) Built a new ryzen/nvidia pc last fall and it's munching all the games at 1440p maxed out. I was planning to get the 3080 when it dropped, but as you probably know, it's easier to find water in the desert than find one of those gpu's in stock new Zen3 cpu's look very promising too. Maybe i'll wait for the hype to boil down, and upgrade next summer. a 5950x/3080 20gb machine should be spectacular.

*worst offender I know is Fallout 76, that thing looks hideous for 2020, and plays like it's running on Intel Celeron/Voodoo graphics.

10-23-2020, 06:58 AM   #98
Digitiser of Film
Loyal Site Supporter
BigMackCam's Avatar

Join Date: Mar 2010
Location: North East of England
Posts: 20,705
QuoteOriginally posted by awscreo Quote
Me neither to be honest, this ipad pro is the first apple product i liked in ages. I don't really know much about their cpu's architecture, but it's pretty cool that it seems to be on par (at least in synthetic bechmarks) with intel/AMD.

My use case also includes working with graphic applications (PS, AI, Figma), photo editing, animation (AE), and 3D packages. Gaming GPU's do quite well in all of those, so it makes sense to get them.
I suspect your gaming GPUs are considerably more powerful than my Quadro P2000... having said that, the P2000 is good for the professional applications you mention, such as 3D / CAD, video and photo work. Not quite so good for gaming, I believe, but not bad for a lower-mid-level workstation GPU.

QuoteOriginally posted by awscreo Quote
One thing to note - my MSI laptop has a cracked hinge now, which is pretty annoying as it's obviously out of warranty. I guess they spent most of the money on the hardware (that was pretty good for it's time, i7 5700hq, 970m, ssd etc), but the build quality is truly not great. I guess that's one thing apple has, at least the chassis is not falling apart after few years (although they have a ton of other hardware issues).
That's where my old HP ENVY 17 failed too... and it happened about six months out of warranty. I've repaired the hinge twice - the last time using epoxy resin... but the whole chassis is plastic and is gradually disintegrating. One day, it'll just fall apart and we'll be done. Oh, and the keyboard backlight is on its last legs, after I spilled half a glass of wine over it A few keys were dead for a couple of days, but it all works OK now, except for varied backlight levels across the keyboard. I can't blame that one on HP My ZBook mobile workstation is mostly metal and has a moisture resistant keyboard with drainage channels to protect against spills that I hope never to test...
10-23-2020, 07:31 AM   #99
Site Supporter
Site Supporter




Join Date: May 2019
Photos: Albums
Posts: 5,976
QuoteOriginally posted by BigMackCam Quote
I suspect your gaming GPUs are considerably more powerful than my Quadro P2000... having said that, the P2000 is good for the professional applications you mention, such as 3D / CAD, video and photo work. Not quite so good for gaming, I believe, but not bad for a lower-mid-level workstation GPU.
Yeah, apart from gaming a modest GPU is more than enough to power most applications - including "graphical" stuff like 3D CAD, as that one (I think, at least it did before) leans heavily on the CPU. And honestly - horses for courses, it doesn't have to be a good GPU for gaming to be a good GPU for your use . I've always refrained from recommending stuff with monster graphics to people who wouldn't need them... just like me and the ex had to insist a lot to a friend that buying an A6400 with 18-135 was ridiculous when they were not even starting to push their RX100...V?VI? One of those two. It's incredible how much stock some people can put into gear instead of their skills and needs...

Where I do feel the limits of my PC is in LR, because Adobe insist on the codebase being contemporary to the Torah, so multi-core optimization is nonexistant. I have to give a more serious try to RawTherapee (or was it Darktable?) since I heard from Bert that it can run OpenGL and the GPU can pull off much more than a single anemic 2.3GHz core on my i7-6700...
10-23-2020, 08:21 AM   #100
Pentaxian




Join Date: Feb 2010
Location: Northern Michigan
Photos: Gallery | Albums
Posts: 6,176
QuoteOriginally posted by surfar Quote
Anyone doubt that phn capability isnt going to outstrip "most" cameras in the future?
I would not merely doubt it, I would reject it altogether. Admittedly, such contentions involve very complicated issues related to the limits of technology, cognitive human nature, and the complex interaction of mind and matter. Specifically, the romantic notion that technology is always getting "better," so that one can make facile extrapolations into the future (e.g., smart phones will keep getting better ad infinitum until they pass ILCs), is just not true. There are physical limits to advancements in technology, and in sensor technology we've seen no real progress in the last ten years. The decline in camera sales is not a consequence of smart phones, but of the fact that because of the lack of significant technology improvement, it's becoming increasingly difficult to persuade sensible people to upgrade their gear. If the DxoMark figures can be trusted, no FF camera has provided significant improvement on the ISO performance of the Nikon D3s, which was releasted in 2009. Smart phones rely heavily on computational technology, but there are limits there as well.
10-23-2020, 08:22 AM - 1 Like   #101
Site Supporter
Site Supporter




Join Date: Jan 2013
Location: Hampshire, UK
Posts: 1,654
Interesting what you say about multi core utilisation on Photoshop. It clearly isn't that efficient. However, Adobe Bridge is. I use it extensively and when importing and preparing the files, all my cores are evenly used.

PShop is sluggish, probably, because of its aged base design. With 32GB ram, there's no limitation when I run PS scripts, but the total CPU utilisation only rarely goes above around 30%, with some cores just ticking over. Shame, but I've no real complaints as I find I multitask when PS is doing its thing - often this is valuable thinking time and the occasional Eureka moment, eg that script will never work, and time for a coffee 😉
10-23-2020, 08:31 AM   #102
Veteran Member




Join Date: May 2016
Photos: Gallery | Albums
Posts: 2,722
PS and lr have a long way to go in terms of optimizing for modern multi core cpus. They still prefer a fast single core I believe, that's why Intel cpus were doing better in Adobe benches up until the current amd generation. But I think they did say that they will be working on that (or already are).
10-23-2020, 08:32 AM - 1 Like   #103
Site Supporter
Site Supporter




Join Date: Dec 2012
Photos: Gallery | Albums
Posts: 2,809
QuoteOriginally posted by Serkevan Quote
Yeah, apart from gaming a modest GPU is more than enough to power most applications - including "graphical" stuff like 3D CAD, as that one (I think, at least it did before) leans heavily on the CPU. And honestly - horses for courses, it doesn't have to be a good GPU for gaming to be a good GPU for your use . I've always refrained from recommending stuff with monster graphics to people who wouldn't need them... just like me and the ex had to insist a lot to a friend that buying an A6400 with 18-135 was ridiculous when they were not even starting to push their RX100...V?VI? One of those two. It's incredible how much stock some people can put into gear instead of their skills and needs...

Where I do feel the limits of my PC is in LR, because Adobe insist on the codebase being contemporary to the Torah, so multi-core optimization is nonexistant. I have to give a more serious try to RawTherapee (or was it Darktable?) since I heard from Bert that it can run OpenGL and the GPU can pull off much more than a single anemic 2.3GHz core on my i7-6700...
I have a pretty substantial desktop system that I run RawTherapee and I'm still sometimes surprised at the processing power and time it takes to do some tasks. It's a Ryzen 9 3.8GHz 12-core processor, 32G of fast RAM, Sabret Rocket M.2 drive, 8G Radeon video card. And if I turn on capture sharpening, tone mapping, dynamic range compression... it can take 3-4 seconds to crank out a 1:1 rendering of a 24MP K-3ii image. If I look at the system resources I get all 24 threads at 50% or more capacity. I can imagine a mid-level system taking 30 seconds to show you the image, and you'd need to shovel coal in the boiler if the code didn't support multi-core.

I can watch jpeg exports peak briefly at 80% utilization of all 24 threads/cores.
10-23-2020, 08:38 AM   #104
Veteran Member
LeeRunge's Avatar

Join Date: Jun 2010
Posts: 996
QuoteOriginally posted by awscreo Quote
PS and lr have a long way to go in terms of optimizing for modern multi core cpus. They still prefer a fast single core I believe, that's why Intel cpus were doing better in Adobe benches up until the current amd generation. But I think they did say that they will be working on that (or already are).
Is there a benchmark out there for Lightroom vs Lightroom mobile for processing RAWS?


I’d be curious if the mobile program takes more advantage of the multi cores on IPads/Phones.
10-23-2020, 08:52 AM   #105
Veteran Member




Join Date: May 2016
Photos: Gallery | Albums
Posts: 2,722
QuoteOriginally posted by LeeRunge Quote
Is there a benchmark out there for Lightroom vs Lightroom mobile for processing RAWS?


I’d be curious if the mobile program takes more advantage of the multi cores on IPads/Phones.
I haven't seen one. I'm also curious if mobile apps are more "modern" in terms of the core design.

In terms of multi core cpu's with the desktop app, Gamers Nexus often include adobe apps in their benchmarks for the CPU reviews:
AMD Ryzen 9 3950X Review: Premiere, Blender, Overclocking, & Gaming CPU Benchmarks - YouTube
Reply

Bookmarks
  • Submit Thread to Facebook Facebook
  • Submit Thread to Twitter Twitter
  • Submit Thread to Digg Digg
Tags - Make this thread easier to find by adding keywords to it!
ai, android, apple, components, data, devices, dslr, effort, gpu, graphics, illustrator, image, intel, ipad, iphone, laptop, laptops, paper, phone, photography, program, reproduction, result, sizes, sketch, skill, specs, tablets, vector

Similar Threads
Thread Thread Starter Forum Replies Last Post
K-3 with huge noise starting with ISO 800; iPhone/ performing better myNevista Pentax K-3 & K-3 II 33 07-18-2019 02:28 PM
Sorry, Apple: The iPhone 7 camera is not better than Samsung's Galaxy S7 interested_observer Canon, Nikon, Sony, and Other Camera Brands 27 09-20-2016 03:53 PM
My daughter's friends iphone pictures are better than mine with my K-5 jake14mw Pentax K-5 & K-5 II 14 07-18-2013 01:30 PM
Iphone PICs are better than my K10D w/ SIgma 17-70mm vmaniqui Pentax DSLR Discussion 28 07-01-2012 08:43 PM
Point and Shoot better than DSLR Dale General Talk 11 05-11-2008 07:24 PM



All times are GMT -7. The time now is 01:56 AM. | See also: NikonForums.com, CanonForums.com part of our network of photo forums!
  • Red (Default)
  • Green
  • Gray
  • Dark
  • Dark Yellow
  • Dark Blue
  • Old Red
  • Old Green
  • Old Gray
  • Dial-Up Style
Hello! It's great to see you back on the forum! Have you considered joining the community?
register
Creating a FREE ACCOUNT takes under a minute, removes ads, and lets you post! [Dismiss]
Top