Forgot Password
Pentax Camera Forums Home
 

Reply
Show Printable Version 11 Likes Search this Thread
10-24-2014, 01:28 PM   #16
Pentaxian




Join Date: Apr 2011
Photos: Albums
Posts: 8,756
I read reviews to get an impression of others' experience.


I find it silly for someone to review anything where their copy has what is obviously a fault, and then comment on the normal performance. For example, a lens with obvious damage being criticised for poor performance resulting from that damage.
When reviewing NEW items poor performance straight out of the box is worth reporting - that tells something about either the design or manufacture process for the item (but also report if the item were obtained from a non-standard delivery channel, such as the bargain bin disposal - seller discounted because of reason to suspect, or grey market - may have been rejected from proper sale and found by someone who saw it fall off the back of the dumpster.
When reviewing old items, try to determine if the performance is 'normal' or the result of the specific history of the sample. A review of a type of something cannot be valid if the sample is not truly representative of the normal characteristics of the type.


I do not know about all of you, but I use reviews when buying a significant items which is outside my usual experience. So, I bought a treadmill of a certain brand. Before buying it 2.3 years ago I found all the reviews of the brand were quite favourable - not ridiculous, but seemed to be "I bought it and find it good or very good". After 2 years I had had a series of warranty repairs of major components (like the main electronics board twice). I looked around for another at other shops and found the local industry would not touch it for a trade-in because the brand was known to be poor. I looked for reviews again, and still they were only good. No-one reported on reliability. I suspect supplier manipulation in the accessible reviews. Anyway, the next action on that is to try to take up my rights under 'statutory warranty', which is that the product must be suitable for a reasonable duration for use as what it was sold to be.

10-24-2014, 01:29 PM - 1 Like   #17
Loyal Site Supporter
Loyal Site Supporter




Join Date: Mar 2009
Location: Gladys, Virginia
Photos: Gallery
Posts: 27,663
To me, the big thing is making sure there is a narrative with the star rating. that is the only way to know if the person is truly familiar with good equipment and if their reason for rating something poorly (or extremely well) are something that I want or, would worry about with that equipment.

I think when I review something, I try to break down strengths and weaknesses. Some "weaknesses" are silly -- "this lens isn't wide enough" or, something like that. A 30mm lens is a 30mm lens -- docking a lens points because it is the focal length it is, is just silliness.
10-24-2014, 05:11 PM   #18
Senior Member




Join Date: Sep 2014
Posts: 128
The accompanying Q-S1 received a 5. Q seems to be a very polarizing system, and some just don't get along with it.

Last edited by goodnight; 10-24-2014 at 05:12 PM. Reason: post got cut off somehow
10-24-2014, 05:39 PM   #19
Otis Memorial Pentaxian
stevebrot's Avatar

Join Date: Mar 2007
Location: Vancouver (USA)
Photos: Gallery | Albums
Posts: 42,007
QuoteOriginally posted by tim60 Quote
I find it silly for someone to review anything where their copy has what is obviously a fault, and then comment on the normal performance. For example, a lens with obvious damage being criticised for poor performance resulting from that damage.
A possible exception might be for conditions where the damage is due to a common fault of the item. A good example would be a jammed aperture ring on a Pentax-A 50/1.7. The problem is exceedingly common* and a well-known design/manufacturing fault of that model lens. A review on the lens might mention the defect and detail that it is a known issue and down-mark the lens appropriately. Is the problem fatal? Probably not unless someone buys it for occasional use with a bellows or with a mirrorless camera. Grading the lens with that in mind and discussing the problem provides a valuable assistance to a potential purchaser.


Steve

* Most, if not all A 50/1.7 either have this problem or will develop it with use. The issue is a brittle plastic part that breaks and jams the mechanism. There is a fix, but it requires special tools and steady hands.

10-24-2014, 07:01 PM   #20
dms
Site Supporter
Site Supporter




Join Date: Aug 2011
Location: New York, NY
Photos: Gallery
Posts: 2,192
A statistical approach to the lens (or camera, etc.) data

I think a simple statistical analysis would work. By the way this is likely a bit too mathematical for many here--but the point is simply that a method can be used. And I am proposing one here. There are likely other statistic methodologies that may be of use here--but this is what I am familiar with and use.

If the system automatically calculated the average and standard deviation, and outliers (outside +/- 3 standard deviations--meaning likely not relevant) not be included in the score, and flagged as an "outlier".

I would think giving the +/- 2 standard deviations spread (approximately 95% of the data) would be useful [except the upper limit is to be a 10]. And I guess it would need to be called something else. Maybe "statistically relevant spread of the scores."
10-25-2014, 09:08 AM - 1 Like   #21
Veteran Member




Join Date: Apr 2014
Location: Cambridge, MA
Posts: 935
QuoteOriginally posted by Moropo Quote
I do not agree with your thinking. Imagine the following scenario: an item is produced and the amount of defective items is higher than it should be. In other words, buyers of such item are receiving a defective item in a higher rate than usual. This is a problem that I as a potential buyer would want to be aware. With your logic none of the reviews should mention such defective items, ie, only perfectly working items should be reviewed. Kind of defeats the purpose.
My feelings exactly. For instance, when I was looking for a pair of astronomy binoculars, it was really useful when a lot of the reviews of one particular pair noted that they arrived out of collimation.
10-25-2014, 10:22 AM   #22
Otis Memorial Pentaxian
stevebrot's Avatar

Join Date: Mar 2007
Location: Vancouver (USA)
Photos: Gallery | Albums
Posts: 42,007
QuoteOriginally posted by dms Quote
I think a simple statistical analysis would work. By the way this is likely a bit too mathematical for many here--but the point is simply that a method can be used. And I am proposing one here. There are likely other statistic methodologies that may be of use here--but this is what I am familiar with and use.

If the system automatically calculated the average and standard deviation, and outliers (outside +/- 3 standard deviations--meaning likely not relevant) not be included in the score, and flagged as an "outlier".

I would think giving the +/- 2 standard deviations spread (approximately 95% of the data) would be useful [except the upper limit is to be a 10]. And I guess it would need to be called something else. Maybe "statistically relevant spread of the scores."
I think it would be enough to narrow the ranking options to integers 1 through 5. That will do a lot towards normalizing the data. One thing to consider is that perhaps the real world distribution is not normal (should not be normalized). The other think to consider is that the data are not probabilistic.


Steve

10-25-2014, 11:29 AM   #23
Site Supporter
Site Supporter
luftfluss's Avatar

Join Date: Jun 2011
Location: NJ
Photos: Gallery | Albums
Posts: 11,627
Original Poster
QuoteOriginally posted by Moropo Quote
I do not agree with your thinking. Imagine the following scenario: an item is produced and the amount of defective items is higher than it should be. In other words, buyers of such item are receiving a defective item in a higher rate than usual. This is a problem that I as a potential buyer would want to be aware. With your logic none of the reviews should mention such defective items, ie, only perfectly working items should be reviewed. Kind of defeats the purpose.

- my opinion
But do you know what induced the failure? Perhaps the item in question suffered from poor handling by the store/seller (especially since many lenses are bought used)? How is a review of the lens itself valid if it has been damaged?

QuoteQuote:
With your logic none of the reviews should mention such defective items, ie, only perfectly working items should be reviewed. Kind of defeats the purpose.
Not at all. For instance, a perfectly functioning 18-55 kit lens is not nearly as sharp as the Sigma 18-35. Reviews can help give the measure of *how* much better the Sigma lens is, by rating sharpness, AF ability, CA, etc.

For those who think that including ratings of obviously defective equipment is the way to determine if there is a pattern of failure, I don't think that's the case at all. The PF community, IMO, does a really good job discussing (sometimes heatedly, lol) a pattern of equipment-related failures and issues, like mirror-flop, SDM failure, etc.

The good thing is that once there are a multitude of reviews, the negative outliers has less impact.
10-25-2014, 11:56 AM - 1 Like   #24
Otis Memorial Pentaxian
stevebrot's Avatar

Join Date: Mar 2007
Location: Vancouver (USA)
Photos: Gallery | Albums
Posts: 42,007
QuoteOriginally posted by luftfluss Quote
The good thing is that once there are a multitude of reviews, the negative outliers has less impact.
I wish that Adam would put a disclaimer on the aggregate stats indicating that they are highly subjective. Another area that is misleading is the average price and price history. Many of us bought our stuff new 30+ years ago and those prices are included in the mix.


Steve
10-25-2014, 12:38 PM   #25
Veteran Member




Join Date: Jun 2013
Location: Nevada, USA
Photos: Albums
Posts: 3,348
I checked out the review and it is indeed a poor review. Just a rant on their part, I believe. If I had to guess what went wrong I think the reviewer doesn't understand focus peaking and zooming in through the LCD to fine tune focusing. I also turn on the flashing highlight and shadows function of the Q. They are at their peak (i.e. highest contrast) when the image is in focus.
10-25-2014, 05:19 PM   #26
Pentaxian
reeftool's Avatar

Join Date: Dec 2007
Location: Upstate New York
Photos: Gallery | Albums
Posts: 9,555
I see things in reviews every day that leave me scratching my head in wonder. Many times I have seen 'distortion' listed as a "con" when reviewing fisheye lenses. Really? The dumbest I have seen recently was on a product review at Newegg for a B&W laser printer that marked it down a couple of eggs because it didn't print in color. You really need to read the reviews and ignore the scores because so many reviews are just downright dumb. How can anyone seriously justify taking points off a review score because a product won't do something it wasn't designed to do? Fisheye lenses are always a target because some software claims to be able to "de-fish" the shots. When the results aren't good, it's the lens that gets the bad review.
10-26-2014, 09:56 PM   #27
dms
Site Supporter
Site Supporter




Join Date: Aug 2011
Location: New York, NY
Photos: Gallery
Posts: 2,192
QuoteOriginally posted by stevebrot Quote
I think it would be enough to narrow the ranking options to integers 1 through 5. That will do a lot towards normalizing the data. One thing to consider is that perhaps the real world distribution is not normal (should not be normalized). The other think to consider is that the data are not probabilistic.
I think the pedigree of the statistical method need not be very high for this application--and that is why I suggested a very simple approach. I believe there is a tendency to require too high a set of requirements, and as a result not do something that is a practical benefit. And it is in this light that I (below) address Steve's comments.

-- I agree the distribution is not normal, and that was implicit in my stating the need to truncate the upper limit of the 2 standard deviations spread to 10 (and I should have added truncate the lower limit to 1).
-- I am not sure what is meant by the data is not probabilistic. If you mean the reviewer sees prior reviews--that is certainly an influence--but to the degree that influence is present in all reviews, it is just part of the process.

Last edited by dms; 10-26-2014 at 10:02 PM.
10-26-2014, 10:06 PM   #28
Pentaxian
SpecialK's Avatar

Join Date: Dec 2006
Location: So California
Photos: Gallery
Posts: 16,482
QuoteQuote:
personally, i believe no lens is likely to rate lower than 7 , and this includes a whole lot of absolute crap out there.
This is part of the problem - 5 is average, 7 is pretty good. Crap (below average, at most) can not be 7.
10-26-2014, 10:26 PM   #29
Otis Memorial Pentaxian
stevebrot's Avatar

Join Date: Mar 2007
Location: Vancouver (USA)
Photos: Gallery | Albums
Posts: 42,007
QuoteOriginally posted by SpecialK Quote
This is part of the problem - 5 is average, 7 is pretty good. Crap (below average, at most) can not be 7.
In the full scope of camera optics (historic as well as current), most lenses reviewed here are easily in the 75th percentile. Many of our users are old enough to remember when truly poor lenses were very common on the market. The normal lens that was attached to my first SLR was part of type of bait 'n switch scheme (thank you Bass Camera) that was very common in the late '60s and early '70s. I would rank it a 3 on a scale of 1-10 in terms of lenses I have used. It was pure junk, but I have seen worse.

So with the Holga at 1 and lensbaby at 2 and the Zeiss Otus at 10, just where do we place a DA* 55/1.4? Is the fairly excellent Pentax-M 50/1.7* only a 5 or is it really just a notch off the Otus or 55/1.4? Is the 18-55 kit only a little better than my turkey Rexatar of 45 years ago?


Steve

* Tested back in the day to be only a notch off the Summicron 50/2

Last edited by stevebrot; 10-26-2014 at 10:35 PM.
10-27-2014, 02:14 AM   #30
Pentaxian




Join Date: Apr 2011
Photos: Albums
Posts: 8,756
Actually I find the numbers on reviews to be pretty unenlightening because every reviewer has a different concept of 'goodness' and scale relationship of numbers to their perception of goodness. I am really only interested in the descriptive part of the reviews where one can, at lest perceive something of the experience and reasoning that went into the rating.


On a forum like this we are never going to get agreement about an appropriate scale of 'goodness' of lenses that would be meaningfully applied by all our friends out there.


It is hard enough to get specialists to agree on these things. I speak as a specialist in measurement theory and reasonably acquainted with the professional test and evaluation community.
Reply

Bookmarks
  • Submit Thread to Facebook Facebook
  • Submit Thread to Twitter Twitter
  • Submit Thread to Digg Digg
Tags - Make this thread easier to find by adding keywords to it!
equipment, lens, photography, review

Similar Threads
Thread Thread Starter Forum Replies Last Post
YN560 III test button defective on ALL of them? peterh337 Flashes, Lighting, and Studio 10 10-18-2014 02:12 AM
Rant about bad PF reviews innivus Pentax SLR Lens Discussion 54 09-10-2013 09:00 AM
Suggestion Pointless (wink, wink) equipment reviews Giklab Site Suggestions and Help 16 09-07-2013 05:07 PM
Is my K30 Defective or in need of adjustment ? Axeman Pentax K-30 & K-50 6 07-13-2013 05:53 PM
Suggestion Reviews of film scanning equipment... ??? Robert107 Site Suggestions and Help 1 05-31-2011 10:28 PM



All times are GMT -7. The time now is 06:39 PM. | See also: NikonForums.com, CanonForums.com part of our network of photo forums!
  • Red (Default)
  • Green
  • Gray
  • Dark
  • Dark Yellow
  • Dark Blue
  • Old Red
  • Old Green
  • Old Gray
  • Dial-Up Style
Hello! It's great to see you back on the forum! Have you considered joining the community?
register
Creating a FREE ACCOUNT takes under a minute, removes ads, and lets you post! [Dismiss]
Top