Garden Watchdog "20 Worst"

Monroe, NC(Zone 7b)

We have a top 20, the most highly rated nurseries and other sources. Why not a "worst 20", phrased to avoid law suit, to honor the handful of really unscrupulous rip-off that some of us have encountered? This would more readily alert people whom to AVOID.

Franklin, LA(Zone 9a)

We just had a discussion about this a little while back ... LOL

http://davesgarden.com/t/393404/

=)

Cheri'

Monroe, NC(Zone 7b)

Thanks, Cheri.

I quickly read that link, and asked that the idea be revived. I had no idea how BAD mail-order companies could be til I got into gardening seriously in the last couple of years. I guess that I just have been burned so bad, and seen others suffer the same fate, that I want revenge!

Peter

Deep South Coastal, TX(Zone 10a)

I look in the GWD before I buy from any mail order plant source. It kept me from ordering what seemed like a good deal but wasn't.

Monroe, NC(Zone 7b)

I agree. It's a great service! I just want the obviously dishonest to be more readily visible; but I can see that going on a case by case basis will probably prevail! Glad it worked for you.

Murfreesboro, TN(Zone 7a)

Then as now, my concern is that I don't see a good reason for users to be seeking out a list of the worst companies. As Baa pointed out, which is worse? A huge company whose sheer volume of comments makes them "mediocre" mathematically, or an obscure company with one comment, which unfortunately is negative?

There are several of those behemoths in the Watchdog that I personally wouldn't give a plug nickle for a plant from them, but they've got a less-negative score than lesser-known companies whose misfortune has guided only one customer to rate them, and it a was a negative rating at that.

With the tools currently in place, we allow/encourage users to search for companies based on one or more of the following criteria:

1) a particular cateogory of product the user is seeking;
2) the geographical proximity to the user (or at least being located in the user's country); and/or
3) good feedback from other customers (the "Watchdog 20")

I suggested in the other thread that being able to search by rating as well as the other criteria would be helpful, but I would only think that this criteria would be set to find positive or neutral companies. I wouldn't suggest that users be allowed to intentionally seek comapnies with a negative rating, just because it serves no good (as far as I can tell) purpose, and would cause many of those companies to become upset and possibly litigious if they felt they were being "tarred and feathered" by us.

Sundry followed up my suggestion (and I agree) it would be nice if our search results could be re-sorted by clicking on the headings of location (alpha by state) and/or rating (descending order, from highest positive to lowest negative), rather than always in alphabetical order of company name.

But those are all enhancements that Dave will need to say yea or nay to, as they would demand his time and efforts to program :o)

Terry,

These enhancements have already been made. When you do your query, you can select the sort field. Take a look.

Dave

Thumbnail by dave
Franklin, LA(Zone 9a)

Dave, you are The Man.
very cool.
thanx!

Cheri'

Vicksburg, MS(Zone 8a)

Dave,

How did you implement the sort algorithm? When I sorted on ratings, the highest rated company was first, but a number of companies that had high ratings were after ones with lower numerical ratings. For example, one company that had a rating of 25 was after a number of companies with a rating of < 10 in the ornamental trees and shrubs category.

Also, it would be nice to be able to click on the heading and have the results sorted. This would be consistent with how you have implemented the journal sorting capabilities.

Baron,

Thanks - I'll look into it. I made one change with the ratings to fix that (hopefully). I'll check into the headings per your and Terry's request.

dave

Murfreesboro, TN(Zone 7a)

Well, whaddya know - sure enough, it's there!!! Thanks Dave! (Copperbaron, I think the algorithm is factoring in the number of negatives, which tend to drag down a company's rating beneath those that have fewer total comments, but also have fewer (or no) negative comments.) For ease of use, it should probably just do a simple +1 positive/-1 negative scoring on the sort.

It was indeed doing a +1/-1 scoring, but that was showing false readings to Baron (because the actual number is determined by +1/-5.)

I set the rating to be standard +1/-5 across the board (shouldn't it be consistent? I can't remember but we've gone over this several times) and it shows correctly now.

dave

Baron, I fixed the sorting. Check it out now.

Dave

Murfreesboro, TN(Zone 7a)

chuckle...ummmm, well. Ahem. Discreet cough. Stutter. Rub my hands over my face. Mumble a little. Sigh....

Pardon me, Dave, but uh, well, I thought we were using that only for the Watchdog 20, not their overall score?

True confession time: what y'all are reading here is a philosophical difference between Dave and me, albeit a good humored one (I think...) And yes, the "Top 20" score is uses a different criteria than a simple +1/-1 scoring system.

This is because we found some companies making their way into the top echelon by sheer volume, not because they were really stellar companies. (Should it matter if you have 500 negative comments, if you have 650 positive ones and can show a net +150 score? I/we think it should matter, and that a company in that situation should not be as highly recommended as one that has 151 positive comments and 1 negative, which would also equal +150 net using +1/-1 scoring.)

We beat our heads against the wall with some highly involved/complex calculations (put together by a very generous actuary who was interested in seeing more equity in the scoring), but at the end of the proverbial day, we decided it was too complicated to implement, let alone explain/defend to companies and users.

The same effect (more or less) is achieved by using a +1/-5 scoring system. However, my thought was we were only using that for the Top 20 spots. Because there is (IMO) an inherent danger in giving a disgruntled person as much clout/weight/leverage (whatever you want to call it) as 5 happy customers for all aspects of a company's rating.

(Now, to all of y'all reading this: Isn't it fun being flies on the wall? *grin*)

Vicksburg, MS(Zone 8a)

Here is a suggestion similar to one that I made some time ago. Base the rating on the following formula:

rating = (# positives - # negatives) / (# positives + # negatives)

This puts everyone on the same playing field regardless of how many ratings the company has had. For example (using Terry's examples above plus a company with a lot fewer ratings, but mainly positive, and a company with all negative ratings)

Company 1

Rating = (650 - 500) /( 650+500) = 0.13

Company 2

Rating = (150 - 1) / (150 +1) = 0.987

Company 3

Rating = (15 -1) / (15 +1) = 0.875

Company 4

(0 - 250) / (250 + 0) = -1

For these examples, Company 1 is clearly not very good, but isn't completely horrible as is Company 4. Company 2 is clearly the winner, but Company 3 is not so severely penalized because of the lack of a large number of ratings. Company 4 clearly sucks big time. The only possible drawback is that companies with only 1 rating would be either a 1 or -1. This could be rectified by having a minimum number of ratings before DG rates the company + or -. You could still give the score, but have the ones with insufficient # of ratings a different color, or put a NR for the rating, which would indicate that there are an insufficient number of ratings to make a decision. There are other ways to take into account companies that don't have a lot of ratings yet, but this is very, very simple.

The current way of doing things is apparently giving negative ratings to companies that have more positive than negatives (if the new system was implemented correctly). I have made another post about this prior to seeing Terry's latest post to this thread.

The other nice thing about this is that all of the ratings would be between -1 and 1.

Regarding the top 20 (I just read Terry's response to my thread), just set a minimum number of ratings, say 30, in order to be eligible for consideration.

One addition - the formula should include the number of neutrals in the denominator as well. For example, a company with 2 +, 1 -, and 10 neutral ratings would have a score of:

(2 - 1) / (2 + 1 + 10) = 0.077

For a company with 2 +, 1 -, and 0 neutrals:

(2 - 1) / (2 + 1) = 0.333

So, the company with no neutrals would have a better rating than one with the exact same number of +'s and -'s but included neutral ratings as well. This is a clear improvement of what is being done now in that the neutrals also enter into the rating, but not as strongly as a negative rating.

This message was edited Wednesday, Sep 3rd 4:28 PM

This message was edited Wednesday, Sep 3rd 4:38 PM

Terry, I believe this addresses your issue regarding the top 20 (they would still all be in the top 20), but would also eliminate the negative ratings for companies that have more +'s than -'s. One other change. For companies with more -'s than +'s, rather than adding the neutrals in the denominator, subtract them. This gives the following:

IF (# +'s - # -'s> 0) THEN
rating = (# +'s - # -'s) / (# +'s + # -'s + # neutrals)
ELSE
rating = (# +'s - # -'s) / ((# +'s + # -'s - # neutrals)
END IF

This now penalizes neutrals for both positive and negative ratings.

This message was edited Wednesday, Sep 3rd 4:46 PM

Monroe, NC(Zone 7b)

Terry - thanks for clearly restating your logic. I agree. I am glad that my question helped uncover Dave's new improvements!

Peter

Post a Reply to this Thread

Please or sign up to post.
BACK TO TOP