Source of article 2's Company - Magnus Insights.
There is a trend in recent years for every purchase, service encounter, or dining experience to end with a customer satisfaction survey. As useful as feedback can be, it is, obvious to me that many of these “surveys” are better called “fake surveys.” That is, they lack objectivity and they lack validity. As an example, I got my vehicle serviced, and when I picked it up, the service representative forewarned me that his (to remain nameless) car company would be sending me a survey. Further, he said he hoped I could rate him “excellent” in all categories because anything less than “excellent” meant that he and the dealership would receive a failing grade. How stupid is that? If the point of the survey is to get feedback and enable the company to improve things, how does it help to pre-sell the survey results? If the purpose of the survey is to get all perfect scores to use to report as a marketing/sales tool – “all our customers are satisfied” then how can anyone rely on these “fake data”? The other frequent customer satisfaction survey question is, “Based on this encounter with the telephone representative today, how likely would you recommend the XYZ company?” I would rarely recommend a bank, a credit card, or anything based on 1 encounter with 1 representative over 1 issue, so my answer will be as far on the negative point of the scale as possible. That totally loses the feedback on the performance of the representative, because it is the wrong question. Someone obviously came up with this question some years back and it became popular in such surveys. But, this question, like some others I’ve read, are actually 2 or 3 questions in 1. It would start with “Do you make recommendations to your friends/family? Yes/No.” It is a decision tree from that point. But, in a “fake survey” used to make “fake marketing propaganda,” or even “fake news,” it appears these steps are skipped. The next issue with fake surveys is representativeness. Never are these truly random. When given a receipt at McDonalds or a grocery store, only certain people will complete the survey for the free food or other coupon. (I do, sometimes, just to see what the survey looks like and how it is presented.) I suspect the response rate on these surveys is very low and is also skewed by those who had an extraordinary experience, a bad experience, or who just want the free food. Maybe this is the most cost effective way to collect data, but I hope that the consumers of that data are savvy enough to realize that it is not objective. I fear too many are not. All of us consume data; survey results fill the news. It is critical to read between the lines to know what was asked, who responded and what the data are really saying. Unfortunately, this dilution of survey and data collection quality has probably misled many people, whether it is in political choices, or in our world, when jury research is conducted by those who don’t know proper data collection, analysis, or reporting procedures.