Customer satisfaction research: How soon is too soon?

There’s an old saying that, in life, timing is everything. Based on my recent experiences, I think this adage applies to customer satisfaction research as well.

Last month, my wife and I went to purchase a used Toyota Camry hybrid (we are trying to go green while saving some green at the pump). We test-drove the car on a Friday night and verbally agreed to purchase it. The next day we finished the paperwork and jumped in the car to head home. Halfway there, I looked at the gas gauge and it was nearly empty. I was a bit angry that the dealership had shorted us like that.

In addition, during our drive, we found a host of problems: the navigation stopped tracking the vehicle location and the nav screen began to fade; the remote key fobs didn’t work; and we noticed a poorly-executed repair to the leather in the back seat.

While all of these things were fixable, my wife and I felt the hassle was more than we wanted to deal with. And, if the near-empty gas tank was any indication, the dealership’s attitude toward customer service might not be the greatest. So we decided to return the car under the dealership’s three-day return policy.

After a small fight with a manager, they agreed to take the car back that same day. That night, in my e-mail in-box, was a customer satisfaction survey. They wanted to know our opinion on the salesperson, the finance manager and the dealership in general. They also wanted to know if we were satisfied with the car.

After airing my views, I stopped to consider whether the dealership should have waited to send this survey. Why not at least let the three-day window pass? Even if we had not returned the car and everything had been okay, I don’t know that six hours is enough time for a customer to effectively evaluate the service and product they received.

Similarly, when we had DirectTV installed, we got a customer satisfaction survey e-mailed to us literally the minute the installer left the house. I was impressed by the speed, but honestly, I never took the survey, because I went to check out my satellite service and forgot to go back and give feedback. It seems that DirectTV would have been better off sending the survey 24 hours after the install, to give me time to use and interact with the service.

I understand that, in many cases, the goal is to capture consumers’ responses while an experience is fresh. But how soon is too soon? When is the appropriate time to send a customer satisfaction survey? I would love to hear others’ opinions.

This entry was posted in Consumer Research, Customer Satisfaction, Market Research Best Practices, Online Surveys and Research, Survey Development, Uncategorized. Bookmark the permalink.

6 Responses to Customer satisfaction research: How soon is too soon?

  1. Great ideas, Steve. When I saw the title, my top-of-mind thought was the same as the folks who unleashed those sat surveys, I’m sure: the time to seek feedback is as soon as there has been an interaction with the customer. But your points are well-taken. If the customer has insufficient experience to develop an informed opinion, there may be a lot of wasted effort in return for no answer (or worse yet, a flawed answer).

    Now, imagine either of the vendors you mentioned had given you a chance to respond during the startup or installation phase (a step-by-step checklist of their expectations for service delivery), and then a follow-up, after you’d had a chance to use the product. In your car example, the dealership might have actually had a chance to rectify the issues and save the day, or at a minimum, apologize before your disillusionment became too great to surmount. And the DirectTV folks would have received immediate feedback about your interaction with the installer (hmm…as part of a system test even?) plus informed feedback about whether the system works as well once you’re on your own.

    It seems to me the “fast response” method could be good – but is no substitute for informed action.

  2. Chris says:

    Great questions … I had a similar case where I got a customer satisfaction survey from Toyota (I bought a new car in November) within a few days and felt it was too soon.
    I run some customer satisfaction surveys at my company and aim to survey people around the 3 month mark. The nature of our product (IT data storage and infrastructure) means that it takes a little time to set up and become familiar with the product and service. 3 month works well for us but I think it really depends on the product. The car example might work better at the 1 month mark to get a good overall experience. The new iPhone user might work best at 2 weeks. I think the product type will determine when it’s appropriate to survey.

  3. Pingback: Customer satisfaction research: How soon is too soon? « Meyers Research Center: THE BLOG

  4. Anne Miner says:

    Steve and Megann you both make great points! As a customer satisfaction measurement specialist, I am conscious of the need to tailor the excercise to the specific circumstances and objectives of the excercise. In both of the situations you describe, it is evident that there was a desire to capture your feedback “in the moment” of your experience.

    In the case of Direct TV, if their interest was in caturing your feedback specifically about your interactions with the installer, then their timing was appropriate if somewhat overzealous. On the other hand, if Direct TV was hoping for feedback concerning their service, then as Megann has pointed out, it would be more productive to send you the questionnaire after some period during which you have had the opportunity to enjoy the product or service.

    It seems that Toyota’s intentions were to capture feedback concerning your purchase experience – that would be your experience with the salesperson, finance manager and dealership in general – “in the moment” of your experience. If their questions were limited to these areas, then sending the questionnaire immediately/that night is appropriate. The questions concerning your satisfaction with the vehicle itself and aftersales service would be better sent at a later time – perhaps a week later – again, to allow you time to experience the vehicle.

    We are driven to constantly improve efficiencies and this appears to be a case of improved efficiency – endeavoring to satisfy multiple objectives in a single questionnaire – is counter productive. A more thoughtful approach to listening to customers is needed – an approach that collects the appropriate information in the appropriate timeframe. This will require multiple questionnaires/contacts with the customer to gather meaningful feedback.

  5. salma Talaat says:

    adequate interaction/trial time of the product itself should expire before any customer satisfaction surveys are carried out. This ineraction/trial time will vary from one product to the other depending, of course, on the type of product. I don’t know, i am not a car expert, but i think at least one week should have passed before evaluating the car dealer. At least one needs to live with the car for a while, before deciding on evaluating the whole experience.

  6. Beth Dungey says:

    Anne you are spot on. It is a case of timing and the nature of the “experience” you are measuring.

    If the experience is fleeting and likely to become part of a broader memory (e.g. making a deposit at a bank, making a call to a call centre) then the closer the survey is to the service delivery point the better the feedback.

    But if what you are measuring needs to be experienced over a longer period of time to get a good measurement (e.g. a car, cable tv service, house painter, etc) then you need to give the respondent sufficient time to experience the product or service before surveying them. The interpersonal service will still be remembered, particularly if after sales service is required. (Although if your focus is on the minutae of the installation or the receptionist in the car showroom, closer may be better. In that situation, you also need to ask yourself if that is what the respondent will want to answer questions about. The survey still needs to be relevant and engage the respondent.)