Please activate JavaScript in your browser settings to enable all features of this website.

Möchten Sie zur deutschen Seite wechseln?JaNeina
Close
User Experience

User Experience (UX)

Today’s consumer is bombarded with promises for compelling experiences. They are sophisticated and demanding.  To be successful, a new product or service needs to be intuitive, usable, engaging and desirable.  The user experience needs to be emotional in order to be memorable.  

GfK’s User Experience (UX) research and design experts help our clients create and improve customer experiences for existing or new products and services.

We bring your customers into the heart of the design process from the start; reducing the risk of failed products and costly post-launch changes. We project user insights into all stages of development, from early concepts and prototyping through launch and post-launch activity.

Our user experience findings reveal definitive plans on how to best differentiate your products and services, capitalize on current market opportunities, and guide the UX of future product and service design.

As a result our clients create experiences that are engaging and meaningful; driving user adoption and customer satisfaction. 

Andrew Therkelsen
Australia and New Zealand
+61 2 9900 2888

UX Labs

GfK’s custom-built UX laboratories across multiple key markets are standardized to ensure consistency and high quality, no matter where the research is conducted. We use our UX labs to host test scenarios to meet any need – from a simulated emergency room to a living room environment – and accommodate everything from focus groups to individual interviews. 

For user experience research outside the traditional laboratory environment, we have unmatched mobile studios that allow data-gathering to occur anywhere in the world, in any setting.

Read more about our UX labs

UXalliance

Our GfK UX team is a founding member of UXalliance, the international user experience network. With more than 500 UX professionals worldwide who speak 30-plus languages combined, the UXalliance gives you access to local experts with deep knowledge of local markets.

To ensure reports are comparable across countries, our partners adhere to strict quality standards and proprietary guidelines. And we have been making global UX research easy since 2005 by offering cost-savings and shorter timelines for multi-country projects.

Related Links:

UXalliance

UX Masterclass bi-annual conference 

Related Products for User Experience
Latest insights

Here you can find the latest insights for User Experience. View all insights

    • 01/25/16
    • User Experience (UX)
    • Australia
    • English

    Turning the Tables

    In one of the latest Research News features, Catherine Eddy explains how turning user experience research tools inwards has helped reduce staff turnover and improve workplace culture.
    • 12/01/16
    • User Experience (UX)
    • Global
    • English

    5 ways to apply design thinking to UX research

    When I was just starting out as an industrial designer, I can remember rolling my eyes when I heard some prominent designer or design agency talking about how designers were going to save the world. I thought they were a bit full of themselves (and I still do), but that doesn’t mean there wasn’t some value in what they were saying. They didn’t have a name for it at the time, but what they were talking about is what we now refer to as design thinking. Design thinking is broad and vaguely defined, and if you ask ten designers what it is you’re likely to get 12 different opinions; but if you examine those various opinions you’ll start to see some themes repeated, reflecting many of the tools that designers use in their process including user-empathy, prototyping as exploration, abductive reasoning, re-framing, and the list goes on. As a starting point, we have identified five tools of design thinking that can be applied in a research context.
    1. Think systemically. Rather than focus on the immediate problem, look for the larger context. You might come to understand the problem better, and you might see solutions that you wouldn’t have otherwise. Famed engineeer Paul MacCready said, “The problem is that we don’t understand the problem.”
    2. Be empathetic. It is vitally important to understand the subject from the user’s point of view. While you benefit from seeing the larger context, you don’t want to fall into the trap of seeing the issues through your lens and not the user’s. If you do, you could end up solving the wrong problem. The most useful piece of software in the world provides no value if the user can’t navigate it.
    3. Linger in ambiguity. Don’t jump to an interpretation or a solution too soon. Allow the question to be ill-defined; allow for multiple possible answers; don’t assume anything. As soon as you think you know the answer, you stop processing new information. Delay that decision point until you have all the data available, and you will make a better decision.
    4. Maintain a result-oriented focus. Instead of focusing on a solution, focus on the end result you are trying to achieve. Blockbuster focused on improving the video rental experience and did quite well, until Netflix focused on video viewing and eclipsed Blockbuster overnight.
    5. Reflect and invite feedback. Examine and re-examine your assumptions, your insights, and your solutions. Seek input from a variety of people, knowledge bases, expertise; imagine your solution in a different scenario, with a different user. Re-test not only to verify your answers, but to identify the next questions.
    Design is a subtle, intuitive, and non-linear process. It cannot simply be mapped and codified into a repeatable, cookie-cutter method, but the principles underlying it can be emulated and applied to other problems including research design. If we can remember these principles when we are planning, conducting, or analyzing research, we will open up new opportunities, generate more meaningful insights, and create richer feedback. Perhaps the most important element of design thinking is that—contrary to what those design luminaries would have you believe—it is not restricted to an elite group of people. As Nobel Prize laureate Herbert Simon said, “Everyone designs who devises courses of action aimed at changing existing situations into preferred ones” (Sciences of the Artificial, 3rd ed., 1996). So, while you may not have the training to design the next ground-breaking smartphone or web search algorithm, you can apply the mindset of design thinking to your area of expertise and go a step further, or maybe even leap beyond. Tyler Duston is a User Experience Lead Specialist at GfK. Please email Tyler.Duston@gfk.com to share your thoughts.

    Design compelling experiences grounded in research

    Read more about our User Experience Design solutions
    • 11/11/16
    • Health
    • Technology
    • User Experience (UX)
    • Global
    • English

    GfK’s Hannah Duffy to speak at global innovation and health technology conference

    Hannah Duffy, senior user experience (UX) consultant at GfK, will share perspectives on the topic “Avoiding the storm in the NHS through design.
    • 11/02/16
    • Health
    • User Experience (UX)
    • Global
    • English

    Four best practices for bulletproofing drug and delivery device innovations

    When applying human factors engineering in medical and drug delivery device development, the end goal for manufacturers is a successful validation study. Proper application of best practices in human factors engineering throughout the development process, not just at the end, is how that success is achieved. Having managed and executed hundreds of such studies, we’ve observed some common pitfalls that, if not navigated properly, will likely result in FDA requests for additional research – pitfalls that can lead to time wasted, money lost and effort exhausted. Four best practices represent examples of how to apply human factors engineering to reduce time and money, and increase your rate of success: 1. Ensure participants in the human factors validation testing are representative of intended end users. Do not assume. Base your definition of intended users on data gleaned from past research, and document the inputs to your definition. We often see incomplete or incorrect assumptions about the nature of the end user. For example, during one study a manufacturer assumed physicians would use a particular device to accomplish a task, but ethnographic observations revealed that physicians typically handed the device to a nurse to complete the task. FDA guidance indicates that human factors validation testing must include participants who are representative of intended end users (adult patients, pediatric patients, various types of HCPs and caregivers). In some cases, support personnel (i.e. staff who perform equipment maintenance, repairs, cleaning, etc.) may need to be included as a separate user group, likely with separate tasks. The FDA requires a minimum of 15 users per user group, and sometimes more. 2. Assess tasks and sub-tasks associated with product use with sufficient granularity to truly understand failure modes. It’s crucial to perform a task analysis that is granular enough to identify every interaction a user has with an interface, breaking those interactions down into elements of perception, cognition and action. This helps to understand key failure modes. For example, we conducted formative research for a manufacturer with a goal towards identifying any opportunities for refinement in the packaging and labeling for a drug. Previously, a graphic designed to communicate the proper dose was made larger in an attempt to reduce improper dosing. We helped the manufacturer redesign, rather than enlarge the graphic and saw a reduction in improper dosing in later research. For critical (and essential) tasks, it’s crucial to observe behavior through simulated-use scenarios because what users say they would do versus what they actually do can be vastly different. Craft each scenario allowing participants to demonstrate what they would do if they were at home/at work/in other intended use environments. Control environmental factors (light, sound, distractions, etc.) to be representative of the intended use environment. 3. Conduct preliminary analyses with an eye towards defining and documenting context of use in addition to designing the product and associated materials. Product manufacturers often assume that because they have implemented a training program, all of their users will be trained as they prescribe. But when we’ve conducted contextual inquiries or ethnography in clinical settings, it’s not at all uncommon to hear that some clinicians have skipped training. Or that a”train the trainer” model is only loosely followed. This results in scenarios where the user might interact with the device without any formal training or a long time after they were initially trained. Taking the opportunity during preliminary analyses to evaluate the context of use, who is using the product and how is just as important as formative usability testing to ensure safe and effective use will be validated at the conclusion of the human factors effort. 4. Prepare for complexity of validation by establishing robust team training on best practices in application of human factors engineering, and control for quality and consistency. In validation studies sample sizes are typically larger. Representative user populations are often difficult to identify and require data collection across multiple markets. Representative contexts of use must be simulated carefully. Add to all this the variety of team members involved in the execution of such an effort (research leads, participant recruiters, site coordinators, moderators, note-takers and trainers). It is important to have a robust system in place that ensures the team is appropriately trained for research protocols, that good documentation practices are adhered to, that a robust root cause analysis has yielded sufficient understanding of all observed use errors and that any adverse events have been reported. Any missteps and, at best, significant time, effort and cost go into documenting and explaining deviation from protocols. At worst, the validity of your data falls into question and leaves you with a need to conduct more research. Ultimately, implementing these best practices will not only support a successful validation study, but they are also critical to ensuring the product you are developing lives up to the promise of your innovation by delivering a superior user experience. For more information on our best practices for safeguarding drug and delivery device innovations, contact korey.johnson@gfk.com.

    Want to learn even more tips?

    Watch our on-demand webinar!
Contact us
Andrew Therkelsen
Australia and New Zealand
General