Qualtrics
Author: Amber Asaro | Last edit: August 19, 2024
Access and logging in
All Red Hatters who have Qualtrics licenses can access our Qualtrics instance vis their SSO login with licenses and access privileges controlled by the UXD Operations team.
Screener and survey checklist
Creating the study
- Start from the template (see below).
- Name your project following current naming conventions.
- Store your project according to the current filing/folder conventions.
- Verify that you are using the correct Theme in Qualtrics. To check this, go to Look and feel on the left nav bar in your survey, then make sure the Static Theme "Red Hat Survey Default" is checked.
- Cross-link with Salesforce
- reference your Qualtrics study in the Salesforce Study Case.
- Make your Ops contact a Collaborator so that they can update the screener or survey with the appropriate metadata so appropriate collected data sync back to Salesforce. (See instructions below for sharing and collaborating)
Running the study
- When you are ready to distribute your survey, make sure you use the appropriate urls to track where responses are coming from.
Closing the study
- Double check that the Inactive Survey message from the library is applied, so that if anyone with the link tries to take this study too late, they will be redirected to other opportunities rather than just turned away.
- To check this, go to the Survey Options on the left nav bar in your survey, then check the Responses area for the Inactive survey message.
- Go to the Distributions tab and "Pause response collection".
Template
When we start a screener or survey, we always start from a library template. This ensures that our terms and conditions are displayed and agreed to (as required by our legal department), our participants have informed consent (required by legal and privacy, but also the right thing to do), and that we know the country of residence for participants (important for BEAR Compliance, compensation, and regional privacy laws).
Note: copying from an old project that was copied from the template is not the same thing. The only way to ensure you are benefiting from all of the efforts of Ops to streamline workflows and keep you in compliance is to always pull from the template. Then you can feel free to copy in questions from the old study as well. But please do not do the opposite, where you copy from an old project and then afterwards try to merge things from the template; this often results in broken logic and much more work or errors.
Steps to pull from the library:
1. In Qualtrics, choose Create a New Project
2. On the Create a project page, choose survey (this may be selected by default already) and click the Get Started button.
3. How do you want to start your survey? dropdown field where it says "Create a blank survey project", instead choose the drop-down option Choose “Copy a survey from an existing project"
4. In the Survey dropdown, choose from the folder named “Current Templates and Always-on”
5. For now we have one template to use: “UXD TEMPLATE with SF Integration (make a copy for all screeners/surveys)”
6. Click the "Create Project" button. You may now proceed with your screener or survey from here.
Tips and tricks
Tips for preventing fake/repeat/spam responses in Qualtrics
- Include distractor (aka low-probability response) answers in screener surveys
- Red Hat McGuffin (not a real product, even though Amber thinks it SHOULD be and has included in her list to find folks who are saying they're good at everything)
- e.g. “Which products do you use?”
- Google it. Be careful to make sure that this product doesn’t actually exist!
- A person could still get mixed up or make an error, so factor in their other responses. They might not be a liar, but it can at least indicate they aren't very reliable or careful with their responses.
- Request video during moderated sessions at least before recording. This will reduce the chance of one person trying to take an interview more than once (yes, this has happened!)
- Check IP & geographical location on Qualtrics (especially if they have the similar names and you are unsure)
- Ask for honesty up front:
- “We want to hear from a variety of people with different expertise and technical backgrounds, so we need your honest answers”
- Randomize screener questions and don’t screen out right away
- Prevents participants from sharing answers or using multiple accounts to answer the same way
- Ask open-ended questions.
- Make it much harder to guess the right answer (remember how much we preferred multiple choice questions to essay questions in school, because we had a 25% chance of guessing the correct answer or just inferring it through process of elimination?)
- Understand the communication abilities, dedication, and depth of insight this participant might deliver. Are they trying to fly through and make their money, or give useful feedback?
- For example, if you were screening for 5 participants for a longer survey or test, and really need to make sure they meet specific criteria, instead of giving them multiple choice questions, ask them to write in their responses. This is a technique recommended by Nielsen Norman Group (see "How to use Screening Questions to Select the Right Participants for User Research")
- While this means you have to sift through the responses more, it would do two things:
- Bury the “right” answer under “Other”
- For example, if you were specifically looking for folks who have used Google in the last six months, you could structure the screener with options to check Bing, Safari, Opera, Internet Explorer, and Other. The Other question could then lead them to a new structured response list with options of Google, Mozilla, Duck Duck Go, and "Other (please describe)". This again makes it harder for potential participants to infer what the “correct” answer is.
- Include a disclaimer or in questions themselves “In your current role”
- Precise and realistic frequencies (in line with role or product use)
- Once per day
- Twice per week
- Add a repeat question
- One at the beginning and the other at the end
- Industry & Compliance - if they select “Healthcare” as industry they should write in adjacent compliance
- Check if the survey software has an option to prevent ballot stuffing (multiple submissions from the same IP address)
- Check for the start and end time taken for the survey
- Too fast - could be a bot, careless answers, or a repeat survey taker
Tips for identifying fake Qualtrics responses tips (especially important for high volume responses)
- You can enable a recapthca score in the survey security setting. A score closer to 1 is likely not to be a bot. (You probably still want to spot/double check)
- If something looks fishy, compare IP to selected location
- Bots can get around single response questions easily ,but free text is harder to bot in a way that looks right
- Recaptcha score
- IP vs selected locations
- Same responses from different IP
- Generic email, titles
- Wrong type of response in certain fields (i.e. individual name in organization or job title field)
Tips on how to Import questions from another instance of Qualtrics
If you've been here a while, you probably have a few projects that you often use as a starting point for building over the core template. In order to bring this into the new instance, or to share it with members of a team that use a separate instance, you can follow these quick and simple steps.
To export
- From the Survey tab view of the survey you wish to export, click on the Tools dropdown.
- In the Tools menu, choose Import/Export > Export Survey.
- This should export the survey in the proprietary .qsf format of Qualtrics.
To import
- Go into the Qualtrics instance where you want to bring in (import) the survey.
- From the home page, choose Create a New Project
- On the Create a project page, choose survey (this may be selected by default already) and click the Get Started button.
- On the How do you want to start your survey? dropdown field where it says "Create a blank survey project", instead choose the drop-down option Choose “Import a QSF file"
- Click the Choose File button to browse for and select the file you exported from the other instance.
- Click the "Create Project" button. You may now proceed with your screener or survey from here, and it's of course totally separated from the original survey in the other instance.
Miscellaneous little tips
- Trouble finding a survey by name? Has the name changed several times? There’s a unique code for your survey in the url, Eg SV_86LQYxVhXiW3GfQ. You can search by this code later, regardless of the display name, to find it. Tuck this code away in your Jira or project documentation.
- Default security on Preview mode can cause some issues to stakeholders when you share it with them, because they don't have a login for Qualtrics. Folks who aren’t in Qualtrics need the Public option. In the Link visibility permissions dropdown, switch from "Brand internal only" (meaning only logged-in users) to "Public" when generating the share url.
- You cannot add a multiple choice response answer without deleting the logic for ALL responses within that question.
Additional Resources
For more tips about using Qualtrics, feel free to reference several documents the researchers on our team put together in the google drive.
And here's a deck with some tips and tricks put together in 2022.
Sharing and collaborating
You will now only see surveys created by you or shared with you. For collaboration, you will need to make a survey available for others to see, and can choose their level of access. To do this:
- From the Survey tab view of the survey you wish to share, click on the Tools dropdown.
- In the Tools menu, choose Collaborate.
- In the Collaborate on Project window, search for people with whom you wish to share this survey.
- Use checkmarks to indicate what types of access they should have to this study (see screen shot).
- Click Save.
Tracking the source of respondents
The Ops team is tracking the source of all respondents and participants, in order to understand what recruitment avenues are most productive.
If you copy from the template in Qualtrics, this should already be set up for you to use.
If you need to add these tags to a study, here's a handy tool that allows you to paste your study url and it will append the tags for you.
If you are for some reason creating from scratch (e.g. maybe you have very custom tags for your unique study), follow these steps:
- On the left navigation bar, click the icon for Survey flow.
- Scroll to the very bottom of the survey, below the End of Survey block.
- Click the green text: + Add a New Element Here
- In the field on the left, the placeholder text reads: Create New Field or Choose From Dropdown.... Replace this with the exact text: source=
- Do not place any values on the right. It's done. Click the blue Apply button on the bottom Right.
- On the left navigation bar, click the icon for Builder.
- Be sure to click the Publish option on the upper right.
Whenever distributing this survey via different channels, simply add a short bit of metadata at the end of the url. This will be captured so we can tell who signed up/responded from where.
To do this:
- Generate your distribution link like normal (steps if you need them):
- Click on the Distributions tab at the top of the survey.
- On the left, choose Anonymous link
- Click the Copy survey link button
- Now take your distribution url and append it at the end with the following: ?source=[sourcename]
- For example, if my distribution url is https://redhatsurvey.com, and I want to distribute on LinkedIn and Twitter, for LinkedIn my url will look: https://redhatsurvey.com/?source=LinkedIn
ALWAYS use the standardized source names from this list if applicable, or using the following naming conventions. If you ever find out something is missing, please comment here or reach out to Ops so we can evaluate and add it to the list (do feel free to add your non-standardized source so you can keep moving along, just make sure we know!).
Standard List of Sources
Bambu | |
PreviousParticipantCallback | |
Redirect | |
RedHatAccelerators | |
RedHatUserResearchWebPage | |
SFdb | This is a response from someone who is already in our participant database in Salesforce. |
UserInterviews |
Qualtrics administration
For general information on being account admin, see the Qualtrics Admin Basics Overview page. Below are details about how we manage things on our team specifically.
Maintaining Templates
Any time changes are going to be made
- make a COPY of the template.
- The NEW copy should be named zzOLD and moved into an archive folder.
- The CURRENT one is always the latest template (to prevent any linking issues)
- be sure to update in the first question of the template what changes were made
- update the Embedded Data field named "Template Version" so that we can make sure the most current template is always being employed. Use V(CCYYMM) format. (see image)
Rationale on using a study rather than the templates library
The templates in the library in Qualtrics do not support the Options section, which includes post-survey triggers. These are critical for configuring syncing with the Salesforce database. Therefore, we maintain a "template" study rather than an actual template in the library.
How we use Users, User Types, and Groups
Each user type and group has different security access. Note: We should update any security access at the Group level, so that members of the group all have the same access (this is simpler to track). If there are one-off exceptions, the tracking sheet will need to be updated with rationale.
UXD Ops | members of the operations team, requiring full administrative access | UXD Ops |
UXD Researchers | researchers, with full access except ability to modify library items or change the Theme (in Look and Feel) to the generic themes | UXD Researchers |
UXD PWDRs | members of UXD who are doing some research, but are not researchers. slightly more limited access. | UXD PWDRs |
Whenever a user is added, or access is modified or revoked, it must be updated in the Qualtrics Access tab of this tracking sheet.
Why do we have two instances of Qualtrics?
The "old" instance of Qualtrics (redhatdg.co1.qualtrics), forced all researchers to share one login (the username was uxdresearch@redhat.com). This has disadvantages, as everyone has the same level of access and visibility into all projects. It makes it hard to partition and focus only on relevant projects, or to track who changed something. We are keeping it because it does have the advantage of relatively unlimited use, since it rolls up to the corp Red Hat account. It's a good back-up. It also means that we can maintain access to the older studies, while making a clean start in the new instance.
In the current instance (feedback.redhat.com), each person has their own login. We are now able to exercise some governance over naming conventions and folder structure, as well as to share with other teams (CXA team is currently the owner of the account in contact w Qualtrics).