Skip to main content

Preview

Preview mode lets you test your experiment exactly as participants will experience it. Run through your entire study to verify timing, check instructions, test response collection, and ensure everything works correctly before sharing with real participants.

Overview

Preview mode provides:
  • Participant-identical experience - See exactly what participants see
  • Full functionality testing - All features work as in real experiment
  • Data collection - Preview responses recorded for verification
  • Flow testing - Verify component sequencing and branching
  • Timing verification - Check that durations feel appropriate
  • No permanent data - Preview data separate from real participant data
When to use Preview:
  • After building any new components
  • Before sharing experiment with participants
  • When making timing or flow changes
  • To verify randomization works correctly
  • To test variable substitution
  • Before submitting for publication or sharing
Critical: Always preview your complete experiment before collecting real data. Errors discovered during data collection waste participant time and research resources.

Starting Preview

Opening Preview Mode

Three ways to start preview:
  1. Keyboard shortcut: Press P from any view in Task Editor
  2. Preview button: Click “Preview” button in top toolbar
  3. Preview panel: Open Preview panel from left sidebar

Preview Options

When starting preview, you can choose: Start from beginning:
  • Runs complete experiment from first component
  • Recommended for full testing
Start from specific component:
  • Jump to middle of experiment
  • Useful for testing specific sections
  • Select component from dropdown
Test subset of trials:
  • For experiments with many trials (e.g., 160-trial Stroop)
  • Preview first 10-20 trials instead of all
  • Configured in frame or variable settings

Preview Controls

During preview, you have control buttons to manage the test session.

Control Buttons

Pause/Resume:
  • Pause experiment at current screen
  • Resume to continue from same point
  • Useful for: Taking notes, checking configuration, discussing with colleagues
Skip Forward:
  • Jump to next component
  • Bypasses current component duration
  • Useful for: Getting past long delays, testing flow connections
Skip Backward:
  • Return to previous component
  • Re-test specific screens
  • Useful for: Reviewing components, testing variations
Stop/Exit:
  • End preview session immediately
  • Returns to Task Editor
  • Useful for: Found issue that needs fixing, testing complete
Restart:
  • Start experiment over from beginning
  • Keeps preview mode open
  • Useful for: Second pass with fresh perspective, testing randomization

Keyboard Shortcuts During Preview

KeyAction
SpacePause/Resume
Skip to next component
Skip to previous component
EscStop preview and exit
RRestart from beginning

What Preview Shows

Preview mode replicates the exact participant experience.

Participant View

What you see:
  • Same screen layout participants will see
  • Same fonts, colors, images, and styling
  • Same element positioning
  • Same component sequencing
  • Same timing (durations, ITIs, timeouts)
What’s different:
  • Preview controls visible (can be hidden with H key)
  • Current component name shown in header (optional)
  • Preview data tag in database (doesn’t mix with real data)

Response Collection

All response methods work:
  • Keyboard responses recorded
  • Button clicks captured
  • Form inputs collected
  • Mouse interactions logged
  • Reaction times measured
Where preview data goes:
  • Stored temporarily in preview database
  • Visible in Analytics panel
  • Not mixed with real participant data
  • Can be cleared or reviewed
Check response collection:
  1. Complete preview trial
  2. Open Analytics panel
  3. Verify response recorded correctly
  4. Check reaction time measurements
  5. Confirm correct/incorrect scoring

Timing Accuracy

Preview shows real timing:
  • Component durations - Actual millisecond timing
  • Response windows - Timeouts and delays work correctly
  • ITIs and fixations - Inter-trial intervals shown
  • Auto-advance - Components transition at specified times
Test timing by:
  • Using stopwatch to verify durations
  • Noting if pacing feels rushed or slow
  • Checking if response windows are appropriate
  • Ensuring fixations are visible long enough

Flow Transitions

Verify experiment flow works correctly:
  • Sequential flow - Components advance in correct order
  • Loop iterations - Loops repeat correct number of times
  • Randomization - Order changes across preview runs
  • Conditional branching - Different paths based on responses
  • Frame behavior - Frames execute as configured

Testing Checklist

Systematically verify all aspects of your experiment during preview.

Before You Start

  • Save experiment (preview uses saved version)
  • Review checklist of what to test
  • Prepare notepad for issues found
  • Clear previous preview data (if needed)
  • Consent form displays correctly
    • All text visible and readable
    • Consent button works
    • Required reading time enforced (if applicable)
  • Instructions are clear
    • No typos or grammatical errors
    • Instructions match actual task
    • Examples shown correctly (if applicable)
    • All images/diagrams load
  • Navigation works
    • “Continue” or “Next” buttons functional
    • Keyboard shortcuts work (if applicable)
    • Can’t accidentally skip important info

Stimuli and Content

  • All images load correctly
    • No broken image links
    • Images sized appropriately
    • Image fit mode works as intended
  • Text displays correctly
    • Font size readable
    • Colors provide adequate contrast
    • Alignment looks professional
    • No text overflow or truncation
  • Elements positioned correctly
    • Centered elements actually centered
    • Alignment looks intentional
    • No overlapping elements
    • Spacing consistent

Response Collection

  • Response keys work
    • All valid keys register responses
    • Invalid keys ignored
    • Keyboard focus working correctly
  • Buttons clickable
    • All buttons respond to clicks
    • Button states change (hover, click)
    • Correct button advances/submits
  • Forms functional
    • Text inputs accept typing
    • Radio buttons single-select
    • Checkboxes multi-select
    • Dropdowns open and select
    • Required fields enforced
    • Validation works (email format, number ranges, etc.)

Timing and Pace

  • Component durations feel right
    • Not too fast to read/respond
    • Not so slow participants get bored
    • Fixations visible but not excessive
  • Response windows appropriate
    • Enough time for thoughtful responses (untimed)
    • Challenging but achievable (speeded tasks)
    • Timeouts don’t occur prematurely
  • Overall pacing comfortable
    • Experiment doesn’t feel rushed
    • Breaks at appropriate intervals (if long experiment)
    • Can maintain attention throughout

Flow and Structure

  • Components in correct order
    • Logical progression (consent → instructions → task → debrief)
    • No missing steps
    • No unexpected jumps
  • Loops work correctly
    • Repeat expected number of times
    • Exit properly after iterations complete
  • Randomization functions
    • Order changes across preview runs
    • All components appear (none skipped)
  • Conditional branching works
    • Correct path taken based on responses
    • All branches reachable and tested

Data Collection

  • Responses recorded
    • Check Analytics panel after preview
    • All responses captured
    • Correct/incorrect scored properly
    • Reaction times reasonable
  • Data structure correct
    • Expected columns present
    • Data types appropriate (numbers, text, etc.)
    • No missing or null values unexpectedly

Preview with Timeline Variables

For experiments using timeline variables, preview with extra checks.

Testing Subset of Trials

Don’t preview all 160 trials - test a representative subset. How to preview subset:
  1. Before preview, go to Variables view
  2. Enable “Preview mode” or “Sample trials”
  3. Set sample size (e.g., 10 trials)
  4. Run preview with just those trials
  5. Verify variable substitution works correctly
What to sample:
  • Mix of all conditions (if applicable)
  • Example: If 4 conditions, preview 2-3 trials of each
  • Check randomization across sampled trials

Seeing Variable Substitution

Verify variables fill in correctly: Check substitution for:
  • Text content - ${word} becomes actual word
  • Images - ${stimulus}.png loads correct image
  • Colors - ${color} applies correct color
  • Correct responses - ${correctKey} matches stimulus
How to verify:
  1. Note variable values in Variables view for first few trials
  2. Run preview
  3. Check that each trial shows expected stimulus based on variable values
  4. Confirm variation across trials (not all showing same stimulus)

Checking Randomization

If variables are set to randomize: Test randomization:
  1. Run preview once, note trial order
  2. Restart preview
  3. Note trial order again
  4. Verify order changed
  5. Confirm all trials still appear (none lost due to randomization)
Check intermissions: If using block intermissions (e.g., break every 40 trials):
  • Preview with enough trials to hit first intermission
  • Verify break screen appears at correct interval
  • Check break can be ended (continue button works)

Common Issues to Check

Look for these frequent problems during preview.

Missing Images

Symptom: Broken image icon or blank space where image should be Causes:
  • Image not uploaded to media library
  • Incorrect image filename in properties
  • Wrong file path
Fix:
  1. Open Screens view
  2. Select image element
  3. Re-select image from media library
  4. Verify image appears in preview

Unclear Instructions

Symptom: Confusion about what to do, ambiguous wording Causes:
  • Instructions too brief or too detailed
  • Jargon or technical terms
  • Missing examples
  • Response key mappings not explained
Fix:
  1. Revise instruction text for clarity
  2. Add example trial or screenshot
  3. Explicitly state response keys
  4. Test with colleague unfamiliar with task

Confusing Response Mappings

Symptom: Unclear which key does what, mistakes on easy trials Causes:
  • Non-intuitive key assignments (e.g., X for “Yes”, C for “No”)
  • No reminder of response mapping during task
  • Too many response options
Fix:
  • Use intuitive keys (F/J for left/right, Y/N for yes/no)
  • Show reminder on each screen (e.g., “F = Red, J = Green”)
  • Simplify to fewer options if possible

Timing Too Fast or Slow

Symptom: Can’t keep up / getting bored Causes:
  • Stimulus duration too short
  • ITI too long
  • Response window insufficient
  • No self-pacing allowed
Fix:
  • Adjust component durations based on preview experience
  • Pilot with someone else to get objective feedback
  • Check literature for standard timing in similar tasks
  • Consider self-paced option for non-timed tasks

Components in Wrong Order

Symptom: Unexpected jumps, illogical flow Causes:
  • Dragged components to wrong position
  • Connections set up incorrectly
  • Frame encompasses wrong components
Fix:
  1. Check Timeline view for sequence
  2. Check Flow view for connections
  3. Reorder components as needed
  4. Re-preview to confirm

Tips for Effective Preview

Preview Early and Often

Don’t wait until experiment is complete:
  • Preview after adding each section - Catch issues early
  • Preview after major changes - Verify changes work as intended
  • Preview before sharing - Final quality check
Benefit: Small issues are easier to fix than discovering 10 problems at once.

Test on Different Devices

If participants will use various devices:
  • Desktop - Standard for lab experiments
  • Laptop - Smaller screen, different keyboard
  • Tablet - Touch interface, mobile layout
  • Smartphone - Small screen, vertical orientation
How: Use responsive design features and test actual devices, not just browser resize.

Get Colleague Feedback

Ask someone unfamiliar with your experiment to preview:
  • Fresh perspective - See things you’ve missed
  • Naive participant viewpoint - Identifies unclear instructions
  • Actual user testing - Real responses and confusion points
Ask them:
  • What was unclear?
  • Were instructions sufficient?
  • Was pacing comfortable?
  • Any technical issues?

Run Through Complete Experiment

Do at least one full-length preview:
  • Experience pacing - Feel how long experiment actually takes
  • Test fatigue effects - Is it too long?
  • Check breaks - Are breaks placed appropriately?
  • Verify ending - Ensure experiment completes properly
Tip: Time yourself - if it feels long to you, it will feel long to participants.

Take Notes During Preview

Document issues as you find them:
  • Keep notepad or document open
  • Pause preview to write notes
  • Include component names and specific issues
  • Prioritize critical vs. minor fixes
Example notes:
  • “Instruction component ‘Task Overview’ - typo in line 2”
  • “Stimulus duration too fast - increase from 1000ms to 1500ms”
  • “Missing image in trial 5 - check Variables row 5”

Check Analytics After Preview

Verify data collection:
  1. Complete preview
  2. Open Analytics panel
  3. Check data structure
  4. Verify all expected columns present
  5. Confirm values make sense
  6. Check correct/incorrect scoring
Look for:
  • Missing data (should be none)
  • Unexpected values (e.g., negative RTs)
  • Scoring errors (correct responses marked incorrect)

Next Steps

After preview testing and fixes, you’re ready to share: Preview is your quality control step - use it thoroughly to ensure professional, error-free experiments that provide reliable data and good participant experiences.