Booking Flow
Redesign

Booking Flow
Redesign

Booking Flow
Redesign

Booking Flow
Redesign

Booking tasks rated 5 to 6.5 out of 7 on ease compared
to repeated failures in initial research

Booking tasks rated 5 to 6.5 out of 7 on ease compared

to repeated failures in initial research

Booking tasks rated 5 to 6.5 out of 7 on ease
compared to repeated failures in initial research

Booking tasks rated 5 to 6.5 out of 7 on ease compared

to repeated failures in initial research

UX Design

Usability Testing

Benchmarking

  • UX Design

  • Usability Testing

  • Benchmarking

  • UX Design

  • Usability Testing

  • Benchmarking

Project Overview

Project Overview

Project Overview

Project Overview

Have you ever abandoned a hotel booking out of pure frustration? Imagine redesigns the booking flow around a single principle: every click should feel predictable, every price should be visible, every decision should be easy.

Challenge

Challenge

Challenge

Users struggled with unpredictable date selection, comparison overload, and hidden key information.

Users struggled with unpredictable date selection, comparison overload, and hidden key information.

Impact

Impact

Impact

Users completed all 3 booking tasks without the confusion and repeated attempts observed in initial research.

Users completed all 3 booking tasks without the confusion and repeated attempts observed in initial research.

Users completed all 3 booking tasks without the confusion and repeated attempts observed in initial research.

Team

Team

Team

UX Designer (Me 🙋🏽‍♂️)

UX Designer (Me 🙋🏽‍♂️)

Process Overview

Process Overview

Process Overview

Process Overview

From understanding the market to testing the solution. Here's how the work unfolded.

Research

Competitive Analysis

Competitive Analysis

Usability Testing

Usability Testing

Analysis

Affinity Diagram

Affinity Diagram

User Journey Mapping

User Journey Mapping

Flow diagram

Flow diagram

Design

Sketching

Sketching

Wireframes

Wireframes

Testing

Prototyping

Prototyping

Research

Competitive Benchmarking

Competitive Benchmarking

Competitive Benchmarking

Competitive Benchmarking

To understand UX standards in the market and identify consistent gaps, I analyzed 4 leading booking platforms across key stages of the booking flow.

To understand the UX standards in the market and identify where consistent gaps exist, I analysed 4 leading booking platforms across the full booking journey — from search to checkout.

To understand the UX standards in the market and identify where consistent gaps exist, I analysed 4 leading booking platforms across the full booking journey — from search to checkout.

To understand the UX standards in the market and identify where consistent gaps exist, I analysed 4 leading booking platforms across the full booking journey — from search to checkout.

Criteria

airbnb.com

booking.com

hotels.com

wake.com

Calendar UX

No state feedback

Real-time preview

No hover preview

Dynamic updates

Room Comparison

N/A

Cluttered layout

Clear & scannable + labels

Large cards -> scrolling

Key Info Visibility

Cancellation · Breakfast · Location

Icons & clear layout

Cluttered layout

Structured & visual

Plain list, no icons

Trust Signals

Social proof · badges · rating

Guest Favorite

Rate check badge

Lowest price label

Gimmicky execution

Benchmarking defined the standard. Usability testing showed where users struggled.

Usability Testing

The benchmarking showed what good looked like. I then ran 4 moderated usability sessions across the full booking flow to understand where real users struggled and what would make them leave.

What I needed to understand

Context, Goals, Behaviors, User Flow

Friction points across search, comparison, and checkout

Trust signals and price transparency

Methodology

Duration

60 min moderated in-person sessions, 4 participants

Scope

Full booking flow, 4 websites

Method

Open-ended questions, Think-aloud, behavioral observation

My role

Test script adaptation, recruitment, moderation

I'd just move on to a different website.

Said during the calendar interaction. One of three moments where users expressed intent to leave.

Usability Testing

The benchmarking showed what good looked like. I then ran 4 moderated usability sessions across the full booking flow to understand where real users struggled and what would make them leave.

What I needed to understand

Context, Goals, Behaviors, User Flow

Friction points across search, comparison, and checkout

Trust signals and price transparency

Methodology

Duration

60 min moderated in-person sessions, 4 participants

Scope

Full booking flow, 4 websites

Method

Open-ended questions, Think-aloud, behavioral observation

My role

Test script adaptation, recruitment, moderation

I'd just move on to a different website.

The founder had created a landing page with a 2 % sign-up conversion rate.
Initial testing with 5 users revealed confusion.

Usability Testing

The benchmarking showed what good looked like. I then ran 4 moderated usability sessions across the full booking flow to understand where real users struggled and what would make them leave.

What I needed to understand

Context, Goals, Behaviors, User Flow

Friction points across search, comparison, and checkout

Trust signals and price transparency

Methodology

Duration

60 min moderated in-person sessions, 4 participants

Scope

Full booking flow, 4 websites

Method

Open-ended questions, Think-aloud, behavioral observation

My role

Test script adaptation, recruitment, moderation

I'd just move on to a different website.

Said during the calendar interaction. One of three moments where users expressed intent to leave.

Usability Testing

The benchmarking showed what good looked like. I then ran 4 moderated usability sessions across the full booking flow to understand where real users struggled and what would make them leave.

What I needed to understand

Context, Goals, Behaviors, User Flow

Friction points across search, comparison, and checkout

Trust signals and price transparency

Methodology

Duration

60 min moderated in-person sessions, 4 participants

Scope

Full booking flow, 4 websites

Method

Open-ended questions, Think-aloud, behavioral observation

My role

Test script adaptation, recruitment, moderation

I'd just move on to a different website.

The founder had created a landing page with a 2 % sign-up conversion rate.
Initial testing with 5 users revealed confusion.

Analysis

Affinity Diagram

Affinity Diagram

Affinity Diagram

Affinity Diagram

315 observations across 8 hotel booking websites. Clustering competitive benchmarking and usability testing data revealed three recurring, high-impact friction patterns.

315 observations across 8 websites. Clustering them revealed three patterns that appeared in every single session.

315 observations across 8 websites. Clustering them revealed three patterns that appeared in every single session.

1

Unpredictable Calendar
Users could not tell which field was active, so the calendar often felt unpredictable. This led to repeated attempts, longer time on task, and early drop-off before users even reached room selection.

2

Unclear Room Comparison
Room options were hard to compare because key differences were scattered and required scrolling or back-and-forth navigation. This made it difficult to choose confidently and increased decision fatigue at the most important step in the flow.

3

Hidden Information
Key details like breakfast inclusion, location, and cancellation terms were not visible early enough to make a confident decision. When users had to search for these late in the journey, trust dropped and abandonment became more likely.

Customer Journey Map

Customer Journey Map

Customer Journey Map

Customer Journey Map

The journey map highlights the biggest frustration spikes during date selection and room comparison. In both stages, users hesitated, repeated actions, and searched for clarity, which reduced trust and increased the likelihood of dropping off before checkout. This supported the same friction points identified in the affinity diagram and added a second layer of validation across the full journey.

The journey map highlights the biggest frustration spikes during date selection and room comparison. In both stages, users hesitated, repeated actions, and searched for clarity, which reduced trust and increased the likelihood of dropping off before checkout. This supported the same friction points identified in the affinity diagram and added a second layer of validation across the full journey.

Mapping the full booking journey made the emotional cost of these friction points visible

The journey map highlights the biggest frustration spikes during date selection and room comparison. In both stages, users hesitated, repeated actions, and searched for clarity, which reduced trust and increased the likelihood of abandoning before checkout. This supported the same friction points identified in the affinity diagram and added a second layer of validation across the full journey.

Flow diagram

Flow diagram

Flow diagram

The flow diagram shows where the three main friction points were addressed across the booking journey: predictable date selection, scannable room comparison, and earlier access to key information.

Before designing, I mapped the ideal flow — ensuring every friction point was addressed at the exact step where users encountered it.

The flow diagram shows where the three main friction points were addressed across the booking journey: predictable date selection, scannable room comparison, and earlier access to key information.

The calendar section is shown in more detail as an example. Unlike competitor flows observed in research, the overlay stays open until the user intentionally moves to the next input, and the active field is always indicated. Hover previews use distinct shapes for check-in and check-out, and the corresponding date field updates before selection.

Once selected, the date range remains visible and persistent. This reduces uncertainty, prevents repeated actions, and saves steps early in the booking process.

Flow diagram

The flow diagram shows where the three main friction points were addressed across the booking journey: predictable date selection, scannable room comparison, and earlier access to key information.

Before designing, I mapped the ideal flow — ensuring every friction point was addressed at the exact step where users encountered it.

The calendar section is shown in more detail as an example. Unlike competitor flows observed in research, the overlay stays open until the user intentionally moves to the next input, and the active field is always indicated. Hover previews use distinct shapes for check-in and check-out, and the corresponding date field updates before selection.

Once selected, the date range remains visible and persistent. This reduces uncertainty, prevents repeated actions, and saves steps early in the booking process.

Design

From Hidden to Visible

From Hidden to Visible

From Hidden to Visible

From Hidden to Visible

The principle was simple: if users have to guess, the design has failed. Every solution addresses one friction point, directly at the step where it occurs. The redesign focused on three areas: a predictable calendar with clear state feedback, a scannable room comparison view, and transparent pricing visible before the final booking step.

The founder had created a landing page with a 2 % sign-up conversion rate.
Initial testing with 5 users revealed confusion.

The founder had created a landing page with a 2 % sign-up conversion rate. Initial testing with 5 users revealed confusion.

  • Homepage

  • Search Results

  • Hotel Details

  • Room Selection

  • Reviews

  • Personal Details

  • Add-Ons

  • Booking Confirmation

1

Predictable Date Selection

Predictable Date Selection

Before: A user struggled for nearly 2 minutes as the calendar switched fields unpredictably and the screen went blank repeatedly. She massaged her forehead and said: "I actually may just move on to a different website."

After: Errors, time on task, and frustration at the first step of the booking flow are reduced, building the trust needed to move users toward completion.

STATE FEEDBACK
Active input field always highlighted, so users know exactly what they're selecting.

STATE FEEDBACK
Active input field always highlighted — users know exactly which date they're selecting next, preventing the repeated attempts observed in usability tests.

RANGE PREVIEW
Hovering shows the full range in real-time before confirming. Once selected, the date range persists on the calendar.

RANGE PREVIEW
Hovering shows the full range in real-time before confirming. Once selected, the date range persists on the calendar — giving users continuous visual reassurance.

NO OVERLAY CLOSE
On a competitor website, the calendar closed automatically when users tried to change their dates, forcing them to reopen it. Here, the calendar stays open, allowing direct corrections and reducing the clicks needed to modify a booking.

NO OVERLAY CLOSE
On a competitor website, the calendar closed automatically when users tried to change their dates. Keeping it open allows direct corrections and reduces the clicks needed to modify a booking.

NO OVERLAY CLOSE
Dates can be corrected directly — no need to close and reopen the calendar.

2

Scannable Comparison

Scannable Comparison

Before: Users had no way to compare rooms without scrolling back and forth. Room cards took up the full screen, key differences between rooms were unclear.

After: Side-by-side comparison reduces the time users spend evaluating options, lowering decision paralysis and supporting faster booking completion.

SIDE-BY-SIDE
Users can select up to 4 rooms to compare side-by-side. Limiting comparisons prevents decision paralysis, fewer options at once means faster, more confident choices.

KEY INFO UPFRONT
Bed type, cancellation, and payment terms visible on every card without clicking.

DECISION GUIDANCE
'Best Value' tag reduces cognitive load at the most critical decision point — supporting conversion by helping users commit faster, while creating upselling opportunities for premium rooms at discounted rates.

DECISION GUIDANCE
'Best Value' tag reduces cognitive load at the most critical decision point — supporting conversion by helping users commit faster, while creating upselling opportunities for premium rooms at discounted rates.

DECISION GUIDANCE
"Best Value" tag reduces cognitive load at the most critical decision point.

DECISION GUIDANCE
"Best Value" tag reduces cognitive load at the most critical decision point.

3

Visible Key Information

Before: Users struggled to find the same details. Breakfast inclusion, hotel location, and cancellation terms only appeared late in the flow.

After: Surfacing key information early reduces uncertainty before users commit, lowering late-stage drop-off and supporting booking completion through transparency rather than urgency.

LOCATION UPFRONT
Location details required users to search through multiple screens before finding them. Showing distance to city center and train station immediately reduces time on task, prevents frustration from searching, and builds trust by being upfront about what matters most.

BREAKFAST CLARITY
"Breakfast (optional)" with a ⓘ icon placed in amenities before room selection, and shown again as a selectable add-on at checkout. The status remains visible and unambiguous at both decision points.

TERMS UPFRONT
Cancellation policy and check-in times visible before users commit — building trust & confidence early rather than creating doubt at the final step.

Evaluative Testing

Evaluative Testing

Evaluative Testing

Evaluative Testing

To validate whether the redesign addressed the identified friction points, I conducted 3 moderated usability sessions using the SEQ (Single Ease Question) after each task. Each task mapped directly to one of the three friction points.

The founder had created a landing page with a 2 % sign-up conversion rate. Initial testing with 5 users revealed confusion.

With n=3, these results are treated as directional indicators rather than statistically significant findings. They suggest the redesign reduces the friction documented in initial research and identify areas for further iteration.

The founder had created a landing page with a 2 % sign-up conversion rate. Initial testing with 5 users revealed confusion.

Task

Description

SEQ Score

Date selection

Select check-in and check-out dates

6.5 / 7

6.5 / 7

Room comparison

Compare rooms and select one

6 / 7

6 / 7

Key Info

Find breakfast and cancellation details

5 / 7

5 / 7

An Unexpected Finding

An Unexpected Finding

An Unexpected Finding

One user double-checked if breakfast was included during room selection by going back and forth, revealing that it should be added to the room cards as well.

The founder had created a landing page with a 2 % sign-up conversion rate. Initial testing with 5 users revealed confusion.

One user double-checked if breakfast was included during room selection by going back and forth, revealing that it should be added to the room cards as well.

Reflection

What I learned

What I learned

What I learned

Established platforms are a starting point, not a standard

Large platforms are a valuable starting point for competitive research, but they should not be treated as the definitive standard for good UX. Usability testing revealed friction points that users had normalised over time without questioning. Staying critical and aiming beyond existing patterns is what creates genuinely better experiences.

Research before features

A feature idea for room comparison seemed like an obvious improvement early in the project. A key principle from the UX Design Institute reframed this thinking: adding features before validating the existing experience is a common mistake.

Test what you think you've already solved

The unexpected finding during evaluative testing, a user struggling to locate breakfast information despite it being visible on the detail page, showed that research findings do not automatically translate into effective design solutions. Testing confirmed what worked and revealed what still needed attention.

Reflection

What I learned

Established platforms are a starting point, not a standard

Large platforms are a valuable starting point for competitive research, but they should not be treated as the definitive standard for good UX. Usability testing revealed friction points that users had normalised over time without questioning. Staying critical and aiming beyond existing patterns is what creates genuinely better experiences.

Research before features

A feature idea for room comparison seemed like an obvious improvement early in the project. A key principle from the UX Design Institute reframed this thinking: adding features before validating the existing experience is a common mistake.

Test what you think you've already solved

The unexpected finding during evaluative testing, a user struggling to locate breakfast information despite it being visible on the detail page, showed that research findings do not automatically translate into effective design solutions. Testing confirmed what worked and revealed what still needed attention.

Next steps

Next steps

Next steps

Next steps

To validate these concepts, I would:

To validate these concepts, I would:

Add breakfast status to room comparison cards, as evaluative testing revealed users still searched for this information during room selection

Conduct a larger round of usability testing to move from directional SEQ indicators to more statistically reliable findings