Project Description


The Fortius Platform provides suppliers in the UK construction industry an accreditation service for their business, giving them opportunities to join larger construction companies’ supply chains and projects.

As part of the accreditation service, suppliers must input a lot of information relating to their company details, financial status, health and safety policies, modern slavery procedures, and much more.

The consequences are mammoth digital forms that customers need to complete and submit to then be assessed by the Fortius verification team, who can then verify their details and approve them as an accredited supplier on the system.

There was lots of feedback coming into the business from users and the customer service team, saying that users were struggling to get through the assessment process for some sets of questions. This often led the verification team needing to return submissions to the customers with feedback, requiring them to resubmit with updated details. Regularly, customers would require multiple submissions to achieve their verification status which led to them becoming very frustrated, creating delays to get verified, as well as causing inefficiencies for the verification team to have to reassess multiple submissions.

Our Brief

As a design team, our brief from the business was to look at how a better approach to the design of question sets could help users to complete them more effectively the first time and get verified more quickly.

Business Objective

“How do we help Suppliers get verified first time round, and reduce resubmission rates?”

Documenting user feedback

We wanted to document insights coming back from users as well as the customer service and verification teams to highlight any obvious pain points and understand where the issues are coming from.

“I wasn’t aware of the specific information I needed to enter for these questions”

User feedback

“I’m not sure where I am in the process and how much I need to do”

User feedback

“Guidance text is often really confusing, really long and take ages to read”

User feedback

“It wasn’t clear enough what format the document needed to be in”

User feedback

Themes from the feedback:

  • Users were often not aware of crucial acceptance criteria for questions, meaning their answers would fail assessment.
  • Question and guidance text was often confusing and poorly formatted.
  • In some cases, questions and the user input weren’t displayed together (due to the amount of guidance text that was displayed). This meant users weren’t always sure where to input answers to a questions.
  • It wasn’t clear enough what criteria was required for documents that users needed to upload for evidence.
  • Users were not always clear how long a form was and how far along in the process they were of completing a form.

Looking at the data

The business was tracking the submission and verification success data already. Analysis of this data backed up the feedback that was coming back from the team. It was clear that some of the questions sets were causing issues that were prevented customers from getting through the verification process.

We used the data to analyse performance and to help highlight specific questions sets and particular questions that had high failure rates so we could give them particular attention. The team also planned to use continued data monitoring to track improvements in future code releases.

Breaking the problem down

We knew there wasn’t a single silver bullet that would deliver a solution to the business objective. We wanted to consider and test several well-documented UX principles that we felt used in combination could improve the performance of the submission and verification process. We broke feedback down into individual goal statements that we could later measure against user stories and design solutions.

Goal statements:

  • How might we… minimise the pain of completing the questions sets

  • How might we… show users a path to completion

  • How might we… humanise the process for customers and improve the dialogue with the customer

  • How might we… create a clear and consistent communication

  • How might we… lead the conversation with customers through the form

  • How might we… give suppliers a sense of progress and question scope within a question set?

  • How might we… help suppliers know what to expect when starting a question set?

  • How might we… show suppliers how to better meet a requirement’s criteria?

  • How might we… help suppliers understand system blockers that are preventing them from submitting?

  • How might we… surface that there are extra (private goal) criteria linked to a specific requirement?

Suggested design solutions

1. Set expectations before the start

For very long forms, it helps to prepare the person before they start. The hypothesis is that preparing the person before beginning the form, giving them as much context as possible, will help inform them about the form’s scope, therefore assisting individuals to complete the process.

Our plan for more complex question sets was to design an introduction module that would be displayed to first time the users:

  • Including a helpful intro video helps explain the context and inform users about specific criteria they should be aware of. We felt that a short friendly video helped to humanise the process while also informing without the need to process to read loads more text.
  • Providing a checklist of possible documents they might need to collect. We thought that offering this as a downloadable file also allows customers to get everything they needed before they start, making the completion of the questions a smoother process.
  • Informing users of any additional system criteria or tasks (blockers) that potentially might prevent them from submitting a form. This allows them to complete these tasks beforehand.
  • A friendly intro video would help to give a human face to the organisation as well as introduce the new inline helps tools throughout the question sets.
  • A list to a checklist of possible documents they might need to collect and any additional criteria that might block them from submitting the requirement
  • Any additional criteria that might block them from submitting the requirement when the 
requirement has been completed



2. Improving individual question layout and formatting

Our plans for the format and structure of individual questions:

  • Displaying dynamic question numbers helps users get a sense of orientation and gives users a sense of progress and scope within a question set. “Question X of Y”

  • Moving the question text and user input components together. This helps to keep a context between the question being asked and the answer being inputted.

  • Improvements to guidance notes, to be as concise as possible with language and use bullet points instead of paragraphed text where possible and translate jargon-filled questions and help-text into easy to understand copy.

  • Using good help text to provide an understanding for the user to understand why they needing to answer the question. It also ensures that they give a better answer on their first try, which reduces the cost of additional communication after the form is submitted.

3. Adding video and audio help tools

Individual questions that had higher failure rates often required precise acceptance criteria that the user needed to follow. We decided to test a hypothesis to use short audio and video clips that could be accessed in context with these questions.

Our hypothesis:

  • Using a brief video was a more immediate and consumable way of explaining to users what they needs to do than providing even more lengthy guidance notes.

  • This also adds a friendly human face which helps to humanise the process of getting help.

  • This also offers the opportunities to communicate other related criteria that a user might benefit from knowing.

4. Sorting questions into manageable chunks.

Forms are like conversations and often have natural breaks that structure the dialogue between user and business. We considered options to sort question sets into shorter and more understandable sections.

Needing to consider the context.

While deciding on this as an approach, we were aware we firstly needed to consider the context of each question set individually. Asking ourselves if splitting up of questions would be beneficial? In some question sets, where users needed to see all questions to answer each question, we thought it would be inappropriate to do this.

However, in other question sets, where questions ran across several different themes, we felt this was an option to look at. This potentially gave the UI opportunities to clearly map the user’s progress and further understand the overall scope of the task. It also allows for small rewards for users in a lengthy and painful process when they complete smaller sections.

5. Using more explicit validation and feedback


Inline confirmation works best for questions with potentially high error rates or specific formatting requirements. We suggested an audit of existing feedback and validation messages.

Included better communication of system blockers for a submission.

For some sets of questions, the system had submission blockers in place to prevent users from submitting question sets before other tasks in the platform had been completed.

We intended to communicate these blockers in the introduction module, but we felt that these blockers needed to be displayed in a better way once a user was ready to make a submission. Not just to tell the user that there are tasks they needed to complete but also to communicate a context as to why these extra tasks existed and why they are essential to their submission. It also gave opportunities to route users to the appropriate place for them to sort out any blockers more quickly.


The design of the online forms might seem to be an unexciting topic, but in the online world, forms are an essential part of almost every digital product.

It is crucial for organisations and UX designers to consider forms as a fundamental part of how users interact with your products. However, they are often overlooked and have poor user experiences as a result. Online forms aren’t a fun experience for users, but getting the approach right can minimise user’s pain and improve business performance.

Forms are integral to checkout, registration and data entry and are often the lunchpin of successful Web application.”

Luke Wroblewski (2008)

Some tips that I believe are important things to consider when approaching the design of lengthy online forms:
  • Examine whether your form should be split or on one page. Considering the context is important.

  • Ask yourself whether the information you are asking is critical to the process the user is currently in. If not, can this data be captured at a different time or place in a user’s journey?
  • Think about how you communicate the scope of the form and user’s location within the process.

  • Consider appropriate labels and language for input fields (Descriptive/Concise) based on the context of your form.

  • Look to use smart defaults to help speed up user’s workflow.

  • Consider when to use help tools, and use them to explain the context to users why they need to provide specific data.

To read more about designing for online forms go here.