Skip to main content
Client
Department for Education
Role
  • Senior User Researcher within Service Design Delivery
About Department for Education
The Department for Education is responsible for children’s services and education and is a ministerial department, supported by 18 agencies and public bodies.

Challenge

The challenge was to digitize entirely offline citizen-facing and staff-facing systems, focusing particularly on enhancing internal systems for staff members. The project aimed to streamline the complex, time-consuming process of transferring an academy from one academy trust to another. This process involved coordinating the legal transfer of buildings, staff, assets, land, debt, and funding agreements from the Education and Skills Funding Agency (ESFA).

With the increasing number of academy transfers, the inefficiencies of the manual process became apparent, slowing down operations, lacking transparency, and creating administrative burdens. The Department for Education (DfE) selected the Academy Transfers project as a priority for digitization to optimize these processes for DfE staff and Service Division teams. The goal was to ensure smoother and faster academy transfers, better support for trusts, and ultimately improve outcomes for children while reducing time and resource waste.

GDS: Unique Research Pattern

Approach to the Problem

Following the discovery research, several hypotheses were derived from the findings to inform the direction of the Academy Transfers project. For the Alpha phase, four key hypotheses were prioritized for testing, focusing on the most critical areas identified during the discovery phase. These hypotheses were essential in validating assumptions, ensuring that the digitization efforts effectively addressed the process inefficiencies, and guiding the next steps in the project’s development.

1. Initial Hypothesis

Six remote, in-depth interviews were conducted with users to explore the first hypothesis, offering key insights into user behaviors, needs, and challenges, validating the project's foundational assumptions.

2. Analysis

The project followed a two-week sprint structure, dedicating one week to user research and the following week to analysis and prototype preparation, enabling iterative testing and focused research.

3. Synthesis

During the synthesis phase, qualitative research methods were used to synthesize data, identifying themes, user needs, and pain points. This informed the overall direction and validated user experiences.

4. Second Hypothesis

Building on the insights from the initial hypothesis, a second hypothesis was developed.

5. Developed Research Strategy

The research strategy was further refined, leading to team sketching and wireframing sessions.

6. Team sketch & Wire-framing sessions

These sessions translated initial paper sketches into digital wireframes using Miro, allowing for more structured design iterations.

7. Remote, in-depth interviews

Another round of remote, in-depth interviews validated the second hypothesis, offering further insights into user interactions.

8. Team Analysis and Synthesis Session

Research sessions were followed by a team analysis and synthesis session. This collaborative effort ensured that the findings were accurately interpreted and integrated into the ongoing design process, continuously refining the project based on user feedback.

9. Prototype & Testing

As the first team to prototype and test using the Gov.uk design kit within the Academy Transfer project, the team was tasked with leading this critical effort.

10. Political Situation

The implications of this initiative were significant, as adopting the Gov.uk system could potentially require terminating contracts with the existing IT contractor company across all teams. Given the political sensitivity of this situation, the team proceeded with careful consideration.

11. Analysis

The transition to a new system prompted a request for user research, which the team conducted as a leading tech spike. This research was crucial for understanding the potential impacts and gathering insights to inform decision-making.

12. Synthesis

During the synthesis phase, the critical decision revolved around whether to continue with the current system or adopt the new one, with a focus on assessing the new system’s flexibility and suitability for the project’s needs.

13. Research on the next hypothesis

As the research progressed, the team explored the next hypothesis through collaborative Design Kit sessions, where wireframing and iterative design took place. These sessions included exploratory co-design activities and prototype usability testing with 10 users. Additionally, subject matter expert (SME) check-up sessions and interactive journey mapping were conducted to ensure all aspects of the project were thoroughly considered. Mapping data, documentation, and processes further supported the analysis.

14. GDS Assessment Alpha Peer Review

The culmination of these efforts led to the GDS Alpha Peer Review, a critical presentation made to all departments to demonstrate the project’s progress and strategic direction. During this 1-2 hour session, the team presented their strategy and approach, seeking feedback from other departments to ensure they were on the right track and aligned with the broader goals of the organization.

Outcomes

One of the significant difficulties in this project was capturing and understanding the lingo and acronyms used by the DFA staff and subject matter experts (SMEs). To effectively keep up with their world, I essentially had to become a subject matter expert myself.

Moving to Beta

  • Validated Technical Choices: Successfully validated tech options for SDD, ensuring a robust and user-friendly solution with the involvement of interaction designers.
  • Prototype Development: Developed an interactive prototype and APIs, enabling seamless data manipulation through different systems.
  • Optimized Technical Solution: Recommended keeping existing system as the back end while enhancing the front end for better user experience.
  • User-Centric Dashboard: Identified preferred dashboard options through user research, aligning the design with user needs.
  • MVP Usability Testing: Conducted usability tests that informed the development of a flexible and user-friendly MVP.
  • Pre-Beta Usability Fixes: Identified and addressed key usability issues before the beta launch, ensuring a smoother experience.
  • Ongoing User Feedback: Implemented continuous research activities during the beta phase, integrating user feedback to refine the product.

Monthly

Show & Tells

Weekly

Cross Departmental Presentations

Contact for detailed portfolio

+44(0) 7492222636

ipek.delia@gmail.com