UXPA Journal Project

A research backed redesign strategy for the UXPA's Journal of UX

Timeline

3 Months

Role

UX Researcher

Tools

Screaming Frog
Optimal Workshop

Team

6 x UX Researchers

Problem Statement

The UXPA Journal of UX, a valuable resource for UX professionals, faced significant challenges with its online platform. The website’s outdated design and poor information architecture hindered user experience, making it difficult for readers to find and access relevant articles and resources. Key issues included unclear content structure, broken links, outdated content, and limited search functionality. Our goal was to redesign the website to improve navigation, enhance usability, and align with the needs of its academic users.

Website Hierarchy Tree Graph

Research & Discovery

To understand the website’s shortcomings and user needs, we conducted a comprehensive research process:

Content Audit

A thorough inventory of the website's content was conducted using Screaming Frog to identify areas for improvement in organization and structure. We Identified 18 broken links across 409 pages, outdated articles, and missing metadata.

Content Inventory Sample

Card Sort

To understand users' mental models and preferences for content organization, card sorting activities were conducted with a sample of UX students. The experiment was an unmoderated open card sort with 10 participants to reorganize content into intuitive categories. Key findings included the need for broader categories for experienced users and more specific labels for beginners.

Tree Test

Performed a tree test with 10 participants to evaluate navigation paths.

Task 1:

  1. Prompt: “Suppose you are completing a scholarly research piece, find research about inclusive and universal design.”
  2. Anticipated Path: Home → Articles → Articles by Topic → Accessibility.
  3. Goal: Test how easily users could locate specific research topics

Task 2:

  1. Prompt: “Suppose you are a professor who has written a research paper. Where would you look to find guidelines about publishing your research?”
  2. Anticipated Path: Home → Call for Papers → Criteria for Submitting a Paper.
  3. Goal: Evaluate the clarity of navigation for publishing-related content.

In the first round of testing (Test 1), Task 1 had a 100% first-click accuracy. This showed that users found the task straightforward and easy to complete. However, Task 2 had only a 44% first-click accuracy. Many users struggled to find the “Call for Papers” section. Feedback from participants revealed that the placement of “Policies” as a level 2 category caused confusion.

Task Specifics for Test #1

Design Process

Restructuring the Information Architecture

We started by reorganizing the website’s content structure. Using data from the card sort and tree test results, we grouped content into broader, more intuitive categories like “Articles,” “Call for Papers,” and “About.” This helped align the website with how users naturally think about the content. For example, we moved the “Policies” section under “Call for Papers” after noticing it shared similarities with latter, which caused confusion in the first round of testing. This change alone improved task success rates by 20% in the second round.

Improving Labels and Navigation

We updated category labels to make them clearer and more user-friendly. For instance, we changed “peer-reviewed paper” to “research paper” to better match the language users were familiar with. We also added breadcrumb navigation to help users keep track of where they were on the site. Additionally, we created a “Most Downloaded Articles” section and placed it as a third-level category to make it easier for users to find popular content.

Suggested Design Changes

To help users find content more easily, we introduced a faceted search system. This allowed users to filter articles by date, author, topic, and popularity. We also added a “Save Search” feature so users could revisit their previous searches without starting over. These changes were especially helpful for academic users who needed to locate specific research quickly.

To increase user engagement, we added a subscription option so users could receive notifications about new publications and updates. We also included a bookmarking feature, which allowed users to save articles for later reading. This feature was highly requested during our research phase and addressed a key pain point for users who wanted to save content for future reference.

Results

Second Test

The results from our testing showed that our design decisions had a positive impact on the user experience. Task 2’s direct success rate increased by nearly 50%, jumping from 44% in Test 1 to 90% in Test 2. Task 1 maintained a 100% success rate across both tests, proving that the structure was consistently clear and easy to navigate. First-click accuracy for Task 2 also improved significantly, rising from 44% in Test 1 to 90% in Test 2. Participants found the tasks easier to complete in the second round, and many commented on the improved clarity of the navigation paths. Median completion times for Task 2 decreased, showing that users could navigate the site more efficiently. These results confirmed that our changes to the information architecture and labeling made the website more intuitive and user-friendly.

Task Specifics for Test #2

Final Report

Design Strategy for JUX

The suggested redesign for the JUX website features a more intuitive structure, improved labeling, and faceted search, making it easier for academic users to find relevant content. Suggested utility features like subscription options and bookmarking aim to boost engagement and retention. While limitations in metadata optimization and multi-device bookmarking present opportunities for future improvements, our recommendations address key pain points and enhance the site’s functionality. These changes would position JUX as a more user-centric platform, better serving the UX research community and hopefully fostering scholarly collaboration.

Retrospective

Reflection

In this project, we successfully identified key usability issues on the JUX website, conducted user research through card sorting and tree testing, and implemented a redesign that improves navigation, search functionality, and the overall user experience. However, there are always alternative approaches that could have been explored to enhance the design further.

What we could have done better

  • With more resources it would have been nice to design a new version in Figma to test both versions in real time using A/B testing, allowing us to collect user data and definitively say which version of the website is better. But overall I think we did an amazing job considering the required deliverables.

Feel free to reach out if you’d like to chat about this project or explore my other work!