The UXPA Journal of UX, a valuable resource for UX professionals, faced significant challenges with its online platform. The website’s outdated design and poor information architecture hindered user experience, making it difficult for readers to find and access relevant articles and resources. Key issues included unclear content structure, broken links, outdated content, and limited search functionality. Our goal was to redesign the website to improve navigation, enhance usability, and align with the needs of its academic users.
To understand the website’s shortcomings and user needs, we conducted a comprehensive research process:
A thorough inventory of the website's content was conducted using Screaming Frog to identify areas for improvement in organization and structure. We Identified 18 broken links across 409 pages, outdated articles, and missing metadata.
To understand users' mental models and preferences for content organization, card sorting activities were conducted with a sample of UX students. The experiment was an unmoderated open card sort with 10 participants to reorganize content into intuitive categories. Key findings included the need for broader categories for experienced users and more specific labels for beginners.
Performed a tree test with 10 participants to evaluate navigation paths.
Task 1:
Task 2:
In the first round of testing (Test 1), Task 1 had a 100% first-click accuracy. This showed that users found the task straightforward and easy to complete. However, Task 2 had only a 44% first-click accuracy. Many users struggled to find the “Call for Papers” section. Feedback from participants revealed that the placement of “Policies” as a level 2 category caused confusion.
We started by reorganizing the website’s content structure. Using data from the card sort and tree test results, we grouped content into broader, more intuitive categories like “Articles,” “Call for Papers,” and “About.” This helped align the website with how users naturally think about the content. For example, we moved the “Policies” section under “Call for Papers” after noticing it shared similarities with latter, which caused confusion in the first round of testing. This change alone improved task success rates by 20% in the second round.
We updated category labels to make them clearer and more user-friendly. For instance, we changed “peer-reviewed paper” to “research paper” to better match the language users were familiar with. We also added breadcrumb navigation to help users keep track of where they were on the site. Additionally, we created a “Most Downloaded Articles” section and placed it as a third-level category to make it easier for users to find popular content.
To help users find content more easily, we introduced a faceted search system. This allowed users to filter articles by date, author, topic, and popularity. We also added a “Save Search” feature so users could revisit their previous searches without starting over. These changes were especially helpful for academic users who needed to locate specific research quickly.
To increase user engagement, we added a subscription option so users could receive notifications about new publications and updates. We also included a bookmarking feature, which allowed users to save articles for later reading. This feature was highly requested during our research phase and addressed a key pain point for users who wanted to save content for future reference.
The results from our testing showed that our design decisions had a positive impact on the user experience. Task 2’s direct success rate increased by nearly 50%, jumping from 44% in Test 1 to 90% in Test 2. Task 1 maintained a 100% success rate across both tests, proving that the structure was consistently clear and easy to navigate. First-click accuracy for Task 2 also improved significantly, rising from 44% in Test 1 to 90% in Test 2. Participants found the tasks easier to complete in the second round, and many commented on the improved clarity of the navigation paths. Median completion times for Task 2 decreased, showing that users could navigate the site more efficiently. These results confirmed that our changes to the information architecture and labeling made the website more intuitive and user-friendly.
The suggested redesign for the JUX website features a more intuitive structure, improved labeling, and faceted search, making it easier for academic users to find relevant content. Suggested utility features like subscription options and bookmarking aim to boost engagement and retention. While limitations in metadata optimization and multi-device bookmarking present opportunities for future improvements, our recommendations address key pain points and enhance the site’s functionality. These changes would position JUX as a more user-centric platform, better serving the UX research community and hopefully fostering scholarly collaboration.
In this project, we successfully identified key usability issues on the JUX website, conducted user research through card sorting and tree testing, and implemented a redesign that improves navigation, search functionality, and the overall user experience. However, there are always alternative approaches that could have been explored to enhance the design further.