Internal SaaS tooling
An Award Winning Unified Platform for Psychometric Insights
Built for Consistency & Scale
A platform bringing together assessment creation and reporting, while introducing a more rigorous and scalable approach to how data is captured and interpreted.
Kirschtein Creative's Involvement
Lead Designer
Product Mentor
Brand Owner

A fragmented set of legacy portals, reports and standalone tools were replaced with a unified assessment and insights platform.
Key outcomes included:
A coherent brand and interface system applied across web, mobile and assessment surfaces.
Restructured data‑gathering flows that reduced friction for candidates while improving data quality for practitioners.
Clearer mapping from raw responses to psychometric constructs, improving interpretability and confidence in results.
A scalable design system to support new assessment types and products over time.
Design team size
1
Establishing a baseline
The work began with a full audit of existing tools, assessment journeys, and brand touchpoints: marketing sites, candidate flows, practitioner consoles, and reporting outputs. This exposed duplicated patterns, inconsistent interaction models, and multiple interpretations of the same underlying constructs across products.
By mapping these journeys end‑to‑end, we established a baseline of what needed to remain (regulatory and psychometric constraints) versus what could change (flows, UI, and language). This diagnostic work grounded later design decisions in evidence, not preference.
Competitor and category analysis
A focused competitor review examined both psychometric platforms and adjacent B2B SaaS tools in HR, talent and team benchmarking. The goal was not to imitate, but to identify patterns users already understood, opportunities to differentiate visually, and gaps in how complex assessments and insights were typically communicated.
This analysis validated the direction toward a cleaner, more minimal visual language with a stronger emphasis on interpretability over raw data density.
Building a brand led product vision
Brand and product strategy were brought together into a single vision: Thomas as a trusted, science backed partner that makes complex psychometric insight feel approachable and actionable. Key principles from senior management centred on clarity, fairness, and confidence, for both candidates and decision makers.
Product branding and UI patterns
The new platform UI applied the evolved brand system across navigation, core components and data visualisation. Assessment tiles, result cards, and dashboards were designed mobile first as part of a cohesive visual framework that could accommodate different instruments without fragmenting the experience.
3 cross communicative design systems were generated to accommodate the needs for each platform and keep a visual consistency. This reduced the perceived complexity of the product suite and made it easier to introduce new capabilities under a recognisable visual and interaction framework.

Design system
Across every project, design patterns, components, and tokens are standardised into a reusable design system so teams can ship faster with less risk and higher consistency. Explore the principles, workflows, and tooling used to build industry grade systems.
Restructuring data‑gathering methodologies
A core focus of the project was reshaping how data is gathered from candidates and practitioners. Existing assessment flows were reviewed alongside psychometric and regulatory requirements to identify unnecessary friction and inconsistent metadata collection.
UX overhaul and information architecture
The platform’s information architecture was reshaped around real user goals:
Creating an assessment
Inviting participants
Monitoring progress, and interpreting results.
Legacy navigation based on internal product names was refactored into task oriented groupings, making it easier for new users to understand where to start and what to do next.
Interaction patterns were standardised across flows; consistent form handling, progress indicators, status states, and error messaging. This reduced cognitive load and created a predictable rhythm across different tools in the suite.
Wireframing and iterative validation
Low‑fidelity wireframes were used to explore alternative IA structures and reporting layouts, then validated with internal stakeholders and selected customers. Feedback cycles focused on task clarity, interpretability of results and the balance between detail and digestibility.
These iterations de‑risked major changes to the platform experience and ensured that new flows would be usable by both experienced practitioners and first time administrators.
Component alignment
Components were designed to be semantically flexible enough to support multiple assessment types while remaining visually consistent.
This system became the contract between product, design, and engineering, reducing one‑off implementations and enabling faster delivery of new features and experiments across the platform.
Data informed design decisions
Usage analytics, completion rates, and support queries were integrated into the design process to validate hypotheses and prioritise enhancements. Patterns such as frequent navigation backtracking, abandoned assessments or misinterpreted score ranges were used to drive specific UI and content changes.
Over time, this created a feedback loop where the platform’s own data informed refinements to how psychometric information was introduced, explained and visualised to different user groups.
B2B application
The redesign focused on making it easier for HR, talent and leadership teams to configure assessments and manage/interpret results in one coherent environment. Platform navigation was restructured around core workflows such as creating assessments, inviting cohorts, monitoring completion and comparing candidates or teams over time.
B2C application
On the candidate side, assessment journeys were reworked to prioritise clarity, fairness, and psychological safety. Language, pacing and visual hierarchy were refined to reduce anxiety and make expectations transparent from the first screen to completion. Mobile friendly flows, accessible patterns and clearer progress indicators helped candidates stay engaged and complete assessments with fewer errors or drop offs.






























