Transforming the DataRobot AI platform user experience
I lead the product design team at DataRobot, a Boston-based software company that has built a leading predictive and generative AI platform used by data scientists, software developers and ML engineers working across multiple industries to build, manage, and govern AI models.
Shortly after I joined in April 2022, we kicked off an ambitious and comprehensive redesign of the entire platform. I thought I would share some of the process here for others who may be interested or attempting to do something similar.
For context, there are approximately 800 people in DataRobot and we are a relatively small design team of eight product designers and one UX researcher, all of whom were involved in this project. There were many more involved across different domains (product management, engineering, and technical content) and while the project truly was a tightly integrated cross-domain collaboration, this case study focuses primarily on the role of the design team.
Understanding our starting point
First, it’s important to understand where we are as a company. DataRobot has been around for about a decade and has gone through several iterations in this time. We pioneered the automated machine learning (AutoML) industry, making the process of exploring data and building ML models accessible to new users. This established our strength in model selection, assessment, and comparison. The company has since evolved to support both low-code and code-first workflows across the full ML lifecycle — from building to governing and operating models in production.
By the time I joined in 2022, the product experience was under strain as a result of additional features that were bolted on over time, along with new acquisitions that were difficult to integrate. This was significantly impacting the user experience. We heard from our own team and customers that users were finding it harder to solve their problems with DataRobot. Even though there were plenty of capabilities, they found it hard to find and then use features. Some examples of this include multiple levels of navigation and a lack of hierarchy in the visible actions, meaning users had to hunt for actions to progress.
The organization underwent significant changes, including a leadership transition in July 2022 and subsequent layoffs that hit our team significantly. Despite the disruption, these changes (some of which predated the new leadership) brought in a fresh perspective on product development and the role of design in the organization. In the early days of DataRobot, design reported into engineering and a top-down approach to product development meant that design acted more like a feature factory. With new leadership members from established companies like Google and Microsoft, design as a capability shifted into a more strategic position, reporting into the product org. This marked a departure from merely executing features to actively shaping strategy and user experience.
In addition, our sales model was evolving to encompass a product-led growth SaaS model alongside our traditional sales-led model. It was clear that significant changes were required if we were to enable our customers to realize the value of the platform on their own and realize our growth ambitions as a company.
The scope of this work covered the entire application user experience, meaning a new design system and front-end code. We also used this as an opportunity to rebuild some parts of the back-end, but much of this has remained intact. In addition, this work included recategorizing the platform from a single entity into three distinct products with a lot of new functionality, such as GenAI tools. This allowed us to better target the personas that interact with our platform at different parts of the life cycle, creating a simpler go-to-market value proposition and user experience overall.
Identifying the key challenges
To tackle this undertaking, we started by aligning on the key problems to be solved and to do this, we gathered input from internal teams and our users. We are fortunate in that we have an internal ‘AI Experts’ team, a white glove service for new and existing customers whose role is to help teams get started with our platform, providing an invaluable source of feedback.
Just before I joined, our design team kicked off some preliminary user research, conducting 1:1 interviews with AI Experts and existing users to understand their needs, pain-points, and goals with the product. We used the jobs-to-be-done framework to capture this feedback, which provided a useful and necessary foundation to build from.
Defining the system
Next, the R&D team (product management, product design, and engineering) needed to align around how to define the existing features, and how they relate to each other. DataRobot is a big product with a lot of breadth and complexity. We quickly realized that very few people in the organization actually understood how the entire system worked together — there was lots of deep knowledge in different parts that we needed to unify. We needed to first understand the system in order to know how to fix it. We did this by conducting several collaborative workshops with cross-functional stakeholders, mapping the nouns, verbs and defining user flows.
What quickly became clear was that there were different mental models for how the system worked. We needed to unify these if we were to build a cohesive experience that would make sense to our users. In addition, it was also clear that we were not aligned on how nouns and verbs were used across the platform. In some cases, it was simply inconsistent use, but in others, we had a lot of terminology that was understood implicitly or used informally, yet not actually captured in the interface or product documentation. By getting the right people together and using and collaboratively drawing simple, descriptive information flows, we were able to form a consensus to build upon.
Articulating project goals
Once we identified the major nouns and verbs, we started building out the experience in more detail — capturing how different internal teams overlapped with the information architecture. Bit by bit, the resolution increased and we started to explore a future state that tackled the key challenges with the experience. We summarized these challenges as the following goals for the project:
Help users to get started on their own
To enable a self-serve approach, optimize for new users while progressively revealing complexity to experienced users over time.
This included the creation of a new onboarding experience to educate new users and enabling people to use the system at default settings.
Simplify UX & navigation
Re-categorize product features in more intuitive groupings — new features had previously been bolted on with little consideration to the overall user journey.
This exploration resulted in 3 new top-level product areas: Workbench (building & experimentation), Registry (asset management & governance), and Console (managing and monitoring production models).
Make it clear what they need to do next
Make it clear to users what they should or can do next at each point in their journey.
This meant re-thinking the entire information architecture and navigation system, with clear wayfinding and CTAs throughout the experience.
Ensure universal accessibility
Aim to meet web accessibility standards, ensuring our platform is user-friendly and inclusive for individuals with diverse abilities
In previous versions of the platform, accessibility was not a priority and we fell short in many areas. The new design system tackled this at a foundational level and we are now WCAG 2.2 AA compliant throughout.
Crafting our design strategy
Once we had a clear understanding of the problems to be addressed, the PM team started creating 6-pagers and PR/FAQs that described the vision for the redesigned platform (we use the working backwards methodology pioneered by Amazon). This method helped everyone on the team, including design, to assess and interrogate the product’s viability, guiding development, and maintaining a customer-centric focus.
In parallel with contributing to these documents, our design team started exploring new interaction and IA concepts. There is a lot of technical complexity and nuance to the workflows in our product and we operate in a rapidly evolving market. This meant working closely with the PM and engineering teams in particular. It was a highly collaborative process, with different disciplines sketching solutions, working back and forth with the common goal of finding the most robust and elegant flows possible.
This early exploration included a range of typical UX methods; from sketching user journeys in FigJam to wireframes and rough prototypes in Figma. This was not a linear journey — the complexity of the UX challenge necessitated a flexible approach that was highly exploratory. We iterated quickly, using synchronous (working meetings) async (commenting) feedback to get input. In a relatively short period of time, our conviction increased and the fidelity of our thinking sharpened.
While static wireframes were useful for the team building them to iterate, it was hard for others to grasp the concepts and understand the nuance of the intended experience. We found that low-fidelity yet highly interactive prototypes, like the example below, to be an effective way to convey design intent. Many of the ideas from these early concepts are visible in the production version that’s in use today.
DataRobot users interact with the platform in diverse ways: technical analysts often prefer the GUI for its ease in managing ML workflows, while data scientists may opt for coding directly in Notebooks using DataRobot APIs. Yet, this distinction isn’t rigid, as AI professionals and novices alike toggle between GUI and code based on their tasks. This hybrid approach, blending code and GUI capabilities, sets us apart and informs our ongoing development strategy. We’ve developed user personas to capture these varied interactions, continuously refining them to keep pace with market evolution.
Translating the vision into reality
After a few months of iteration, testing, and validation, we were ready to share the vision internally to the wider company. At this stage, we had worked through several versions and arrived at a concept that was ambitious and expansive, yet grounded in what was possible. This was a concept car and not a production model however. We were aware that the engineering effort to deliver this would be significant, the new direction was positively received but the work of making it a reality was only really starting.
This phase, underpinned by tools like Dovetail for interviews and Maze for surveys and self-directed walk-throughs, sought extensive feedback to refine and adjust our approach. We saw the impact that the redesign was having, users were completing workflows such as experiment setup 3x faster and the qualitative feedback was broadly positive, such as this is an example from an external tester:
“This was intuitive. I like how you laid it out, with the data and then my training and then my deployment, that’s the natural phase. So like, everything felt super, super intuitive. It felt like exactly like where my mind is.”
— Data scientist in user testing
In making such large changes in one go, we of course didn’t get everything right the first time. We sought feedback from a wide range of sources, both internally and externally in order to not just address concerns of current users, but to ensure we are building a system that will appeal to new users, who may not know DataRobot and the legacy UI.
External launch and refinement
The first features of the redesigned platform launched last year, starting with Workbench, and we continue to release new features and migrate functionality today. We are taking a measured approach, building the new experience inside the old one. While this poses some obvious challenges, it does allow us to address challenges while minimizing friction as users transition to the new experience.
The redesigned platform is comprised of three parts:
- Workbench is where users explore their data and build predictive and generative AI models that solve business problems.
- Registry provides governance and oversight at scale. Users can register and manage models, approve workflows and manage audit and compliance requirements.
- Console is where MLOps users manage models in production. This means monitoring and visualizing model performance — providing full control and confidence for organizations to operate AI at scale.
While we continue to evolve the platform, we remain committed to tracking user engagement and impact through tools like Amplitude, ensuring our design choices resonate with our diverse and expanding user base to get a deeper understanding of what is driving impact.
Something I have skipped in the overview above is the fact that mid-way through this redesign, we pivoted (like many companies over the last year) to integrate new GenAI functionality. While our platform has traditionally focused on predictive AI, the new information architecture and design system meant that we were able to move quickly and add in new GenAI features with minimal disruption. An example of this is the Playgrounds experience in Workbench, which is used to evaluate and compare different LLMs. The structure and interactive patterns are quite similar to the Experiments page and meant we were able to launch this quickly.
In addition to migrating a lot of existing functionality, this redesign has run in parallel with the development of many additional new features that were released over the last 18 months. This reflects a continuation of our innovation journey, establishing DataRobot as leaders in value-driven AI. Some examples include:
- Guided flow onboarding experiences.
- Notebooks and codespaces for code-first users.
- GenAI playgrounds and vector databases.
- Data preparation tools to bridge the gap between data ingest and modeling.
- Applications that make it easy for users to share their work with stakeholders.
There is so much more that hasn’t been mentioned above — making our design system accessible, collaborating with Azure OpenAI to integrate AI assistants and our UX research story are all worthy of articles of their own. There are new features in progress that we can’t show yet but are very excited about, and will no doubt have a similarly transformative impact on how people build, govern and operate AI models.
Reflections & what’s next
Over the past 18 months, our team embarked on an extensive platform redesign, learning and adapting through every challenge. This journey was a transformative experience for our team as it helped to refine our approach, enhancing collaboration, and ensuring our platform remains at the forefront of AI innovation.
We’ve seen firsthand the importance of clear communication, especially when introducing new concepts and overcoming skepticism. Our process wasn’t perfect; we learned the hard way that involving key teams early on is crucial for alignment and momentum. This experience has taught us the importance of external research and that when leveraged with our collective expertise, we can move swiftly and effectively.
The redesign underscored the essential role of cross-functional collaboration in our product development process. As we integrated feedback and iterated, we solidified the foundation for our evolving product strategy, focusing on delivering a user experience that empowers and engages. There were some missteps, such as under-communicating with our field teams and late engagement of our technical content team. This emphasized the complexity and collaborative needs of the projeect. We are also seeing the impact of this work however. One example is the US healthcare provider Baptist health is using our platform to integrate generative and predictive workflows that record, synthesize and submit patient-doctor appointments into medical records—reducing admin so practitioners can spend more time with patients.
We chose DataRobot to be part of our AI journey because it just works…. As a major healthcare provider, our goal is to provide quality care, we’re not interested in managing infrastructure or point solutions. We want to focus our energy on the people — both patients and employees. Working with DataRobot and leveraging AI appropriately allows us to maintain that critical focus.
— Rosalia Tungaraza, Baptist Health
We are not done yet and as we look ahead, we remain dedicated to innovation, continuously improving and adapting to meet the needs of our users. This journey has been a testament to the resilience and adaptability of our team, especially our colleagues in Ukraine; we have many designers and engineers based there who were pivotal in ensuring we met our goals. Their commitment has been nothing short of inspiring.
It’s an exciting time to be building AI tools and we are looking forward to sharing more of what we are working on, especially new GenAI features. We are expanding our team, if you’re interested, check out the openings on our design team or connect with me on LinkedIn. Either way, I would love to hear how others are navigating similar journeys in their organizations.
John Moriarty leads the design team at DataRobot, an enterprise AI platform that helps data scientists, ML engineers and software developers to build, deploy and manage predictive and generative AI models. Before this, he worked in Accenture, HMH and Design Partners.