The current Dell experience is fragmented. The unified search experience was a a part of the larger unified customer experience which aimed at unifying the different acquisitions and domains Dell has through consistent visual language, design patterns and shared understanding of customer information and intent in a way that represented our website as a singular entity i.e. the way our customer imagines it and not in the fragmented and siloed way Dell sees its own parts.
Each of the four domains were created by different business unites within Dell, that were built on different platforms and had been designed for a specific set of customers with a singular need.
To understand the problem we had to look at search from various different perspective. We included different teams to understand search landscape we talked to both external customers but also internal teams that had search as a touchpoint.
We started with diving our customer segments (shop, support, enterprise, premiere) into various stages in their completion of purpose to understand how they approached search, what were they looking for, and how they used it.
For each customer segment we used a variety of research methods like research through design, moderated user interviews, focus groups, and surveys. Each method provided us with a deeper understanding of what people say, what they do and how they feel about search as a function.
After gathering all the data points across the conversations and tests we had, we conducted affinity mapping to extensively map out the customer behavior, their thought process, emotions and expectation along each step in their journey to complete their purpose and their wish list features and desires. We noticed several emergent insights from this actively.
We noticed several emergent insights from this actively.
We know that humans are task oriented.
As one customer we spoke to put it: “I don’t just go to websites for no reason… like I obviously have something to get done if I’m there, and I wish more sites would understand that at any given time I’m probably interested in only 1% of the content on your site.”
And that became the basis of how we think about intent. Before this, Dell has always had a very narrow and singular view of what “intent” is—with singular definitions of what is relevant to each intent. So the first thing we did was change that, by influencing the way the organization as a whole understands intent, what the object of that intent or task is, and then how to begin to cater to the customer.
The biggest challenge that combining all four sites into a single site created for us was that we had to architect an experience that would allow us to serve everyone from a consumer looking for a basic e-commerce search as well as an IT administrator looking for documents and resources and even a CIO looking to vet Dell Technologies as a worthy partner.
This meant that we really needed to build an engine or a system of engines that could identify using all available data points, what the query meant RELATIVE TO the person putting it into the box.
For example…
Data storage can mean an entirely different thing to a consumer who may be looking for a $100 external hard drive as it does to an IT professional looking for a $200,000 NAS with Hyperconverged Infrastructure capabilities.
Even a simple term like XPS can and should trigger entirely different content and experiences depending on whether you’re a consumer looking for a single laptop for personal use or a business looking for 100 laptops and the relevant software to manage your new fleet.
The diagram above shows you a high-level view of the mental model that we established for our engines to run (and constantly re-run) in order to begin to generate content and the final end-user experience.
What was evident right away through our discovery was that our search experience needed to be completely dynamic—meaning that every single thing about the experience needed to be able to change depending on who the individual human using it was. This couldn’t be your average e-commerce experience because traditional e-commerce accounts for less than 6% of Dell’s revenue and less than 25% of our overall web traffic. So we came up with the first iteration of our modular search platform (above) which shows how different queries (indicating intent or task) could serve up different content and experiences.
That led to our initial concept wireframes which took these initial scenarios and went into detail on what—and more importantly HOW AND WHY—each component module was placed on the page for each individual customer.
After a few more rounds of exploration with our concept modular system, we began to detail a set of ten wireframes—different scenarios that we saw frequently in the actual search data on our site—that showed how a modular search platform designed around intent and task could fundamentally change how customers are able to use search on Dell’s website.
Finally, after we’d come up with the system that worked for us, we put it out to test with customers in both moderated and unmoderated studies once more.
Once we had enough intelligence and insight from the engines running the mental model in the previous section, we could then use all the same data points and insights to begin to personalize the experience. Now, personalization is a triggering word because it means so many different things to so many different people.
To some it means simply show me the products you think I’ll like most. But because our experience goes so far beyond e-commerce search, we took it ten steps further by thinking about what experience suits your task at hand the most—meaning, we don’t just personalize the results coming back, we look at all the data points we have available to us to curate your entire experience—results, content architecture, and UI (more on the second two below).
And, contrary to popular belief, authentication status isn’t the be-all, end-all that people think it is. It unlocks certain specific capabilities but we can still get hyper personalized to you no matter your authentication status. The more data points we’re able to access, the ‘warmer’ the experience can get for every customer.
Even in a zero-party data world.
Once we’ve been able to identify the human, their intent and or task, and what content is most relevant to them at that given moment in time, it means we can begin to construct the actual end user experience.
However, content presented a challenge because again, our search experience needed to be able to transform and flex depending on the customer, their background, segment, intent/task, and the content most relevant to that intent and task. The same experience needed to be able to serve up (just to name a few):
On top of that, we were very cognizant (from feedback we collected through over 1,500 customer surveys and interviews) that regardless of which of the above experiences (or combination thereof), no one wants to see irrelevant content.
Thus, we figured the best way to serve up the most relevant content to every user was through a flexible content architecture, in which we’d use tabs to segment the experience not by our own business verticals but by the way our customers think about the intent or task that they are there to accomplish.
For example: If I’m a consumer looking for a great gaming laptop, I don’t want to see enterprise/industry solutions or break/fix support. I want to see an overview of what you offer for gaming and why I should choose Alienware or G-Series (Dell’s gaming lines), all the products I can purchase, forums and communities dedicated to gaming.
Once we’ve gone through the process of identifying the individual customer, their intent and task, and the right content and organization of said content; we now need to figure out how to present those relevant results WITHIN each tab through the front-end UI.
That’s where MFEs or Micro Front Ends come into play (see here if you’re unfamiliar with what MFEs are).
On a unified platform such as Dell is looking to develop, MFEs are perfect because they allow for cross-functional, technology-agnostic features to be built once to support all customer needs across the site and ecosystem.
So we said, by leveraging the MFEs that Dell was already building as well as adding some new ones and clearly delineating and defining the purpose of each (purpose for the end user), we’d be able to get all the right information in all the right tabs in the best format for consumption.
For unified site search, that means that the engine simply needs to identify which MFEs are relevant to the given intent, and following the content architecture and strategy, which tab each MFE needs to live within.
MFEs also support the flexibility and scalability that we needed out of our experience. Meaning, something like a product identifier MFE could be used for many if not all customer intents that have to do with using a product as a data point to refine a master data set down to a sub-set. Same with a product finder—as long as the MFE was designed and built to support all the use cases and mental models our customers have, we’d be able to plug it right into Search.
Let’s look at how all this works in a basic, real-world use case. What you’ll see in the visuals below is how the experience for Nicki is entirely tailored to her very common use case of wanting a second monitor but not knowing much about monitors, what she really needs, and trying to make an informed decision on a purchase.
So with her task of finding herself a new monitor, Nicki has come to Dell and is looking for a new monitor that will solve her needs for a work-from-home setup. She doesn’t know what she really needs other than that it has to be compatible with her MacBook. So we can infer that her time on our site should initially be spent learning and discovering what Dell has to offer her.
Nicki is first presented with a tab that presents her with a series of MFEs that are designed to help her begin to find and learn about monitors in mental models that she understands. For example, she can enter the model number of her MacBook and only see compatible monitors. She can also use the product finder which has been presented in its use-case or quality-of-life view in order to let her self-identify what qualities she’s most interested in. And finally, she’s presented with some learning content to help her understand connection standards (USB-C, HDMI, etc.). Each and every MFE on this page also allows her to begin shopping right from it using her decision point as a refiner to generate an initial product set.
Once Nicki is ready to start shopping, whether she clicks on the next tab or enters the shopping experience via one of the MFEs, she’s presented with a typical shopping experience. The difference is that it’s now personalized at a deep level to her. In the left rail, we’re surfacing quality-of-life type refiners (which are the same MFEs we saw previously in-line on the page). In her results set on the right, we’ve also placed our Product Concierge which uses all the available data points we have on her and her needs to begin to formulate human-like product recommendations and provide individualized context about why each one is right for her and why we’ve recommended them.
In this example, we see one of our Premier customers coming to the new search experience to look for Dell Latitude laptops. In this instance, because of everything we know about this particular customer, we’ve shortcut them right into the shopping experience (versus the consumer who started on a learning-oriented page). However, even here, this is your typical e-commerce search but supercharged by personalization achieved by leveraging all the data we have about them.
For starters, we hope that our customers never even have to scroll. At the top of the page, we again see our product concierge MFE but tailored to our business customer and their mental models (so instead of good/better/best, we see your most to least purchased items) matching the refiners set and the query.
In addition to that, our Premier customers think in terms of projects (since this is their job, after all) and we imagine a future for Dell where their entire experience is oriented around projects such as those that Tonya has set up. And since that’s the primary way that TONYA THE HUMAN thinks about her work, it becomes her top filter. You’ll also see that products that have been tagged or added to projects also get a color-keyed indicator on the related product card.
Finally, because of the features that Premier accounts unlock and the way we know they work when it comes to roadmapping their fiscal years, products that will be reaching end of life in the next nine months are capped with a banner prominently alerting them that while such products may still be purchased (and even sometimes at a great discount), they will be discontinued soon.
By clicking on this banner, they’re presented with another shared MFE in the modal: our Product Lineage MFE. This shows the general family lineage of this product (so 5420 was replaced with 5520 and now will be replaced with 5620) and in addition, provides context and helpful links around each.
But not just context, context designed around their own human mental models.
This approach to the UI was hugely impactful to actual human users in our concept testing, who struggled with homogenous lists of text links, with many saying that being able to identify the kind of asset each link corresponds to was a huge timesaver and allowed them to zero in on exactly what content would best serve their task at hand, and quickly and efficiently move on to the next step in their journey through Dell’s online ecosystem.
The next phase of this journey is ramping up both our internal and external teams to begin engineering feasibility assessments as well as building Proofs of concept with various vendors who may be able to bring to life some of this immense functionality, and we can’t wait to share more with you in the coming months.