Rethinking the Digital Experience Platform for the Modern Age

Michał Cukierman

Michał Cukierman
Jun 28 2023 7 min read

graph showing the architecture of a DXP system

Traditional Digital Experience Platforms (DXPs) have for years been the default driver of digital transformation efforts. These systems, descending directly from monolithic Content Management Systems (CMSs), were initially designed to manage and publish web content, and are to this day built around a content-centric philosophy. 

However, in our fast-evolving digital landscape, we must recognize that delivering memorable and engaging user experiences goes beyond mere content. As the volume of data companies and organizations have access to continues to grow, so does the possibility to  craft more personalized experiences. Yet, there's a catch – the architectural framework of our current tools, particularly DXPs, is no longer up to the task.

This is the first in the series of articles where I will outline a more detailed analysis on the problems that DXPs are facing in today’s highly demanding landscape. While these insights are based on my own observation when working on enterprise web development projects, they are not exclusive to my experience. They're shared across our industry, resonating with many engineers grappling with similar challenges. My team and I figured there must be a better way - and finding it has been the main driving force behind the StreamX platform.

Old solutions, new problems

DXPs have to talk to more systems and process more data. The results have to be delivered to the clients efficiently, at any scale. The problem is, this isn’t what they were built for.

To put this into perspective, let's consider some key milestones in digital history:

The first website ever created was published in 1991 by Berners-Lee at CERN

In 2000 Google was an unknown company and the most popular website was AOL, followed by Yahoo

The amount of data on the internet grows exponentially. The Zettabyte Era started in the mid-2010s and available forecasts expect further growth.

The first generation of iPhone was announced in 2007

Majority of how we experience the internet today has originated in the early 2000s - and so did the DXPs we see leading the pack today. They are now mature products, implemented globally in countless deployments. Changing the architectural design of such entrenched systems is hard, if not impossible, without breaking backward compatibility. The need to protect existing clients and revenue often outweights the need to respond to future challenges.

The content-centric model

Let's start by defining a Traditional DXP: It's essentially an evolution of the Content Management System (CMS) or Web Content Management (WCM) System, employing a monolithic architecture, usually built upon a Java or .NET platform.

Scalability and fault tolerance

The traditional DXP platform may be distributed across several machines to ensure scalability and fault tolerance, with content replication managed by the application layer. It may contain various authoring and publishing nodes, each employing its own relational or object database as a primary content store.

This implementation ensures data consistency, manages transactions, observes data changes, and facilitates concurrent updates. That’s why DXPs often come loaded with out-of-the-box functionalities such as content querying, full-text search, user management, authorization, authentication, scripting, and templating.

Extensions and Integrations

Extending the platform or introducing custom services requires updating or deploying new modules to the running applications. Deployments are done on servers, which involve installing new releases onto running systems and altering their states. Integration with external systems is managed by the platform nodes using point-to-point communications. 

Delivery

DXPs generate and deliver experiences using a standard request-reply model, otherwise known as a pull model. Experiences can be generated on demand, optionally cached, and then returned to the client in response to a request.

Though this definition isn't perfect, it provides a good conceptual understanding of how such systems work. Two key observations arise: most market players fall into this category, and a common history and architecture suggest shared limitations and challenges.

graph showing the architecture of a DXP system

The limitations of CMS-based DXP

As I already stated - the internet of today is dramatically different from what it was two decades ago. Businesses face an increasingly dynamic and demanding digital landscape characterized by high-traffic volumes, real-time interactions, diverse device usage, and exponential data growth. The architectural strategies developed around traditional Digital Experience Platforms (DXPs) to help businesses achieve their digital transformation goals in the past years, are no longer delivering the expected results. 

Scalability and fault tolerance

The content-centric DXPs were originally designed more for content publishing than handling high-traffic volumes. With data volume increasing, they are suffering from low throughput and limited scalability due to their monolithic architecture and transactional databases. 

Additionally, while DXPs serve as the backbone for various digital products, their single points of failure (the biggest being usually the DXP), limited throughput, and scalability inhibit their effectiveness as integration hubs for digital ecosystems.

P2P Integrations

The monolithic architecture also results in a lack of composability, as every external system needs to be integrated point-to-point. This limits adaptability and innovation, and raises the question of whether an all-in-one solution is still the best fit.

Deployment model

Additionally, traditional DXPs follow  a legacy deployment model, with most applications installed on dedicated servers or, at best, virtual machines—an approach that is now considered outdated.

Costly development

Finally, due to their complexity, DXPs require highly skilled technical teams for operation, extension, and integration tasks. Forming such a team around legacy technologies is always a challenge, with a scarce talent pool to draw from. 

We will dive deeper into each of these problems in the following articles:

But this multitude of problems raises the critical question: is a monolithic solution still the optimal approach?

The hidden cost of building upon outdated technology

As the digital landscape evolves, it becomes more evident that to survive, let alone succeed, businesses need to outinnovate the pace of external change. This involves not just being more creative, but also adding more value than any other competitor. Companies must deliver experiences that will build their customers loyalty, and do so more rapidly than anyone else.

In the face of this reality, surpassing the limitations of traditional DXPs becomes crucial for any company that’s willing to innovate.

Conventional DXPs are struggling to keep pace with growing market demands such as broad data usage, interoperability with a growing number of systems, as well as flexibility and scalability. The fast cloud adoption, which offers unlimited infrastructure and access to specialized services, underscores these traditional platforms' constraints further. Until now, the companies had two options to try and solve those challenges.

Custom made = long way to go

The first option is to invest in custom in-house solutions that will allow them to adopt the new technologies.  While this approach may be successful, it requires significant time and resources. It is also risky: it's not uncommon that after dedicating 2-3 years to the development of a custom in-house platform, the outcome might still not meet expectations. The project success can be influenced by one flawed architectural decision.  What's worse, such a problem usually becomes apparent when it's too late to reverse the decision.

This risk can be mitigated by using modern standards such as MACH and building the platform from well-known products. Unfortunately, there are no standards for building end-to-end solutions such as those offered by traditional DXPs. 

Custom-made  solutions are also notoriously difficult to maintain, as the team working on them needs to be large, competent and available throughout the lifecycle of the solution - right up to late maintenance and decommissioning.

Trying to live with what you have 

The second option is to remain with the traditional DXPs and operate within their boundaries. For some, this may be a viable strategy: companies operating in markets with less demanding digital requirements, or where customer interaction is primarily offline or low-tech.

Similarly, companies that have already made significant investments in these platforms, in terms of infrastructure and skilled engineers, may choose to continue using them to maximize their return on investment.

However, it's important to note that these scenarios are becoming less common as digital demands increase across all industries. Companies that decide to stay with traditional DXPs must strategically seek competitive advantage from other areas, and fully understand the challenges they will face in the near future.

The StreamX way

Finding a solution to problems mentioned above is crucial, but the current DXP landscape falls short. Companies need more options than expensive and risky development, or staying with traditional solutions and hoping they don't fail.

To truly grasp what an experience is, we need to start thinking differently about experience creation and distribution. We need to stop using CMS for purposes it was not designed for. 

In the upcoming articles, I will dive deeper into the approach of experience assembly and distribution that’s been embedded into StreamX. Our goal was to address the limitations of traditional DXPs and create a platform that was ready for tomorrow. A platform that could really accelerate the growth of the organization. A platform that was available off the shelf, giving companies the advantage of not having to invest years in in-house development.

Hero image source: Theo Eilertsen at Unsplash.com