The Fourth Wall of Content
What are we supposed to believe about the origins of the content we consume? Are we supposed to believe that this content just exists, suspended in some fantasy world, waiting to be magically cast out to our browsers, all Harry Potter-like? Or are we always aware that there’s a more raw, less refined version of the content somewhere else that was transformed into something more palatable for us to consume it?
Consider a templated page of highly-structured content (e.g. – an employee profile in a directory). We show this to the world as a single page of HTML, with content and template values inter-mixed so that an uninterrupted string of HTML is produced and appears on our screens as an integrated document. But do we expect our visitors to actually … believe this? Do they actually think someone typed this all manually?
Or, do we accept that they know there’s some structured content “back there” somewhere? Just like someone in a theater audience knows that the actors break character as soon as they’re out of view of the audience, do our visitors know that there’s a content type in the background, and that some values on the page are part of that content while others are part of something called a “template” that gets applied to all content, and that there’s a set of rules we made to organize the content in the site? To what extent do our visitors “lift the curtain” on our content?
I started word processing back in the early 90s, in college. Back then, Microsoft Word wasn’t common – all the cool kids were using WordPerfect.
I used the DOS version for a while, then graduated to the Windows version, which was a revelation. The DOS version had to represent different formatting with colors – something was “bright white,” for instance when it was bolded. To see what something really looked like, you needed to go to Print Preview.
But in Windows, you had this great feature called “Reveal Codes.” This would show you a little pane at the bottom of your text and the formatting codes. So you could see “bold” codes surrounding text which was bolded (a lot like HTML tags, really). This was my first introduction to the concept of markup.
I loved it. With this, I could peer behind the scenes. I could lift the curtain and see how the sausage was made. Everything was suddenly clear to me, and weird formatting issues suddenly made sense and were easily resolved. I felt like I was seeing the document for what it really was, rather than the homogenized version that was intended for the unenlightened masses. I had a privileged view while “they” did not – they merely saw the manufactured result of it; a PDF, or a printed page.
This dichotomy has never left me, and it’s paralleled in the larger content industry: we spent a lot of time trying to hide the things we do from the end user. We want them to see the result, but not how it happened. We want to maintain some … mystique (?), and there’s a strange intimacy to the extent that we reveal to them what we’re actually doing, both technically and logically.
We’re All In This Together
Indulge me with this gross generalization, but in a public website scenario, there are two types of people:
Editors make the content and are the ones who understand how it’s formed: the structure, the organization, the nomenclature. Editors understand the very subtle difference between why a News Article is not a Blog Post, and how putting multiple content objects in the same folder will cause the left navigation to change.
A visitor just consumes the content. They can’t “Reveal Codes.” They can’t see behind the curtain and understand how the content comes together. They can’t tell a News Article from a Blog Post, and they have no understanding of the underlying IA that makes it all go. Visitors only understand what we reveal to them. Things that are blatantly obvious to us are hidden from them. They are led to believe that are content sprang magically forth from the ether in a perfectly consumable form.
To understand this better, let’s turn it on its head, by discussing intranets – the internal, employee-focused information systems used within a company.
The editor/visitor segregation breaks down a little when some visitors are editors, and editors are also sometimes visitors, and everyone works for the same organization. That organizational co-mingling breaks the dichotomy of “us vs them.” When the visitors are employees and they know there’s a Corporate Comms department, we suddenly drop the pretense that we have to “fool” them.
When my first book came out last year, intranet expert Martin White wrote a great review of it. Martin and I had met a couple of time over the years, and we exchanged some emails about it. Martin said there needed to be a chapter on intranets. What followed was a discussion about whether intranets were “web content management” in the traditional form.
I made this case:
With [SharePoint, and ECM in general], there’s very little “hidden.” With something like Episerver or Drupal, you have an admin/editorial side, and then a public side, and the latter is a very disguised version of the former. Put another way, you have admins/editors, and you have visitors/consumers, and the two are not meant to share the same view.
With SP, intranets, and ECM, you really have differing gradients of the same class: “user.” You have users only consume, and users that are allowed to also contribute, and users that are allowed edit and create content. The views are largely the same, and vary simply by degrees and permissions.
For instance, a particular document library in an ECM system might look the roughly same for everyone, some users just have a few more links and options than others. There’s little need or desire to “project” the library as something completely different, visually and functionally. Whereas with a public website, the raw document library would be the administrative view, and it would be viewed by the public through a facade that made it look like something not resembling a document library at all.
Unlike the traditional theater where the audience sits in the gallery and views the stage through the proscenium, intranets are like theater-in-the-round or “participatory theater,” where the stage is surrounded, there’s little place to hide, and actors come walking in through the aisles among the audience. The audience is “in” the performance. Other people are doing the acting, but the lines between “us” and “them” get blurrier and in some performances, actors come and sit in the audience and maybe even pull us up on stage. When this happens, we’re not voyeurs anymore; rather, we become part of the performance.
With intranets, we often have to be explicitly trained in how our intranet works. Someone will teach us the secrets of what a “section” is, and they’ll give names to things like “local navigation.” We understand that the content on the intranet isn’t magic, it’s simply the result of pieces of information interacting with each other through a set of rules. We interact with our intranet every single day, so we begin to reverse-engineer the patterns.
Just like you start to notice little ticks and details of TV shows you watch over and over, we begin to figure out what’s happening behind-the-scenes of websites we’re exposed to every day. We learn the rules and dispel the magic – we slowly break through the fourth wall.
Shadows on the Wall
The idea of teaching our visitors how things were built seems bizarre. Contemporary usage of a CMS is as a form of “projection.” We take content and project it into something else, and then try to pretend that the original thing never existed in the first place.
I’m thinking now of Plato’s Cave:
Plato has Socrates describe a group of people who have lived chained to the wall of a cave all of their lives, facing a blank wall. The people watch shadows projected on the wall from objects passing in front of a fire behind them, and give names to these shadows. The shadows are the prisoners’ reality. Socrates explains how the philosopher is like a prisoner who is freed from the cave and comes to understand that the shadows on the wall are not reality at all, for he can perceive the true form of reality rather than the manufactured reality that is the shadows seen by the prisoners.
The content that visitors see are really just shadows on the wall.
But when you need to draw users deeper into content, perhaps it makes sense to reveal the underlying structure a little more?
For a brochureware site, there’s no point – the user is ephemeral. They will consume, then not return, so any training or indoctrination about the content is wasted. Brochureware is the one-night stand of CMS.
As mentioned, intranets are the polar opposite, but what about something in the middle – a reference site, or a content-based web app, where users will return again and again. How do you decide when to enlighten them with IA and content model – the “insider information” to help them understand better?
How could we make the source of the shadows more explicit? Consider:
Could we have a link in the footer that says “Show me the content model of this page.” Clicking that link would reveals the raw, untemplated content structure. What would be the point of this? Well, as I said before, it probably matters more as intimacy with the site increases – for brochureware, no one cares. But for more content-focused sites, there might be value here (consider that Wikipedia essentially lets you do this).
Could we have a “Explain this Page” link and revealed a bunch of pop-up menus that explain how the page and the site worked? Like, it would label the left nav and explain why those links appear (“All these pages are in the same section, entitled ‘About Us’.”)
Do those options seem weird to you? Probably. But why? Why is there some reluctance about letting visitors turn their heads toward the light that casts the shadows?
Actually, we’re at a weird inflection point right now where we’re revealing more than ever before. Front-end development and single-page applications (SPA) – the so-called “JAMStack” – can’t really hide what they do. Network calls can be monitored, and JSON isn’t hard to read. If you want to reverse engineer how a SPA works, it’s all there for you to understand.
There used to be a clear delineation between client and server, and the server was a “magic box” from which stuff sprang forth. The stuff was generated on the server from … what? Some more pure data, we assumed. But the visitor didn’t really know what happened on the server. They could have no knowledge of what CMS or server architecture was being used, except what they might glean from artifacts in the HTTP headers. HTML was ejected from the server post-transformation.
But now you can monitor all traffic between your browser and the web server, and some of that traffic might be extremely detailed API calls (see GraphQL), complete with internal labels and names for logical concepts that might be presented to the end user in a completely different context. I realize the data we send might not be our exact content model, but it’s probably a lot closer than the shadows the visitor sees rendered in their browser.
Monitoring network traffic might be like running up on stage and peering into the wings. You’re seeing the backstage of the production – breaking the fantasy contract – and you might be shocked to find out the same dude plays four different parts in the background.
Us vs Them
The web as it stands today has grown up with an idea of an “us vs. them” mentality. The traditional paradigm has been one of a thin-client talking to the mothership. Strange things happened inside the mothership – both logically and technically – and we weren’t privy to them. We didn’t know how the content was generated, and to some extent, we didn’t know how it was organized. In fact, the goal was for a generation and organization process so unknowable that content just seemed to magically appear out of the ether.
This has inculcated us to the idea that visitors should be ignorant of the wizard behind the curtain. But is that limiting? I think the answer depends on your particular website and use case, but I can easily imagine situations where letting users understand your core definitions and architecture of content – not just your templated end product – might be the exact right thing to do.
I’m tempted to end this with some specific advice, but I can’t get behind anything in particular.
However, I just can’t shake the idea of “projection” in CMS, and how the very idea of content created for an audience is deeply imbued with the idea of “them” and “us.” We (us) create, manage, transform, and deliver. They consume.
As a mildly prolific content creator for more than a decade, I can tell you that “their” role is much shorter and simpler than ours, and I suppose that’s the basis of performance art in general. Actors rehearse and rehearse for a performance we consume in a couple of hours.
I think the basis of performance art – and of content – is that we hold our cards close to the vest, by design. We attempt to curate an image of spontaneous, miraculous conception. Our joy is in the visitor not even conceiving that it took work to deliver the end result. We try to be effortless and unseen, like shadows projected on a wall to an audience that never turns their heads.