Antonio Savorelli is a web designer, and a traveling writer and speaker of semiotics, television, and design. He is based in Italy, but keeps toothbrushes in Boston and San Francisco. He is the Chief Web Chef of Communikitchen, and co-chair of Television for PCA/ACA.
He has too many blogs, all probably stuck in 2012, and he tweets as @antiorario.
The November Slump
The first week of the month, my biggest client has its annual conference, an event I’ve worked toward for the better part of the year. After that, it seems that all I’m left with is piles of scrap paper covered in months’ worth of notes, the remnants of to-do lists that have been superseded by more pressing tasks, and an inbox that, even if I’m lucky, still stretches its tail back to the previous spring.
I know I should be prepared for the November slump, anticipate what the future is going to hold, and have the next few months of work already lined up. In most cases, however, that’s not what happens. I come back from the client’s conference with the aftertaste of glory for a job well done, and the knowledge that in a few months everything will start again.
November is the time when I depressurize, when I learn to take my time. I pick up side projects only I and few others care about. I catch up on TV shows. I figure out if I can still play the violin.
Even during busier times of year I try to establish a routine that allows me not to jump head first into my day, but ease into it with at least an hour of reading—and I make sure that not all of the things I read are work-related. Anything I can’t do during those early hours of the morning will inevitably accumulate in several corners, both physical and digital. What I can’t take care of during some more relaxed summer weeks will have to wait until November.
No matter when it happens, the end of a project is bound to be followed by a sense of void, and by the impression that I’m not doing enough. Putting aside the piles of notes, ignoring the incomplete task lists, and trying, even just for a few days, to forget about the minutiae of work is a possible cure for this void, and a way to swat at the anxiety about the future, whose shadow is already peeking from around the corner.
There is no compact Italian expression to translate the English “from scratch.” Even if I could find one, it would lack the self-congratulatory value it currently has, especially in American culture, where most things come pre-made and prepackaged. When it comes to food, the expression “made from scratch” seems to imply an intermediate state of being between store-bought and homemade: you buy the genoise, the custard, the can of whipped-cream substitute, and you have an Ikea cake, where the final product is assembled, not created. The from-scratch rhetoric values the process of making things rather than the substance of which they’re made.
In Italian, on the other hand, the priority is reversed, and the category by which food is judged is that of genuinità. For a few decades, food advertising has turned the adjective genuino almost into a false friend of the English “genuine.” A product that’s marked (and marketed) as genuino is not an unchanged version of an original model, no matter its source, but one that’s as close to nature as possible. One that’s actively good for you.
In Italy, homemade food is by definition genuino, no matter the amount of sugar it contains or whether it was fried in pork fat. For decades, advertising of industrial food, particularly baked goods, has used genuinità as one of its main selling points, and made it more perceivable through various strategies, among which:
A substantial strategy, which highlights the relative presence of ingredients that are assumed as being good for you, as opposed to those that are obviously (and often anachronistically) naughty: more milk, less cocoa! (Never mind the sugar and the trans fats.)
An aesthetic strategy, which minimizes packaging and possibly makes it transparent, to show that the shape of its content is as close as possible to something you could make at home. Compare that to the space-age appearance of something like the alien, American-born Pop-Tarts.
A mythical strategy, which links the origin of the product to an imagery of uncontaminated nature, and to parts of the country where everything is made, from scratch, at home.
Both the from-scratch rhetoric and that of genuinità are ways to deal with the overwhelming presence of industrial products in our daily lives. When it comes to food, being homemade, the opposite of industrial, has a connotation of natural and healthy in the United States (adjectives that carry their own special set of ambiguities and contradictions), but of genuino in Italian. The idea of genuino, however, ultimately trumps healthy and natural: as long as it’s something your grandma could have made, it’s better than anything anyone else could ever make for you. And when you start earnestly applying it to industrial products, the word loses all meaning.
Making things from scratch is an illusion too. A few years ago I stopped following a hipster on Twitter, exhausted from his self-righteous tweets about how if you don’t make your own pasta you can’t say you’re cooking homemade food. I guess he owned a wheat field by his house in Brooklyn. According to this worldview, authenticity isn’t enough, and the effort of production must go to extreme lengths.
That attitude isn’t limited to food, and extends to anything that has an aspect of craftsmanship. Once, on a web-design forum, I caught an Italian (because Italians are proto-hipsters, I think) ranting about CMSs, and saying that “a real website” must be completely hand-coded, and that people who use WordPress and Drupal are just cheaters.
I no longer take offense at such opinions, annoying and uninformed as they may be. If someone makes solemn judgments about what a real anything should be like, it’s most likely because they’re collapsing under the weight of their own inadequacy. That includes any statements about real men, real women, real Italians, real Americans, real Italian food, and so on. (I’m not listing real American food because, as all real Italians know, there is no such thing.)
Learning how to make things from scratch, be that making bagels or coding an HTML page, is certainly a good way to learn how a certain technology works: how to raise dough, what boiling bagels does to them, what happens to your diacritics before you learn about HTML entities and UTF–8—those are all things you can read about, but the experience of doing them yourself will make them stick to your brain a lot better. However, that doesn’t mean you can’t enjoy a store-bought bagel, or a web page that was written entirely by a machine.
Any technology that takes some of the effort out of your daily process and allows you to get to your results faster shows the fallacy of an absolutist from-scratch rhetoric. PHP and MySQL are a way to make our websites dynamic. Does spending less time on HTML make me less of a web designer? No, because being a web designer doesn’t just mean writing HTML. Same goes for using SASS to write style sheets. By the way, you’re not a real web designer if you’re not using SASS. It’s 2015—come on!
It’s easy to fall for the from-scratch fallacy. It’s the myopic pride of thinking that whatever you’re doing today will always be good enough, and the way you’re doing it is the best possible way. It’s the delusion that using any prepackaged material, regardless of its source, is an inherently undesirable shortcut that will make what you’re doing less genuine, and less genuino.
Every new technology has some inevitable overhead. It forces you to spend time learning it, and, consequently, it redefines where the baseline is, thus making any attempt at defining what “from scratch” means almost completely irrelevant. My way of building a Drupal site from scratch will look nothing like that of someone who’s just starting out.
Different depths of understanding of the technology allow for different starting points: because my background is not in computer science, there are things about the inner workings of the Drupal core that I don’t (and probably never will) master. I might have a general understanding of them, but I’d rather let someone else get their hands dirty with that code and lose their sleep over it. I’m okay with that. I’ll buy the flour, but as tempted as I am by the idea of making my own yeast, the thought of purposely growing mold still kind of scares me.
Go ahead then, learn your basics, choose your frameworks, build your library of tricks, and remember that what makes your design process unique is not the number of lines of code you personally wrote, but your mastery of the medium, your understanding of your clients, and a commitment to never taking anything for granted.
Your Users Are Fainting Goats
You wake up at 4 am and decide to look at your email, and even though you know you shouldn’t, for so many reasons, it’s almost a reflex. You see a few messages—not many, maybe three or four, which is an okay number, not enough to make you think that some catastrophe has hit your server or one of your clients’. You could decide not to read any of those and go back to sleep, but you have no power against temptation at 4 am. Not to mention that at least one of the subjects starts with “Fw: R: Re: R: Re:” and you already know someone must be in trouble.
If you really can’t resist the urge to read the actual content of the email, do so with the full awareness of the fact that your users are fainting goats. Fainting goats—in case you haven’t stumbled upon them, metaphorically or otherwise—are goats that suffer from myotonia, a genetic disorder that makes their muscles go stiff when they’re startled. They don’t actually faint, but either topple over or start hopping around on all four legs for a few seconds. When the scare passes, they can get up and go on their merry way. Distressing as it may sound, it can also be quite entertaining to watch.
Receiving alarmed emails from users who suddenly believe they can no longer get anything done, or that the system you built has decided it doesn’t like them at all, can easily cause you to do an epic, Liz Lemon-style eye roll, and immediately jump on a response in which you explain, not without some snark, how they did something wrong, and how their expectations were simply misdirected, and how they should think logically before they send support requests drenched in despair. You, from the height of your knowledge of the system you built, can clearly imagine your users’ entire thought process, and can almost see them bang their heads against that specific feature that for you is so blatantly obvious.
But logic doesn’t work in the face of fear. Machines, even something as well-designed and most certainly pretty as that website you made, can be intimidating. Let’s face it: you, as a digital designer, should always assume that anything you make scares the crap out of most people who interact with it. A simple hiccup that may or may not depend on the code you wrote can make your users’ brains go into defense mode, their muscles spasm, their fingers automatically type that email in which they blame themselves for all that’s ugly in the world. As with fainting goats, that moment will pass. You can rest assured that in a few seconds everything will be back to normal. In most cases, they won’t sit there anxiously waiting for your response.
Make sure you resist the temptation of taking care of the matter right away. It’s maybe 4:15 am now, and you should go back to sleep, and leave your response for the morning. Your main goal in writing it should be to prevent or minimize the next scare. Ask for more information if needed—you know that a fainting goat in distress forgets to take screenshots and transcribe error messages. Explain what went wrong, reassuring them that it wasn’t their fault. Even if you believe it was their fault, consider that they didn’t build the system, and they don’t know it as well as you do. What may seem logical and sensible to you may appear arbitrary to someone else. Also, the web is a bitch, and can’t be trusted to behave consistently, which is precisely why we love it. (Now, where did I leave my Stockholm syndrome pills?)
No amount of user testing will prevent the fainting-goat effect, so there is no place for frustration at users’ requests for help. In-person training is useful and necessary, but most of that information will be forgotten in a heartbeat, and you can’t possibly do that for every single one of your potential users. What alleviates future scares is documentation. Writing documentation aimed at people (on top of code documentation, which you obviously always write) should be part of any web project, no matter how small. If after a day of coding you’re too exhausted to spend another hour describing everything you just did, make it the first thing you do tomorrow.
Documentation should be more than just a list of features and commands. It should be, above all, a narrative device that puts people back at the center of the action, helps them make sense of their work, and shows them that what you built for them is not a disconnected cluster of dumb tasks, but a structured process. Turning machine logic into narrative logic forces you to lower the camera angle and see your work from your users’ perspectives. It will also help you discover possible plot holes in what you’ve built. If the narrative works well, you’ll have spared your users a scare or two, and saved yourself some extra sleep.
Tales of a Fateful Year
This spring, a new Italian TV show has made the news for setting a story in the year that made the recent history of my country. 1992 follows a few fictional characters, whose lives are entangled with those of fictionalized key figures in what is known as Tangentopoli, the bribery scandal that within a few months shook the entire Italian political system, and set the tone for the next twenty years. The parties that had built the backbone of post-World War II Italy were dissolved as many of their members were jailed or just fell out of favor. 1992 is very good at tracing the rise of new populist movements, such as the Lega Nord, founded on the cultural and economic divide between the north and the south of the country.
1992 has been compared to House of Cards and The Sopranos, while in fact it has more in common with Mad Men, if anything. However, while in Mad Men history flows by on the background, slow at shaping the lives and the attitudes of the characters, 1992 places it right in the middle of the action, and wants to bring it all back for the viewers to relive (or live for the first time, if they missed it), through names, faces, and the exuberance of daytime teen television.
Through the eyes of one of its main characters, a ruthless and seemingly heartless marketing guru, the show recreates—through invention, although it might not be too far from the facts—the narrative behind the political ascent of Silvio Berlusconi, the media magnate who redefined the Italian television market in the eighties. The party he founded thanks to his network of advertising salesmen and football fans would be the heart of the right-wing coalitions that would repeatedly gain power from 1994 until 2011. He would become the embodiment of Italian anomalies and contradictions for twenty years—and we’re still counting.
Of all the powers clashing together in 1992—justice, politics, corporations—the media, and television in particular, is the one that takes the booty. Until the late seventies, the state was the only entity allowed to broadcast nationally. It was Berlusconi’s networks that challenged the status quo, first exploiting legal loopholes, then officially, thanks to the political support of Bettino Craxi, several times the prime minister of Italy, who in 1993 fled the country to avoid prosecution, and died in Tunisia in 2000.
The social and political quake of 1992 had the power to leave state-owned TV networks, historically controlled by the very parties that were being attacked and progressively dismantled, having to catch up with private channels, which were now basking in the light of change.
It’s hardly a coincidence that in 1992 three Italian universities started pilot programs in Communication Studies, followed the next year by a few more, one of which was the University of Bologna. That was when, all of a sudden, fifteen-year-old me found his future home. Once I got to Bologna—the oldest, the naughtiest, the one that makes parents at once proud and worried—it turned out not to be the same as it had been the previous two decades, now barely more than a caricature of the cradle of student effervescence it used to be. Maybe the eighties had finally caught up with it.
In 1992, Bologna is caught in the middle of the eternal rivalry and mutual misunderstanding between Rome, the political capital, symbol of the old guard, and Milan, the financial capital, where things actually get done, but which is the land of Tangentopoli. All that’s ever shown of Bologna are the intricacies of the tangenziale, the highway system that envelops it. It’s shown as the city you can’t avoid if you drive from Milan to Rome. The city you can easily love to hate in its imperfections and its messiness, just like the memories of childhood.
There could be no better place to host a Communication Studies program, an attempt at understanding and, possibly, mastering something that until then had been treated as corporate voodoo. The intent was humanistic in a broad, multidisciplinary sense, and, in retrospect, the program was conceived as a way to vaccinate future generations, and make even mine, born in the seventies, a child of the eighties, more aware of the impending dangers.
The timing couldn’t be more perfect, but, at the same time, couldn’t have been any worse. I can only guess that designing a new university program must have taken a whole decade. The idea might have been born right when the Italian state lost the right to be the only television broadcaster in the eye of the public, if not in the eye of the law. Unfortunately, 1992 was the year that allowed communications to become a dirty word. Communications would bring about a new political system. Many new faces, some old recycled ones, and a way of convincing people to approve of things that were the opposite of what they believed, the opposite of what would be better for them.
Televised as it may have been, it wasn’t a revolution that the year of changes brought about, but a shift in the way power was established and maintained. Whereas until then power was built through personal communication, in the nineties it became about public communication, and above all television. And we, the children of the eighties, those who would grow into our right to vote within the next few years, had been conditioned—trained, almost—to be receptive of those new ways, those faces, those sweet words over a decade of afternoons spent watching cheap Japanese cartoons, toy commercials, and, eventually, prurient teenage variety shows on Berlusconi’s networks. It had happened to no other generation before us.
1992 did not bring a revolution, because Italians were, in fact, only watching it on TV. Italians are very good at making a lot of noise, but always make sure to be back home by dinner time. 1992 was the year that gave us the potential for change, yet distracted us by making a show of it, while the grown-ups played in the other room. It gave us a seemingly clean excuse not to make the revolution we deserved and never had, then left us unprepared for what was coming next.
I can’t wait for season two.
The Friction of Quitting
A couple of months ago I decided to quit Facebook once and for all, and stop considering it an inevitability of my digital identity. Its business strategy had always creeped me out, even before the advent of frictionless sharing, which is something bad labeled as something good. Nothing I’ve done throughout the years has ever fully succeeded in toning down the sources of creepiness, from the so-called Facebook envy to the feeling I have that even the smartest of my friends become shallow, babbling dummies the moment they open Facebook—which makes me certain that I too, in their eyes, must appear the same way, even more than my offline self does.
Facebook adds all the friction back when you want to get rid of your content, because there’s no way to do that with a single click, not to mention the fact that nothing ever really gets deleted. The activity log is both a useful tool and a way for Facebook to taunt you by showing you all the stuff you’ve done, much more than just posting on your wall: there are comments, likes, friendships, tags, and many other small things. Anything you didn’t personally initiate can, at best, be hidden from your timeline. If you want a photo of you to be deleted, or a tag to be removed, you have to ask the person who posted it and tagged you.
It’s time-consuming, privacy-shaming, and emotionally abusive.
Facebook may have tweaked the meaning of the word “delete” to stand for something else, but going through the history of my relationship with it was a useful exercise. Being confronted with the daily abuse brought me clarity, and had a cathartic effect. No matter how little I thought I was doing on Facebook, I was surprised to discover the amount of things I’d liked, and all the superfluous comments I’d made. I wanted to kick myself for coming off as a jumpy little dog, and behaving in the same mindless, inconsiderate, and vapid way I criticize others for. I’d fallen for the trap.
It’s no coincidence that my years on Facebook have been those during which my websites have languished and my new book hasn’t written itself. Wasting my energy on Facebook is easier than putting it to work on something more meaningful and longer-lasting. Something that requires time and doesn’t need to be shared within the next five seconds lest everyone forget me forever.
The friction of quitting doesn’t just come from the physical labor you have to do to remove things, intensified by the certainty that they will never be gone. Facebook counts on it being emotional as well. By undoing actions you’re destroying meaning. Unliking things makes you feel disrespectful. Deleting comments damages conversations. Unfriending people makes you feel like an asshole. Asking others to untag you from posts and delete your photos forces them to do work for you, and as much as you think everyone should be doing what you’re doing, you can’t ask unwilling parties to endure that kind of process.
Don’t give in to the emotional abuse. Keep hiding and deleting, unliking and unfriending. See how needy, superficial and impulsive you’ve been in the past few years.
I thought of it as a way to rebuild my own personal content strategy. I performed a destructive inventory of my content, noting that while some of it was fun to look back on, most of it had lost all relevance. Besides, everything was tainted with the certainty that no matter how frictionless my content production was, I wasn’t doing it for myself or for anyone else but Facebook.
Deciding to leave Facebook is essentially a socially inappropriate act that has given me a taste of what it would be like to leave a cult. That’s why I didn’t want to just delete my profile and be done with it. I wanted to feel the awkwardness now, to avoid withdrawal symptoms later.
I don’t even care about the mind games Facebook is playing. Psychological research, political experiments, advertising schemes—I don’t care, because they’re all equally disgusting, yet all equally understandable (if not excusable) on a business level. I don’t pay for the service, so I know I should expect something like that to go on behind the scenes. Just a few days ago, Laura Kalbag detailed all of that on A List Apart.
But the emotional abuse goes further. Even if we don’t consider the huge implications of Facebook’s stance on privacy and data ownership, the ability to share anything at any time with anyone makes it easy to forget why we’re sharing things in the first place. It makes us overlook the context in which any friendship or acquaintance exists.
The frictionless approach privileges quantity of information over quality of interactions. Almost everything on Facebook is reduced to small talk, background noise to fill up your days. Facebook transfers its own horror vacui (no content means no business) onto its users. It counts on the guilt associated with pruning your contact list and your timeline. Its sense of quality is based only on its ability to turn your behavior into money. Its repeated attempts at replacing the web aim at making sure you won’t need anything other than Facebook.
I need to go back to quality interactions, and to the quality of the things I write and share. That was the initial purpose of having a personal website, as well as the original intention of the existence of the web itself. Not everyone has the knowledge or the time necessary to create that kind of personal presence on the web, and Facebook is the easiest way for people to get that.
As web designers we have the responsibility to prevent the web from falling into the hands of those who don’t have people’s best interests at heart. We must keep building tools that defy the logic and control of private companies whose primary strategy is to turn social interaction into an addiction, and to transform the innocent concept of sharing into giving yourself away.
One drawback of designing for digital media comes from the intangibility of pixels. A recurring theme I’ve been noticing for years in design blogs, conversations with other designers, and—yes—Pinterest boards is the frequent need to go beyond the screen, and make something that one can finally touch, something that lasts. It could be considered as a form of pixel-induced alienation, in that exposure to various kinds of screens makes us long for something that we can hold in our hands, that doesn’t glare in our faces, that doesn’t constantly solicit our attention. Also, something that doesn’t disappear in a blackout or doesn’t become obsolete in six months.
It’s not about reading books, or going for a hike, or taking a cooking class. It’s about taking the design-related part of our brains on an analog, maybe even lo-fi vacation. Some designers dabble with letterpress printing, some experiment with hand-carved wood type, and some with fashion design. The goal is to keep making things, but also to fill the void of tangible objects.
Although I support these departures from the digital, and enjoy hearing stories about them and seeing the results, the fact that digital media are seen as lacking physicality is due to a widespread discomfort with digital technologies, which affects even the most tech-oriented of us. This is, in turn, due to a cultural bias when it comes to human perception, a hierarchy of the senses, if you will: touching something feels more real than just seeing it. We can trust our skin more than we can trust our eyes, and consequently we give the sense of touch more credit and more intrinsic value.
Granted, it would be reductive to say that digital design is only about sight, when in fact, if properly executed, it allows the same object to be accessed through different senses, including hearing and touch. The beauty of digital technologies is that they transcend the limitations of physical matter. But when I make a website, I know for a fact that the one thing my typical client desires the most is to finally be able to see it—never mind that the main interaction with it is through a form of touch, mediated as it may be by an input device.
Touch trumps sight, matter trumps light. The day after Thanksgiving 2014 I went to the Harvard Art Museums to see the recently restored Rothko murals. Painted using a photosensitive pigment, through their fifty-year history the murals have gradually faded. Conservators deemed it impractical to restore them using conventional techniques, chiefly because they wouldn’t have solved the aging of the paint, so they decided to act only on the perception of colors. This involved a combination of digital technologies to compensate for the missing vividness, using photos of the murals taken in 1964 as sources. What resulted may very well be the definition of pixel perfection.
No one can claim the murals are not real and tangible. Even so, they are but a shadow of their old self. They’ve aged and faded, and only thanks to the lights projected onto them can we now see them in their former colors. I was at the museum at four in the afternoon, when the projectors are switched off. For one hour every day, the murals are naked, old, and dark. Their colors have very little to do with those you could see just a few seconds before, or with those of the sketches hanging in the next room.
I was amazed at the effectiveness of restoration by light, but I could hear sighs of disappointment, and a couple of peremptory statements about the legitimacy of this means of preserving art—as if replacing or enhancing colors by changing the substance of which the murals are made would have maintained the originality of the works more than superimposing rays of light onto them. But visual art is about perception, so what if we trick our eyes and our brains into thinking we’re seeing something that’s not there?
The issue of the intangibility of digital design is a subjective matter, but I see it as a symptom of the persistent idea that digital equals fake, or just not quite real enough. Although I understand the need to get away from the keyboard, take our eyes off the screen, I don’t see it as a way to fill the void of tangible things, fulfill the need to produce real objects. The web and other digital interfaces are real, because real are the effects they produce on the analog world. This misconception of the web and the internet at large as pertaining to a different world is what makes boundaries between different planes of reality appear fuzzier. What you say and do on the internet doesn’t really matter, it doesn’t define you as a person, because you’re not really doing it, it’s not you who’s doing it, it’s not real after all.
Except it is real. What’s built with pixels isn’t a virtuality or a fallback reality, a simulacrum of something that is completely realized only in the analog world. Only embracing the web and other digital media as another kind of matter, not as an intermediary, will we finally win the battle against the idea that all we do as digital designers is draw pretty pictures. The truth is that we are building worlds. Better yet, we’re shaping new parts of the world we live in.
Intangibility and impermanence aren’t just a prerogative of digital media. Music faces a very similar conundrum: a violinist plays an instrument that gives him bad posture, probably damages his hearing, and results in an intangible and impermanent, albeit analog, product. Even a recording, which exists to defeat music’s ephemerality, is just as dependent on external factors, such as the availability of a music player, thus some source of electricity, and, more and more, an internet connection.
I’m not even sure I want the product of my work to be permanent. With the speed at which the progress of technology forces us to learn new tricks and techniques, do we really want that website we made ten years ago to be publicly available forever? For sentimental reasons, perhaps we do. But what should actually remain of our work is the craft itself, the knowledge required to take it one step forward every day. That, and the ability to think critically about it and its role in culture and society.
The intangibility and impermanence of digital media raise another, bigger concern. Digitizing the world means taking a signal, a physical phenomenon perceivable by humans, breaking it apart, and turning it into something else that, in its raw form, only a machine can understand. This process alone is enough to make many uncomfortable for ontological reasons: a digital song is not a song, a digital book is just a screen pretending to be a book, and a digital photograph is most definitely not a photograph—never mind that there’s nothing in the word “photograph” that demands it be an analog recording of light. I have no such ontological qualms. What concerns me is the permanence of the data we produce and the sustainability of a craft that depends on electricity and an internet connection.
Lose that machine, lose the knowledge of how the machine can process that digital code, and you lose the ability to turn it back into an analog signal, you lose the perception. That’s a scary thought. Digital archives are, at once, the best and the worst way to preserve our knowledge. They require preserving not only the data, but the tools used to encode and decode it. And they require planning for duplication, redundancy, accessibility, portability, and real privacy—in the sense of total control over the availability of one’s own data. I trust technology and science enough to know that we’ll solve this problem too. We’ll invent some meta-technologies that will be able to reverse-engineer anything—or maybe I’m just delusional, because I’m a digital designer and this is a thought that makes me sleep better at night. Whether I should trust humans to use these technologies appropriately, well, that’s a whole different story.
My Elusive Science
Recently I was reading someone’s bio on the web, and it said he studied “semiotics” in college. Like that, with scare quotes, as if it weren’t really a thing, followed by an apologetic statement about all the real, productive stuff he’s been doing since.
I would have found it mildly offensive, had it been the first time I noticed something like that. Apologizing and keeping the distance from one’s past as a student or a scholar of semiotics is a recurring theme, particularly in Italy, where the subject has been central in the education of communications majors for the past twenty years, and of film and theater majors for the past forty.
I love semiotics. No quotes, no apologies. I loved semiotics even before I knew what it was, before Umberto Eco was my professor, and before I discovered there were several kinds of semiotics, and that Eco’s wasn’t even going to be my kind of semiotics. My understanding of the world, of language, of music, of the things I studied and did ever since I was a kid—my understanding of myself, even—has always taken a semiotic form.
Just to clear the air: semiotics is everywhere. My very first class at the University of Bologna was a sociology class, and the professor claimed sociology is the one thing in which all humans are experts. That was some sensationalistic bullshit meant to wow a young audience, and only marginally to make it feel at ease. All humans are, for sure, sociological informants, but there’s quite a gap between being an informant and being an expert.
Unfortunately, the instructor of the next class, my first in semiotics, made no such statement about her subject. I think that was a lack of planning on her part, which made it more difficult for most of my class—and, at first, even for me, to grasp the value of what she was teaching. I felt there was something there, but it would take me a few months to realize semiotics was the fabric of my thoughts.
Semiotics is the fabric of everyone’s thoughts. What it does is study meaning as produced, understood, and communicated by humans—if there’s one discipline that could be picked as the science of communication, that’s semiotics. However, I have a few ideas on why semioticians have a hard time explaining what they do, and in some cases end up apologizing for it:
- semiotics has a quasi-scientific approach, but unfortunately its objects are not quantifiable, they’re not entirely objective (paradox, I know), and they often require negotiations and further approximations. This baffles not only hard scientists, but also those who practice quantitative social sciences
- semiotics makes fast things slow: I could spend one hour talking about a two-minute scene from Mad Men, and apparently not everyone likes that
- semiotics has been plagued by all the bullshit some academics live by, their internal and external power struggles, the special way they have to make things sound difficult to prevent others from thinking they have nothing to say. (It has also been plagued by terrifying neologisms, of which both the American and the French school of semiotics have been masters.)
Make no mistake, I’m an academic too. A satellite of the academic world, that is, which I never fully embraced, but to which I’ve continued to contribute. However, I believe that bullshit and buzzwords are detrimental to any activity, in academia as well as in the world of people who make things.
Thou shalt not use semiotics to make things
I believe that what really did semiotics in was the idea that it should never be used productively, but it was to remain an analytical tool. One may well argue that the first objective of science is to understand the world, but there is an implicit (and not so implicit, really) further goal of making the world better by, say, curing diseases, building machines, and, often enough, repairing the damage that previous science has done.
This very academic (in the pejorative sense) definition of semiotics implies that the only contribution of semiotics to the world is to produce texts whose only audiences are students, scholars, and the occasional nutcase who will read just about anything. Texts that observe, dissect, analyze and figure out how the world makes sense, but do not and ought not to affect it directly.
Scholars who subscribe to this sterile notion of semiotics frown upon any application of semiotic theories, and smirk at the fact that certain fields, such as advertising, but also much of contemporary television writing, have not only been aware of the existence of the discipline (because guess where those former students of semiotics went to work), but use it on a daily basis to do their jobs. They abhor the idea that the underlying structure of a text might reveal an active knowledge of how texts in general should work, and deem such a text unworthy of attention.
Talk about ivory tower, right? This kind of semiotician wants to be free to enjoy the world, and marvel at the underlying structures, at the patterns of meaning produced, at the multiple layers of understanding and cross-references, but only as long as the world remains oblivious to these eyes and hands that are observing and dissecting it. As long as the world maintains that ingenuity it had before semiotics was even a thing.
But guess what, just because semiotics was given a name, and was systematized in a few different ways doesn’t mean people weren’t trying to figure out how meaning worked even before. The proof is in that first semiotics class, which starts with Greek philosophers. As long as humans have not only produced meaning, but also been able to record it, they’ve also been trying to figure it out, to figure each other out.
What’s the reverse of semiotics?
I will admit that semiotics as an analytical tool gets in the way of a few things. Art is one. Art—the practice of creating texts that have no function, no constraints except that of existing, of telling stories, of pleasing or shocking the mind of the audience—doesn’t necessarily fail when it’s founded on a semiotic awareness, but it shows all its cards more easily. It’s self-conscious, and may come off as dishonest, as more of a trick than an act of genius.
On the other hand, things like advertising or writing a structured TV show aren’t art. They may have some artistic qualities, but they involve a different kind of focus, a different level of awareness, and, above all, an understanding of how to produce and convey meaning for a certain audience that art is better off without. Those, however, are the qualities of design.
I’m a semiotician, and I’m a designer. The two things go together, for to be a designer one has to be, at least to some extent, a semiotician. The designer must be aware of what makes things work and produce meaning, because meaning doesn’t just impregnate things from the outside.
Despite having made websites professionally for a while, the realization that I was a designer came to me suddenly a few years ago, and it was a surprise. Until then, I’d thought that web design was a special kind of design, not really design after all. Reading the experiences of people who had made a name of themselves as web designers, I had been led to believe that because I hadn’t gone to design school I couldn’t really be a designer. My studies, all the semiotics, all the observation, made me at most an enabler, and by making websites, by writing HTML and PHP (and by being utterly unable to draw anything) I was doing someone else’s job. I thought I would eventually develop my business enough to be able to focus on things that might be web-related, but more in tune with my education—that wide and indistinct field of communication studies—leaving the more technical, computer-sciencey details to someone else.
Then one day I just opened my eyes to the bullshit that’s all over the place, in academia as well as in the world of people who make things. You’re not a designer because you went to design school. You’re a designer because you want to understand how things work, and try every single day to make them work better. The day I realized that what I did on the web was more than just tinkering, that my years of seeing the world with a semiotic mind had shaped the way I worked, was the day I realized I was a designer.
So there was the link that held everything together: semiotics wasn’t simply a theoretical framework I used to understand my clients better, to make my content strategy more effective, to justify that I was a guy with a PhD doing something any fifteen-year-old could do—isn’t that what people believed? Design was semiotics in reverse, it was the process of building objects where there was only pure meaning, of conjuring up tools that people could use starting from ideas, abstractions, and thoughts.