The Subtle Frames of Disruption

or How We Can See Something Different
Looking at the Same Thing

I hesitated to make Fragonard's The Swing (1766) my icon for this Tackk essay, but as I thought about it, it seemed apropos for what I'll discuss today. I relish the ambiguity of the painting because it leaves space in me as the viewer to ask questions, which subsequently pushes me to think. The fabulous aspect of looking at art is that there is no right answer. As is in life, the answer any one of us comes up with depends upon the kind of person one is, one's background, if one got enough sleep last night, or whether one is just paying attention to that which needs attending. Sharing what one thinks is its own celebration in diversity, even if there isn't consensus. It's the debate that counts because the act itself encourages creative thinking. Some cognitive scientists might call this distributed cognition. That doesn't always mean two heads are better than one, but that more heads disagreeing is better than living with your own in a bubble and which can be easily popped. I've been hearing a lot of popping sounds on the Internet lately.

When it was unveiled in the 18th Century, The Swing was risqué, even titillating, because it was emblematic of the preoccupations of the aristocracy, who delighted themselves in intrigues of love and sex instead of running the country. You know, Les Liaisons dangereuses, stuff like that.

The reason aristocrats could spend so much time slacking and loafing is that at that time they owned 90% of France's national wealth. That's a lot of bread. It also explains what people do in a class where no one has to work. Most passing of time was a leisurely commitment to various psychodramas and conquest of the sensuous (Of course the word psychodrama was yet to be coined, but no matter, you get the gist). One became entangled with "having experiences." Some reveled in the scandalous, others wilted.

Financial freedom did not make people better people in this case. It made them corrupt and dissolute. Without doubt there were many reasons for this, none of which will be discussed here, largely because I am not a professor of history (although the French Rev is one of the most fascinating times of history for me). This is just to note that's what life was like back then if you were born into money, and lots of it.

For this essay the point is that after the French Revolution the painting became a symbol of what was wrong with the class system particularly concerning the higher strata of the Ancien Régime. Same painting, different context.

The spell of the foreplay going on in the foreground pushes the dimly lit man on the right side of the frame –who works to keep the swing in motion– into the background. We see him pulling the swing either conscious of the same thing that we see, or else he is oblivious to what is really going on. It's hard to tell. We see the couple enthralled with one another; then we look back again to the man who pulls the swing and then again to them, swinging our gaze, so to speak, until our eye movement is disrupted by the sandal flung into space before it lands to earth. Yes, the slipper is supposed to make us think of hurling ourselves into weightlessness, if you know what I mean. That's my interpretation and you'll have yours.

Showing a little leg and looking up skirts when it's you who is showing the leg or looking up a skirt is one thing. But watching someone else show the leg and someone else look up a skirt as an interaction between two lovers, oblivious of being watched, is quite another.

For the couple in the foreground, it's as if they are the only ones in the world, just like people entwined in the referential act. As much as pornography tends to amuse voyeurs, for many people with other priorities, watching real people have intercourse can be just a little mortifying.

Given that the actual painting by today's standard is extremely tame, it might help to consider the possibility –a presumption of mine: that the middle-class French felt revulsion upon viewing the painting, which might be similar to viewing sexual interludes in wide open public spaces today. It was shocking!

Strangely, whether this experience of watching others is enjoyable or revolting is a class signifier. The prudes were from the middle class, while the profligates were from the lower- and upper-classes. I'm not sure who exactly defined that, the middle-class prudes or the historians. I did read it somewhere. Even today this may be the only social common denominator between the rich (because they are bored and look for excitement) and the poor (because they have nothing to lose).

This isn't to say that sexual expression should be a certain way, but that how it is interpreted depends on what frame your standing in. I say "standing in," because a frame is not just the edges of a given picture you're looking at, it's the edges of the world you live in, what you take for granted, how you view what is novel, and such like.

For example in our time Madonna made public sexual/power display a commodity. For DSK, public sexual/power display is something of an illness. Similar acts of display, different frames.

Chances are, as intensely as these public figures inflame our imaginations, we just might not look the pornstar that we feel we are while we are enjoying the act. Fortunately, I think, a very large majority of us don't feel compelled to exhibit in public the most intimate moments with one's significant other, and that's fine by me.

This now brings me to the central topic of the essay, disruption. It's a word that seems to have more relevance to all of us each passing day, especially if you find technology swinging into your frame.

Personally, I have always found this word, disruption to be a bit rude. Rude, in the American sense of the word, but also the British sense of the word. Here we have an example of a word with two meanings from two different cultures speaking the same language. Same word in the same language, different cultures, therefore: different frames. For example:

Rude in the US means being ill-mannered and boorish, or showing no gentleness, respect, or sensitivity.

Rude in the UK means being sexually aroused. "I feel rude" means "I'm turned on" (its American slang equivalent).

So with this in mind, perhaps we can imagine the collisions of meaning and opportunities for misunderstanding if I were to say, "You're being rude," one of us being British, the other American. This simple example is what I believe I am witnessing with regard to a clash of cultures concerning a modern economic concept of change called disruption, which again I find is a little rude, in the American sense.

For the knights of industry however, the word arouses them similarly as market competition arouses the lust of capitalists. I could be enticed to wager that the concept of disruption will be exciting material for historians 100+ years from now, after all of us alive have passed on. As long as the Internet isn't scrubbed clean of historical material by some virus or world dictator (or both), I imagine there will be a lot of content to page– I mean, click through.

I hope this essay makes the cut, not as expert material, but used as a record of a contemporary witness, because from what I understand, one of the hardest things about historical work is capturing how the world was seen by those who lived through it when something culturally important happened and, well, disruption is certainly happening.

Disruption appears to be a word, when spoken, signifying that the speaker throws down the gauntlet as if to say:

"Sir, I now remove my glove and give a slap to challenge you to a duel, with my technology as my second."

The reply?

"I accept your duel, but I shall have history as mine."

And so it is that the current controversy commences. In a recent New Yorker piece by Jill Lepore, The Disruption Machine: What the gospel of innovation gets wrong. This very accomplished Harvard professor of History lays the aims of her argument to critically consider the pinions of a business theory dubbed disruption (also known as disruptive innovation), as prescribed by Clayton Christenson, a professor at Harvard Business School, that revered author of the 1997 tome, The Innovator’s Dilemma, which explains why companies fail. The important word in that last sentence is critically. I don't believe her intent is to prove anything, as proving goes in the academic sense.

Unless we are talking about Mathematics (we're not), according to Popper you can only disprove, if it is to be considered scientific. Of course, people can argue that there can never be anything scientific about history, since it's always the victors who write the narrative and explain how to interpret it. Yet, if it is true that history can't be scientific, then perhaps evolution can't matter either, which is a theory of us and how we came to be, just from a biological frame, rather than through culturally tainted frames of memory or documents and other ephemera. No, no one has invalidated Darwin's theory, as much as literal interpreters of the Bible try.

As the Internet becomes more embedded in our lives and our memory-making, we'll possess more documentation of what people said and why they said things and we'll collect and sort these into datasets and see what they reveal. I imagine that history will become more scientific because all the documents available will offer our descendents a lot of evidence that will explain how trends of thought developed into and faded from our consciousness.

Generally, evidence is interpreted. But sometimes the evidence becomes so massive it becomes difficult to interpret in any, but one way. I imagine that historians are becoming a lot better at developing innovative methods to let the evidence speak, just like Darwin was able to do as a Naturalist using his powers of description and his flora and fauna collections. Rather than being relegated to an old British guy walking around in a field telling stories in front of a camera, historians will come up with new and exciting ways to analyze immense sets of data over many hundreds of years. If we last that long, it will be thousands. Likewise, determining theories of economics will no longer be practiced as a dark art, but perhaps as a game simulation of events.

Think Piketty on steroids as an app for the kids a millennium from now.

Oh, excuse me. Back to disruption.

Much like The Swing, and the connotation of rudeness in the English language, disruption appears to embody a modern clash of cultures: Incumbents and Disruptors. The Self-Made and The Intellectuals. Academics and Venture Capitalists. West coast and East coast. Women and Men. The Sacred and the Profane. And likely other dichotomies. I'm always suspicious of dichotomies, because they tend to be false, like good vs. evil.

Granted, the Lepore article is not exactly an academic paper, however the New Yorker piece does appear to be well-researched. It has caused a stir on Twitter, Forbes, and other places that may ripple out around the tech and business worlds like an earthquake. Or maybe it will do nothing, a tiny tremor we experience, as we look around and shrug, not certain if the earth did move at all. Tiny tremors are actually better than big ones, because tiny tremors release a bit of tension and pushes the date of the Big One farther away from the present. That's a good thing.

An earthquake is something that, as a native Californian, I take into stride. Why? Because as a force of nature, or an act of God (if you believe in that), earthquakes happen. I'm not sure there's anything more disruptive than the earthquake. Your entire world stops even though everything around you is in motion. It's a surreal experience. Some welcome the cosmic ride, others throw up in fear. These are all valid human responses and no judgement need be attached to the visceral.

Living through an earthquake is a good reminder of what is important to you. It fuses you to THE NOW and to your own mortality. You go into survival mode.

For those of you not schooled in how to survive an earthquake, the best thing for you to do is to head for a doorway and stay right there until the tremors cease. Whatever you do, don't go outside unless you are far away from power lines, trees, or skyscraper glass. Don't use elevators, either even if they appear to work. In an earthquake, the most dangerous time is usually not when the earth shakes, but the moments after the first tremor. Usually.

After the strange sounds of the earth snapping there is a gasp of silence, buildings settle, gas lines may start to leak, power lines may snap and coil, fires may start, and then in the indefinite aftermath there are the aftershocks. During and immediately after an earthquake, the last thing for you to do is panic, because your panic can be the very thing of your undoing.

When I was in the Loma Prieta earthquake in 1989, one of the disruptive surges of pride that filled my heart was how cooperative everyone was in San Francisco that first night and next day after the quake. We all hung together. No one was rude (American. British, I cannot say). I don't recall any looting, though I saw opportunities. I do remember how dark it was that night because the streetlights were out. I couldn't see my hand in front of my face, I know because I tried. I almost couldn't find my way home. We lit candles and told and retold what happened to us, as if we saw the face of God. Comparing ourselves to the quake of 1906, we came to understand firsthand what life is like living at the edge of the Ring of Fire and what it must have been like 83 years before. Eventually, we swept the dust away and counted our blessings if there were blessings to have, and we moved on. That's the resiliency of San Francisco, I like to think.

Reasoning metaphorically, there's not much difference between experiencing technological disruption and an earthquake. And yet for as long as we've been measuring the activity of the lithosphere, no geophysical scientist has yet come up with the definitive theory for earthquake prediction. There are plenty of hypotheses though. Despite not possessing an earthquake theory, nothing stops us from practicing preparedness and then hoping for the best. No one walks the earth touting an earthquake theory and at this point in history no one should. Why? because the data isn't clear enough to form one. A theory must predict; a hypothesis doesn't have to, but it is constructive in forming a theory if the evidence supports it.

In an earthquake, sand moves while seeming to standing still, making small waves like water in slow motion. It's near impossible to build a sound foundation on sand. So is it any wonder that Silicon Valley, that valley of sand, is doubly vulnerable to tectonic shifts along non-geological fault lines? Disruption is the thing that you must get to so you can get to the thing. That thing being a nebulous holy grail, whatever that thing is.

Yep, we Californians live on the Ring of Fire. We walk the razor's edge. To survive, we understand how to snap back, zen-like. This isn't something that comes naturally to transplants who first arrive to Gold Mountain. Unless one is fortunate enough to hook one's wagon to an old geezer who's seen a few big ones, it's fairly likely one will leave the area and go home or be pushed out for not having sharp elbows. It's not for the faint of heart.

I bring up earthquakes, because unless you have been through one, you can't understand what it's like. People not living here will exaggerate what an earthquake is about or offer such a feeling of doom and paranoia, that their conceptions don't line up with what actually happens. Regardless, earthquakes happen. Likewise, disruption happens. The real question for all of us is, what is it? That's where we should work together to understand.

Disruption as a theory of change and how it is interpreted appears to be as subjective as how one responds to an earthquake. Perhaps that's why it shouldn't be called a theory? It's not that disruption doesn't happen or shouldn't happen. Fear can generate the strangest behaviors and expressions. At the same time cool heads will always prevail, and that's why I think debate about the facts, and not ideologies and interpretations, cannot be unproductive.

We should welcome this controversy. There is a there there, however it's not clear at this point if we are looking at the same phenomenon and interpreting it differently, or we are looking at different phenomena and applying the same word to them.

Unfortunately one caustic response to the New Yorker article was written by senior technology writer, Will Oremus, at, "The New Yorker Thinks Disruptive Innovation Is a Myth." Rather than argue upon the merits, we see he can't argue at all. One might call it hysterical spit balling.

In response to the article, I posted my 140 along these lines during a tweet conversation, with the following reply from King Arthur of the Round Tweetstorm himself:

Regardless, the Slate article is snark in its highest art form of sarcasm, ridicule, and bravado. Which saddens me because an opportunity for authentic discussion was lost. Unless the article is an intentional clankle of irony (I missed it), the takeaway is that disruption is a fighting word, not only by its eventuality in the marketplace, but its definition for those who parlay it. Oremus is just more unwanted noise in the echo chamber.

I suggest that because disruption is many things to different people, it activates different frames within each of us. The big flag for that is all the emoting that is going on among the Venture Capitalists. Lepore's article is not void of emotion either. She does her fair bit of lunge and parry. Especially when she states:

“Use theory to help guide data collection,” Christensen advises.

If Christensen really advises that, I can't see anything any more damning, because that is anathema to legitimate research. Any academic worth her salt would take a stormy exception to such a proclamation. Wouldn't you?

This is why as an observer I found the reaction to Ms. Lepore's article as bizarre and unexpected than if she had lobbed a cow from the ramparts.

Mount Harvard hurling (click image to watch The Original Hurl)

On the one hand we have Ms. Lepore making an earnest, albeit severely withering critique of Christensen's theory along with observations that:

Most big ideas have loud critics. Not disruption. Disruptive innovation as the explanation for how change happens has been subject to little serious criticism, partly because it’s headlong, while critical inquiry is unhurried; partly because disrupters ridicule doubters by charging them with fogyism, as if to criticize a theory of change were identical to decrying change; and partly because, in its modern usage, innovation is the idea of progress jammed into a criticism-proof jack-in-the-box.

This seems to be a fair assessment, nothing truly out of touch or bizarre. If you don't agree with her, that's OK, but her assertions are not bizarre. On the other hand, she seems to state in her hurl that which proves her hurl, if only because Mr. Oremus states:

“I couldn’t help thinking that her thesis boils down to: ‘Disruptive innovation is a myth, and also please stop doing it to my industry.’”

and also

“This may just be my ignorance as a scooter-riding, jeans-and-sneakers-wearing, office-sharing, couch-sprawling blogger, but Lepore lost me right about here: ‘Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with secularization, and what happens when the invisible hand replaces the hand of God as explanation and justification.’

Whew! I guess Christensen isn’t the only one who overreaches now and then.”

Is this not ridicule of a doubter? Is it not a thinly-disguised charge of fogyism? Maybe even the oft-used "crazy woman" routine, such as Karl Rove's attack on Hillary's brain? Is it just me? or is the "she must be crazy!" charge getting to sound daftly old and predictable?

I have to point out, if only as a question –not an accusation– Does the harsh slingshot ridicule by Oremus have anything to do with Lepore being a woman? Is there something about this that is tied to gender?

I have to ask because I recall as of late that when Piketty's Capital in the 21st Century came out a month or so ago, the reaction in capitalist fire rings was certainly tsunami-like. Debate and discourse blew like a tradewind on the Internet, which subsequently sold more of the book on Amazon. In all the various articles that I read in response to Piketty's work, all of them had to do with the work, not the man. The debate that has ensued since the book's publication is healthy, but outside of the predictable internet meme humor, such as seeing the book set on the floor as a as a doorstop in an investment banker's office, the reaction was fairly professional. I did not see the gaggle of geese honking ad hominems that I've seen posted against Lepore. Even Larry Summers who disagrees with Piketty's conclusion for a global wealth tax does not stoop to conquer; he offers kudos claiming it is Nobel prize material, albeit in a backhanded way. So while I can't prove that there is a gender issue going on under the surface, am I allowed to ask, "Is that what's happening here?"

If I might offer a small defense to Lepore: she is a professor of History. Yes, at Harvard, something of an accomplishment for a woman, as most appointments for tenured History posts typically go to men. However, she isn't overreaching not even now and then. This only speaks to an area where Oremus shows a lack of understanding not only in content but also context, while his spittle flies from the screen. Lepore describes to us the evolution of thought and interpretation with regard to the human concept of History. That's with a capital H. If you know your History, you'll know that how we conceive human history, that is, a theory of history, is an long-established topic of study (Any historians out there are invited to comment below). Lepore has done quite a nice job of explaining it to us.

All we have to do is read it:

“Every age has a theory of rising and falling, of growth and decay, of bloom and wilt: a theory of nature. Every age also has a theory about the past and the present, of what was and what is, a notion of time: a theory of history. Theories of history used to be supernatural: the divine ruled time; the hand of God, a special providence, lay behind the fall of each sparrow. If the present differed from the past, it was usually worse: supernatural theories of history tend to involve decline, a fall from grace, the loss of God’s favor, corruption. Beginning in the eighteenth century, as the intellectual historian Dorothy Ross once pointed out, theories of history became secular; then they started something new—historicism, the idea “that all events in historical time can be explained by prior events in historical time.” Things began looking up. First, there was that, then there was this, and this is better than that. The eighteenth century embraced the idea of progress; the nineteenth century had evolution; the twentieth century had growth and then innovation. Our era has disruption, which, despite its futurism, is atavistic. It’s a theory of history founded on a profound anxiety about financial collapse, an apocalyptic fear of global devastation, and shaky evidence.” (My emphasis)

It's an evocative lesson for the rest of us about something that isn't a mainstream notion packaged and streamed in a podcast. That's the New Yorker for you. Still, what she says is quite relevant to today and to us. Moreover, it's a ticklish morsel of evidence upon which the entire arc of her argument launches. Whether you see this as a fetchez la vache moment or not, a flying cow can squash the bejezus out of you if it happens to land on you. It seems to have landed upon Oremus, if you don't mind me saying.

Even Forbes is a little annoyed, as might be deduced from the title, Is Clayton Christensen's 'Disruptive Innovation' a Myth? I've read the Lepore article a few times and I can't find anywhere where she makes the assertion that disruptive innovation is a myth, she just claims that Christensen hasn't presented his evidence to the same proportions or rigor as his assertions of predictiveness. Where's the controversy?

Instead, in an example of "if you repeat it enough times, it makes it true," Forbes Leadership Editor Frederick E. Allen closes his article parroting Oremus's ventriloquism –words that Lepore never said:

“‘Disruptive innovation is a myth, and also please stop doing it to my industry.’”

One early commenter of the Forbes article with the handle DProssor states:

I read Jill Lepore’s missive in The New Yorker a few days ago and then, after this column, read it again.

I respectfully believe you’re missing something in this conversation....

I was glad to know it wasn't just me that thought this. So this means that the Forbes article is yet another instance of the Leporian whiplash that the article has caused in the self-satisfied bizworld, angry about such a... disruption.

This lobbing of farm animals over walls reminds me of a legendary event, that of Lady Carcas in 8th Century medieval France. (Is this where the word carcass comes from I wonder?) The story goes that the Saracen princess held out through a five-year siege against Charlemagne who stormed the city we know today as Carcassonne, located in the South of France. Running out of food, she conducted an inventory and learned that all that was left was a bag of wheat and a pig.

In a crafty example of early psychological warfare, Lady Carcas fed the pig the bag of wheat and then proceeded to have the pig thrown from the highest tower. Charlemagne, impressed when seeing the pig's innards full of wheat and interpreting this to mean the Saracens had so much food that they could lay a pig to waste, ended the siege. (Talk about one-bit communiqués!)

Ends up there was no actual Lady Carcas, but the story does make pigs fly in our imagination. Cows, if you're Terry Gilliam. The parallel here is that what looks to be a messy hurl from Mount Harvard, if we carefully examine the innards of the New Yorker pig, it is full of truths worth considering, even if it doesn't tell the entire story. Lepore did not say she held the definitive version behind disruption. She doesn't have to. She is critiquing the assertion that it is a theory that predicts, something that even Forbes reluctantly admits.

I have to say as well, no one has yet to mention anything about the pompous Christensen data-follows-theory quote, which is really beyond amazing. It hurls me into a weightlessness of disbelief that anyone would take that advice seriously.

Moreover, while she justifiably challenges the academic integrity of Christensen's work, she does not, does not call him a huckster. What has happened is that like locusts, hucksters have jumped on the disruption bandwagon, so even if Christensen has made a viable contribution to understanding the siege-like destruction of companies, his theory has been obfuscated by those, who like "hyenas," want to ravage the dead pig rather than interpret the one-bit message.

Maybe the lesson here, is that if you proffer a theory that is not sound, you're opening the cab door for "theory bohemians" looking for a free ride. This doesn't serve anyone, not the historians nor the business schools, and certainly not Silicon Valley.

Isn't debate between subject matter experts de rigour in institutions like, say, Harvard? Should we be surprised? I'm not. It's not as if Lepore is trying to take down the entire tech industry. She is merely critiquing one of her own. It's called peer review, albeit public and informal. Peer review is in essence the great-great-great-grandfather of the open source movement, so shouldn't it be given some respect? It's all about more eyeballs. At the moment, mine are fixed upon the slipper.

What we see is a clash of cultures unfolding before us. It is commonly why so many academics are misunderstood. It might also be why technologists are misunderstood as well. It is possible to debate on the merits and do so severely. So let's fix that problem and get people to talk to one another instead of partaking in pissoir.

Disruption? Let's step outside to the pissoir, shall we?

If you've ever worked in a start up, or participated vicariously from the sidelines as a tech enthusiast, as tech journalists are wont to do, technology companies are enormously stressful cooking pots of industry. There is no room for the inept, the slackers, or the non-clairvoyant. It requires a village of guts to risk the money, the time, and the reputation to participate in this culture of innovation.

Where the South has a grudge to bear against them damn Yankees for destruction of southern gentility (an intentional euphemism), the West sustains an ongoing annoyance with the East for its presumption that the world revolves around New York, Boston, or Washington, D.C. California, on the other hand, has a bustling economy that competes with nation-states, so we must be doing something right way out here in the boonies.

West coast culture is pioneering. We welcome the new and newcomers. We are populated with descendents of those with an urgent desire to get away from the vertically static and asphyxiating social hierarchy of Old Money and sundry Mayflower bluebloods. In California anyway, we are always eager to try the new, experiment, and play with something novel. If something "has always been done that way," we'll consider a way to do it better, and then we'll tinker on something different by the shine of the garage light bulb.

Of course, Christensen is not exactly West coast, but understandably, I can see why he would be a darling of Silicon Valley entrepreneurs. He has given a voice in the East to what we do in the West. What's not to love?

Still, as is typical of American business people, there is an aversion to intellectual pursuit and those who live by that standard. I can see both sides.

Frequently arguments over theory between academics can be as interesting as watching two dogs sniffing under tails circling one another ad infinitum. After a while, you just want them to stop it.

Theory doesn't mean anything if the rubber doesn't meet the pavement. But that's what's so bizarre to me. Lepore's challenge is that this particular theory doesn't have predictive power. It can't be applied.

By the way, innovative disruption is something we can see farther back than the late 1980s. The effects of innovative technology are peppered throughout history, and it's a fascinating topic to study and the subject of a countless number of BBC programs.

So while disruptive innovation may be an anthropological event in history –and I believe that it is– as an applied business theory it falls short. I'd think that the self-made would want to take a listen instead of snap towels at the messenger.

Christensen has made a valid contribution if only because he appears to be the first to detect something newish happening, which from what little I've read, appears to be the devastating effect of accelerated change generated by innovation and why businesses with monopoly-like control better think twice. That is something those in the technology camp have been made fully aware from lived experience.

It used to be that if you were first to market with an innovative product, you could play king of the mountain fairly successfully. But in technology, which has lowered the bar of entry in some ways (though not in others), if you are first, it's more likely you'll be last before the end of the day because someone will look at what you've done wrong and will execute it better and then take all your customers or their eyeballs.

Just consider Internet Explorer to Navigator, or Facebook to Six Degrees, Friendster, or MySpace. There are other examples. This is the struggle of the Incumbent versus the Disruptor. Sometimes the disruptor can be a very large established company, and not the lean start-up. Just look at Apple and its iPhone, which has completely reshaped the mobile phone experience for millions of customers.

The biggest takeaway of the Lepore article and its reception has to do with something Sacred and something Profane. Each activation of these frames are particularly sensitized to different groups, since what is profane to some is sacred to others and vice versa.

On the one side we have The Literary Magazine with Lepore as its jousting champion fighting for the honor of literature, history, research, and learned thinking, all refined ladies in their own right. On the other side we have Silicon Valley with its entrepreneurs fighting for the honor of ambition, enterprise, invention, and creativity, other charming ladies who beguile in their fast-company mannerisms.

The controversy concerning disruption is a culture war.

Regardless of the foundational issues that have caused this, the debate is healthy. Let's abandon defensive ridicule that only stirs up the dust. Something important is happening here. Let's try to understand what that is.

Necessity is the mother of invention. We need to talk, not only to talk the content, but so that we learn civility. It may mean inventing new ways to speak and new ways to listen.

Stop wasting time.

If both sides were to see what they share in common, rather than where they differ, there might be something truly productive from all this chatter, thrown slippers and whatnot.  Each side has something meaningful to offer the other if only to understand how the honor of all these ladies are important to protect if we hope to preserve moral excellence in human society.

We need one another right now more than any other time. We must learn to listen to what the other side finds grievous and even if we don't see it quite that way, we must be respectful of what the other finds sacred. It can be done. The alternative is a holy war.

If you get too much in the habit, your identity forms by the existence of enemies instead of friends, making the world an inhospitable place. One day our bones will be dust and all that will be left of most of us are the words we write upon the Internet. The world is wide, so there is no need to squabble when there's room for all of us. At the same time, we stand on a very small planet in a very large universe, and sometimes it's good to remember how tiny and insignificant we truly are in the absolutely big picture.

Maybe that's why in such a large universe just a little love can go a long way.

All we need is love.

You are most welcome to leave a reply using Tackk's commenting tools below.

Some Additional Links on Disruption
for Connecting Your Own Dots

I expect there will be many more articles concerning disruption, even a second response by Ms. Lepore. I hope so.

Silicon Valley, the New Yorker, and Disruption
Winterspeak - June 18, 2014

The New Yorker: Battle Of The Strategy Titans
Forbes - 6/19/2014

Disruption is a dumb buzzword. It's also an important concept - June 23, 2014

Leadership Hall of Fame: Clayton Christensen
Author of "The Innovators Dilemma"
Fast Company - March 31, 2011

The Tryanny of Structurelessness - July 1973

58 Cognitive Biases That Screw Up Everything We Do
Business Insider - Jun. 18, 2014

Hey! You! Get Off Of Our Bandwagon
Tech Crunch - Mar 15, 2014

Even the Father of Disruption Thinks “Disruption” Has Become a Cliché
Slate Magazine - June 23 2014

An Incompetent Attack on the Innovator's Dilemma - June 27, 2014

(I'll add to this list as I find more! Cheers!)