Doing Away with Discipline: The Way of the Digital Scholar

In his 6th chapter, “Interdisciplinarity and Permeable Boundaries” in Digital Scholar: How Technology is Transforming Scholarly Practice, Martin Weller (2011) anchors the idea of Interdisciplinarity in digital practices that reshape society. Drawing from Chris Anderson, the current TED curator, he claims that “lightweight and unrestricted forms of communication found in many Web 2.0 tools may serve the needs of Interdisciplinarity to overcome existing disciplinary and geographical boundaries” (p. 2).

Weller suggests that open, digital, networked technologies are, in many ways, responsible for an “unexpected collision of distinct areas of study” (p. 2). To an increasing extent, digital culture permeates the walls of the ivory tower as technologies enable new practices, which “create[s] a common set of values, epistemological approaches and communication methods” that “override those of separate disciplines” (p. 3). Approaches to research emerge that refigure what it means to be a researcher as academic behaviors encompass more and more digital practices. Researchers adhere to new, emergent norms of discovery in their work, which often run counter to the traditional, fragmented, departmental models of an analog past. As a result, new pathways leading to different-yet-viable methods of knowledge production are formed, reshaping institutions and disciplines as they crystallize via publication. As scholars tread grounds beyond their familiar intellectual territory, pursuing innovative ideas outside of their academic home, they form alliances with others by way of new media. Blogs, social network sites, and Wikis evolve with scholars’ ideas as convergence cultivates creativity, play, and other forms of generative learning that cut across disciplinary boundaries.

This is a big deal for an academy structured around a model of institutionalized knowledge, which developed a fragmentary schema of disciplined study sometime in the mediaeval period. In the cliché words of Bob Dylan, “The times they are a changin’.”

For Weller, Interdisciplinarity goes beyond the physical constraints of pre-networked society where “Journals need[ed] to have an identified market to interest publishers, people need[ed] to be situated within a physical department in a university, [and] books [were] placed on certain shelves in book shops” (p. 2). Digital practices lead to virtual spaces where cultural norms and standards adhere to new possibilities, enabled by global networks of scholars who reform the functions of their trade and find innovative uses for new media tools leant to research efforts.

Problems in the academy arise when a clash of realities between digitally-oriented and analog-secure scholars lead to disagreements about rigor and relevance. Many scholars oriented toward tools of a pre-network society (i.e., analog technologies and traditional means of gaining public notoriety) remain unconvinced that digital practices can be rigorous or salient. As skeptical reactions toward Wikipedia’s credibility illustrate, many academic professionals who hold sway over tenure promotions and search committees remain suspicious of digital practices, distrusting the viability of knowledge that emerges through work that is digitally prodused under the cultural auspices of openness, free access, and quick turnover.

Interdisciplinarity is at once condoned when tied to emergent digital practices. Weller’s discourse frames the “schizophrenic attitude toward Interdisciplinarity” (p. 1) as a problem of exploding traditions.

He exposes a reality in the academy where scholars of an “old guard” who seek to defend the boundaries of institutional disciplines clutch to analog tools and methodological constraints of an old paradigm. The compendium of digital scholars entering the academy, as both students and new faculty, are forcing those who protect the standards of traditional approaches to yield their posts as they crash institutional gates with smartphonestabletsGoogle AnalyticsBlogger, and Twitter – all tools that diversify research audiences, amplify scholar’s messages, and ensure that scholarship has a larger impact when published.

In short, the digital difference in scholarship is Interdisciplinarity since digital practices break down barriers. With digital tools come digital practices and standards that academic institutions must take into account as they move into the future. Academic definitions of knowledge and discipline are forced to shift with a paradigm of practice that threatens the authority of institutions everywhere (see Weller’s discussion in Chp 3 regarding the music and newspaper industries).

In Weller’s view, Interdisciplinarity doesn’t only apply to academic work. In reference to blogs as a genre of writing that leads to inquiry, Weller suggests that the “personal mix is what renders blogs interesting” as he explains that, in one of his favorite blogs, the author “mixes thoughts on educational technology and advice on the blogging platform WordPress with meditations on B-horror films. The mix seems perfectly logical and acceptable within the norms of the blogging community” (p. 4). The takeaway here is that digital culture remixes other cultures, including the intelligentsia, and this leads to new social formations. Scholars reinforce altered practices of engagement, learning, and knowledge production with their research, regardless of its focus or content, as they use digital tools to conduct it.

This means that the academy is changing from the inside out—a centrifugal force pushing out old hierarchies as it makes way for new networks. As Benkler (2006) suggests in Wealth of Networks, there is value in these networks, which is derived from the network itself and the swarm that embodies it. New networks have their own energy, which establish new modes of evaluation, new means of discovery, and new ways of making meaning through human action that gnaw at the edges of disciplines keeping old hierarchies sturdy and analog identities intact.

As Weller notes in his 3rd chapter, “Lessons from Other Sectors”, academia should take note of alternative resources that lead to new forms of research and learning before it loses its institutional hold on knowledge as an ideological authority. While this may seem a bit pretentious, the everyday experiences of academics who utilize digital tools frequently reveal the pertinence of such a warning. As digital culture subsumes disciplinary culture, Interdisciplinarity becomes more of a reality and ideological apparatuses are reshaped to fit “the social classes at the grips in the class struggle” (Althusser, 1970). The “weakness of the other elements in the ‘university bundle’ could become apparent, and the attractiveness of the university system is seriously undermined” (Weller, 2011, Chp. 3, p. 8) if traditions remain carved in blocks of stone.

Digital practices chip away at those stones.

The networked foundation for digital scholars’ work gives them the stability and solidarity to tackle complex, societal issues in ways that “old guard” academics never imagined possible. As a result, they may find their efforts having a greater practical impact outside of academia because institutional standards fail to adapt. This is a dismal attitude to take towards schools, which have made technological development and intellectual growth possible for an eon. However, as Weller warns, we should not confuse “higher education with the university system” (Chp. 6, p. 1); people will find a way to accrue new knowledge in any way available, and if that means subverting the dominant, traditional university system, so be it. The integrated perspective of Interdisciplinary pedagogy that Weller draws from Ernest Boyer, which makes “connections across the disciplines, placing the specialties in a larger context, illuminating data a revealing way, often educating non-specialists” (p. 1), is more hopeful than the critical view taken by many scholars caught up in the current system. This may be because hard working academics who strive to climb social hierarchies do not stand in solidarity together.

It is no lie that many graduate students and untenured scholars are bent on dismantling the good work of their brethren, who have spent a lifetime building the best stocks of knowledge they can in contribution to their discipline. In the end, these scholars belabor tired points in graduate seminars and faculty meetings, more concerned with asserting their self-centered agendas and personal politics as a way of accruing social capital, rather than fostering ongoing dialogue amongst their colleagues that would lead to new ideas and innovative inquiry. Digital practices tap networks that provide academics with outlets to collaborate unilaterally and avoid the traps of corporate machinery embedded in the institution, nullifying the need to burn bridges and step on toes as one makes their way in academia.

The limiting scope that arises when scholars squabble over methods of research, play tug-o-war with the line over authority, and willfully thicken tensions that arise between “hard” and “soft” sciences is perhaps the very reason why Interdisciplinary work evokes a laugh when suggested as a bonafide approach to research. Weller sees diversity as nothing to fight over. The habits of discipline are hard to break “and interdisciplinary work requires transcending unconscious habits of thought” (p. 2). Scholars who commune through digital practices begin speaking new, integrated languages that bridge gaps between research agendas rather than widen disciplinary lacunas. This is because, in their practical nature, digital technologies dismantle boundaries of institutionalized thought, not thoughts of institutionalized scholars.

So what would Weller’s Interdisciplinary model of higher education look like?

I asked my girlfriend this question after I finished reading Weller’s book. We both have different opinions about what counts as research. You might say that we both have trouble transcending disciplinary habits. While we both attended liberal arts universities in our undergraduate studies, our affiliations as graduate students differ. I study Communication, so I consider myself a humanities scholar; she dons the tag “social scientist” as she studies Applied Anthropology.

In our conversation, I envisioned a school where scholars work together to diversify fields of interest and broaden student perspectives. Explaining my ideas, I began brainstorming for a curriculum that put Interdisciplinarity at the center of pedagogy, instead of the margin.

At first she was intrigued by my excitement.

“Could you imagine it? … What if, as an undergrad, you could take classes that blended different areas of study? Something like, “Environmental Ecology and Spirituality”, “Statistics and Performance”, “Graphic Information Systems and Food Cultures” or “Creative Writing and Biochemistry”. How cool would that be?”

Her expression went from hopeful to disturbed. “Everyone would be really confused,” she said.

Perhaps.

But I don’t see that as a bad thing. Then again, I’m a digital scholar.

Creative Commons License
PLE's Doing Away with Discipline: The Way of the Digital Scholar by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Advertisements

There but for the Grace of God go I

I’ve been thinking about my grandmother a lot lately. She was a wonderful, God fearing woman who believed in Grace above all things. “There but for the Grace of God go I” she would say, almost as a way of leaving a phrase unfinished; like it was her way of reminding us that life will always continue to challenge, trouble, disrupt, and dismay us, but that we will be required to keep living in spite of our shortcomings and displeasure.

I wish I’d been mature enough when she was still alive to tell her how much I appreciate her—how much I appreciate my memory of her now. The way I remember her coping with life’s small miseries creeps into my thoughts daily, serving as a manual for difficult moments that can only be read after-the-fact. No matter how hard I try, I can never seem to hang on to the lessons indefinitely.

I wasn’t close enough to her at the end of her life.

For some reason my thoughts always drift toward her in my most challenging moments, searching for examples of similar situations where she exemplified a better strategy for coping, or told some story with an important moral lesson, which would act as a guide the next time around.

She had a way with crafting words around caring gestures that warmed a room, never leading to controversy, awkward confrontation, or confusion. There was a pleasantness and serene aura about her, at all times, but not in a naive way; that is, there was a subtle intellectuality about her presence that was truly…

Graceful.

Religion is attractive for the sole reason that it allows us to make real, through parable and meta-narrative, the very fantasies that come to life in our memory. How romantic the Christian notion is that we can speak to the dead, or that metaphysical beings are watching over us, stewarding loved ones left living until their final hour. A Christian might tell me to pray and that the person I am thinking about, dead or alive, will receive my blessing. Or they might tell me that “your grandmother knows” how you feel because “she’s watching over you.”

It’s funny how the ways people talk about religion echo the ways people talk about the Internet.

Regardless, these narrative devices are nice ways to make sense of a confusing world, but they’re not very comforting, at least not to me. Maybe there is some truth in the “up there” and “out there” notion of a Christian spirituality, but the general dogma is much too contradictory and simplistic for me to embrace its romantic notions. In short, religious scriptural naivety ruins the magical moment of romance that most derive from the promise of salvation.

Although, I do admire those who have such firm belief in fantastic notions of life ever-after. I appreciate that they are able to find peace in ritual, study and practice.

My grandmother was this sort of person—a religious person who truly believed in something. I absolutely respect such certainty as a way of being when it is grounded in notions of goodwill, honesty, and faith. Some people, like my grandmother, understand Grace to much deeper degrees than those who constantly question their beliefs and, as a result, struggle to suspend their assumptions when making connections with others that really matter.

I believe in humanity and in goodness, for sure, but can’t rationalize a heaven (in the mono-theistic sense) any better than I can order coffee at a diner speaking Gaelic. As an intellectual, I feel like I’ve distanced myself far from anything that resembles what I used to know as “faith” and that salvation is dish best served cold. As a result, I struggle daily to trust in others, to believe that others will help me, that they will offer some missing piece of a puzzle needed to keep living, or that they’ll help me achieve my goals.

How can I put stock in a mystical universe based on faith when my career is rooted the practice of explaining away complex phenomenon so that I might understand the most confusing aspects of the universe? After a while, I read so much that I begin to realize that everything can be explained, deconstructed, reconstructed, understood, performed, modulated, queered, queried, themed, grounded, compartmentalized, analyzed, translated, or criticized. Before too long, the quest for an all encompassing Truth is easily abandoned, left to drift down the river of yesterdays, where bits and pieces of forgotten selves—religious beliefs, innocence, naivety, skepticism, goofiness, hopes, wishes, magic—collect in pools that straddle the banks of a personal past left behind.

There’s no Grace in my daily life—not usually. It isn’t something that comes easy to me, like it did for my grandmother. It’s something that I have to cultivate by reminding myself to take it easy, to take it slower, to stop and breathe, to cope, and be aware of what’s around me.

Grace is the one thing that my grandmother seamlessly embodied. I know that she had a lifetime of encounters to learn these traits, and that she was probably more willful in her youth, but my memory relentlessly reveals her Graceful prowess. I remember her best as the woman who was inviting, open, joyous, gentle, delighted, and easygoing. These are all qualities that would be useful to me now, more than anything, if they could only be remembered in the moment of stressful encounters with others.

I don’t think I’m alone in this.

I think Grace is a struggle for many, and I warm to people who don’t reflect my willful demeanor. I think most of us—academics especially—critical scholars especially—ought to remember to strive for Grace whilst speaking their minds, making claims, and doing what they’re trained to do.

In I and Thou, Martin Buber says that all actual encounter is by Grace, not by seeking, and that all actual living is encounter. I interpret this to mean that the only way to truly live is to relinquish moments of willfulness so that the other—whomever or whatever the other might be—may be invited to experience oneself fully and wholly, no holds barred. To me, Grace is about finding comfort with one’s own vulnerability so that, when we least expect it, we may trust in a universe that can never be fully comprehended.

This semester has brought work related stress to a whole new level and I’ve been struggling to cope. If all of life is suffering, like the Buddhists say, than I’ve been experiencing an order of it reserved for the busiest, most intensely demanding, and most productive times. It has been hard to be Graceful, much easier to assert my will over the tasks of daily living in order to command, conquer and accomplish everything that I imagine is expected of me.

But there comes a point when all of the willfulness that helps me efficiently check things of my to-do list—the course, protective, power-focused, nerve-centric affect that I carry in my shoulders after 6 cups of black tea and 10 minutes to spare—becomes too much to bear.

 

There comes a point when I no longer recognize I.

I seem more like some one else.

I forget that I am only myself in relation to You.

But without Grace, there is no You—not to me, anyway;

There’s no trust, no vulnerability, and no openness.

Without Grace, there is no I, no actual living.

***

“Nick, I need to see you for a minute before we break off into groups,” she says, throwing he words over a few chatting colleagues sitting near me, already discussing their projects. I skirt the table quickly, weaving through spinning chairs as I near the end of the table.

“Heya” I say, with a certain nonchalance, “What’s up?” I can see that she has notes for me and something pressing to discuss. I’m eager and a bit nervous, but in an expecting way that’s not as much fearful as ready and confident. I see that she has my paper in front of her with notes written on it.

She looks forward in thought as she speaks, crafting the right words, responding in a thoughtful and focused way that senior professors have learned from what seems like thousands of years of experience, in classrooms where they’ve refined professorial wizardry via literary magic.

“That comment I made a minute ago about finding what you are writing about 5 pages into the paper—I hope that you picked up that it was direct toward you.”

I nod.

“I see what you have here, and it’s nice and I get it, but sometimes you get so abstract—I mean, you really let yourself go—that it becomes unclear what you’re getting it. The reader has to know where you are, what you’re doing, where you’re taking us. You’re job is to reign yourself in and tighten it up, or else we’ll be totally lost.”

I’m grateful and I agree, but my face doesn’t show it. I want to be appreciative, but my voice won’t let me. The exact opposite comes out and we grapple in discussion over ideas for my paper. For the next 10 minutes we hash over ideas, rationalizing, mythologizing, tossing out possibilities in search of commonality. Something in the darkest cavern of my mind wants to escape and say “thank you just for reading it and offering feedback,” but waves of intellectual prowess infused with oily pride keep it hidden away, under pressure, unable to reach the surface. In the back of my mind, I’m wondering why I can’t ever accept feedback as it comes—the way it comes—without added expletives, justifications, or defenses. “Why the need to explain yourself?” I ask myself, inaudibly.

We continue to exchange details and get a feel for each other’s needs. Eventually, we reach some common ground but not before I deliver a beautiful rendition of “difficult grad student” for an undeserving customer. “Must I always be such a pain in the ass?” I ask, clinching my teeth to keep my mouth shut and end the conversation.

I walk to my office pleased with the progress that we’ve just made, glad to have more direction, elated that a person with her esteem and sensibility is working with me. Undergirding all of this is an overbearing feeling of shame that I couldn’t show that I was grateful for the time and effort she put into helping me. Embarrassment is out of the question when self-spite like this is in order, and my head gets hot as pride boils on the tip of my tongue. I cuss at myself as I search the halls, turning the corner by the bathroom, wishing that I could be more of the appreciative person that I sense myself to be in actuality.

“Grace” I say out loud, just before I see my colleague coming down the hall. “Where is the Grace?” I ask, in my mind, emphatically searching for a way to change my mode and shift my mood. My colleague comes up beside me and we stride in step as we enter my office to begin collaborating on each other’s work. I feel myself ease into a funk, unable to cope with the failed dialogue I just had and suspecting that I’ll botch this next encounter all the same.

We sit and read for a few moments and then toss around some comments, lightly chatting about each other’s work. Her paper is excellent, written in a voice I only wish I could capture. There is elegance and directness about the way she describes her actions, the way she writes her environment in the story, and the way she illustrates her relationships to other characters that is unique and visual. It’s a style that is second to none in our department. Having read some of her writing a few years ago, I can see that it’s full-bodied and enlivened now. I can already see her paper’s panning out into a publication of some kind that is definitely worth both of our time and effort. I truly believe in collaborative-driven ethos, especially when it comes to writing narrative.

She continues to admire some of the things I’ve written and casts a glance over the pages she’s holding that beckons me to respond to her work. I tell her that I’m pointing out a few things that are really mechanical by nature, no big deals. “Just some things that help invite the reader into the story a bit more.” I say, without urgency.

Her glance changes from an inquisitive hopefulness to a concerned dryness, more worried than ugly. We bicker for the next few minutes about whether or not the details I’ve annotated are really worth editing, arguing over the definition of conversational voice before we notice that we have to get back to the classroom and report back to the rest of the group.

“I’m a better editor than I am a writer” I boast. I think it’s the truth. As we leave I tell her that I’ll work through her paper over night and bring her back some good feedback in the morning. I’m hoping that my willingness to put time and effort into editing will ease her nerves. I love editing because it helps me be a better writer and lets me practice providing solid feedback that might actually be of use. Helping people move in better directions is always fulfilling if they allow you to intervene.

Again, I’m met with what I sense to be distrust and discomfort.

Silence.

“I’m sorry, I didn’t mean to make you uncomfortable. I don’t have to edit your paper if you don’t want me to. I’m just trying to help.” It’s awkward, and I can see her struggling with a gentle way of telling me that she’d rather not have my input. I’m struck at this, thinking that we had a fairly collegial relationship until now, and I can’t help but wonder if her eventual “I’d rather you didn’t read it” comes because she doesn’t trust my character in general—or if it has something to do with what’s written in the paper, something I have yet to discover about her that would reveal a tender spot of vulnerability.

I never liked not being allowed to help people. It’s one of the only ways I feel like I can show my true colors and be the more gentle, caring, responsible, compassionate, and well intentioned person I sense myself to be; it’s the only way that the sunken treasure of gratefulness has ever risen to the surface of my everyday experience with others. I know that tenderness breeds tenderness, and I want to show her that I can help, if she’d only give me a chance. I do my very best to block out self-consciousness and doubt in order to reaffirm my commitment help her.

We turn the corner of the classroom without coming to a resolution. Her paper in hand, I engage with her about our brief conversation as we discuss our group meeting with the class. Surprisingly, we’re all at very similar places in our writing and there are a few tense laughs shared by all—a welcome relief to the tensions incurred when writing from the soul and speaking from the hip. There is a general sense of agency and community in the air as we wrap up class.

I look down to pack up my things and see my colleague sitting next to me, a quick maneuver in the adjacent chair after it becomes vacant. She reaches her hand out to grab her paper, a half-hearted attempt to snatch back what is rightfully hers.

“After some thinking about it, I’ve decided that I just don’t want you to read it. I’m just gonna do something else,” she asserts, not wanting a conversation. She’s cautious and slow about her words, indirect and sheepish in a way that furthers my confusion. For the last time, I wonder if she’s suspicious of me as opposed to what’s in her paper.

“Look, it’s just a paper and I don’t really care what’s in here, but I like it so far and I think I can help. You have a wonderful voice and a way with words that I envy. I think if you let me give you some good feedback you might find it helpful. I promise I won’t be mean.” In a joking way, I crack a smile and wryly remark, “I swear that I won’t judge you any differently than I judge you already!”

She whips a quick, stern glance my way. “And lets talk about that!” she says, her voice raising her eyebrows as she turns toward me, ready to grab her paper for good and run. I can tell that I’m at least half-right about her issues trusting my intentions but that the real onus of trust lies in the contents of the text itself. Though I was trying to be witty and dry, a way of alleviating tension, I think I pushed her a bit too hard and put pressure on a sore spot that I didn’t know existed. Sometimes bruises aren’t always apparent. I continue in a plea.

“Look—you can swear me to secrecy. I’ll sign a contract that says I won’t tell a soul what’s in this paper. I think you should let me read it, make some notes, and I’ll put it in your mailbox when I’m done. I’m here to help you, not to hurt you. It will be good in the long run, trust me.” Apprehensively, she deflates in here seat and calmly, quietly agrees, still unsure that she’s made the right decision. As I walk away, either am I.

Moving from the classroom to my office, I try to make sense of the two encounters. Both were about writing; both required a degree of vulnerability that I just couldn’t show; both ended well but left the relationship on an unstable ground. Questions start a race in my mind, unconcerned with a finish line.

Will I ever know Grace? Will I always be this abrasive? Why am I so disconfirming and off putting? Will there come a day when being easy will be easy? When being difficult will catch me off guard? Will the deeper self, the one filled with gratitude and awe, ever find a way to fuse with the surface of my experience? Will people who wade in relation to me ever find something between us worth sustaining?

I just don’t know. I’ve got no answers to fleeting questions.

As I get into my car, a deep breath exits my body, releasing the hyper-tension in my body before I turn over the engine. I press the clutch, turn the key, and throw the stick into reverse  as I turn my head to gaze out the rear window.

“There but for the Grace of God go I” I say out loud, smile, remembering my grandmother, slowly putting the car into gear.

Creative Commons License
PLE's There but for the Grace of God go I by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Digitally Rewired: Coming of Age in a Network Society

Philip Zimbardo thinks my brain has been “digitally rewired” by video games, computers, cell phones, and the Internet. If you grew up with digital technology, he thinks that yours has, too. He thinks that we think differently—nonlinearly—and so we get bored in school. This poses a challenge to learning because analogue teaching methods are still being used in the classroom. Sir Ken Robinson makes a similar argument in his RSA video that discusses why schools kill creativity and how the arts should be embraced in learning environments. He also claims that divergent thinking is stifled because technology makes students more prone to webs of interaction that stimulate many interests at once. He resolves that a formal education should include various forms of play, which tap into our natural capacity for understanding, retaining, and refiguring information.

I wonder if there’s any purchase to these concepts. It makes me think back on my childhood… …before there was Facebook, Web 2.0, and Android…

…When I was a kid, I was a Boy Scout. My grandfather was the troop leader, my mom chaired regular meetings, and my stepdad frequently drove me to the local church where I’d spend a few hours a week learning knots, reciting motto’s, and telling fart jokes. Twice a year, we’d have fundraisers, squaring off in competition, each of us pining to be top-sellers and have a choice of prizes that our parents would never buy us. We’d sell popcorn or we’d sell flowers.

I excelled at flower sales, but it wasn’t because I knew anything about horticulture. The truth was simpler: my family worked hard to ensure that the Scout troop stayed financially stable and they’d be dammed if their kid didn’t win the top prize. Needless to say, I had a social network that, when activated, could rival the Gotti family in efficiency.

It’s amazing what parents will do in the name of their kids.

Each year, I won top seller. Each year, I sold maybe $50 dollars worth of flowers. Each year, “I” managed to rake in a few thousand in sales.

I was an anomaly in the Boy Scout flower-business world, often endng up the regional top-seller. Other kids dreaded me and other parents were always suspicious that my scheme was a sham. I was heavily dissuaded from disclosing this information at the time, but my entire extended family (and friends) hustled flowers like street-grade powder. It was never a solo operation. Perennial blues and seasonal reds lined edges of every house on my street; no cousin was too distant to receive a phone call; white pages of the family phonebook were riddled with highlighting for quick reference. Harnessing the power of my familial network, I was a philanthropic force to be reckoned with. Eventually, people started calling us, asking when the sale would begin. They needed their fix.

To the chagrin of my fellow Scouts, my family’s hard work would always end in uproarious applause and admiration. I was the sole benefactor, the shining, smiling, buttery face of a sinister flower-sales operation. Each year, I carried my new bike, year’s supply of bubble gum, or $500 savings-bond back to my chair at the annual banquet feeling like I knew a secret the other kids didn’t. I knew how to get my family to work for me…

Thinking back on my childhood, it seems that Zimbardo and Robinson are right; I was raised to think differently. I was wired in a unique way.

But that wiring had very little to do with technology. It had more to do with the patterns of organization I was exposed to. I was embedded in efficient, beneficial, and rewarding peer-production networks from the start. In a social arena where the stakes were raised and motivations ran high, figuring out how to crowd source was a necessary part of my youth experience. My parents insisted that I participate in nearly all of the organizations in my local community. If I was rewired by anything as a youth, it was by my parents’ willful style of parenting, not the computer games that connected my to the slow and arduous world wide web of the early 90’s.

I do like the idea of digital rewiring, but I think the word digital needs to be redefined so it is not so deterministic. Maybe the meaning of digital isn’t so attached to technology; maybe it actually means social, participatory, networked, dedicated, and modular.

Yochai Benkler (2006) says that “emerging models of information and cultural production, radically decentralized and based on emergent patterns of cooperation and sharing … are beginning to take on an ever-larger role in how we produce meaning” (Chp 2, para. 6). “Emergent” is misleading in this phrase because I’ve been participating in cooperative networks since before I could pronounce the word “Xanga.” The models of production Benkler discusses were in place well before digital reffered to a technological practice.

Could it be that the organizational patterns I learned as a child, before we were all tethered to smartphones, laptops and tablets, made me privy to current patterns of participatory culture, with its convergent qualities and social production values? Maybe the reason I value open-source software, online gaming, Facebook, and music sharing  aren’t because I was “born digital” (Pelfrey & Gasser, 2008) but because the in conditions of my upbringing revolved around modularity and granularity.

Maybe what makes the current age digital isn’t so much that we reformulate texts “by the fingers and thumbs (the digits), clicking and keying and pressing in” (Kirby, 2009, p. 1) buttons, but that knowledge is formed through patterned interactions tied to the economic and political realities of the the past and the present. Maybe my childhood was littered with events, activities, and social functions that prepared me for a network society before network societies were ever a thought.

This would mean that members of an older generation—my parents’ generation—set the stage for the digital age. It would mean that their cultural norms and standards formed some sort of digital prehistory—a precursor to the Hive style of society (Kelly, 1994) we live in today. If this is true, and I think that a good socio-political case can be made that it is, then today’s youth have a sort of digital inheritance sans technology, of which they’ve developed new media technology to help them enact the patterns they were raised with.

There was no definite dawn of the digital age; it isn’t as though the sun came up one day and shed light on a field filled with touch screen computers, IPhonesSpotify, and Mark Zuckerberg. As historian Thomas Kuhn (1962) reminds us, paradigms shift when the current state of affairs reveals “anomalies whose characteristic feature is [a] stubbornness to be assimilated to existing paradigms” (p. 97). In short, it isn’t until we notice that the old way of doing things isn’t working that we recognize our current conditions to be different. This means that the revolutionary nature of the digital age exemplified in peer-production projects like Wikipedia, meritocratic mechanisms of group ranking on sites like The Pirate Bay and Reddit, and crowdsourcing trends that boost blog/news sites like Huffington Post to the top of search engine results are not the real reasons for major shifts in values and culture. In actuality, what’s changed is our perceptions.

Fundamentally, changes in culture and social values made more apparent by new technology are misunderstood to be inherent to the technology itself. But technology has no inherent quality. This sort of perceptual framing (or technological determinism) is a convenient rhetorical strategy used by many to resist participating in the network.

We like to think that we’re living in the midst of a paradigm shift—that those innovative forms of production we attribute to the contemporary Internet have changed everything, including the meaning of power, control, and social structure. It isn’t true. The paradigm shifted a while ago.

If a collective effort was made to spread the idea that things have been networked for a while, maybe we could have a conversation about the ethics of representation, participation, and authority in the digital age before accusations are made about the “inherent dangers” of new media technology. Maybe we’d all take one big step forward and wuite being so afraid to participate. Maybe students would be more literate when it comes to digitality.

Henry Jenkins (2006) thinks that we need to play in order to learn how to live more amicably in the digital age. Both parents and youth can learn how to navigate a network society, together, if they abandon their allegiances to perceptions that frame new media as “new”; if they acknowledge that we have all been digitally rewiring each other for the better part of 30 years, and that participatory culture works better when everyone is on board; if they embrace the network for what it is and understand that they all have a role to play in it.

Creative Commons License
PLE's Digitally Rewired: Coming of Age in a Network Society by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.