The Digital Playground

We’re supposed to be playing games. We’re not. We’re starting a fight.

People argue and the rhythm beats against my skull. They toss ideas back and forth like a game of catch with a ball that’s easy to throw but difficult to throw back. The more that people argue, the less they mean and the more they attack one another.

I want to do something fun. That’s why I’m here—why we’re all here—to begin with. We’re supposed to learn through play. Instead, the back and forth of confrontation sails overhead, competitive, taunting, and demeaning. I put my hands against my temples, waiting for the ball, following along—annoyed but still attentive:

“I’m just sayin’.”

Someone yells, tossing the ball across the room.

I’m just sayin’!”

Louder, throwing with more force.

I’m just sayin’.”

It’s falls to the ground and someone picks it back up.

The ball passes in front of me, way above my blood pressure, making me tense. I’m not sure how to play when people fight. I’m a bigger fan of dialogue, where everyone plays along. When people contribute easily, included in the game—connecting with others as they share ideas, suspending assumptions. Playing fair and, for the most part, playing nice.

This is not that. This is people fighting over a ball…

Catch.

“But students aren’t that smart. They want things to be easy and they don’t want…”

Toss.

Catch.

“Do you know how ridiculous that sounds? Really, I mean you can’t honestly believe…”

Toss.

Catch.

“You can’t say that! That’s not necessarily true! Studies show that people don’t care…”

Toss.

Classes like this are ruined from the start by too many personalities pulling in every direction. Discussion is disruptive; dialogue is meaningful; but here learning is reduced to miscommunication. Though no one’s in charge, no one takes turns because everyone has something to say. And someone always gets left out.

In dialogue, when one person wins, everyone wins.

That’s just the way that it goes. I hate being the person who’s unsure if they’ll get to play. I make others know that I’m not going away. I assert my presence and take a firm stand. I struggle for attention among strong egos. The need to be hears comes before good ideas and competition trumps decorum. I’ll be the first to admit that I’m abrasive—that I get animated when I feel threatened. Motivated by malice and cursing under my breath, I look for ways to break the rules and stay involved, get my words in edgewise and find a way to throw the ball.

I get loud and speak out of turn. I interrupt just to digress. My chest is tight from my heart to my neck, suffocated with ambition, the empathy strangled out of my words. Hot with anger I hold my breath, biting my tongue in half at the sour taste as the room gets heated.

I realize that I’ve had it all wrong: This isn’t play; this is people fighting with guns

I grip my desk to control the expressions on my face. Someone takes a shot at me, pulling me into the fight. Thrust into the open, I’m mocked by a person who’s got a way with words—criticism with a real need to be “right.” On guard, I pull back, holstering hasty ideas, taking my finger off the trigger, thinking about escape and there’s bedlam in my mind, generating thoughts too raw to express, harboring words in steady production as I prepare to draw. It’s only a matter of time before things get loud and ugly and I don’t want to miss the point when I get my chance take my shot. Animosity is churned into gunpowder, held back with bated breath and the smallest spark of excitement is explosive enough set me off.

People draw and fire, the room filled with smoke—hot air pouring from the barrel of their tongues. Others take cover, taking shots at each other, not sure where their words will land. Good ideas are slaughtered and threads of conversation murdered—maimed into assertions with no conclusion or point. A few people throw out terms in a desperate measure of defense, hurling boulder-sized words like “agency” and “autoethnography,” struggling to get a grip on what they mean as they fight to survive. They kick up dust with forcible gestures, echoing no one but themselves in the absence of wisdom and commonsense.

“I can’t believe that you think this is a…”

Bang.

“You have no clue what it’s like to teach a class with a…”

Bang.

“How can you say that knowing that people don’t…”

Bang.

“That’s unbelievable! I don’t know where you get this kind of…”

Bang.

My vocal chords shake, ringing shots out like bullets, shattering broken silences with hammering arrogance, bigger and meaner than others. A shotgun loaded with aggression, blasting away, spraying everyone, everywhere, all at once, silencing the crowd, commanding attention in rapid fire, pumping out shot after shot.

“What you’re saying doesn’t actually mean anything! You haven’t said a thing this entire time! You just keep talking, over and over, repeating yourself, filling the air with noise…”

BANG. BANG… BANG.

Pairs of eyes left blinking, targeting me with uncomfortable glares, holding their ground but not firing until the smoke clears. I stare back, queer and awkward—exposed but steady and my voice reverberates in my mind, filling a moment of sudden silence as a small stream of smoke sneaks up my side. I see that I’ve missed the target. I see that I’ve shot myself.

Sigh.

For a moment, there’s silence and then calamity ensues again. Conversation buried in the sarcasm of some new untenable game. Balls fly and guns blaze, but I pay them no mind. I opt out and disengage, shut-off by the imaginary world I’m forced to inhabit in a class that’s gone wrong. It’s not a game worth playing or a fight worth fighting—not on this playground, anyway—and not with these kids.

There are other ways to learn and have fun.

I abandon the group to go off on my own, resigned to keep my thoughts undisclosed. Staying quiet, I notice a few others doing the same.

This is people playing alone, together

Sliding open my computer I close my mouth. A gust of air-conditioned air cools my face and bits of imagination fill the room. My attention shifts into the virtual ether as I focus online, soothing interactions that don’t provoke humiliation.

My fingers do the talking, translating angst into social commentary. I climb over rungs of posts. I perch atop wifi bars, connecting networks of discussion in a jungle-gym of information. I peer through the glass of my screen, sanguine as others argue and fight. I reflect on my thoughts and respond at my discretion, productive as I communicate with distantly intimate others, learning to play on my own.

I open Twitter to observe the class-feed—our back channel of the discussion. I check lists of followers, scroll through posts, tweeting once every few minutes. There’s affirmation in the network; it explodes with creativity—forming scores of information that swing by my mind. I monkey around with others online, retweeting interesting links as I go, playing follow the leader as we all climb back to where we started.

On Facebook, my newsfeed rolls and I explore the slow churn of “conversation.” Others keep pace from the far reaches of my network and classmates make room for each other as they voice their opinions. They’re see-saw encounters, falling silent in-the-flesh while speaking up out of body, finding a way to collaborate and even smile.

I post comments that I overhear from the argument still going, using classmates’ words in puns and metaphors. I’m the captain of a ship that sails through cyberspace, passing by computer screens—windows into the very classroom setting on every desk. Quiet jeers of delight keep us moving as oblivious classmates walk the plank. Status updates and newsfeeds wash over them, drowning their cynicism in virtual presence. Other typed voices chime in, playfully layering intelligent anecdotes with humorous quips, cheering me on. Together we’re a crew and a therapeutic subtext, escaping a mutual dissatisfaction in the creative commons of our own devices.

Voices fade into the distance as I ascend deeper into the blue and alabaster of Facebook, Twitter, and Google, finding footing in complex thoughts, pounding out responses on my keyboard in a field of text. I swing between applications, more invloved and emphatic, each time curling my feet behind my chair and pushing myself to new heights of participation. Tweets and retweets, posts and likes, all accumulate in affinity. Digital ideas re-place verbal accusations and typed enunciations elicit response. Fresh thoughts infuse with new discoveries, engaged in intellectual contention, swinging in tandem, building a cognitive surplus of trust, feeding ambient generousity that adds value to reality—freed from the bondage of the classroom, surrendered to the digital playground.

The same people are talking but fewer are listening, and everyone’s more engaged with themselves. I can see fingers moving, smirks on faces with heads bent as they type and press and drag their ideas across a screen, exploring new worlds in parallel play, meeting others they’ve never given a chance any other way. They play on the equipment—finally unafraid to get along. Clicks and ticks welcome the sounds of silence.

Images from the past flash across my mind…

I’m in a desk, in 5th grade, staring out the window on sunny afternoon. The teacher talks about something I don’t understand, but the wind has got my attention. I don’t want to understand him so I tune it all out; I don’t want to pay attention as much as I want to play. I’m longing to be outside, where it’s warm and air is clear; where the wind blows leaves with the smell of cut grass, and ants gather under swing-sets flexing in a rhythm. Others kids fly off of monkey bars as they hit the ground running, laughing and pulling at each other. People toss a ball, seeing who can throw the hardest, impressed at how good they all are. Friends on seesaws bounce and giggle as cops and robbers run around by.

I wish the classroom was the playground, or the other way around—and I want to understand why that can’t happen.

Light floods through the window, casting networks of shadows on the floor. And there’s no need to fight, just good reasons to laugh. We play hide and seek, moving on and offline, together bringing the playground into the classroom and the classroom online. There’s so much more out in the digital wide open—so much more we can do together  because play is the deepest lesson that we can learn.

Creative Commons License
The Digital Playground by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.

Move Your Body to Move Your Mind

I often tell people that we should offer lecture classes to undergraduates (particularly Freshman) at the gym. In my mind, I see a lecturer positioned in front of treadmills; the various screens that typically display ESPN and Dr. Phil are adorned with Prezi’s or SlideShare presentations.

“Wouldn’t it be great,” I usually say to other academics, “if you could take your required history or philosophy course while you jogged, or powerwalked, or went to town on a rowing machine? Don’t you think students would listen better and learn more? Their brains are technically functioning at a higher level when they’re working out.” At that point, I get dirty looks and contentious laughs.

“Ya right,” people say.

When I ask why they think it’s a bad idea, they usually say something like “no one would sign up for that” or “I listen to music at the gym.” I have to wonder what’s so different about listening to someone discuss complex ideas that may actually be new or interesting as opposed to Nicki Minaj. Or is there some innate human desire to hear the same top 40 song you heard yesterday blast through your eardrums during work-outs?

I, for one, listen to lectures as I run, or lift. I would do it while I swim, but I haven’t saved enough money for the underwater phone protector or the waterproof headphones. But X-Mas is right around the corner…(cough*Mom*cough)….

A recent NY Times article explores the monotony people feel toward excersice.  Drawing from a number of psychological studies, Jane Brody concludes that the average person chalks working-out up to doing something hard, challenging, or generally unenjoyable. Yet, study after study reveals that people who do excersice on the regular are happier, more productive, and less stressed.

I can attest to the latter. Moving your body is not just a way to fit into that shirt you bought last winter when you were certain you’d be in shape by now. It’s a way to move your mind – to keep your mental state positively charged, resilient, and upwelling with new ideas that motivate you to improve the conditions which help you sustain whatever it is that you do. And, reflecting on the shape my relationships are in since I’ve started working-out on a daily basis, I’ll argue that it makes you a more pleasant person to be around.

Look – I used to be 100 lbs over weight and then I chose a profession that forces me to sit down all day long. That is the personal-health equivalent to making toast while you take a bubble bath. Sitting and staring in front of my computer screen most of the time, I suffer from the same hand to mouth disease as the next person. And I am much more concerned with gettng my thoughs in order and well-formed (because it makes me money and pays off my mountain of college debt) than I am worried about the shape my love handles make when I wear shorts just out of the dryer. But I’ve found that a lack of attention to one important aspect of my life (I’m suggesting that my bodily health is one of them) has a direct impact on another (I’m suggesting that financial/mental health is just as important).

As a technology user and graduate student, I’ve found a way to reconcile the Cartesian Dual that tortures my soul. It’s a dilemma that’s not just mine alone – I know for a fact that a “longing for” combined with the “lack of” motivated, enjoyable, routined exercise plagues the majority of my colleagues. And most of them can’t seem to understand how I stay on top of my work (which involves immense amounts of intensive reading, writing, blogging, teaching, and incessant talking) as well as work out everyday (which most of them assume is an exaggeration, I’m sure).

I’ve turned their excuses into a solution. All it takes is a phone and headphones:

1. Don’t listen to music when you work out; listen to open courses, lectures, podcasts, or something intellectually stimulating. Teach yourself how to pause and fast-forward so when you need to talk to someone or shift your focus for a moment, you can get back on track with minimal interruption.

2. Download an app that lets you easily record yourself. You will be shocked at how incredible your ideas are at the peak of your workout. You’ll also get a kick out of hearing your winded self say words with more than 3 consonants. Go back and listen to these as a warm down – or, just throw them away. The magic is really in the talking-through-it.

3. Use a standard note taking app to write down any idea that comes to mind. This is especially great to do when you stop running, pause the workout, or are waiting between machines at the gym. I actually write a load of emails while I workout and sometimes – I’m not embarrassed to say – I write poetry. How ’bout that!

These three suggestions are easy, make working out more productive, and, at least for me, seem to keep the same old routine fresh and exciting. Everyday. As an academic, you might find these suggestions helpful, but I can assure you that what I’m suggesting translates to any vocation that involves learning. You could just try it out for the hell of it. Who knows? I bet you find yourself motivated and inspired at the same time.

And that’s not an exaggeration.
Creative Commons License
Move Your Body to Move Your Mind by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Digital Pedagogy: 2 Fears of Teaching Naked

I never realized that I was teaching naked. In fact, I’ve been doing it all summer.

Though there are all sorts of ways to construct a digital pedagogy, one powerful approach begins with pulling the plug. (Fyfe, 2012, para 20)

Paul Fyfe’s recent article in Digital Humanities Quarterly addresses some significant issues related to digital pedagogy. For him, teaching is “digital” not because computers are present in the classroom, but because it is hands-on, creative, dynamically emergent, and, for all intents and purposes, analogous to a place students and teachers want to be. Fyfe explores teaching strategies that utilize technologies beyond the walls of campus buildings, digitizing the whole experience of being a student. On their own, outside the classroom, students use technology to work on projects and collaborate. They blog, podcast, perhaps collaborate by annotating a shared document. This frees up time (and space) inside the classroom for learning in the “non-electronic senses” (para 8) where conversation carries the lesson, which emerges as students engage with each other about course content. Even with the use of minimal technology in the classroom – say, a screen projection of a text for collective reading – digital pedagogy is about peeling off the layers of institutional authority that normally conceal students’ desire to learn faithfully and teachers’ ability to really teach.

Hence the “naked” in teaching naked – being “exposed” together. Keep your shirt on, though – it’s not about skin and underwear.

It’s about finding ways to leverage technology for what it’s worth, freeing up the time people spend together, in the flesh, to expose the limitations and possibilities of learning. What results is a vulnerable situation where those involved – students and teachers – negotiate the tensions of learning together. This, of course, takes students who are willing to show up for more than just a grade – those who find value in the relationships they have with their own learning experience and their classmates – and teachers who don’t just show up to train students – those who abandon the “it has always worked” lesson plan and discover the lesson in the conversation with students, asking questions that guide group thinking and encourage participation.

As a teacher it’s scary to be in that sort of situation – where the plans are loose and the conversation can go wherever students take it. It takes a lot of trust and humilty. It also insists that they take it somewhere. The last things students want in a classroom is to be bored, and in this ideation, if they’re bored they share the burden. Excitiment from improvisational course content that emerges from student and teacher interest does, however, get a little scary because everyone has to tolerate a certain level of ambiguity. It’s sort of like white-water rafting – you have to trust the people in your boat to work together, paddling, steering and staying on board.

There are two primary fears that I’ve experienced this semester as I’ve (unwittingly) implemented Fyfe’s suggestions:

Fear of Participation

First, teachers have to be comfortable relinquishing some of their authority over the course, authorizing students to learn on their own and trust that they’ll remain engaged beyond the classroom. I’ve learned that a good way to guarantee student participation is to use blogs, vlogs, and wikis to explore course content. Asking my students to produce a 200 word blog each night (or to contribute minimally to a 20 sentence wiki) is the equivalent of a math teacher asking students to show their work. 200 words ain’t that much, really. Students know that both I and others can see they’re contribution and are waiting to respond – which is also part of the assignment. This leads to a rich conversation online the night before class, which usually builds on the conversation from the day before. There is a collaboration-driven ethos established among a small group of people working in this way – not unlike that discussed by Jono Bacon (regarding Open Source) and David Bohm (regarding Dialogue in small groups). The result is an ongoing conversation about course content that doesn’t feel like a class conversation; in fact, it feels like something that would happen on Facebook, but with better links to helpful sources and less inflammatory language. What zaps the fear of participation in this scenario is that digital tools expose whether a student does or doesn’t engage with the class. Of course, it won’t ensure that each student does every assignment, but it does mean that they learn at their own discretion, visible to everyone, which encourages others to follow suit.

Authorizing Student Expertise

The second fear stems from opening up class time for interaction, conversation, and constructive activities. There is, above all other things, a fear of engagement in any intimate group. Attend a high school dance or pep rally – you’ll see. Guiding a conversation among students, who are both excited and knowledgeable, takes a lot of energy and a substantial amount of risk. Sometimes I actually know less about the conversation at hand than the students do. It’s uncomfortable, certainly, to let loose the reigns and allow students educate each other, mainly because the expectations in a traditional learning environment involve the teacher  dictating course content and  authorizing the right answers. In fact, digital pedagogy necessarily rearranges these expectations so that each person decides what counts as “right” and “wrong” during conversation. Enter critical thinking skills. In nearly every instance, the validity of less-than-insightful claims made by less-than-involved students are regulated by others in the conversation. This often leads to rich debates – productive as long as people are respectful and prudent. Teachers have to trust their own abilities to intellectualize and mediate discussion as they roll with the conversation, nudging it toward important issues that  ought to be discussed. They are not, however, in control. Avoiding conversation-placebo – where conversation is promised, people sit in a circle, and the teacher still lectures, usually from a chair with more pronounced gestures – is the hard part. In my experience as both a teacher and a student, when teachers feel exposed and their authority is brought into question, they tend to work very hard legitimating themselves and the lesson. Nothing could be more counterproductive in a collaborative situation.What derails the assumptions that may lead a teacher to dominate a conversation is simple – sit back, let the students carry the conversation forward, and arrive at the silent realization that being a teacher doesn’t negate your being a student, ever.

In the very least, being able to identify these fears (more like strategic obstacles) can help a teacher approach a class in a digital way, encouraging students to take ownership over their learning, utilizing technology in ways that make the whole experience more engaging.  Now, teaching naked might not work for everyone or for all courses. I can’t imagine any way that science or math could be taught in this way; I also don’t see tenured lecturers dropping their drawers of PowerPoint slides and popping a squat in the crowd.  Then again, maybe I’m mistaken. What I do know is that courses in the humanities and social sciences can be more collaborative, engaging, and…well, digital.

Just don’t take off your clothes.
Creative Commons License
Digital Pedagogy: 2 Fears of Teaching Naked by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

3 Reasons Students Should Blog

I took a risk this summer by integrating a lot of technology into my classroom and it paid off.

Steve Wheeler has been a big influence on me because he talks about the ways new technology can change how students learn and teachers teach.

I wanted to take his advice and get my students to use more technology. I was worried they wouldn’t be as tech-savvy as my colleagues and friends think. I was worried about the digital divide – that the stereotypes weren’t true. It’s no secret that social media is something for “young people” – because age somehow determines a persons’ ability to be social, or understand how to push buttons and navigate LCD screens. Right? Because cell phones are like video games. Right?

“Show of hands – how many people in here have a cell phone that connects to the Internet and has some sort of audio or video recording device?” I ask.

All hands go up.

“Whoa…”

They all laugh.

Guess there is some truth the “age = social media likelihood” equation.

My biggest fear this summer was introducing elements to my course that were contingent upon social media. See, I have this “crazy theory” that students writing papers – essays, to be exact – is not necessarily productive. It doesn’t foster learning.

A student writes a paper, they turn it in to me, I read it, make comments, and give it back whenever I find time to get through all of them. A few weeks go by. My comments reflect the untenable demands of reading hundreds of pages of poor grammar, bad sentence structure, re-typed arguments from Wikipedia, and undeveloped thoughts that have nothing to do with the matter at hand. About half of the class reads what I write. I know this because half of the class usually leaves their marked-up papers behind when they leave the room.

No one really learns much of anything in this situation, no matter how much effort we all put into the papers. It’s a crazy theory, I know, but I have good reason to believe it – beyond a desire to save some trees.

“Hogwash!” You say. “Now you’re just being hyperbolic, Nick! Essay writing is a traditional staple of a good education. I did it! You did it! Who are you to change it?”

I’m a person who takes risks. A person who cares about my students actually getting something out of the hours we spend together, and a person who wants to keep myself excited about teaching and reading student work.

I decided to have my students write blogs instead of papers. There were a few things I discovered that made the risk worthwhile and makes my theory seem not-so-crazy after all:

1. Students can critique each other’s work. In a traditional write-a-paper-and-the-teacher-hands-it-back format, students only get one person to read their work. Me. My sole perspective – though informed by a few years of teaching – is not the only one that has value in the classroom. Also, with my workload as a graduate student there is just no way that I can hope to give solid feedback to all of my students and remain deeply invested to doing my own work. Sadly, a few student papers usually fall through the cracks with blanket responses like “Great!” or “Rework this section” or “unclear” as I transition back to my own reading and writing in the wee hours of the morning. In my humble opinion, this type of alienating language (and practice) should be left out of any learning environment and educational experience. Reading my student’s blogs, I’ve found that they give each other both positive and critical feedback that go into deeper detail than I could ever imagine doing alone. This type of dialogic process, I’ve found, contributes to the ethos of the course and everyone’s enthusiasm for having an opinion and learning something new.

2. Students get to write less, I get to read less. Any educator who is being honest will tell you how much they dislike having to read so many student papers. It isn’t that they dislike reading or dislike their students – it’s that reading so many papers so incredibly similar is tough to stay enthusiastic about. A 100 word blog is big enough to articulate a single idea with a bit of rigor and some hyperlinked sources (like this one). My students are writing 100 words at least 3 times a week, usually in response to some video I’ve posted for them to watch. I make them find other sources on the web to back up their argument. I also make some suggestions when I assign the video (via email) about what they should consider, in both form and content, when they respond. They’re also required to read and comment on at least one classmate’s blog for every one they write. This ensures that everyone gets feedback. Of course, I read and comment on all of them. All of this takes me (and them) less than an hour, and we do it 3-4 times a week. After a 6 week course, that’s 1800 words written per student in about 18 precise, nuanced arguments. You can’t really shake a stick at that! I have to admit that the shorter reads and the salient points are addicting to go through and comment on as a teacher. It’s a lot more fun than doing my own work!

3. Covering uncharted territory. The worst thing for teachers and students to cope with is boredom. By the time students are college Freshman, most have taught themselves how to sniff out a reused lesson plan and give a teacher what they think is  “good work”. Most of the time, it means regurgitating someone else’s point of view about a given subject. Many teachers trust their time-tested activities and lessons, falling back on the same examples and lectures they’ve used for years in a row. To be blunt about it, nothing could be less productive and worse for the education system, overall. No one learns unless they get somewhere new in their thinking. Production is not reproduction. I started the semester assigning a video about changing education paradigms by Sir Ken Robinson and had no other plans. After reading their responses, I realized that the vast majority had something to say about Robinson’s claims on ADD/ADHD diagnoses. It’s a compelling argument that tapped the core of class interest. Recognizing their interests, I assigned a video from Thomas Szasz about the dangers of calling mental illnesses a disease. The responses were enticing, thoughtful, and provocative. This led to even more uncharted ideas for out-of-the-classroom thinking, learning, and writing. The course content emerged through the blogs themselves.

For people who aren’t educators or care less about teaching, maybe none of this means much. But we were all students once. We should all take a moment to think back to our youth – to our education – and try to remember what we disliked about it. What if we’d had new social media technology? Could using it in our classrooms have changed our minds about school, or learning, or those things we thought we were interested in but decided to leave behind because they were boring?

Perhaps.

The thing I suspect most students really dislike about education is this: that their teachers are afraid to take risks, to engage them, to look for new, exciting ways to understand what they want to learn. Call me ridiculous, but I think that students want their teachers to enjoy teaching as much as they want to enjoy learning. Most new technology is already in our pockets because we enjoy using it for work and play. It’s fun. Why shouldn’t we figure out a way to use it in the classroom? Learning can be fun. It can be productive, too.

Maybe we can all learn how to learn with each other as we learn to use social media, together.Creative Commons License

3 Reasons Students Should Blog by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

PLE’S Help Yourself: 9 Things Academics Can Do With Social Media

This one is specifically for my academic friends. Although, anyone who thinks of themselves as a lifelong learner should read on.

Last time, I spoke about being mindful of the Internet, tipping my hat at Howard Rheingold‘s Big Idea. There is a lot out there to be overwhelmed with, that’s for sure. We should all be taking advantage of the Interwebs – without question. There is knowledge at our fingertips.

Don’t misread that statement as zealotry – I certainly don’t mean to say that laying off the Facebook and Twitter feed is bad thing. By all means, strip down and go to the woods as much as possible. And bring people with you, too.

“Living with” technology is different than “living for” it. We all might want to understand the difference.

I’m not quite a techno-cheerleader. On the other hand, I’m definitely not a Luddite. In fact, I strive for a certain technological balance. I like my media the way I like my relationships – particular, personal, discrete – overall, complimentary to my lifestyle.

Rather than catapulting into the typical excursion about human and non-human relations, I’d like to make a few practical observations about the way I use social media. In general, it helps me maintain an aura of conversation and interaction with others throughout my day – people present in both real and cyber space. These conversations hang together as I work and play at different times and in different places, for different purposes and in different spaces. My thinking has developed in revolutionary ways, as a result, and I think (I’m not sure) that social media doesn’t have to be overwhelming. It can be managed.

We don’t have to count social mediation out of what we do and we definitely don’t have to assume it’s beyond our understanding – a perspective I’ve learned many academics have about technology, overall. I wonder if the first people who put language into use felt the same way? After all, language is the paramount technology.

Many all but scoff at the Facebooker, Tweeter, Blogger or texter. Any mention of social media in conversation gets an eye roll or puckered lips. These same folks usually struggle with the most basic technology functions, missing out on great resources, passing up opportunities to extend their own learning and research. So many people are more and more likely to jump to irrational conclusions about the “good, bad, and the ugly” of social media (like the ones put forth by Sherry Turkle). For so many, it’s because they’re unsure about how to manage a digital reality.

I have to admit, translating all of the data we could be exposed to on a daily basis takes a lot of effort. Although, interpretation of information has always been about how much effort you put toward it. Listening isn’t easy. Either is filtering through the crap on the web.

I don’t blame folks for dismissing social media. There is a lot to know about, and a lot to learn how to do. The unknown has always been a major source of anxiety for people. Not knowing the “what’s” and the “where’s” of social media is one thing. The popular press helps us keep track of that, so there is little reason to use everything out there. There’s little value in being “cool” or “trendy”. However, not knowing the “hows” of technology is a different issue all together. Rheingold calls this sort of know-how “digital literacy” – comparable to any other type of literacy – and necessary for living in a digital world.

Knowing how to do social media is “second nature” for many, and not natural at all for others. Still, anyone and everyone can be familiar with how technology works and what others see as valuable about social media. They don’t have to use it, but they should have some know how before passing judgement. In the end, the reality of a digital world is that technology is – contrary to popular belief – always at a person’s discretion.

9 Things Academics Can Do With Social Media

The following outlines part of my Personal Learning Environment (PLE). A PLE is a relatively new idea developed by interdisciplinary scholars who see the web as a rich source for learning and wish to move toward an open, global, collaborative education system. What I’ve laid out below is a short list. It’s the basic tools that I use everyday to curate content and tame the digital behemoth into an analogue companion. These tools both satiate my attention deficiency and relieve some of the socio-economic pressures of the academy. While PLE’s are supplemental to higher education – not an alternative – they can certainly be prudent additions to a person’s cache, which lead to more engagement, more conversation, and more thoughtful hours of the day.

Tablet PC: I spent the money a few months ago on a Tablet PC. I got the ASUS Slider because it has a keyboard that slides out and props the touch screen up on its own (hence the name). The touchscreen sold me because of its immediacy and convenience. I read more now than I ever did – and that’s a lot in your third year of grad school. Personally, I have a lot more fun reading, posting and scrolling on a touch screen than a laptop. On a University campus, connectivity is never an issue since wifi is everywhere (seemingly). “App culture” is not just a new fetish but a way to pool the resources I use to work and play. I have to say that the interactivity of reading on a Tablet is so engaged and tactile that it is more than “reading”. How about “treading” as a combination of “touch” and “reading”. Ya, that’s actually pretty accurate. The 500 bucks was well worth the money, by the way.

Google Reader: 90% of my daily readings are blogs. This means that, sandwiched in between all of the reading I’m supposed to be doing for class, I’m also reading the work of my peers – graduate students, younger scholars, leading researchers in technology, programmers, and comedians (because I like comedy). A little secret – I cite things from scholarly blogs on the reg-u-lar because, lets be honest, sometimes their actual published papers are long winded and boring. Most scholars who blog are covering the same issues in a 150 word version on a daily basis. Google Reader is great because the blogs I like are delivered as a feed to me whenever they’re updated. Better than reading the morning paper, I do most of my blog reading over coffee or whenever I have ten free minutes (wherever I may be).

Samsung Galaxy S Smartphone: I dislike Apple products. Android isn’t much better, but, alas, the “third form” (open source software) has yet to develop a phone operating system (OS) that actually works. The touchscreen is essential, and like the Tablet, I chose my phone because it has a good’ol fashioned keyboard that slides out. Call me old skool but I still like buttons (and I really think autocorrect should be renamed “autoincorrect”). More significant than the tactility is the corresponding OS platform between my phone to my Tablet. Having both my mobile devices on the same OS makes every function so much easier and takes less mental effort. Also, I have a Sprint plan because they give me unlimited Internet access and texting for under 100 dollars month. The service is spotty, but you can’t beat unlimited. For Tweeting, Fbing, email, taking quick pictures and recording interviews, classes and important dialogues with other like minded people who collaborate on work with me, having a solid smartphone is absolutely necessary and worth the money. I see it as a gateway to productivity.

Tape-A-Talk Audio Recorder: I use this app because it’s simple, it has big buttons and it keeps track of my recordings by date as a default function. Most of my recording is done while I jog (because I have the best ideas when I exercise). I can keep the app running in the background while I jog and listen to music or a podcast. The buttons provide a large enough display that I have no trouble finding without looking, even when I’m out of breath, sweaty and fumbling. There’s also an option to turn your camera button into the record button, so your phone will work like any other dictation machine. The quality is exceptional, too. The free version is great and the pro-version is worth the money for the added functionality, too.

Stitcher Radio: I’m a big fan of radio. Always have been, But for some reason the radio transmitter in my car doesn’t work and it’s not worth the money to fix. Instead, I’ve taken to listening to podcasts. I started with Marc Maron’s WTF podcast (frequently the number one comedy podcast on iTunes) and This American Life (which is syndicated on NPR). When I found out that Stitcher collects the best podcasts from the web, I downloaded it and never turned back. It works a lot like Google Reader, except I can make different “stations” and categorize podcasts into categories. In a single day I’ll listen to an hour long interview with a famous comedian (usually something about their personal struggles with relationships and substance abuse), a 15 minute monologue by Garrison Kelleor from A Prairie Home Companion, a 30 minute story on Radiolab about fistulated stomachs in both people and cows, and a 10 spiritual exegesis from the one and only Alan Watts. My favorite podcast recently has been a free class from Yale University on the Continental Foundations of the Social Sciences, which compliments the Interpretive Social Sciences I just took this past spring beautifully. I’ll know more about Hobbes, Locke, Marx and – everyone’s favorite Durkheim – by the end of the summer than I ever wanted to know. This is a world class education people. From a senior lecturer at Yale. For free. Podcasts have truly changed the way that I learn and listen, and, in my humble opinion, have helped me turn workouts and drives into prime time educational experiences, re-extending my technologically impaired attention span.

Dropbox: There are a lot of different “clouds” floating around the Internet. I suggest finding one that works for you because it makes traveling to-and-fro so much easier, especially if you’re an absent minded academic like I am and frequently forget your flashdrive in your computer’s USB port, or fail to email yourself the necessary files for the next days’ presentation. It’s also an easy way to share files with colleagues and professors because you can upload as big a file as you want.

Tweetcaster, Freindcaster and Spotify: For all of your social media needs, Tweetcaster and Friendcaster are much more functional apps for sharing content across platforms than the traditional apps provided by Facebook and Twitter. Facebook mobile tends to crash mobile devices and Twitter’s app is pretty difficult to navigate. The caster-apps make micro-blogging a breeze and are more customizable. If you don’t know why you should use Twitter, you should try it for a few days and then see how you feel about it; it’s a bit like having a personal CB radio that other digital-truckers tune into as they drive through web traffic. You never know when someone will be able to help you find your way through to a gold mine of knowledge you didn’t know existed right under your nose. Facebook, of course, is the great social stethoscope of our time. Your Facebook page can be the pulse of your PLE if – and only if – you manage it properly. Taking the time to manage your network will generate more opportunities for conversation and exposure to new ideas than you ever imagined possible. Finally, if you like music and you haven’t heard of Spotify, visit the site and download it already. These designers really have solved the music piracy problem, and this is coming from a person who’s been swashbuckling digital data since the Internet was delivered over a phone line.

WordPress: I blog, obviously, because I have a lot to say. More than giving me an excuse to be long winded, my blog has made me a better writer. It also gives me a voice in ongoing conversations about issues that other people think about, care about, and want to know about. For all intents and purposes, my blog is my corner of the web where I get to host the ideas that matter to me and it gives me a good excuse to invite others into the conversation for support and criticism. It’s perhaps the most formidable way to develop an academic voice outside of publishing in journals, where making concise arguments in writing is a key skill and making connections between your thoughts and those of others is still the best way to maintain a strong ethos. Most people who claim that the web isn’t peer-reviewed probably don’t use technology an haven’t heard of peer-to-peer networks. Where self-publishing is lacking in “rigor” it certainly empowers the author to write what they want to write, when they want to write it, for people whom they are interested in having read it. Of course, business folks write shorter blogs and find value in the the super-hyperlinked variety of web writing that makes bold claims, reaches as large an audience as possible, and is more concerned with attention seeking than thought development and careful examination of nuanced arguments. Academics, however, write longer blogs (it seems) because this genre of speech is a provincial way for them to work through thick thoughts, deep theories, and styles of writing that lead to fresh perspectives. Like stand-up comics who appear frequently at open-mic nights to work on new material, blogs are the open-mic of the academic who is diligent about refining their craft. For me, the real challenge in blogging isn’t procuring a readership – you can use your other social media channels for that; it’s sticking with it each week (or month) and finding something to talk about that is of some value that takes determination and stamina. For the contemporary scholar, there is really no excuse not to blog. Search for your favorite living theorist on Google – chances are, they blog. You should, too. Remember that you blog for yourself. I find that it really matters little if others read what you write. The point is not to seek approval, it’s to practice your professional craft and develop mental and rhetorical skills. Readers are nice, though, especially when they comment (hint, hint).

Creative Commons: All people in the world of publishing and producing original content (which includes nearly all those who would ever self-willingly don the label ‘author’) need to familiarize themselves with the Creative Commons licensing initiative. Ever worry that you shouldn’t’ use that random picture you got off Google because someone might sue you? Ever had a concern about putting a new idea online and having it “stolen”? Creative Commons gives you a way to copywrite your work in cyberspace. It’s brilliant, it’s easy, and it works. The best thing is – a lot of people not unlike yourself have made it their life work to ensure that the CC licensing holds up in court. Check it out – it’s worth knowing and spreading the word to your colleagues, coworkers and students. Protect yourself and protect your right to share what you write.

My hope is that some of this is new to you. This list only scratches the surface, but it’s enough to give you sense of how technology can serve the contemporary academic, intellectual, or common person’s agenda. All it takes is a little bit of know-how. Social media can be used to filter out the crap on the web, which could certainly lead to some peace of mind and who knows – maybe even a better way to live.

Creative Commons License
PLE’s Help Yourself: 9 Things Academics Can Do With Social Media by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

The Raging River of the Interwebs

Howard Rheingold tweets that being mindful about all of the data on the web means filtering all of the crap as we wade through the waters of ever rushing interest.

Ok, maybe I’m stretching his 140 word post on Hybrid Pedagogy‘s #digped discussion group about his new book Net Smart, which went live last month (and will stay active throughout the summer). Still, Rheingold is pulling together centuries old spiritual thought with cutting edge technology when he suggests that digital beings can be mindful beings. He’s saying that surviving the over-growing, ever-moving datasphere at a time when information and ways to access it grow in abundance each day requires some mental agility. Dare I say, we all need to show some “digital hubris” or what could otherwise be considered intellectual stubborness.

We have to get unstuck from the school-age notion that the only way to really know something – to be right, to have a say, to pass the test – is to know everything there is to know. We have to decide not be perfect – to let some things pass us by – but we have to keep trying to keep learning as we keep moving down the river of tweets and retweets, memes, likes, posts, blogs and vlogs, and oversourced schools of email that nibble at every second of our already overbooked day. Like the reborn alcoholic or addict, we can surrender to the datasphere, acknowledging our own learning limits and realizing the full magnitude of what the datasphere has to offer. In my view, cyberspace does qualify as some form of “higher power” that is “greater than ourselves”. It’s “virtual” for Godsake! What could be more mystical than that?

If you don’t like that idea, don’t worry: higher powers and guilty people have historically complimented each other nicely.

“We can’t all learn everything, and but we all can learn something” is a line we used to tell pledges in my college fraternity. Today, I take it as my daily mantra of digital practice. I prevent myself from falling down the “YouTube hole” and resist my unbelievable propensity to scroll down. That sort of avoidance doesn’t include technological dismissal or denial. It’s a matter of discipline, like a spiritual practice, you do what feels right. Knowing how to move through the “crap” (Rheingold’s actual word) on the Internet is key for retaining peace of mind. It’s the only way we can manage the digital information overload, which for some reason seems bigger and meaner than all of the other information overloads that have happened throughout history. But just because there’s too much information doesn’t mean that there’s too much information to manage. A fact that is often overlooked by the common person is that social technology is a discretionary function of everyday life, not a mandated one. That means that you have just as much ability to shut down as you do to power up. By virtue of that fact, you have just as much incentive to tailor technology to suit your needs as you do to be sucked in by flashy lights and funny pictures of cats.

We live in a world of artificial excess – the ocean, the land, the sky, and outer space are all bolstering with too much stuff that gets in our way when we try to occupy it. “Space junk” threatens our safety and the purity of the environment that anchors whatever reality we’re living in at the moment. Today and forever from now own, cyberspace will be the same way. The hard part is recognizing what counts as “junk” and what doesn’t. We need tools to do that because – remember – the Interwebz is bigger and badder than us. We need super amazing information-metal detectors, hardcore data-rototillers and social media-rakes that collect all of that rich soil good for planting positive, clever, and humorous seeds of intellect and prestige in the gardens of our social network (and, perhaps, our minds).

If we want to know how to manage the raging river that runs through our collective digital-backyard, then we better know how to pick the right tools to help us reroute our expectations. We also need to know who can be our best guides for helping us along the way as we take on the rapids of discourse and debate. We need to know who will pull us back in the boat if we get pulled into the tossing waves of argument, cut-up on the sharp rocks of disinformation.

Had enough metaphors, yet? Good. Me too.

You get the point, I hope, that we all need to learn how to be mindful of the Internet, which means not passing up the opportunities it presents us for both work and play. No matter your vocation, digital resources can help you just as much – if not more – than they are said to be a hindrance.

Creative Commons License
PLE's The Raging River of the Interwebs by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

“Truth” in Analogue

Everything is going digital. Everywhere you look, there are digital things. People use the word “digital” frequently. Right now, I’m on a quest. Questioning other people this week, I’ve asked, “What does digital mean?” Most people just stared at me. A few took a shot at an answer:

“Zeros and one’s, right? Binary code?”

 “Anything that deals with computers, I think.”

 “When I think of digital, I think of telephones and computers, anything that sends a signal.”

 “To be honest, annoying is the first thing that popped into my head.”

 “I always thought that it just meant something new.”

I appreciate these responses, but I still don’t know what is meant when people call something digital. There’s little continuity in the colloquial definitions that emerge in conversation with peers. As I suspected, no one really knows what they mean or what they’re referring to when they use the word. In actuality, few care.

There’s a term for this kind of thing:

Epic fail.

I’m driven to know about the meaning of digital by a severe suspicion about what free and easy use of the word indicates. I don’t think we should continue to employ a word as loosely as we do digital when we don’t fully understand it. This may seem like a problem of mere semanticsIt’s not. It goes deeper than words. It’s a question about how we’re coming to live in a world that we continue to call by a name that we don’t fully grasp. It’s a question of virtues.

In the socially constructed reality that we embody as homo technicus, what we call the machines that we live with is indicative of how we come to understand our values as human beings. To inquire about the meaning of digital is to examine the labeling we’ve developed for packaging knowledge about the way we relate to each other in contemporary society. It’s to inquire about the ways our interactions involve technology and if they are, or aren’t, valuable.

In short, words is important.

See what I mean?

Words are symbols used to communicate and function in the world. Symbols stand for something. Digital is a symbol. I want to know what it stands for, but more poignantly, I want to know how the use of digital—as a naming device—signals a certain way of being in the world, where our actions are based on the values of a socially saturated society and our relation to technology straddles the thin line between virtue and vice, as Sherry Turkle suggests.

Symbols are made out of language. Language is a systematic tool for constructing symbols and expressing subjectivity in an objective way—a technology that we use, in talk and text, to give definition to what we see, think, and do. Through the objectivation of linguistic symbols, we cultivate a sense of agency as our ideas, thoughts, and actions become “things” that we can refer to, possess, change, and value. So understanding what is meant by digital is not just squabbling over words. It takes task with understanding how we live in a world that we continue to suggest is characteristic of a specific, albeit poorly understood, symbol.

Asking people to explain digital was hardly insightful so I go to the library, looking for any book that includes the word digital, scouring for context clues and straightforward definitions. Hours later, I’m sprawled out on the floor between shelves, knee deep in a puddle of scattered books, reading about circuits, channels, and transmissions. Engineering books. Books with digital in the title.

I’m no engineer. I don’t understand the math. But I’ve had enough communication classes to grasp the language, which includes words like signalcircuittransmissionnetwork, and system. In terms of communication research, these are concepts from a bygone age—an era before the paradigmatic shift to human communication research that privileges the mutual constitution of messages over an information-transmission model. I begin thinking about the ways engineers talk about the digital signal:

Digital signals are limited in flexibility.

Digital signals take less effort to transmit.

Digital signals can be manipulated and stored without much error.

Digital signals are better for performance.

Digital signals involve less noise.

Digital signals are less variable and more easily controlled.

Digital signals are based on a limited set of estimated values.

Thinking harder, I search for ways that this knowledge-set translates to human communication in a digital world. I wonder what “digital”—as a symbol for the way we communicate in a technological society—signals about the way we live:

We are less flexible.

We expend less effort as we communicate.

We can manipulate and remember without much error.

We are better at performance.

We deal with less noise.

We are in more control and put up with less variability.

We base what we do on a limited set of estimated values.

What, then, is meant by digital? Precision is the word that comes to mind. To be digital is to be clear, repeatable, exact.

Indeed.

We have GPS. We travel with more accuracy by giving up our geographical sensitivity, relying on systems that estimate our destinations. We end up forgetting that we can value our journey as much as our arrival. We give up the possibility of getting lost for the guarantee of finding our way. We trade the excitement of risk for the security of reward.

With the Internet, the world is becoming clearer, our ideas are more repeatable, and what we do more exacting. But there is a loss of value inherent in the digital, a sense of certainty ensured in the processing of signals, of life, that disallows discrepancy, variation, and divergence. Who doesn’t like certainity?

Yet, it is the variability we lose in the digital transmission of a signal that devalues living. Descrepency, variation, and risk give a signal a richer set of values. They also make life more robust, vibrant, and fulfilling. Surprise and excitiment are essential elements of a totalized experience of reality. They make our life more interesting, enriched with more possibilities. When precision is favored over everything, we lose a sense of value that we might not be able to recover. We lose our sensitivity to variability and count on limited values, which is another way of saying that we limit our options for what could be considered “true” or “good”.

In the finite province of meaning that engineering language provides, digital means precision. When associated with the way we live in a digital world, life becomes predictable. It describes a world that is less forgiving, where the accuracy of signals relies on a smaller index of values and the “truth” of our lives rests on a limited set of possibilities.

Walking away from the library with a lot of questions and a few books, I’ve got a lucky find in my hands. Wedged in between books about transistors, binary code, and baseband waveforms was Being Digital by Nicholas Negroponte. It must’ve been misfiled because it’s not an engineering text. Negroponte wrote the book at the beginning of the digital revolution while he was still chairman of the MIT Media Lab (which he founded). It’s dated, for sure, and it’s clear the Negroponte wears rose-colored glasses when he talks about the future (now past) of digital systems, but these words strike me as eerily relevant:

“multimedia narrative includes such specific representations that less and less is left to the mind’s eye. By contrast, the written word sparks images and evokes metaphors that get much of their meaning from the reader’s imagination and experiences” (p. 8)

It’s ironic that no one seems to know what digital means but they continue to use the word. Do they know what they’re getting into? Do they know that being digital might mean less possibility and limited values? Have they considered that trading virtue for virtual could mean narrowing imagination and circumscribing creativity? I doubt it. It seems to me that the more we call the world digital, the more we ought to ask about the cost of precision. I wonder about the values that we forgo, the possibilities lost in the signal, and the remaining “truth” in analogue.

Creative Commons License
PLE's “Truth” in Analogue by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

It’s all fun and Facebook until somebody shoots their “I” out

Damn you, Joseph Kony.

I swear, the man is everywhere. Except for Uganda.

Right now, there is a lot of fuss about Invisible Children’s Kony 2012 campaign. I don’t want to discuss the particulars but I want to discuss friendship, values, and how we use Facebook.

A quick story:

I was de-friended on Facebook after a thick debate regarding the credibility of Joseph Kony propoganda. The argument came to blows once the conversation shifted from a contestation of opinions to a critique of values. Suggesting that my friend was drawing from poor sources, I questioned his judgment when I told him that his evidence was garbage and that he shouldn’t post things like that on my thread. Albeit much tamer than the “flaming” of yesterday’s Internet, the remark didn’t go over so well.

The next day, we were no longer friends.

I wanted to apologize. I searched everywhere, scouring the depths of Facebook. He was gone. No relationship page; no shared interests; no mutual friends; disappeared from members lists of mutually affiliated groups.

Poof.

I’ve never really noticed being de-friended before, so you can imagine that after 7 years of Facebooking, it came as a bit of a shock.

This person wasn’t just a Facebook friend. I would consider them a real friend—a person I’ve known for years, whose relationship I cherished. Like so many relationships that give-way to time and distance, we remained friends because of Facebook. We took advantage of the channel, kept it open and stayed connected. Without Facebook, the knot of our relationship would have come unraveled long ago. At least that’s the way I see it.

It was more than a “weak tie”— he was someone I respect and appreciated; a fraternity brother, a mentor, a person worth seeing again. But because our email addresses, cell phone numbers, and mailing addresses have most likely changed since we last saw each other, Facebook was the direct tether between us.

Since when did we start losing friends because of Facebook?

While I’m new to it, I know that it happens a lot to others. As people get more comfortable with digital communication, they become less aware that they’re playing with technologies of the self, fluid identity, slippery traditions, and social values. As danah boyd (2007) observes, social network sites give us the ability to “write [our]selves and [our] community into being” (p. 120) as we post and comment, friend and de-friend.

And it’s all fun and Facebook until somebody shoots their “I” out. Politics are especially challenging.

Theoretically, Facebook is complicated and paradoxical by nature. It is a utility for self-expression—what Clifford Geertz (1973) might call a venue for “serious play.” The type of communication that occurs isn’t just something done for fun and it’s no longer a novelty of youth culture. In the past few years, it’s become a significant arena for performances of all kinds, where “facework” is learned and done (Goffman, 1959), information is churned over until it becomes meaningful, and relationships are held in tension that sustain and maintain bonds (Baxter, 2006). Put another way, it’s an important place where life is lived and culture is formed.

Also, it’s not going away anytime soon.

The lesson that I learned from being de-friended is a lesson that all Facebookians might consider. What’s at stake when we’re so free to make connections wherever we see them is further weakening a tradition of social connection that’s been thinned by American individualism for some time (see Bellah et al., 1985). If we can choose to connect so easily, friending whomever, whilly-nilly, than we can choose to disconnect in the same way, without thinking about why we value friendships, what’s gained, and what’s lost. The very idea of friendship is cheapened when we fall in and out of relationships so easy.

The abstract, contingent, and discrete nature of reality in an increasingly digital world, which allows us to tinker with the meaning of social institutions, is something that Kevin Kelly (1994) describes in his book Out of Control. What he calls the “rising flow” (p. 404) is a wave of life that implicates each of us as architects of boundless, yet uncontrollable symbolic systems. We all cope in our own small ways with the inevitability of entropy by making sense of our lives through logic, reason, and the ordered coding of social construction.

The way we communicate with new media can lead to dismay, disarray, and isolation, not to mention desire, affinity and what some might call addiction. As individuals are caught up in the rising flow, they’re swamped under a sea of moral turmoil and ethical chaos, like surfers tumbling under a “sustainable crest always falling upon itself, forever in the state of almost-toppled” (p. 405). Communication turns ugly quick in a digital world. Kelly’s metaphor is one that forewarns the present, as de-friending becomes a practicable reality. Especially during times of high political tension, when movements like Kony 2012 come to the fore of Facebook newsfeeds and timelines, a wider sense of order can be shattered, logic and structure can be quickly abandoned, and conversations may end in easier dismissal. People begin to opt for disconnection when mis-information is too much to bear, and they vanish into the virtual ether.

As my friend posted before he dropped out of my network, “This is a subject that makes my blood boil, I have zero sense of humor, and very little patience on the subject, so I’m done. Do what your conscience tells you.”

I imagine that the crash of sensibility on Facebook is a reason why many choose not to use social media. Dissmissivness is a caveat of digital living, often framed as an unhealthy replacement for analogue communication (Turkle, 2011). While  replacement is hardly the right word—in fact, communication is hardly ever purely analogue or purely digital (Bateson, 1972)—Facebooking, like all mediums involves some sort of risk, some kind of uncertainty, and some degree of necessary vulnerability for it to be worthwhile. This sort of exchange value puts the “capital” in “social capital.”

Talk about politics in a public forum, risk losing a friend. Question a person’s intellectual integrity in front of others, you’ll probably get burned. Cite a phony source, take a chance of being called out. The safe way to cope with this system is to drop out of it all together. On Facebook, de-friending is just as good.

That’s not a reason to either abandon or blame Facebook. It’s a reason to be more aware of ourselves as digital beings. It’s a call for a new tradition that doesn’t value disconnection in the same way.

At least not disconnection at the drop of a hat.

The key to sensible communication in the digital age is to be careful with fragmentation. I value Facebook because it allows me to hold onto relationships. Some of those relationships have fallen quite dormant, true, but others have been lively all along. In fact, I can only think of 3 instances when I’ve de-friended anyone, ever. The ability to stay connected with desperate others, despite estrangement, is a technological privilege, not a right. As a privilege, we can recognize that negative perceptions, which frame digital communication as contributing to disquiet, concern, and (perhaps) pathology, lead to self-fulfilling prophecies. We’d benefit by keeping conversations going—even if others rub us the wrong way—so that channels remain open between differently minded people, keeping communicative possibilities open and available if necessary.

Maybe then we can ride the crest of the wave for just a little longer. Maybe experience the next paradigmatic sea-change, together.

If, as political beings who use global communication networks, we decide to abandon relationships in the face of cyberspace disagreement, than we reify David Bohm’s (1996) fear of a contemporary society where “people’s self interest and assumptions take over … against the best of intentions — they produce their own intentions” (p. 14). In this scenario, the world is more fragmented, and continues at the speed of Moore’s Law. The digital risk of Facebook friendship is that, one day, de-friending could become more common than friending. If this was to be the case, the value of friendship rests on the mere contingency of disconnecting from others.

To use a common metaphor, Facebook is a technology of the self and a matter of the heart. It’s a place where values are negotiated as logic is tested. When social media—and social media campaigns—become a reason to end relationships, we can say that our collective orientation to the social world, values and to “the good” has shifted.

We’re in need of a digital ethics—some sort of tradition of digital communication that allows it to flow freely, slows down the crashes, and keeps channels open. Without some sort of guide—some fault line drawn collectively between social good and social ill—friendship may start to be defined by de-friendship. If we want to hold on to the worthwhile moral sensibilities of the past, we have to put connection over disconnection. We have to work on establishing the fault line together.

As Alexis de Tocqueville observed 150 years ago, the American sense of individualism threatens social disillusionment and despotism. As Bellah et al. (1985) argued about the American worldview over 25 years ago, “What we find hard to see is that it is the extreme fragmentation of the modern world that really threatens our individuation … our sense of dignity and autonomy” (p. 286). When being digital begins to retrain our habits, redefining friendship and blurring ethical boundaries, it becomes more and more crucial that we pay attention to the ways we relate—if, for no other reason, than to redraw the boundaries of our moral horizons, so that culture itself isn’t so “loosely bounded” in lieu of easy isolation.

Creative Commons License
PLE's It’s all fun and Facebook until somebody shoots their “I” out by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Digital Being Free

When I was 12 years old, I was super-proud. I’d finally accomplished the one thing that my buddy John had yet to do. It took a month, a lot of diligence, and some trial and error, but I finally did it.

My Winamp playlist had reached 25 songs.

It was a big deal.

 No one else on the block had as much music as I did. John lived a few blocks away, and even him—a rich kid with a cable modem and 4 gig hard drive—had a poor working knowledge of music and lacked the patience to download a full playlist. In fact, he would download a song, listen to it a few times, and delete it because “it got old.” It wouldn’t be until years later that he would make mini-discs filled with music that would eventually scatter the backseat and trunk of his car. As a pre-teen, however, he was much too concerned with hiding porn from his mom on an ftp site than he was about building a music collection.

Back then, it was one medium at a time, a “pick your poison” style of consumption. Processors weren’t that great and a person worried about fires and literal machine meltdowns when music played in the background of games and other applications. I blew-up two computers before I was 18.

Regardless, my music collection was my claim to fame and I reveled in my collecting abilities compared to John. I prided myself on the knowledge I’d acquired about navigating share sites like Scour, AudioGalaxy, Morpheus and eventually Kazaa and Napster. I’d sit and play with MS Paint, search TuCows for wallpaper schemes, and work on developing the flashiest (and gaudiest) AngelFire site that I could. I basked in the family’s office, able to escape from the surveillance of my parents, learning the words to “Bittersweet Symphony” by The Verve, “What Would You Say” by Dave Mathews Band, “Hunger Strike” by Temple of the Dog, and anything by Primus as I flirted with classmates and stirred up gossip on AIM.

This was the digital world in which I was reared—a lifestyle enclave that I’m pretty sure any other 26-year-old, middle-class, technophile might describe. Being “on the computer” was a nightly activity that provided a sense of freedom and mobility away my parents, chores, and school work. Beyond the $10 a month “high-speed” cable plan that my parents paid every month as they gritted teeth and hung over my head, the web was free reign.

Music was free. Wallpapers were free. Access to John’s ftp site was free. Paying for anything was never even a thought. Why would this stuff on the Internet—which was pretty hard to locate and laborious to acquire—cost any money? To a 12 year old, money grows on tall trees out of reach, and the Internet was not for tall people. It was for people like me—short people; young people; free people. Like in Peter Pan, we were the lost boys—free to our own digital devices.

I learned to be a pirate at an early age, but I didn’t know that it was piracy. I thought it was surfing the web. At 12, legal realities are nothing but futurist fantasies. Semantic contagion soon took its form and I came to understand digital practice as rife with controversy.

Fast forward to college—to University networks with End User License Agreements that threatened to sue and sell your information whether you agreed or not; to stories of kids being hauled to jail for downloading music; to worries about child pornography on public networks; to suggestions from analysts on FOX news that Columbine happened because of Internet archives; to discourse about terrorists recruiting Americans for suicide bombings via Facebook; to Criagslist killers; to Megan Meyer’s cyberbullying suicide; to stolen identities; to politician and athlete Twitter scandals; to…

The list goes on and on.

In ten years, the Internet went from a free place to a dark place. I grew up. I learned about the law. I learned that artists liked to make money from their music and that, if it was good music, they were entitled to that money.

But I didn’t care. They could still play it live.

I kept on pirating. Kept on asking my cable guy how closely the company monitored downloads, slipping him a few bucks for faster service, under the table. I sought out and discovered sites like TVLinks, ISOHunt and The Pirate Bay, where I could find movies that weren’t yet released, download software that was well beyond my economic reach, and continue to build my music library until it touched the digital sky.

25 songs become 50 gigs worth of songs as bandwidth increased and download speeds soared. I thought I was doing something really rebellious, but when I asked around I found myself trailing behind the content collection leaders by far. In the world of digital pirates, I was practically the Swiss Family Robinson. Most had discovered bit-torrent long before I did and had exclusive access to sites like OiNK’s Pink Palace for every media longing their hearts desired.

I grew up at a time when cognitive surplus was welling up around personal computer consoles, where social and cultural capital was circulating at rates that were historically unmatched. I learned to be digital by being free. Much of the conceptual purchase-power of the early days of that culture—when the Internet was far from ubiquitous—stemmed from the non-monetary value of exchange. Digital practices like downloading music, scouring freeware, and surveying the depths of the “information super-highway” had value because they provided a certain freedom to the user, a way of “doing virtual” that was eventually redeveloped by venture capitalists and government officials, annexed by the corporation and the state.

What was incredibly important to an entire generation who grew up awed and emaciated with digital technologies in the house, easy enough to use, and interesting enough to be fun, became a battle ground for economic warfare and legal bargaining. Ironically, the more money I put into my digital devices now-a-days, the less I appreciate them or the experience they facilitate. Isn’t that funny?

Isn’t that sad. It’s like losing your digital innocence.

To me and my cohort, free culture had less to do with money and more to do with possibility. Music sharing and collecting used to be about possibility, and embedded in that digital practice was a not just a value, but a virtue that came to define the entire digital culture.

Music sharing was never really about not paying for music—that was just a perk realized fully once the generation came of age, got jobs, and witnessed the invention of the iPod. Music was a matter of prestige, a way to show that you had taste and knowledge bundled up in digital know-how. In the world we live in now, technology centered law creates a fertile ground for breeding guilt in youth who want to listen to music, share their wealth of artist knowledge, and learn how to express themselves through networked machines.

Music will never be their method for cultivating knowledge. Not like it was for my cohort. I was taught from the get-go that digital culture was non-commercial. Somewhere over time that changed entirely. Value was lost. But it might be coming back.

Lawrence Lessig (2004) makes a good point when he suggests that “competing with free” is a good thing because “the competition spurs the competitors to offer new and better products. This is precisely what the competitive market was to be about” (p. 302). What he could never have conceived before the advent of behemoth social network sites like Facebook, which reestablish the valance of social and cultural capital in exchange, was that free could be marketed and profited from itself in a network society.

Applications like Spotify have changed the game in ways that have only begun to be understood by most. Their business model, which exchanges free access to the world’s music (songs which are upload by individual users themselves) for subjection to advertising, shows that a free culture is still the richest culture that digital people have. It is hard to compete with free when it is done as well as Spotify has done it. In fact, you could say Spotify has changed the definition of free, so that it disbands from including monetary value in its reference at all.

Or maybe I’ve been brainwashed by all of the ads.

I guess that’s just the price you pay.

Creative Commons License
PLE's Digital Being Free by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Doing Away with Discipline: The Way of the Digital Scholar

In his 6th chapter, “Interdisciplinarity and Permeable Boundaries” in Digital Scholar: How Technology is Transforming Scholarly Practice, Martin Weller (2011) anchors the idea of Interdisciplinarity in digital practices that reshape society. Drawing from Chris Anderson, the current TED curator, he claims that “lightweight and unrestricted forms of communication found in many Web 2.0 tools may serve the needs of Interdisciplinarity to overcome existing disciplinary and geographical boundaries” (p. 2).

Weller suggests that open, digital, networked technologies are, in many ways, responsible for an “unexpected collision of distinct areas of study” (p. 2). To an increasing extent, digital culture permeates the walls of the ivory tower as technologies enable new practices, which “create[s] a common set of values, epistemological approaches and communication methods” that “override those of separate disciplines” (p. 3). Approaches to research emerge that refigure what it means to be a researcher as academic behaviors encompass more and more digital practices. Researchers adhere to new, emergent norms of discovery in their work, which often run counter to the traditional, fragmented, departmental models of an analog past. As a result, new pathways leading to different-yet-viable methods of knowledge production are formed, reshaping institutions and disciplines as they crystallize via publication. As scholars tread grounds beyond their familiar intellectual territory, pursuing innovative ideas outside of their academic home, they form alliances with others by way of new media. Blogs, social network sites, and Wikis evolve with scholars’ ideas as convergence cultivates creativity, play, and other forms of generative learning that cut across disciplinary boundaries.

This is a big deal for an academy structured around a model of institutionalized knowledge, which developed a fragmentary schema of disciplined study sometime in the mediaeval period. In the cliché words of Bob Dylan, “The times they are a changin’.”

For Weller, Interdisciplinarity goes beyond the physical constraints of pre-networked society where “Journals need[ed] to have an identified market to interest publishers, people need[ed] to be situated within a physical department in a university, [and] books [were] placed on certain shelves in book shops” (p. 2). Digital practices lead to virtual spaces where cultural norms and standards adhere to new possibilities, enabled by global networks of scholars who reform the functions of their trade and find innovative uses for new media tools leant to research efforts.

Problems in the academy arise when a clash of realities between digitally-oriented and analog-secure scholars lead to disagreements about rigor and relevance. Many scholars oriented toward tools of a pre-network society (i.e., analog technologies and traditional means of gaining public notoriety) remain unconvinced that digital practices can be rigorous or salient. As skeptical reactions toward Wikipedia’s credibility illustrate, many academic professionals who hold sway over tenure promotions and search committees remain suspicious of digital practices, distrusting the viability of knowledge that emerges through work that is digitally prodused under the cultural auspices of openness, free access, and quick turnover.

Interdisciplinarity is at once condoned when tied to emergent digital practices. Weller’s discourse frames the “schizophrenic attitude toward Interdisciplinarity” (p. 1) as a problem of exploding traditions.

He exposes a reality in the academy where scholars of an “old guard” who seek to defend the boundaries of institutional disciplines clutch to analog tools and methodological constraints of an old paradigm. The compendium of digital scholars entering the academy, as both students and new faculty, are forcing those who protect the standards of traditional approaches to yield their posts as they crash institutional gates with smartphonestabletsGoogle AnalyticsBlogger, and Twitter – all tools that diversify research audiences, amplify scholar’s messages, and ensure that scholarship has a larger impact when published.

In short, the digital difference in scholarship is Interdisciplinarity since digital practices break down barriers. With digital tools come digital practices and standards that academic institutions must take into account as they move into the future. Academic definitions of knowledge and discipline are forced to shift with a paradigm of practice that threatens the authority of institutions everywhere (see Weller’s discussion in Chp 3 regarding the music and newspaper industries).

In Weller’s view, Interdisciplinarity doesn’t only apply to academic work. In reference to blogs as a genre of writing that leads to inquiry, Weller suggests that the “personal mix is what renders blogs interesting” as he explains that, in one of his favorite blogs, the author “mixes thoughts on educational technology and advice on the blogging platform WordPress with meditations on B-horror films. The mix seems perfectly logical and acceptable within the norms of the blogging community” (p. 4). The takeaway here is that digital culture remixes other cultures, including the intelligentsia, and this leads to new social formations. Scholars reinforce altered practices of engagement, learning, and knowledge production with their research, regardless of its focus or content, as they use digital tools to conduct it.

This means that the academy is changing from the inside out—a centrifugal force pushing out old hierarchies as it makes way for new networks. As Benkler (2006) suggests in Wealth of Networks, there is value in these networks, which is derived from the network itself and the swarm that embodies it. New networks have their own energy, which establish new modes of evaluation, new means of discovery, and new ways of making meaning through human action that gnaw at the edges of disciplines keeping old hierarchies sturdy and analog identities intact.

As Weller notes in his 3rd chapter, “Lessons from Other Sectors”, academia should take note of alternative resources that lead to new forms of research and learning before it loses its institutional hold on knowledge as an ideological authority. While this may seem a bit pretentious, the everyday experiences of academics who utilize digital tools frequently reveal the pertinence of such a warning. As digital culture subsumes disciplinary culture, Interdisciplinarity becomes more of a reality and ideological apparatuses are reshaped to fit “the social classes at the grips in the class struggle” (Althusser, 1970). The “weakness of the other elements in the ‘university bundle’ could become apparent, and the attractiveness of the university system is seriously undermined” (Weller, 2011, Chp. 3, p. 8) if traditions remain carved in blocks of stone.

Digital practices chip away at those stones.

The networked foundation for digital scholars’ work gives them the stability and solidarity to tackle complex, societal issues in ways that “old guard” academics never imagined possible. As a result, they may find their efforts having a greater practical impact outside of academia because institutional standards fail to adapt. This is a dismal attitude to take towards schools, which have made technological development and intellectual growth possible for an eon. However, as Weller warns, we should not confuse “higher education with the university system” (Chp. 6, p. 1); people will find a way to accrue new knowledge in any way available, and if that means subverting the dominant, traditional university system, so be it. The integrated perspective of Interdisciplinary pedagogy that Weller draws from Ernest Boyer, which makes “connections across the disciplines, placing the specialties in a larger context, illuminating data a revealing way, often educating non-specialists” (p. 1), is more hopeful than the critical view taken by many scholars caught up in the current system. This may be because hard working academics who strive to climb social hierarchies do not stand in solidarity together.

It is no lie that many graduate students and untenured scholars are bent on dismantling the good work of their brethren, who have spent a lifetime building the best stocks of knowledge they can in contribution to their discipline. In the end, these scholars belabor tired points in graduate seminars and faculty meetings, more concerned with asserting their self-centered agendas and personal politics as a way of accruing social capital, rather than fostering ongoing dialogue amongst their colleagues that would lead to new ideas and innovative inquiry. Digital practices tap networks that provide academics with outlets to collaborate unilaterally and avoid the traps of corporate machinery embedded in the institution, nullifying the need to burn bridges and step on toes as one makes their way in academia.

The limiting scope that arises when scholars squabble over methods of research, play tug-o-war with the line over authority, and willfully thicken tensions that arise between “hard” and “soft” sciences is perhaps the very reason why Interdisciplinary work evokes a laugh when suggested as a bonafide approach to research. Weller sees diversity as nothing to fight over. The habits of discipline are hard to break “and interdisciplinary work requires transcending unconscious habits of thought” (p. 2). Scholars who commune through digital practices begin speaking new, integrated languages that bridge gaps between research agendas rather than widen disciplinary lacunas. This is because, in their practical nature, digital technologies dismantle boundaries of institutionalized thought, not thoughts of institutionalized scholars.

So what would Weller’s Interdisciplinary model of higher education look like?

I asked my girlfriend this question after I finished reading Weller’s book. We both have different opinions about what counts as research. You might say that we both have trouble transcending disciplinary habits. While we both attended liberal arts universities in our undergraduate studies, our affiliations as graduate students differ. I study Communication, so I consider myself a humanities scholar; she dons the tag “social scientist” as she studies Applied Anthropology.

In our conversation, I envisioned a school where scholars work together to diversify fields of interest and broaden student perspectives. Explaining my ideas, I began brainstorming for a curriculum that put Interdisciplinarity at the center of pedagogy, instead of the margin.

At first she was intrigued by my excitement.

“Could you imagine it? … What if, as an undergrad, you could take classes that blended different areas of study? Something like, “Environmental Ecology and Spirituality”, “Statistics and Performance”, “Graphic Information Systems and Food Cultures” or “Creative Writing and Biochemistry”. How cool would that be?”

Her expression went from hopeful to disturbed. “Everyone would be really confused,” she said.

Perhaps.

But I don’t see that as a bad thing. Then again, I’m a digital scholar.

Creative Commons License
PLE's Doing Away with Discipline: The Way of the Digital Scholar by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.