PLE’S Help Yourself: 9 Things Academics Can Do With Social Media

This one is specifically for my academic friends. Although, anyone who thinks of themselves as a lifelong learner should read on.

Last time, I spoke about being mindful of the Internet, tipping my hat at Howard Rheingold‘s Big Idea. There is a lot out there to be overwhelmed with, that’s for sure. We should all be taking advantage of the Interwebs – without question. There is knowledge at our fingertips.

Don’t misread that statement as zealotry – I certainly don’t mean to say that laying off the Facebook and Twitter feed is bad thing. By all means, strip down and go to the woods as much as possible. And bring people with you, too.

“Living with” technology is different than “living for” it. We all might want to understand the difference.

I’m not quite a techno-cheerleader. On the other hand, I’m definitely not a Luddite. In fact, I strive for a certain technological balance. I like my media the way I like my relationships – particular, personal, discrete – overall, complimentary to my lifestyle.

Rather than catapulting into the typical excursion about human and non-human relations, I’d like to make a few practical observations about the way I use social media. In general, it helps me maintain an aura of conversation and interaction with others throughout my day – people present in both real and cyber space. These conversations hang together as I work and play at different times and in different places, for different purposes and in different spaces. My thinking has developed in revolutionary ways, as a result, and I think (I’m not sure) that social media doesn’t have to be overwhelming. It can be managed.

We don’t have to count social mediation out of what we do and we definitely don’t have to assume it’s beyond our understanding – a perspective I’ve learned many academics have about technology, overall. I wonder if the first people who put language into use felt the same way? After all, language is the paramount technology.

Many all but scoff at the Facebooker, Tweeter, Blogger or texter. Any mention of social media in conversation gets an eye roll or puckered lips. These same folks usually struggle with the most basic technology functions, missing out on great resources, passing up opportunities to extend their own learning and research. So many people are more and more likely to jump to irrational conclusions about the “good, bad, and the ugly” of social media (like the ones put forth by Sherry Turkle). For so many, it’s because they’re unsure about how to manage a digital reality.

I have to admit, translating all of the data we could be exposed to on a daily basis takes a lot of effort. Although, interpretation of information has always been about how much effort you put toward it. Listening isn’t easy. Either is filtering through the crap on the web.

I don’t blame folks for dismissing social media. There is a lot to know about, and a lot to learn how to do. The unknown has always been a major source of anxiety for people. Not knowing the “what’s” and the “where’s” of social media is one thing. The popular press helps us keep track of that, so there is little reason to use everything out there. There’s little value in being “cool” or “trendy”. However, not knowing the “hows” of technology is a different issue all together. Rheingold calls this sort of know-how “digital literacy” – comparable to any other type of literacy – and necessary for living in a digital world.

Knowing how to do social media is “second nature” for many, and not natural at all for others. Still, anyone and everyone can be familiar with how technology works and what others see as valuable about social media. They don’t have to use it, but they should have some know how before passing judgement. In the end, the reality of a digital world is that technology is – contrary to popular belief – always at a person’s discretion.

9 Things Academics Can Do With Social Media

The following outlines part of my Personal Learning Environment (PLE). A PLE is a relatively new idea developed by interdisciplinary scholars who see the web as a rich source for learning and wish to move toward an open, global, collaborative education system. What I’ve laid out below is a short list. It’s the basic tools that I use everyday to curate content and tame the digital behemoth into an analogue companion. These tools both satiate my attention deficiency and relieve some of the socio-economic pressures of the academy. While PLE’s are supplemental to higher education – not an alternative – they can certainly be prudent additions to a person’s cache, which lead to more engagement, more conversation, and more thoughtful hours of the day.

Tablet PC: I spent the money a few months ago on a Tablet PC. I got the ASUS Slider because it has a keyboard that slides out and props the touch screen up on its own (hence the name). The touchscreen sold me because of its immediacy and convenience. I read more now than I ever did – and that’s a lot in your third year of grad school. Personally, I have a lot more fun reading, posting and scrolling on a touch screen than a laptop. On a University campus, connectivity is never an issue since wifi is everywhere (seemingly). “App culture” is not just a new fetish but a way to pool the resources I use to work and play. I have to say that the interactivity of reading on a Tablet is so engaged and tactile that it is more than “reading”. How about “treading” as a combination of “touch” and “reading”. Ya, that’s actually pretty accurate. The 500 bucks was well worth the money, by the way.

Google Reader: 90% of my daily readings are blogs. This means that, sandwiched in between all of the reading I’m supposed to be doing for class, I’m also reading the work of my peers – graduate students, younger scholars, leading researchers in technology, programmers, and comedians (because I like comedy). A little secret – I cite things from scholarly blogs on the reg-u-lar because, lets be honest, sometimes their actual published papers are long winded and boring. Most scholars who blog are covering the same issues in a 150 word version on a daily basis. Google Reader is great because the blogs I like are delivered as a feed to me whenever they’re updated. Better than reading the morning paper, I do most of my blog reading over coffee or whenever I have ten free minutes (wherever I may be).

Samsung Galaxy S Smartphone: I dislike Apple products. Android isn’t much better, but, alas, the “third form” (open source software) has yet to develop a phone operating system (OS) that actually works. The touchscreen is essential, and like the Tablet, I chose my phone because it has a good’ol fashioned keyboard that slides out. Call me old skool but I still like buttons (and I really think autocorrect should be renamed “autoincorrect”). More significant than the tactility is the corresponding OS platform between my phone to my Tablet. Having both my mobile devices on the same OS makes every function so much easier and takes less mental effort. Also, I have a Sprint plan because they give me unlimited Internet access and texting for under 100 dollars month. The service is spotty, but you can’t beat unlimited. For Tweeting, Fbing, email, taking quick pictures and recording interviews, classes and important dialogues with other like minded people who collaborate on work with me, having a solid smartphone is absolutely necessary and worth the money. I see it as a gateway to productivity.

Tape-A-Talk Audio Recorder: I use this app because it’s simple, it has big buttons and it keeps track of my recordings by date as a default function. Most of my recording is done while I jog (because I have the best ideas when I exercise). I can keep the app running in the background while I jog and listen to music or a podcast. The buttons provide a large enough display that I have no trouble finding without looking, even when I’m out of breath, sweaty and fumbling. There’s also an option to turn your camera button into the record button, so your phone will work like any other dictation machine. The quality is exceptional, too. The free version is great and the pro-version is worth the money for the added functionality, too.

Stitcher Radio: I’m a big fan of radio. Always have been, But for some reason the radio transmitter in my car doesn’t work and it’s not worth the money to fix. Instead, I’ve taken to listening to podcasts. I started with Marc Maron’s WTF podcast (frequently the number one comedy podcast on iTunes) and This American Life (which is syndicated on NPR). When I found out that Stitcher collects the best podcasts from the web, I downloaded it and never turned back. It works a lot like Google Reader, except I can make different “stations” and categorize podcasts into categories. In a single day I’ll listen to an hour long interview with a famous comedian (usually something about their personal struggles with relationships and substance abuse), a 15 minute monologue by Garrison Kelleor from A Prairie Home Companion, a 30 minute story on Radiolab about fistulated stomachs in both people and cows, and a 10 spiritual exegesis from the one and only Alan Watts. My favorite podcast recently has been a free class from Yale University on the Continental Foundations of the Social Sciences, which compliments the Interpretive Social Sciences I just took this past spring beautifully. I’ll know more about Hobbes, Locke, Marx and – everyone’s favorite Durkheim – by the end of the summer than I ever wanted to know. This is a world class education people. From a senior lecturer at Yale. For free. Podcasts have truly changed the way that I learn and listen, and, in my humble opinion, have helped me turn workouts and drives into prime time educational experiences, re-extending my technologically impaired attention span.

Dropbox: There are a lot of different “clouds” floating around the Internet. I suggest finding one that works for you because it makes traveling to-and-fro so much easier, especially if you’re an absent minded academic like I am and frequently forget your flashdrive in your computer’s USB port, or fail to email yourself the necessary files for the next days’ presentation. It’s also an easy way to share files with colleagues and professors because you can upload as big a file as you want.

Tweetcaster, Freindcaster and Spotify: For all of your social media needs, Tweetcaster and Friendcaster are much more functional apps for sharing content across platforms than the traditional apps provided by Facebook and Twitter. Facebook mobile tends to crash mobile devices and Twitter’s app is pretty difficult to navigate. The caster-apps make micro-blogging a breeze and are more customizable. If you don’t know why you should use Twitter, you should try it for a few days and then see how you feel about it; it’s a bit like having a personal CB radio that other digital-truckers tune into as they drive through web traffic. You never know when someone will be able to help you find your way through to a gold mine of knowledge you didn’t know existed right under your nose. Facebook, of course, is the great social stethoscope of our time. Your Facebook page can be the pulse of your PLE if – and only if – you manage it properly. Taking the time to manage your network will generate more opportunities for conversation and exposure to new ideas than you ever imagined possible. Finally, if you like music and you haven’t heard of Spotify, visit the site and download it already. These designers really have solved the music piracy problem, and this is coming from a person who’s been swashbuckling digital data since the Internet was delivered over a phone line.

WordPress: I blog, obviously, because I have a lot to say. More than giving me an excuse to be long winded, my blog has made me a better writer. It also gives me a voice in ongoing conversations about issues that other people think about, care about, and want to know about. For all intents and purposes, my blog is my corner of the web where I get to host the ideas that matter to me and it gives me a good excuse to invite others into the conversation for support and criticism. It’s perhaps the most formidable way to develop an academic voice outside of publishing in journals, where making concise arguments in writing is a key skill and making connections between your thoughts and those of others is still the best way to maintain a strong ethos. Most people who claim that the web isn’t peer-reviewed probably don’t use technology an haven’t heard of peer-to-peer networks. Where self-publishing is lacking in “rigor” it certainly empowers the author to write what they want to write, when they want to write it, for people whom they are interested in having read it. Of course, business folks write shorter blogs and find value in the the super-hyperlinked variety of web writing that makes bold claims, reaches as large an audience as possible, and is more concerned with attention seeking than thought development and careful examination of nuanced arguments. Academics, however, write longer blogs (it seems) because this genre of speech is a provincial way for them to work through thick thoughts, deep theories, and styles of writing that lead to fresh perspectives. Like stand-up comics who appear frequently at open-mic nights to work on new material, blogs are the open-mic of the academic who is diligent about refining their craft. For me, the real challenge in blogging isn’t procuring a readership – you can use your other social media channels for that; it’s sticking with it each week (or month) and finding something to talk about that is of some value that takes determination and stamina. For the contemporary scholar, there is really no excuse not to blog. Search for your favorite living theorist on Google – chances are, they blog. You should, too. Remember that you blog for yourself. I find that it really matters little if others read what you write. The point is not to seek approval, it’s to practice your professional craft and develop mental and rhetorical skills. Readers are nice, though, especially when they comment (hint, hint).

Creative Commons: All people in the world of publishing and producing original content (which includes nearly all those who would ever self-willingly don the label ‘author’) need to familiarize themselves with the Creative Commons licensing initiative. Ever worry that you shouldn’t’ use that random picture you got off Google because someone might sue you? Ever had a concern about putting a new idea online and having it “stolen”? Creative Commons gives you a way to copywrite your work in cyberspace. It’s brilliant, it’s easy, and it works. The best thing is – a lot of people not unlike yourself have made it their life work to ensure that the CC licensing holds up in court. Check it out – it’s worth knowing and spreading the word to your colleagues, coworkers and students. Protect yourself and protect your right to share what you write.

My hope is that some of this is new to you. This list only scratches the surface, but it’s enough to give you sense of how technology can serve the contemporary academic, intellectual, or common person’s agenda. All it takes is a little bit of know-how. Social media can be used to filter out the crap on the web, which could certainly lead to some peace of mind and who knows – maybe even a better way to live.

Creative Commons License
PLE’s Help Yourself: 9 Things Academics Can Do With Social Media by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Advertisements

The Raging River of the Interwebs

Howard Rheingold tweets that being mindful about all of the data on the web means filtering all of the crap as we wade through the waters of ever rushing interest.

Ok, maybe I’m stretching his 140 word post on Hybrid Pedagogy‘s #digped discussion group about his new book Net Smart, which went live last month (and will stay active throughout the summer). Still, Rheingold is pulling together centuries old spiritual thought with cutting edge technology when he suggests that digital beings can be mindful beings. He’s saying that surviving the over-growing, ever-moving datasphere at a time when information and ways to access it grow in abundance each day requires some mental agility. Dare I say, we all need to show some “digital hubris” or what could otherwise be considered intellectual stubborness.

We have to get unstuck from the school-age notion that the only way to really know something – to be right, to have a say, to pass the test – is to know everything there is to know. We have to decide not be perfect – to let some things pass us by – but we have to keep trying to keep learning as we keep moving down the river of tweets and retweets, memes, likes, posts, blogs and vlogs, and oversourced schools of email that nibble at every second of our already overbooked day. Like the reborn alcoholic or addict, we can surrender to the datasphere, acknowledging our own learning limits and realizing the full magnitude of what the datasphere has to offer. In my view, cyberspace does qualify as some form of “higher power” that is “greater than ourselves”. It’s “virtual” for Godsake! What could be more mystical than that?

If you don’t like that idea, don’t worry: higher powers and guilty people have historically complimented each other nicely.

“We can’t all learn everything, and but we all can learn something” is a line we used to tell pledges in my college fraternity. Today, I take it as my daily mantra of digital practice. I prevent myself from falling down the “YouTube hole” and resist my unbelievable propensity to scroll down. That sort of avoidance doesn’t include technological dismissal or denial. It’s a matter of discipline, like a spiritual practice, you do what feels right. Knowing how to move through the “crap” (Rheingold’s actual word) on the Internet is key for retaining peace of mind. It’s the only way we can manage the digital information overload, which for some reason seems bigger and meaner than all of the other information overloads that have happened throughout history. But just because there’s too much information doesn’t mean that there’s too much information to manage. A fact that is often overlooked by the common person is that social technology is a discretionary function of everyday life, not a mandated one. That means that you have just as much ability to shut down as you do to power up. By virtue of that fact, you have just as much incentive to tailor technology to suit your needs as you do to be sucked in by flashy lights and funny pictures of cats.

We live in a world of artificial excess – the ocean, the land, the sky, and outer space are all bolstering with too much stuff that gets in our way when we try to occupy it. “Space junk” threatens our safety and the purity of the environment that anchors whatever reality we’re living in at the moment. Today and forever from now own, cyberspace will be the same way. The hard part is recognizing what counts as “junk” and what doesn’t. We need tools to do that because – remember – the Interwebz is bigger and badder than us. We need super amazing information-metal detectors, hardcore data-rototillers and social media-rakes that collect all of that rich soil good for planting positive, clever, and humorous seeds of intellect and prestige in the gardens of our social network (and, perhaps, our minds).

If we want to know how to manage the raging river that runs through our collective digital-backyard, then we better know how to pick the right tools to help us reroute our expectations. We also need to know who can be our best guides for helping us along the way as we take on the rapids of discourse and debate. We need to know who will pull us back in the boat if we get pulled into the tossing waves of argument, cut-up on the sharp rocks of disinformation.

Had enough metaphors, yet? Good. Me too.

You get the point, I hope, that we all need to learn how to be mindful of the Internet, which means not passing up the opportunities it presents us for both work and play. No matter your vocation, digital resources can help you just as much – if not more – than they are said to be a hindrance.

Creative Commons License
PLE's The Raging River of the Interwebs by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Value in Virtual

You walk out of the gym, satisfied with your work out. You can feel the wear in your arms, still fresh from the swim. The chlorine smell from the pool drips onto the collar of your shirt as your hair dries in the sun. Paced and calm, you enjoy the walk back to the office, taking your time, taking it all in. You love being on campus in the afternoon. Students pass by in fast forward, late for class coming from work. Some are late for work coming from class. They get nearer to the cars passing on the crosswalk—closer than you’re comfortable with. One of them, wearing head phones, holds up a middle finger as a red Sedan passes, which doesn’t  even hesitate to stop, almost hitting him. You sigh in disbelief. The ignorance of Florida drivers. Still, you don’t feel sorry for the guy with the headphones. He wasn’t paying attention, either.

Three black guys coming your way are elated in conversation, smiling and joking, taking up a lot of room and making a ruckus. You smile behind your sunglasses, remembering college and recalling two friends you’ve lost touch with completely. You feel a twist of longful nostalgia, envious of a carefree sensibility that you’ve lost. A feeling of freedom absent in a space leftover from a not-so-distanced youth. You catch a bit of the conversation as they pass: “Nigga, shut up! You don’t know what she said wh-” The words echo in your head. You can’t get past “Nigga” and you think about the power of words that construct reality. You wonder why so many people struggle with difference. You can’t help but think about how some words stay in use long after they should. You know that there’s a cultural identity tied up in certain labels, which have been purposed and re-purposed, but you’re not sure that some will ever be completely free of stigma. Your body gets tense when you hear  certain language and you wonder if other people have the same reaction. You try to recall the last time you let your lack of cultural sensitivity get the best of you but you can’t. You decide to omit the word “gay” from your vocabulary. You’re pretty sure you won’t be completely successful with that. Still, it’s worth trying.

Ahead of you is a series of waist high boards, propped up and lined in a row next to a table. Some sort of campus group set them up in the green, no doubt. Maybe protesters. Maybe street preachers. Who knows. You guess names as you walk up, thinking up possible student groups: Students for Social Change…Young Democrats of Tampa…Occupy USF…Campus Coalition for the Homeless. You hope it isn’t anti-abortion propagandists from last week.

Turning the corner, you see that the boards are yellow with a big, sloppy number painted on the front of each. The first one says “US Total Debt.” The number is in the trillions. Silly. You keep walking. Then you stop, take out your camera and kneel beside the last number, snapping a few pictures, taking a shot of the whole thing. You think about how ridiculous money really is, how the national debt is merely an indicator of a government’s inability to play by their own rules. Laughing, you resist the urge to go ask the student standing at the booth if he realizes that money has become less and less real. You want to know if he sees the irony in the whole display, which uses large, physical objects to, quite literally, make money real for us. Money that is rarely represented by dollar bills. Money that’s no longer in our pockets as much as it’s in our clouds. Money that you can spend on Google Checkout. You realize that you’re probably the only one reading into this so deeply, so you keep moving. You check around, but no one was staring.

You head in the direction of the cafe in the basement of the business building. You don’t even notice that you failed to catch the name of the student group responsible for what turned out to be a clever political statement about capitalism, systems of exchange, and material culture. When you get to the cafe, you order a Tuna sandwich, wondering if it’s healthier than roast beef. You decide that you don’t really care. The workout was a good one. Looking through the pictures on your phone of the giant, wooden numbers, you think about your morning. You see yourself sitting in the communication building performance lab, surrounded by colleagues and mentors, listening intently to Mary Catherine Bateson talk about learning. She’s disarming, almost prophetic.

She leans forward in her seat, sculpting the air with her hands, looking at you, then past you, then next to you, then the other way. She talks about the importance of play and improvisation. You shake your head in agreement. You shift your weight. You lose track of the room as you zero in, focused. She answers questions with adapted lecture notes that come out like mini-seminars, genuinely honest and spontaneous, yet authentically true to her thoughts. Old thoughts. Thoughts  she’s mulled over and adapted for years. You realize this is what she means when she says  “we’re all making it up as we go along.” She says we need to spend more time being reflective—that all wisdom is derived from thinking about thinking. That “thinking about thinking” is the same as “learning.” She insists that we should find a way to dictate our actions as they’re happening, not just talk about them after the fact. If we can do that, we’ll reveal that we don’t really learn in the “now” but that we’re always making reality out of things that we already knew.

Mustard farts out of a bottle. You look at the women behind the counter as you grab your sandwich and ask for a pickle. You decide that Bateson’s “now” has got to be connected to Micheal Heim’s “virtual,” which he says is another word for “as if.” You hand over your card to pay. “$8.50” the cashier tells you, handing it back swiped. You don’t get a chance to process the information but you think that $8.50 is too much for lunch, especially considering the quality of the bread. You try to recall the last time you paid with cash, thinking about how your sense of money and value has shifted in the last decade. When did everyone start paying with plastic? When did  that become normal? You can’t seem to pinpoint it.

Moving toward the plastic silverware, you steal more than your share of knives and take a handful of napkins. You briskly open the door with your back, hands full of food and utensils, hoping no one will yell at you. As you scale the steps of the building, you move toward the sunlight, heading for your building. You talk to yourself out-loud, unaware that someone’s coming down the steps: “Virtual is the moment we reflect on what we think. The moment we make reality in our own words. It’s a reality out of nothing but what we remember from our experience. And our experience is only what we make of it.” You think that’s pretty clever, but know it needs some work.

The woman coming down the steps makes awkward eye contact. You stop talking, not sure if she heard you. You’re pretty sure she speeds up as she passes. You wonder why you’re so weird. You decide that when you get to your office, first things first, you’re going to start writing. Get it all out. At the top of the steps you breathe in the sunny Florida air and you ask yourself: Is all thinking virtual? Is money only a thought? If so, what’s the value in thinking? And what’s the value in virtual?

Creative Commons License
PLE's Value in Virtual by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

“Truth” in Analogue

Everything is going digital. Everywhere you look, there are digital things. People use the word “digital” frequently. Right now, I’m on a quest. Questioning other people this week, I’ve asked, “What does digital mean?” Most people just stared at me. A few took a shot at an answer:

“Zeros and one’s, right? Binary code?”

 “Anything that deals with computers, I think.”

 “When I think of digital, I think of telephones and computers, anything that sends a signal.”

 “To be honest, annoying is the first thing that popped into my head.”

 “I always thought that it just meant something new.”

I appreciate these responses, but I still don’t know what is meant when people call something digital. There’s little continuity in the colloquial definitions that emerge in conversation with peers. As I suspected, no one really knows what they mean or what they’re referring to when they use the word. In actuality, few care.

There’s a term for this kind of thing:

Epic fail.

I’m driven to know about the meaning of digital by a severe suspicion about what free and easy use of the word indicates. I don’t think we should continue to employ a word as loosely as we do digital when we don’t fully understand it. This may seem like a problem of mere semanticsIt’s not. It goes deeper than words. It’s a question about how we’re coming to live in a world that we continue to call by a name that we don’t fully grasp. It’s a question of virtues.

In the socially constructed reality that we embody as homo technicus, what we call the machines that we live with is indicative of how we come to understand our values as human beings. To inquire about the meaning of digital is to examine the labeling we’ve developed for packaging knowledge about the way we relate to each other in contemporary society. It’s to inquire about the ways our interactions involve technology and if they are, or aren’t, valuable.

In short, words is important.

See what I mean?

Words are symbols used to communicate and function in the world. Symbols stand for something. Digital is a symbol. I want to know what it stands for, but more poignantly, I want to know how the use of digital—as a naming device—signals a certain way of being in the world, where our actions are based on the values of a socially saturated society and our relation to technology straddles the thin line between virtue and vice, as Sherry Turkle suggests.

Symbols are made out of language. Language is a systematic tool for constructing symbols and expressing subjectivity in an objective way—a technology that we use, in talk and text, to give definition to what we see, think, and do. Through the objectivation of linguistic symbols, we cultivate a sense of agency as our ideas, thoughts, and actions become “things” that we can refer to, possess, change, and value. So understanding what is meant by digital is not just squabbling over words. It takes task with understanding how we live in a world that we continue to suggest is characteristic of a specific, albeit poorly understood, symbol.

Asking people to explain digital was hardly insightful so I go to the library, looking for any book that includes the word digital, scouring for context clues and straightforward definitions. Hours later, I’m sprawled out on the floor between shelves, knee deep in a puddle of scattered books, reading about circuits, channels, and transmissions. Engineering books. Books with digital in the title.

I’m no engineer. I don’t understand the math. But I’ve had enough communication classes to grasp the language, which includes words like signalcircuittransmissionnetwork, and system. In terms of communication research, these are concepts from a bygone age—an era before the paradigmatic shift to human communication research that privileges the mutual constitution of messages over an information-transmission model. I begin thinking about the ways engineers talk about the digital signal:

Digital signals are limited in flexibility.

Digital signals take less effort to transmit.

Digital signals can be manipulated and stored without much error.

Digital signals are better for performance.

Digital signals involve less noise.

Digital signals are less variable and more easily controlled.

Digital signals are based on a limited set of estimated values.

Thinking harder, I search for ways that this knowledge-set translates to human communication in a digital world. I wonder what “digital”—as a symbol for the way we communicate in a technological society—signals about the way we live:

We are less flexible.

We expend less effort as we communicate.

We can manipulate and remember without much error.

We are better at performance.

We deal with less noise.

We are in more control and put up with less variability.

We base what we do on a limited set of estimated values.

What, then, is meant by digital? Precision is the word that comes to mind. To be digital is to be clear, repeatable, exact.

Indeed.

We have GPS. We travel with more accuracy by giving up our geographical sensitivity, relying on systems that estimate our destinations. We end up forgetting that we can value our journey as much as our arrival. We give up the possibility of getting lost for the guarantee of finding our way. We trade the excitement of risk for the security of reward.

With the Internet, the world is becoming clearer, our ideas are more repeatable, and what we do more exacting. But there is a loss of value inherent in the digital, a sense of certainty ensured in the processing of signals, of life, that disallows discrepancy, variation, and divergence. Who doesn’t like certainity?

Yet, it is the variability we lose in the digital transmission of a signal that devalues living. Descrepency, variation, and risk give a signal a richer set of values. They also make life more robust, vibrant, and fulfilling. Surprise and excitiment are essential elements of a totalized experience of reality. They make our life more interesting, enriched with more possibilities. When precision is favored over everything, we lose a sense of value that we might not be able to recover. We lose our sensitivity to variability and count on limited values, which is another way of saying that we limit our options for what could be considered “true” or “good”.

In the finite province of meaning that engineering language provides, digital means precision. When associated with the way we live in a digital world, life becomes predictable. It describes a world that is less forgiving, where the accuracy of signals relies on a smaller index of values and the “truth” of our lives rests on a limited set of possibilities.

Walking away from the library with a lot of questions and a few books, I’ve got a lucky find in my hands. Wedged in between books about transistors, binary code, and baseband waveforms was Being Digital by Nicholas Negroponte. It must’ve been misfiled because it’s not an engineering text. Negroponte wrote the book at the beginning of the digital revolution while he was still chairman of the MIT Media Lab (which he founded). It’s dated, for sure, and it’s clear the Negroponte wears rose-colored glasses when he talks about the future (now past) of digital systems, but these words strike me as eerily relevant:

“multimedia narrative includes such specific representations that less and less is left to the mind’s eye. By contrast, the written word sparks images and evokes metaphors that get much of their meaning from the reader’s imagination and experiences” (p. 8)

It’s ironic that no one seems to know what digital means but they continue to use the word. Do they know what they’re getting into? Do they know that being digital might mean less possibility and limited values? Have they considered that trading virtue for virtual could mean narrowing imagination and circumscribing creativity? I doubt it. It seems to me that the more we call the world digital, the more we ought to ask about the cost of precision. I wonder about the values that we forgo, the possibilities lost in the signal, and the remaining “truth” in analogue.

Creative Commons License
PLE's “Truth” in Analogue by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

It’s all fun and Facebook until somebody shoots their “I” out

Damn you, Joseph Kony.

I swear, the man is everywhere. Except for Uganda.

Right now, there is a lot of fuss about Invisible Children’s Kony 2012 campaign. I don’t want to discuss the particulars but I want to discuss friendship, values, and how we use Facebook.

A quick story:

I was de-friended on Facebook after a thick debate regarding the credibility of Joseph Kony propoganda. The argument came to blows once the conversation shifted from a contestation of opinions to a critique of values. Suggesting that my friend was drawing from poor sources, I questioned his judgment when I told him that his evidence was garbage and that he shouldn’t post things like that on my thread. Albeit much tamer than the “flaming” of yesterday’s Internet, the remark didn’t go over so well.

The next day, we were no longer friends.

I wanted to apologize. I searched everywhere, scouring the depths of Facebook. He was gone. No relationship page; no shared interests; no mutual friends; disappeared from members lists of mutually affiliated groups.

Poof.

I’ve never really noticed being de-friended before, so you can imagine that after 7 years of Facebooking, it came as a bit of a shock.

This person wasn’t just a Facebook friend. I would consider them a real friend—a person I’ve known for years, whose relationship I cherished. Like so many relationships that give-way to time and distance, we remained friends because of Facebook. We took advantage of the channel, kept it open and stayed connected. Without Facebook, the knot of our relationship would have come unraveled long ago. At least that’s the way I see it.

It was more than a “weak tie”— he was someone I respect and appreciated; a fraternity brother, a mentor, a person worth seeing again. But because our email addresses, cell phone numbers, and mailing addresses have most likely changed since we last saw each other, Facebook was the direct tether between us.

Since when did we start losing friends because of Facebook?

While I’m new to it, I know that it happens a lot to others. As people get more comfortable with digital communication, they become less aware that they’re playing with technologies of the self, fluid identity, slippery traditions, and social values. As danah boyd (2007) observes, social network sites give us the ability to “write [our]selves and [our] community into being” (p. 120) as we post and comment, friend and de-friend.

And it’s all fun and Facebook until somebody shoots their “I” out. Politics are especially challenging.

Theoretically, Facebook is complicated and paradoxical by nature. It is a utility for self-expression—what Clifford Geertz (1973) might call a venue for “serious play.” The type of communication that occurs isn’t just something done for fun and it’s no longer a novelty of youth culture. In the past few years, it’s become a significant arena for performances of all kinds, where “facework” is learned and done (Goffman, 1959), information is churned over until it becomes meaningful, and relationships are held in tension that sustain and maintain bonds (Baxter, 2006). Put another way, it’s an important place where life is lived and culture is formed.

Also, it’s not going away anytime soon.

The lesson that I learned from being de-friended is a lesson that all Facebookians might consider. What’s at stake when we’re so free to make connections wherever we see them is further weakening a tradition of social connection that’s been thinned by American individualism for some time (see Bellah et al., 1985). If we can choose to connect so easily, friending whomever, whilly-nilly, than we can choose to disconnect in the same way, without thinking about why we value friendships, what’s gained, and what’s lost. The very idea of friendship is cheapened when we fall in and out of relationships so easy.

The abstract, contingent, and discrete nature of reality in an increasingly digital world, which allows us to tinker with the meaning of social institutions, is something that Kevin Kelly (1994) describes in his book Out of Control. What he calls the “rising flow” (p. 404) is a wave of life that implicates each of us as architects of boundless, yet uncontrollable symbolic systems. We all cope in our own small ways with the inevitability of entropy by making sense of our lives through logic, reason, and the ordered coding of social construction.

The way we communicate with new media can lead to dismay, disarray, and isolation, not to mention desire, affinity and what some might call addiction. As individuals are caught up in the rising flow, they’re swamped under a sea of moral turmoil and ethical chaos, like surfers tumbling under a “sustainable crest always falling upon itself, forever in the state of almost-toppled” (p. 405). Communication turns ugly quick in a digital world. Kelly’s metaphor is one that forewarns the present, as de-friending becomes a practicable reality. Especially during times of high political tension, when movements like Kony 2012 come to the fore of Facebook newsfeeds and timelines, a wider sense of order can be shattered, logic and structure can be quickly abandoned, and conversations may end in easier dismissal. People begin to opt for disconnection when mis-information is too much to bear, and they vanish into the virtual ether.

As my friend posted before he dropped out of my network, “This is a subject that makes my blood boil, I have zero sense of humor, and very little patience on the subject, so I’m done. Do what your conscience tells you.”

I imagine that the crash of sensibility on Facebook is a reason why many choose not to use social media. Dissmissivness is a caveat of digital living, often framed as an unhealthy replacement for analogue communication (Turkle, 2011). While  replacement is hardly the right word—in fact, communication is hardly ever purely analogue or purely digital (Bateson, 1972)—Facebooking, like all mediums involves some sort of risk, some kind of uncertainty, and some degree of necessary vulnerability for it to be worthwhile. This sort of exchange value puts the “capital” in “social capital.”

Talk about politics in a public forum, risk losing a friend. Question a person’s intellectual integrity in front of others, you’ll probably get burned. Cite a phony source, take a chance of being called out. The safe way to cope with this system is to drop out of it all together. On Facebook, de-friending is just as good.

That’s not a reason to either abandon or blame Facebook. It’s a reason to be more aware of ourselves as digital beings. It’s a call for a new tradition that doesn’t value disconnection in the same way.

At least not disconnection at the drop of a hat.

The key to sensible communication in the digital age is to be careful with fragmentation. I value Facebook because it allows me to hold onto relationships. Some of those relationships have fallen quite dormant, true, but others have been lively all along. In fact, I can only think of 3 instances when I’ve de-friended anyone, ever. The ability to stay connected with desperate others, despite estrangement, is a technological privilege, not a right. As a privilege, we can recognize that negative perceptions, which frame digital communication as contributing to disquiet, concern, and (perhaps) pathology, lead to self-fulfilling prophecies. We’d benefit by keeping conversations going—even if others rub us the wrong way—so that channels remain open between differently minded people, keeping communicative possibilities open and available if necessary.

Maybe then we can ride the crest of the wave for just a little longer. Maybe experience the next paradigmatic sea-change, together.

If, as political beings who use global communication networks, we decide to abandon relationships in the face of cyberspace disagreement, than we reify David Bohm’s (1996) fear of a contemporary society where “people’s self interest and assumptions take over … against the best of intentions — they produce their own intentions” (p. 14). In this scenario, the world is more fragmented, and continues at the speed of Moore’s Law. The digital risk of Facebook friendship is that, one day, de-friending could become more common than friending. If this was to be the case, the value of friendship rests on the mere contingency of disconnecting from others.

To use a common metaphor, Facebook is a technology of the self and a matter of the heart. It’s a place where values are negotiated as logic is tested. When social media—and social media campaigns—become a reason to end relationships, we can say that our collective orientation to the social world, values and to “the good” has shifted.

We’re in need of a digital ethics—some sort of tradition of digital communication that allows it to flow freely, slows down the crashes, and keeps channels open. Without some sort of guide—some fault line drawn collectively between social good and social ill—friendship may start to be defined by de-friendship. If we want to hold on to the worthwhile moral sensibilities of the past, we have to put connection over disconnection. We have to work on establishing the fault line together.

As Alexis de Tocqueville observed 150 years ago, the American sense of individualism threatens social disillusionment and despotism. As Bellah et al. (1985) argued about the American worldview over 25 years ago, “What we find hard to see is that it is the extreme fragmentation of the modern world that really threatens our individuation … our sense of dignity and autonomy” (p. 286). When being digital begins to retrain our habits, redefining friendship and blurring ethical boundaries, it becomes more and more crucial that we pay attention to the ways we relate—if, for no other reason, than to redraw the boundaries of our moral horizons, so that culture itself isn’t so “loosely bounded” in lieu of easy isolation.

Creative Commons License
PLE's It’s all fun and Facebook until somebody shoots their “I” out by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Digital Being Free

When I was 12 years old, I was super-proud. I’d finally accomplished the one thing that my buddy John had yet to do. It took a month, a lot of diligence, and some trial and error, but I finally did it.

My Winamp playlist had reached 25 songs.

It was a big deal.

 No one else on the block had as much music as I did. John lived a few blocks away, and even him—a rich kid with a cable modem and 4 gig hard drive—had a poor working knowledge of music and lacked the patience to download a full playlist. In fact, he would download a song, listen to it a few times, and delete it because “it got old.” It wouldn’t be until years later that he would make mini-discs filled with music that would eventually scatter the backseat and trunk of his car. As a pre-teen, however, he was much too concerned with hiding porn from his mom on an ftp site than he was about building a music collection.

Back then, it was one medium at a time, a “pick your poison” style of consumption. Processors weren’t that great and a person worried about fires and literal machine meltdowns when music played in the background of games and other applications. I blew-up two computers before I was 18.

Regardless, my music collection was my claim to fame and I reveled in my collecting abilities compared to John. I prided myself on the knowledge I’d acquired about navigating share sites like Scour, AudioGalaxy, Morpheus and eventually Kazaa and Napster. I’d sit and play with MS Paint, search TuCows for wallpaper schemes, and work on developing the flashiest (and gaudiest) AngelFire site that I could. I basked in the family’s office, able to escape from the surveillance of my parents, learning the words to “Bittersweet Symphony” by The Verve, “What Would You Say” by Dave Mathews Band, “Hunger Strike” by Temple of the Dog, and anything by Primus as I flirted with classmates and stirred up gossip on AIM.

This was the digital world in which I was reared—a lifestyle enclave that I’m pretty sure any other 26-year-old, middle-class, technophile might describe. Being “on the computer” was a nightly activity that provided a sense of freedom and mobility away my parents, chores, and school work. Beyond the $10 a month “high-speed” cable plan that my parents paid every month as they gritted teeth and hung over my head, the web was free reign.

Music was free. Wallpapers were free. Access to John’s ftp site was free. Paying for anything was never even a thought. Why would this stuff on the Internet—which was pretty hard to locate and laborious to acquire—cost any money? To a 12 year old, money grows on tall trees out of reach, and the Internet was not for tall people. It was for people like me—short people; young people; free people. Like in Peter Pan, we were the lost boys—free to our own digital devices.

I learned to be a pirate at an early age, but I didn’t know that it was piracy. I thought it was surfing the web. At 12, legal realities are nothing but futurist fantasies. Semantic contagion soon took its form and I came to understand digital practice as rife with controversy.

Fast forward to college—to University networks with End User License Agreements that threatened to sue and sell your information whether you agreed or not; to stories of kids being hauled to jail for downloading music; to worries about child pornography on public networks; to suggestions from analysts on FOX news that Columbine happened because of Internet archives; to discourse about terrorists recruiting Americans for suicide bombings via Facebook; to Criagslist killers; to Megan Meyer’s cyberbullying suicide; to stolen identities; to politician and athlete Twitter scandals; to…

The list goes on and on.

In ten years, the Internet went from a free place to a dark place. I grew up. I learned about the law. I learned that artists liked to make money from their music and that, if it was good music, they were entitled to that money.

But I didn’t care. They could still play it live.

I kept on pirating. Kept on asking my cable guy how closely the company monitored downloads, slipping him a few bucks for faster service, under the table. I sought out and discovered sites like TVLinks, ISOHunt and The Pirate Bay, where I could find movies that weren’t yet released, download software that was well beyond my economic reach, and continue to build my music library until it touched the digital sky.

25 songs become 50 gigs worth of songs as bandwidth increased and download speeds soared. I thought I was doing something really rebellious, but when I asked around I found myself trailing behind the content collection leaders by far. In the world of digital pirates, I was practically the Swiss Family Robinson. Most had discovered bit-torrent long before I did and had exclusive access to sites like OiNK’s Pink Palace for every media longing their hearts desired.

I grew up at a time when cognitive surplus was welling up around personal computer consoles, where social and cultural capital was circulating at rates that were historically unmatched. I learned to be digital by being free. Much of the conceptual purchase-power of the early days of that culture—when the Internet was far from ubiquitous—stemmed from the non-monetary value of exchange. Digital practices like downloading music, scouring freeware, and surveying the depths of the “information super-highway” had value because they provided a certain freedom to the user, a way of “doing virtual” that was eventually redeveloped by venture capitalists and government officials, annexed by the corporation and the state.

What was incredibly important to an entire generation who grew up awed and emaciated with digital technologies in the house, easy enough to use, and interesting enough to be fun, became a battle ground for economic warfare and legal bargaining. Ironically, the more money I put into my digital devices now-a-days, the less I appreciate them or the experience they facilitate. Isn’t that funny?

Isn’t that sad. It’s like losing your digital innocence.

To me and my cohort, free culture had less to do with money and more to do with possibility. Music sharing and collecting used to be about possibility, and embedded in that digital practice was a not just a value, but a virtue that came to define the entire digital culture.

Music sharing was never really about not paying for music—that was just a perk realized fully once the generation came of age, got jobs, and witnessed the invention of the iPod. Music was a matter of prestige, a way to show that you had taste and knowledge bundled up in digital know-how. In the world we live in now, technology centered law creates a fertile ground for breeding guilt in youth who want to listen to music, share their wealth of artist knowledge, and learn how to express themselves through networked machines.

Music will never be their method for cultivating knowledge. Not like it was for my cohort. I was taught from the get-go that digital culture was non-commercial. Somewhere over time that changed entirely. Value was lost. But it might be coming back.

Lawrence Lessig (2004) makes a good point when he suggests that “competing with free” is a good thing because “the competition spurs the competitors to offer new and better products. This is precisely what the competitive market was to be about” (p. 302). What he could never have conceived before the advent of behemoth social network sites like Facebook, which reestablish the valance of social and cultural capital in exchange, was that free could be marketed and profited from itself in a network society.

Applications like Spotify have changed the game in ways that have only begun to be understood by most. Their business model, which exchanges free access to the world’s music (songs which are upload by individual users themselves) for subjection to advertising, shows that a free culture is still the richest culture that digital people have. It is hard to compete with free when it is done as well as Spotify has done it. In fact, you could say Spotify has changed the definition of free, so that it disbands from including monetary value in its reference at all.

Or maybe I’ve been brainwashed by all of the ads.

I guess that’s just the price you pay.

Creative Commons License
PLE's Digital Being Free by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.

Doing Away with Discipline: The Way of the Digital Scholar

In his 6th chapter, “Interdisciplinarity and Permeable Boundaries” in Digital Scholar: How Technology is Transforming Scholarly Practice, Martin Weller (2011) anchors the idea of Interdisciplinarity in digital practices that reshape society. Drawing from Chris Anderson, the current TED curator, he claims that “lightweight and unrestricted forms of communication found in many Web 2.0 tools may serve the needs of Interdisciplinarity to overcome existing disciplinary and geographical boundaries” (p. 2).

Weller suggests that open, digital, networked technologies are, in many ways, responsible for an “unexpected collision of distinct areas of study” (p. 2). To an increasing extent, digital culture permeates the walls of the ivory tower as technologies enable new practices, which “create[s] a common set of values, epistemological approaches and communication methods” that “override those of separate disciplines” (p. 3). Approaches to research emerge that refigure what it means to be a researcher as academic behaviors encompass more and more digital practices. Researchers adhere to new, emergent norms of discovery in their work, which often run counter to the traditional, fragmented, departmental models of an analog past. As a result, new pathways leading to different-yet-viable methods of knowledge production are formed, reshaping institutions and disciplines as they crystallize via publication. As scholars tread grounds beyond their familiar intellectual territory, pursuing innovative ideas outside of their academic home, they form alliances with others by way of new media. Blogs, social network sites, and Wikis evolve with scholars’ ideas as convergence cultivates creativity, play, and other forms of generative learning that cut across disciplinary boundaries.

This is a big deal for an academy structured around a model of institutionalized knowledge, which developed a fragmentary schema of disciplined study sometime in the mediaeval period. In the cliché words of Bob Dylan, “The times they are a changin’.”

For Weller, Interdisciplinarity goes beyond the physical constraints of pre-networked society where “Journals need[ed] to have an identified market to interest publishers, people need[ed] to be situated within a physical department in a university, [and] books [were] placed on certain shelves in book shops” (p. 2). Digital practices lead to virtual spaces where cultural norms and standards adhere to new possibilities, enabled by global networks of scholars who reform the functions of their trade and find innovative uses for new media tools leant to research efforts.

Problems in the academy arise when a clash of realities between digitally-oriented and analog-secure scholars lead to disagreements about rigor and relevance. Many scholars oriented toward tools of a pre-network society (i.e., analog technologies and traditional means of gaining public notoriety) remain unconvinced that digital practices can be rigorous or salient. As skeptical reactions toward Wikipedia’s credibility illustrate, many academic professionals who hold sway over tenure promotions and search committees remain suspicious of digital practices, distrusting the viability of knowledge that emerges through work that is digitally prodused under the cultural auspices of openness, free access, and quick turnover.

Interdisciplinarity is at once condoned when tied to emergent digital practices. Weller’s discourse frames the “schizophrenic attitude toward Interdisciplinarity” (p. 1) as a problem of exploding traditions.

He exposes a reality in the academy where scholars of an “old guard” who seek to defend the boundaries of institutional disciplines clutch to analog tools and methodological constraints of an old paradigm. The compendium of digital scholars entering the academy, as both students and new faculty, are forcing those who protect the standards of traditional approaches to yield their posts as they crash institutional gates with smartphonestabletsGoogle AnalyticsBlogger, and Twitter – all tools that diversify research audiences, amplify scholar’s messages, and ensure that scholarship has a larger impact when published.

In short, the digital difference in scholarship is Interdisciplinarity since digital practices break down barriers. With digital tools come digital practices and standards that academic institutions must take into account as they move into the future. Academic definitions of knowledge and discipline are forced to shift with a paradigm of practice that threatens the authority of institutions everywhere (see Weller’s discussion in Chp 3 regarding the music and newspaper industries).

In Weller’s view, Interdisciplinarity doesn’t only apply to academic work. In reference to blogs as a genre of writing that leads to inquiry, Weller suggests that the “personal mix is what renders blogs interesting” as he explains that, in one of his favorite blogs, the author “mixes thoughts on educational technology and advice on the blogging platform WordPress with meditations on B-horror films. The mix seems perfectly logical and acceptable within the norms of the blogging community” (p. 4). The takeaway here is that digital culture remixes other cultures, including the intelligentsia, and this leads to new social formations. Scholars reinforce altered practices of engagement, learning, and knowledge production with their research, regardless of its focus or content, as they use digital tools to conduct it.

This means that the academy is changing from the inside out—a centrifugal force pushing out old hierarchies as it makes way for new networks. As Benkler (2006) suggests in Wealth of Networks, there is value in these networks, which is derived from the network itself and the swarm that embodies it. New networks have their own energy, which establish new modes of evaluation, new means of discovery, and new ways of making meaning through human action that gnaw at the edges of disciplines keeping old hierarchies sturdy and analog identities intact.

As Weller notes in his 3rd chapter, “Lessons from Other Sectors”, academia should take note of alternative resources that lead to new forms of research and learning before it loses its institutional hold on knowledge as an ideological authority. While this may seem a bit pretentious, the everyday experiences of academics who utilize digital tools frequently reveal the pertinence of such a warning. As digital culture subsumes disciplinary culture, Interdisciplinarity becomes more of a reality and ideological apparatuses are reshaped to fit “the social classes at the grips in the class struggle” (Althusser, 1970). The “weakness of the other elements in the ‘university bundle’ could become apparent, and the attractiveness of the university system is seriously undermined” (Weller, 2011, Chp. 3, p. 8) if traditions remain carved in blocks of stone.

Digital practices chip away at those stones.

The networked foundation for digital scholars’ work gives them the stability and solidarity to tackle complex, societal issues in ways that “old guard” academics never imagined possible. As a result, they may find their efforts having a greater practical impact outside of academia because institutional standards fail to adapt. This is a dismal attitude to take towards schools, which have made technological development and intellectual growth possible for an eon. However, as Weller warns, we should not confuse “higher education with the university system” (Chp. 6, p. 1); people will find a way to accrue new knowledge in any way available, and if that means subverting the dominant, traditional university system, so be it. The integrated perspective of Interdisciplinary pedagogy that Weller draws from Ernest Boyer, which makes “connections across the disciplines, placing the specialties in a larger context, illuminating data a revealing way, often educating non-specialists” (p. 1), is more hopeful than the critical view taken by many scholars caught up in the current system. This may be because hard working academics who strive to climb social hierarchies do not stand in solidarity together.

It is no lie that many graduate students and untenured scholars are bent on dismantling the good work of their brethren, who have spent a lifetime building the best stocks of knowledge they can in contribution to their discipline. In the end, these scholars belabor tired points in graduate seminars and faculty meetings, more concerned with asserting their self-centered agendas and personal politics as a way of accruing social capital, rather than fostering ongoing dialogue amongst their colleagues that would lead to new ideas and innovative inquiry. Digital practices tap networks that provide academics with outlets to collaborate unilaterally and avoid the traps of corporate machinery embedded in the institution, nullifying the need to burn bridges and step on toes as one makes their way in academia.

The limiting scope that arises when scholars squabble over methods of research, play tug-o-war with the line over authority, and willfully thicken tensions that arise between “hard” and “soft” sciences is perhaps the very reason why Interdisciplinary work evokes a laugh when suggested as a bonafide approach to research. Weller sees diversity as nothing to fight over. The habits of discipline are hard to break “and interdisciplinary work requires transcending unconscious habits of thought” (p. 2). Scholars who commune through digital practices begin speaking new, integrated languages that bridge gaps between research agendas rather than widen disciplinary lacunas. This is because, in their practical nature, digital technologies dismantle boundaries of institutionalized thought, not thoughts of institutionalized scholars.

So what would Weller’s Interdisciplinary model of higher education look like?

I asked my girlfriend this question after I finished reading Weller’s book. We both have different opinions about what counts as research. You might say that we both have trouble transcending disciplinary habits. While we both attended liberal arts universities in our undergraduate studies, our affiliations as graduate students differ. I study Communication, so I consider myself a humanities scholar; she dons the tag “social scientist” as she studies Applied Anthropology.

In our conversation, I envisioned a school where scholars work together to diversify fields of interest and broaden student perspectives. Explaining my ideas, I began brainstorming for a curriculum that put Interdisciplinarity at the center of pedagogy, instead of the margin.

At first she was intrigued by my excitement.

“Could you imagine it? … What if, as an undergrad, you could take classes that blended different areas of study? Something like, “Environmental Ecology and Spirituality”, “Statistics and Performance”, “Graphic Information Systems and Food Cultures” or “Creative Writing and Biochemistry”. How cool would that be?”

Her expression went from hopeful to disturbed. “Everyone would be really confused,” she said.

Perhaps.

But I don’t see that as a bad thing. Then again, I’m a digital scholar.

Creative Commons License
PLE's Doing Away with Discipline: The Way of the Digital Scholar by Nicholas A. Riggs is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at nicholasariggs.wordpress.com.