Category Archives: social media

Why don’t more tweets get @replies or retweets?

As Jennifer Van Grove wrote at Mashable yesterday, “research shows that 71% of all tweets produce no reaction — in replies or retweets — which suggests an overwhelming majority of our tweets fall on deaf ears.”

Sysomos, maker of social media analysis tools, looked at 1.2 billion tweets over a two-month period to analyze what happens after we publish our tweets to Twitter. Its research shows that 71% of all tweets produce no reaction — in the form of replies or retweets — which suggests that an overwhelming majority of our tweets fall on deaf ears.

Sysomos findings also highlight that retweets are especially hard to come by — only 6% of all tweets produce a retweet (the other 23% solicit replies).

I’ll admit, this doesn’t shock me, based upon my experience over the years.

Many of my tweets are retweeted but then I have above-average reach at @digiphile and engaged followers.

I know I’m an outlier in many respects there, and that the community that I follow and interact with likely is as well.

This research backs that anecdotal observation up: people are consuming information rather than actively interacting with it. But my own experience doesn’t gibe with that greater truth, and that’s why I chimed in, even though I know it may expose me to more of my friend Jack Loftus‘ withering snark. (If you don’t read him at Gizmodo you’re missing out.)

Why Don’t People @Reply more?

So what’s going on? I have a couple of theories. The first is that @replies are much like comments. Most people don’t make either. Even though social networking has shifted many, many more people into a content production role through making status updates to Twitter, Facebook, Foursquare (and now perhaps LinkedIn), the 90-9-1 rule or 1% rule still appears to matter most of the social Web. Participation inequality is not a new phenomenon.

That scope of that online history suggests that the behaviors of yesteryear aren’t completely subsumed by the explosion of a more social Web. Twitter and Facebook do appear to have diminished long form blogging activity or comments on posts, as netizens have moved their meta commentary to external social networks. And even there, recent Forrester research suggest that social networking users are creating less content.

In other words, it’s not that Facebook or Twitter sucks, it’s that human behavior is at issue.

It’s not that Twitter or its employees or developers per se are at fault, though you can see where, for example, Quora or Vark are expressly designed to create question and answer threads.

It’s that, for better or worse, the culture of the people using Twitter is expressed in how they use it, including the choice to reply, RT or otherwise engage.

If the service is going to grow into an “information utility” and become a meaningful venue with respect to citizen engagement with government, the evolution of #NewTwitter may need to add better mechanisms to encourage that interaction.

So is Twitter useful?

As Tom Webster pointed out at his blog [Hat tip to @Ed]:

As a researcher, if I were writing this headline, I would have written it thusly: “Nearly 3 in 10 Tweets Provoke A Reaction.”

I follow about 3,000 people on Twitter. If we assume that this lot posts five tweets per week (a conservative figure), that’s 15,000 tweets I could see in a given week, were I to never peel my eyes away from Tweetdeck. The Sysomos data suggests that of those 15,000 tweets, 4,350 were replied to or at least retweeted. See, I think that’s actually a big number.

In other words, 29% of tweets do get a response. That’s better than the direct mail or email marketing, as far as I know. I don’t expect a response from every tweet, though I’ve been guilty of that expectation in past years. That’s why I often ask the same question more than once now, or tweet stories again, or why I’ll syndicate a given post, video or picture into multiple networks.

I continue to find Twitter a useful tool for my profession. While inbound Web traffic from Twitter is negligible when compared to Google, Facebook, StumpleUpon or even Fark, I’ve found it useful for sourcing, sentiment analysis, Q&A, a directory, a direct line to officials and executives, and of course for distributing my writing. Twitter may not be essential in the same sense that a cellphone, camera, notebook and an Internet connection are in my work but I’ve found it to be a valuable complement to those tools. I’ve definitely sourced stories, gathered advice or recommendations through crowdsourcing questions there, with far less effort than more traditional means.

15 Comments

Filed under blogging, government 2.0, microsharing, research, social bookmarking, social media, technology, Twitter

Priceless: Futurama lampoons the eyePhone and “Twitcher”

As a huge fan of Matt Groening, a long-time Apple customer and a serious Twitter user, I found a recent episode of Futurama, “Attack of the Killer App,” to be crackling good satire. Excerpt below:

Yes, I know this all blew up back in July. I saw it tonight, and it made me laugh. The episode pokes fun at pokes fun at Apple and iPhone customers in all sorts of ways, along with viral video and Internet culture. As Engadget pointed out, Futurama critiqued modern gadget and social media obsession using 50s technology. The folks over at EdibleApple.com also highlighted that this is far from the first time Futurama has satirized Apple:

Futurama’s focus on Apple is, of course, nothing new. Series co-founders David X. Cohen and Matt Groening are both big Apple nerds. We previously chronicled Futurama’s subtle and comical use of Apple and Mac references over here.

The viral Twitworm that creates many zombies is one of the best pop references to botnets and IT security I’ve seen recently, too. And there was one more (seriously geeky) detail that Engadget, Edible Apple and Mashable missed:

“When did the Internet become about losing your privacy?” asks Fry.

“August 6, 1991”-Bender.

Why? That was the day when Tim Berners-Lee posted “a short summary of the WorldWideWeb project online. The real world has never been the same since.

WorldWideWeb – Executive Summary

The WWW project merges the techniques of information retrieval and hypertext to make an easy but powerful global information system.

The project started with the philosophy that much academic information should be freely available to anyone. It aims to allow information sharing within internationally dispersed teams, and the dissemination of information by support groups.

Reader view

The WWW world consists of documents, and links. Indexes are special documents which, rather than being read, may be searched. The result of such a search is another (“virtual”) document containing links to the documents found. A simple protocol (“HTTP”) is used to allow a browser program to request a keyword search by a remote information server.

The web contains documents in many formats. Those documents which are hypertext, (real or virtual) contain links to other documents, or places within documents. All documents, whether real, virtual or indexes, look similar to the reader and are contained within the same addressing scheme. To follow a link, a reader clicks with a mouse (or types in a number if he or she has no mouse). To search and index, a reader gives keywords (or other search criteria). These are the only operations necessary to access the entire world of data.

UPDATE: Ok, ok, sharp-eyed readers: The AVClub totally got that Bender reference.

3 Comments

Filed under research, scifi, social media, technology, Twitter, video

What is Gov 2.0? Carl Malamud putting the SEC online in 1993.

What is government 2.0?

Some days, it seem like there are as many definitions for Gov 2.0 as there are people. Tim O’Reilly says Gov 2.0 is all about the platform. In many ways, Gov 2.0 could be usefully described as putting government in your hands. And in three weeks, people will come from all around the world to learn more about what’s happening in the crucible of people, technology and government at the Gov 2.0 Summit in Washington.

I’m looking forward to the event and have been enjoying writing about many of its constituencies in the Gov 2.0 section of O’Reilly RadarThe Huffington PostReadWriteWeb and Mashable.

As I’ve previously observed in writing about language, government 2.0, jargon and technology, I believe the term should be defined primarily by its utility to helping citizens or agencies solve problems, either for individuals or the commons. Defining it in gauzy paeans evangelizing world-shaking paradigm shifts from the embrace of social media by politicians isn’t helpful on that level. That’s particularly when they’re broadcasting, not having conversations that result in more agile government.

Earlier this morning, I was reminded again of the history of the movement in the United States when, through serendipity, I ended up watching the first few minutes of Tim O’Reilly’s webcast, “What is Gov 2.0?” I participated in the webcast when it premiered this spring but was struck again by a particular vignette:

“The first person who really put Gov 2.0 on my radar was Carl Malamud. Carl is really the father of this movement in so many ways. Back in 1993, that’s pretty darn early in the history of the World Wide Web, he put the SEC online.

He got a small planning grant from the  National Science Foundation, which he used to actually license the data, which at that point the SEC was licensing to big companies.

He got some servers from Eric Schmidt, who was the chief technology at Sun. And he basically put all this data he’d gotten from the SEC online, and he operated that for something like two years, and then he donated it to the federal government.

Carl’s idea was that it really mattered for the public to have access to SEC data.”

He still does.

Just look at PublicResource.org, which is dedicated to making information more accessible. Consider his years of working towards Law.gov, which would provide access to the raw materials of our democracy.

For even more backstory, read more about his work as “Washington’s I.T. Guy” in the American Prospect.

Here’s what the SEC wrote about the effort in 1996.

The Commission would like to extend its appreciation to Carl Malamud and Brad Burdick of Internet Multicasting Service. We would also like to express our thanks to Ajit Kambil and Mark Ginsburg of New York University, Stern School (http://edgar.stern.nyu.edu). Operating under a grant from the National Science Foundation for the past two years, IMS/NYU have been providing the EDGAR database to the public via the Internet as a pilot program. It has been an unquestioned success and has provided a significant public service. After the grant came to an end on October 1, 1995, the SEC decided to continue making the vast EDGAR database available to the public from an SEC facility. In addition to the EDGAR data, the Commission has also made available numerous investor guides, Commission reports, and other securities-related information. Much more will evolve from this initial service in the coming months.

Today, I found it notable to be reminded that Malamud was supported by the future CEO of Google in getting the SEC online. That’s the sort of public-private partnership that has substance beyond a buzzword, like his FedFlix effort to digitize films and videos produced by the government,

If you’re interested in Gov 2.0 and open government, the entire webcast with Tim is about 51 minutes long but well worth the time.

If you have some time, I highly recommend it for perspective on the history of Gov 2.0 and insight into what could be possible in the future.

3 Comments

Filed under government 2.0, movies, social media, technology, video

Considering Disasters, Social Media and Crisis Congress at FEMA [#Gov20]

Filtering facts from dross is doubly important during a time of war, 
which is a critical frame for discussing Wikileaks, open government and new media hurricanes. It’s also true during hurricane season, when accurate
 reporting of storm tracks, damage and conditions is crucial. A 
capacity to maneuver more effectively in the most elemental of
 environments will be useful in 2010 and beyond.

One place that’s happening is at the top of the
 Federal Emergency Management Agency, where FEMA Administrator Craig 
Fugate has been leveraging technology to more effectively deliver on 
his mission.

While FEMA has taken tough criticism over the years, its current administrator brings a common sense approach and deep experience from his work in emergency management in Florida.

Last month, Fugate talked frankly the first “Crisis Congress” about social media, disasters and the role Crisis Commons and civil society efforts could play in crises.

There are 
good reasons for that conversation. According to Fugate, ESRI built 
the ability to add Open Street Map as a layer after watching their
 work crisismapping Haiti.

He also highlighted the Crisis Commons Oil
Reporter app as a prototype of the kind of robust app that could
 integrate FEMA open data.

“We work for the people, so why can’t they be part of the solution? “
said Fugate to the assembled Crisis Congress. “The public is a resource, not a 
liability.”

As a recent example, Fugate said that FEMA used reporters’ tweets during Hurricane Ike for
 situational awareness. “We’ve seen mashups providing better info than
 the government.”

Fugate has been out in front in leading an agency-wide effort to enable information and 
e-services to find citizens where they are, when they need to access it. For instance, a new mobile FEMA.gov allows citizens to apply for 
benefits from a cell phone.

More features are on their way to 
mobile platforms soon, too, according to Fugate. “I want an app on multiple platforms that knows
 where my phone is,” he said.

For more on what’s happening with FEMA in this space, read about last week’s Emergency Social Data Summit in Washington from the Red Cross or Voice of America or watch Craig Fugate talk about social media at InCaseOfEmergencyBlog.com.

1 Comment

Filed under government 2.0, social media, technology, Twitter

On Wikileaks, government 2.0, open government and new media hurricanes

The war logs from Afghanistan may well be the biggest intelligence leak ever. Wikileaks represents a watershed in the difficult challenge of of information control that the Internet represents for every government.

Aeschylus wrote nearly 2500 years ago that “in war, truth is the first casualty.” His words are no doubt known to a wise man, whose strategic “maneuvers within a changing information environment” would not be an utterly foreign concept to the Greeks in the Peloponnesian War. Aeschylus and Thucydides would no doubt wonder at the capacity of the Information Age to spread truth and disinformation alike.

In considering the shifting landscape above, Mark Drapeau has asserted that “government 2.0” is the “newest reality of new media.” I’m not convinced by his assertion that “no one is answering” the call to engage on that information battlefield. Given constant answers from various spokesmen over the past week, or this afternoon as the war logs leak breaks, that doesn’t appear accurate.

It’s similarly unclear to me that, were government agencies to develop a more agile media culture, it would sustain a more informed electorate. It’s not clear that it would lead to more effective data-driven policy, nor the transparency that a healthy representative democracy needs to thrive.

More nimble use of new media is important, particularly for the armed services, but given the existential challenges posed by energy, education, healthcare, environment, unemployment and the long war it’s hard to support the content that it should be the focus of open government efforts.

As for his consignment of “journalistic standards” to the company of “other quaint attitudes,” I’d posit that differentiating between propaganda, agitprop and factual journalism matters even more today.

I don’t see standards for separating fact from fiction as quaint at all; if anything, the new media environment makes that ability more essential than ever, particularly in the context of the “first stateless news organization” Jay Rosen has described.

There’s a new kind of alliance behind the War Logs, as David Carr wrote in the New York Times.

That reality reinforces that fact that information literacy is a paramount concern for citizens in the digital age. As danah boyd has eloquently pointed out, transparency is not enough.

What is the essence of open government?

Governments that invest in more capacity to maneuver in this new media environment (the theater of public affairs officers and mainstream media now occupied by the folks formerly known as the audience) might well fare better in information warfare.

Open government is a mindset, but not simply a matter of new media literacy. To suggest that the “essence of open government” is to adopt a workplace environment that both accepts the power of new media and adapts to it seems reductive. I’m unconvinced that it is the fundamental element of open government, as least as proposed by the architects of that policy in Washington now.

It would also seem to have little to do with what research suggests citizens expect of government, even those of a libertarian bent.

Citizens are turning to the Internet for data, policy and services.

Given an estimated 1.47 trillion dollar budget gap estimated for next year, I wonder whether citizens might prefer a leaner, more agile government that leverages technology, citizen participation and civic hacking than a more new media-savvy culture. Those are, after all, the elements of social government or government 2.0 that I’ve heard about from him for years.

There’s also the question of fully addressing the reality that in a time of war, some information can and will have to remain classified for years if those fighting are to have any realistic chances of winning. Asymmetries of information between combatants are, after all, essential to winning maneuvers on the battlefields of the 21st century.

There’s no doubt that government is playing catchup given the changed media environment, supercharged by the power of the Internet, broadband and smartphones. This week we’ve seen a tipping point in the relationship of government, media and techology. Comparing the Wikileaks War Logs to the Pentagon Papers is inevitable and not valid, as ProPublica reported

It’s not at all clear to me, however, how the military would win battles, much less wars, without control over situational awareness, operational information or effective counterintelligence. Given the importance of the ENIGMA machine or intercepts of Japanese intel in WWII, or damage caused by subsequent counterintelligence leaks from the FBI and elsewhere, I question the veracity of the contention that “controlling information better” to limit intelligence leaks that damage ongoing ops will not continue to be vitally important to the military for as long as we have one.

More transparency and accountability regarding our wars to the nation, Congress and president are both desirable and a bedrock principle in a representative democracy, not least because of the vast amounts of spending that has been outlaid since 9/11 in the shadow government that Dana Priest reported out in “Top Secret America” in the Washington Post.

Wikileaks and the Internet add the concept of asymmetric journalism to the lexicon of government 2.0 to the more traditional accountability journalism of Priest or database journalism of the new media crew online at Sunlight and elsewhere. Fortunately for their readers, many of those folks continue to “adhere to journalistic standards and other quaint attitudes and rule sets and guidelines.”

9 Comments

Filed under government 2.0, journalism, social media, technology

Twinfluence: A better measure of social capital at #DCWeek

A list of the most tweets from Digital Capital Week is making the rounds today.

The list, generated by the Bivings Group and “powered by TwitterSlurp,” does seem to accurately record the volume of tweets authored by individuals, as well as the number of @mentions generated by those tweets.

Over the course of the 10 day tech festival in Washington, there were 12,916 tweets by 2,425 people about Digital Capital Week or on the #DCWeek hashtag.

Well and good.

Unfortunately, these kinds of lists are akin to the measuring the influence of people on Twitter by the number of followers they have.

As Anil Dash put it earlier this year, no one has a million followers on Twitter. The “million follower fallacy” has since been validated by research, confirming the common sense understanding of many long-term observers of Twitter.

Instead of measuring tweet volume, looking at influence as measured by retweets, @mentions and click throughs is useful, along with trickier offline analysis that might include catalyzing people to do things offline. Charlene Li’s tweet that she was heading over to a keynote on open leadership, for instance, motivated some people to come see her speak.

To get a sense of influence, it might be useful to parse the list of “top #DC Week” Twitter accounts through TweetReach.

A rough “back of the envelope calculation” might compare the ratio of tweets to mentionsPulling from #DCWeek stats and using that ratio, it’s possible to generate a better list of the folks who had social capital during D.C. Week.

Andy Carvin (@acarvin), for instance, “only” tweeted 52 times but had 209 mentions.

Here are some other notable high ratios:

@frankgruber: 115 tweets, 259 mentions

@Jillfoster: 40 tweets, 104 mentions

@dcweek: 234 tweets, 767 mentions

@corbett3000: 96 tweets, 410 mentions

@digitalsista: 31 tweets, 82 mentions

@darthcheeta: 29 tweets, 82 mentions

@mikeschaffer: 33 tweets, 62 mentions

@noreaster: 46 tweets, 137 mentions

That ratio is confounded by the reach of an account, like @jeffpulver. 36 tweets, 462 mentions, but to more than 360,000 followers.

If you took that ratio and factored in reach of the user, it might come closer to reflecting a “top Twitterer” from a given event or #hashtag chat.

Have at it, math geeks.

The bottom line is that we don’t have terrific technological tools to assess the “best tweets” or top Twitterers after the fact, though tools like Twazzup.com can help in the moment.

For those who think it’s all silly, fine. But measuring audience sentiment and journalists’ coverage at events is likely to be something of interest to politicians, businesses and media alike. Here’s hoping that the analysis relies upon more than volume.

5 Comments

Filed under social media, technology, Twitter

Dressing for success in Washington: Suits, shirtsleeves and shorts

Much was made of President Obama’s choice on day one of his Presidency to doff his jacket in the Oval Office. When the White House unbuttoned its formal dress code, it was a symbolic move that reflected a larger shift to more casual business attire in culture. While some may feel the President’s showed a lack of respect for the office, for many Americans, doffing the jacket in office and rolling up shirt sleeves to get to work simply reflected their own experience.

For many people after all, it’s about whether you can get the job done, not what you’re wearing when you do it. That issue came into sharp relief yesterday, when some speakers at the 140 Conference held during Digital Capital Week in the District of Columbia came under criticism for not wearing pants.

I wish I could wear shorts more often around Washington. It’s now officially moved into “absurdly hot season” and wearing a suit is miserable. That said, there’s often no way around it. This week, for instance, I wore a suit to the Center for American Progress for the Law.gov workshop, since I knew I’d be meeting John Podesta and other lawyers who put stock in that kind of professionalism. I’ve pulled my suit on to go to the ballet at the Kennedy Center, to go to Congressional testimony or to attend a landmark event on community health data at the National Academy of Sciences.

That said, I wore linen shorts, sandals and a collared shirt to the Gov 2.0 day at Digital Capital Week, since it was damn hot, and that fit my vision of summer business casual in the District. And yesterday, at the 140 Conference, I wore jeans and an untucked dress shirt, since that fit the image of the tech journalist I am these days.

Mike Schaffer, a self-described social media strategist here in DC, focused on elevating the style of online communications professionals in public. Respectfully, I think he missed the point. In every situation above, what I wore mattered but, to my audience, was beside the point.

Peter Corbett may have worn shorts and a t-shirt, as seen on the left, but, in his role, it didn’t matter. Since I know him and have respect for the work he’d done for D.C. Week, at iStrategy Labs for Apps for the Army, and other initiatives, I know what he’s done.

I also believe that the informal nature of 140 Conference requires no more of us than that we represent ourselves as ourselves and share what matters, much like, perhaps, we might approach Twitter.

Representative Mike Honda (D-CA) may have come dressed in a suit, as you might expect from a Congressman in D.C., but what he said reflected that sentiment:

“It’s about sharing who you are, rather than trying to sell what you’d like to have people believe about you.”

By focusing on what people wore instead of what they said or have done, I’m not sure Schaffer honored the hard work of the organizers, nor the quality of the experiences that, say, Justin Kownacki shared.

Kownacki, whose cargo shorts drew attention at the D.C. 140 Conference, tweeted afterwards that “I don’t believe in wardrobe labels. I judge words and actions, not packaging. I’m amused by the #140conf attendees who think my wardrobe ‘killed my credibility.’ Who knew packaging dictates truth? Wardrobes provide a shorthand by which we can exclude & ignore. Makes life easier for traditionalists & streamliners, I’m sure.”

I’ve been to dozens of tech conferences, many of which featured people dressed to the nines with little substantive tactical or strategic value.

I can frankly say, as someone who has overdressed on occasion, that sometimes wearing shorts and a hip t-shirt is absolutely the right choice.

Tools and Togs both matter

Schaffer wrote that “a carpenter is known for getting the job done, not which saw he uses.”

That’s both true and untrue. Master builders who can afford to work with Bosch or DeWalt tools do so because of the quality of the tools and the precision product they allow. It’s true that someone with lack of knowledge to use them will fare far worse that a worker without, just as a rube with an expensive composite fly rod might be outfished by a boy with a cheap piece of bamboo and string, if the young man knows where and how to apply his simple rig. What you do with the tools matters more than their quality, but don’t overlook the fact that those tools do matter.

If someone contracts with a professional videographer to create a broadcast-quality ad and she showed up with a disposable camera and a vintage iBook, what would the new client think?

Consider the building example again. Carpenters are known for building things out of wood. Getting the job done is dependent upon the general contractor who employs him or her, or the reputation of the master builder that is hired. I have some familiarity with carpentry, after working as an apprentice for 18 months in Massachusetts. In that role, I wore shorts when it was hot, Carhardt pants when it wasn’t and many layers of fleece and polypro when it was frigid. We dressed as needed to get the job done. If someone showed up on the job site improperly dressed, or without boots, a belt, gloves and a full set of tools, he couldn’t get the job done without a loan of same.

Working in digital media is no different, in the sense that what we wear what we need to to accomplish a goal, in the context of the social mores of the space we move in.

Virtually, that might mean creating a well-designed website that is standards compliant. Or developing a mobile app for a conference or service. In the social media world, it means adding an avatar, bio, link and other elements that fill out a profile before sally forth. Dressing to impress can mean many things, but in the end, it’s what you can do and have done that will matter most to your clients, customers and audience. Did I get the story right? Will the house stay sound for decades? Is this a sustainable business? Does the app work?

Given the monumental challenges that lie ahead for government officials in Washington and around the nation, I suspect many citizens would rather they focus on getting real results, narrowing budgets, passing effective legislation and developing effective regulations that address issues in the financial, technical and environmental space, rather than any wardrobe choice.

As for me, I hope I can wear shorts more often around Washington.

10 Comments

Filed under article, blogging, friends, journalism, social media, technology, Twitter

Why is Twitter hiring a government liaison? Thoughts from @SG and more. [#gov20]

Twitter goes to Washington

Twitter goes to Washington?

A job posting for a government liaison has ignited plenty of controversy in the blogosphere, Twittersphere, and, one might imagine, in the halls of Twitter HQ out in San Francisco.

The Department of Human Services’ new media guru, Andrew P. Wilson, offered up a thoughtful “Top 10 Requests for the New Government Liaison at Twitter.” Adriel Hampton, a former Congressional candidate and a leading voice in the government 2.0 community, wondered if Twitter could reimagine democracy.

And earlier today, Mark Drapeau, the director for innovative engagement at Microsoft, considered whether government 2.0 had passed Twitter by.

I don’t disagree with Mark that it would be useful for Twitter’s staff to be more of a part of the Gov 2.0 community, as Jack Dorsey has at times been, but I was surprised to read Drapeau write that “the help is really not needed.”

Given how lawmakers are tweeting, with many mistakes, lack of engagement or misunderstanding of conventions, some guidance would seem to be of use. More to the point, the fact that they’re not tweeting at all is no doubt of interest to Twitter HQ.

After all, for every Claire McCaskill or Darrell Issa, there are a dozen Congressmen and women who aren’t using the service well – or at all. Many others have staff do it for them. Focusing on the role of Facebook’s Adam Conner here on Capitol Hill is spot on; hiring someone who understands the lingo, conventions and effective communications strategy for this role would be useful for both government and Twitter itself.

I found Drapeau’s selection of Kawasaki as a model to be particularly surprising, given the polarizing effect his use of Twitter has had, particularly with respect to “ghost tweeting.” Using Twitter authentically and personally is precisely what has been effective for politicians like Cory Booker. The blowback that came from people learning @BarackObama wasn’t tweeting himself should be instructive.

My own comments aside, Twitter’s VP of communications, Sean Garrett (@SG), shared more insight on Drapeau’s post into the microblogging juggernaut’s thinking in posting the job opening. I reproduce his comment below:

I’m Twitter’s head of communications and I have spent very little time ivory towers in my career. You?

Before Twitter much of my career was devoted to building bridges between the technology community and the policy world. Did things like helping start TechNet in 1997 and worked with them for a couple years to creating the first technology-policy focused communications consultancy and serving as a partner there for 6 years. This is all to say that I have a pretty decent view how policymakers and political types view and use technologies, tech policy issues and where gaps remain.

We’ve done a lot of research and talked to a lot of people in Washington (including members of Congress and staffers, administration officials, think tank folks, etc) and elsewhere about what would be a good first step for us as we build a policy presence. That step is this position.

I also think it is important to recognize that when you say that this is a type of position that should have been filled one or two years ago that in January of 2009, we had 22 employees. As recently as last October, we had 70 employees. We just crossed the 200 barrier and now have the ability to do things proactively as opposed simply fight to keep the service up and do the basics everyday.

Do you think that Twitter should have made employee number 23 a DC-focused position or a network engineer?

Finally, and most constructively, thanks to the great work of the Gov 2.0 crowd that you mention, this hire won’t have to start work on day one with a blank slate. There’s a whole community that he or she could tap into to become more effective faster. They can attend the right events and get involved in the existing conversation that promises exciting transformation.

At the same time and in just one example, there are real live members of Congress who at this very moment are wrestling with whether to open a Twitter account and, if so, how to get the most out of it. Having someone being able to walk over to their office and sit down with their team is going to be more helpful than telling them to just follow Guy Kawasaki or absorb the collective wisdom of the “countless consultants working inside the Beltway” through osmosis.

As the relationship of lawmakers, citizens and technology companies evolves, one thing is clear: there will continue to be plenty of discussion about how social media disrupts the playing field here in Washington and beyond.

UPDATE: Steve Lunceford of GovTwit posted an interview with Sean Garrett this morning that provides more detail on Twitter’s search for a government liaison. It’s worth reading the entire post but two answers will be of particular interest to the government 2.0 community:

Q: Is this U.S Federal-focused only, given that you’re hiring in Washington, D.C.?

@SG: Twitter is not just interested in government from a U.S. federal standpoint, but [also] outside the Beltway in states and localities. We’re obviously global as well, and this new role will look not only to U.S., but also how other governments use or don’t use Twitter; how campaigns work/don’t work and how they translate from one level to another.

What need is Twitter trying to fill here?

@SG: We believe Twitter will be better off having a direct dialogue with public officials who use our service. And I would say that yes, the “Twitter 101″ conversations are still important. Many in D.C. are eager to engage on Twitter and we want to help them maximize this experience. And, there are some who don’t understand how to use it or where the value is. We’d like to change this where we can. Having a point person that can help verify government IDs, someone that can be down the street to meet with officials in their office, or serve as an overall point person for government outside the Beltway is the initial goal here.

14 Comments

Filed under social media, technology, Twitter

On the failure of Quit Facebook Day, Social Utility and Privacy

"How to split up the US" by Pete Warden

At 10:19 PM EST tonight, the organizers of Quit Facebook Day reported that all of 33,313 people had dropped out of the Facebook universe. That figure represents a tiny fraction of Facebook’s 400 million users.

That miniscule percentage does not represent all of the people that have quit or deactivated their accounts in the past month, but it certainly implies that there hasn’t been a widespread movement to leave the social networking giant.

I’ve been a Facebook user since 2006. I never had the “college experience” of having an electronic Facebook  but found it instantly useful as a means to stay in touch with family, friends, classmates and former colleagues.

It was, clearly, just what its creators said: a social utility. I didn’t care for Facemail much – and still don’t – but many features, like IM, the newsfeed, photos, people search and video are powerful methods for augmenting Facebook’s users to communicate with one another.

Facebook has become the Information Age’s White Pages, for good or ill, extending the service it provided Harvard students with contact details for one another back in 2004 to hundreds of millions around the world.

The changes to privacy and publicy from the past six months, however, fundamentally shifted the reality of using the platform for many users, particularly those who had trusted the site with sensitive information about their lives, friends or other affiliations. For those who never shared information that could be damaging, the shift to a public default meant little. For people with more to lose by virtue of the nature of the cultures they live within, gender or health status, such changes have much greater significance if revealed to a school, parent, employer or government.

By and large, research cited by digital ethnographers like danah boyd shows that many people remain ignorant of how public their updates are. And people care about their online reputation in 2010, given that every organizational gatekeeper or first date is rather likely to Google you.

Recent amendments to the privacy policy and user interface notwithstanding, concerns that those who have the most to lose are not being considered persist amongst privacy advocates. Reports of account deletions, page takedowns or harassment in other countries, with limited recourse for users, also reflect uncertainty over the future of the Internet’s #1 site. Recent shifts to Community pages have also resulted in consternation on the part of both brands and government, though Facebook’s spokesman promises that such features are in development and will be improved.

I have not quit Facebook. I continue to find it useful as a social utility, as before, applying Facebook as a “people browser” for those with whom I want or need to stay connected. I use LinkedIn as a business utility and Twitter as an information utility. I doubt those use cases will change for me personally this year, although I’m watching carefully to see how the most recent privacy controls are implemented.

Recent decisions of Facebook’s management around privacy and personalization may bring its operations under regulation by the FTC, as is already the case with privacy commissioners in Europe or Canada. If consumer harm due to management actions were proven by any of those entities, it would significant implications for the innovation in this sector, although it might also cause developers to build privacy into such platforms from the outset.

As government entities continue to create pages, they will likely be obligated or even required to archive conversations there using Facebook API or other tools. After all, there’s substantial utility to measuring and analyzing the interactions there for those that wish to understand public reaction to policy, candidates or initiatives. If the terms of service are not clearly described to those interaction with government employees, additional layers of complexity around privacy and the rights of consumers will also be in play. And problems in Facebookistan, as Rebecca Mackinnon writes, extend abroad to exposing at-risk members of society to abuse, deleting activist accounts and taking down Pages.

There’s much more to electronic privacy than social networking, not matter how large Facebook becomes. Putting online privacy in perspective is essential. And I tend to agree with danah boyd’s position that quitting Facebook is not enough, especially for those of us in the tech media that have some degree of influence in informing the public and holding the social networking giant’s management to the standards they and law set for ethics and business practices.

But, fundamentally, human relationships are about trust. If we cannot trust that the manner in which we connect, filter and share information with one another will not change with the business needs of a platform, our relationships will be damaged. We have only to look at the statistics on jobs lost, applications denied and romances sunk through virtual actions to understand how those consequences may play out in our offline lives.

17 Comments

Filed under social bookmarking, social media

Using social media for better journalism: @Sreenet at #ONADC

“I used to say “justify every pixel,” said Sree Sreenivasan. “Now I say earn every reader.”

Sreenivasan, a dean of student affairs and professor at the Columbia Journalism School, went beyond “what Jeff Jarvis calls the blog boy dance,” offering up more than an hour of cogent advice, perspective and tips on social media to a packed classroom populated by members of the DC Online News Association at Georgetown’s campus in Virginia.

Where once he used to go around newsrooms to talk about email, then Google and blogs, now he’s moved to new tools of digital journalism grounded in a reciprocal relationship between the audience and the reporter. After all, Sreenivasan had to tailor his talk to the audience, a collection of writers, editors and producers already steeped in the tools of digital journalism, moving quickly beyond listing Twitter, Facebook and LinkedIn to the tools and services that that enable journalists to use those social media platforms improve their reporting, editing and careers.

“The best people find the things that work for them and skip the rest,” said Sreenivasan. Services need to be useful, relevant and extend the journalist’s work. Quoting a student, now at the Wall Street Journal, Sreenivasan observed that you “can have greatest content in world but will die on the vine if we don’t have a way for our readers to find it.” He classified the utility of social media for journalists into four broad categories:

  • tracking trends on a given beat
  • connecting with the audience, where ever it is online
  • putting that audience to work, aka crowdsourcing
  • building and curating the journalists personal brand

“Tools should fit into workflow and life flow,” he said. “All journalists should be early testers and late adopters.” In that context, he shared three other social media tools he’s tried but does not use: Google Wave, Google Buzz and Foursquare. Sreenivaan also offered Second Life as as an example, quipped that “I have twins; I have no time for first life!”

The new Listener-in-Chief

One group that undoubtedly needs to keep up with new tools and platforms is the burgeoning class of social media editors. Sreenivasan watches the newly-minted “listeners-in-chief” closely, maintaining a list of social media editors on Twitter and analyzing how they’re using the social Web to advance the editorial mission of their mastheads.

He showed the ONA audience a tool new to many in the room, TagHive.com, that showed which tags were trending for a group. What’s trending for social media editors? This morning, it was “news, love, work, today, great, people, awesome and thanks.” A good-natured group, at least as evidenced by language.

Sreenivasan also answered a question I posed that is of great personal interest: Is it ethical to friend sources on social networking platforms?

The simple answer is yes, in his opinion, but with many a caveat and tweaks to privacy settings. Sreenivasan described the experiences of people in NGOs, activists and other sources whose work has been impaired by associations on social media. To protect yourself and sources, he recommended that Facebook users untag themselves, practicing “security by obscurity,” and use lists. As an example of what can go wrong, he pointed to WhatTheFacebook.com.

Where should journalists turn next for information? Follow @sreenet on Twitter and browse through the resources in his social media guide, which he referenced in the four videos I’ve embedded in this post. He’s a constant source of relevant news, great writing and good tips.

1 Comment

Filed under blogging, education, journalism, social bookmarking, social media, technology, Twitter, video