Subscribe: The Semantic Caucus
Added By: Feedage Forager Feedage Grade B rated
Language: English
blog  congress  content  data  information  make  media  much  open  people  political  semantic web  semantic  time  web 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: The Semantic Caucus

The Semantic Caucus

Politics and the Semantic Web

Updated: 2014-12-20T13:34:07.608-08:00


What's that line about to a hammer ev... (UPDATEDx2)


This is in response to Connie Schultz's article Tighter copyright law could save newspapers:David Marburger is a First Amendment lawyer at Baker Hostetler who has represented newspapers, including The Plain Dealer, for nearly 30 years. Daniel is an economics professor at Arkansas State University. A panel discussion about newspapers' future sparked David's idea on how to save them. "I heard [Plain Dealer Editor] Susan Goldberg talking about how revenue from online advertising is pathetically low and newspapers can't recoup their investment. As soon as she said it, the wheels started turning. You have all these free riders like Daily Beast and Newser and local television stations aggregating your stories online while diverting readers and advertisers from your site. And they're doing it for a fraction of the cost of the newspapers that generated the original copy. "And it hit me: All those theories out there on how to prop up newspapers -- why isn't anyone saying this? Why aren't we talking about how this free-riding by aggregators affects the market rate for everyone?"What's that line about to a hammer everything looks like a nail? If the solution to your business model is funding an army of lawyers than it's time to write a new business model. And the last person I would be asking business advice from would be an economist. Open any business section on the planet for my reason why.This is the same tactic that the record industry tried with file sharing. It was also the tactic used by the old UNIX vendors against the modern open source Linux operating system. It turned their customers into their enemies. You can guess how well that's worked for them. Technology savvy bands such as Nine Inch Nails discovered that they could make more money by working within this new dynamic instead against it. Eventually record companies retooled their relationship with technology and learned how to profit off of it through commercial download sites such as iTunes and ring-tones.It was easy for record company executives to blame technology as they funded an endless army of boy bands and Britney Spears clones. "Why oh why aren't people buying our records? It's those damned file sharers." The fault was not in their stars, but who ever figures that out?The money in modern media is metadata. Google makes its millions by trying to find as much information as possible about the people viewing content so that they can create targeted marketing. One of the all time best sources for that information is newspaper portals. I would bet you a Graeter's sundae that if you talked to anyone in that industry and they would tell you that newspapers are barely scratching the surface of leveraging the amount of information that sails through their server logs every time someone clicks a link on their sites.One of the things that I discovered back when I was blogging was that by pooling together logs from other allied sites we could find out geometrically greater information about who was visiting our collective sites. It was how we discovered who was anonymously posting threats on our sites during a certain Senate race that you might remember. Newspapers, thanks to their intense reader loyalty and their strong bonds to local communities, are one of the greatest metadata assets on the planet. If I was them I would be funding an effort to maximize leveraging those relationships instead of pursuing legal or political solutions.Google and every other marketing entity is all about finding information about the people viewing their content. There is no better source for that metadata than the readers of newspapers.In the future everyone is a parasitic aggregator. (Ideas are parasites. They feed off of the brains of the living.) A prime example of this is Facebook, where I found this article since Connie posted it there. With a click of a button I could share this article with all of my friends.The thing is that the technology is already being developed to make aggregate portals obsolete. By the time you've figured out how to profit out of their exploitation of y[...]

OpenLink Virtuoso Windows Notes


I'm keeping notes on playing with OpenLink Virtuoso 5.0.7 on Vista.

The Web's Bridegroom Cometh


I'm convinced that the marriage of web content with semantic web meta-data is the future of the Internet. (This isn't a big leap and lord knows I'm late in saying this.) The main reason why people like me are so late to jumping on board this slow moving freight train is that while the theory is all there, and is very solid, the tools just haven't been coming. There's little obvious incentives for the groups that are the builders of the web to make their products semantic web friendly, and even when they do their only interested in doing so to lock you into their product. Linking things together, even though it is the entire purpose of the semantic web, is like pulling teeth. The reason for this is simple: there is an almost complete disconnect between traditional databases, which are the life blood of the web, XML, and semantic databases. The engines are different. Their query languages are different, and their interfaces are completely different. While there has been a lot of effort to create a bridge between relational databases and XML, there has been little to add a bridge connecting it to RDF semantic data.

Most of the initiatives have are hard time at acceptance because they make the same mistake that most failed projects make: they completely reinvent things instead of trying to seamlessly integrate with what already is. Very few people are going to abandon their desktop for a "semantic" desktop. They've invested too much intellectual and emotional capitol in their existing interfaces. Forcing them to abandon everything that is in addition to learning a whole new way of envisioning data and its relationships is completely unrealistic.

Semantic Web data is designed to make it easier for machines to process what's on the web. It is a construct that needs to be relegated primarily to the machines in order to be effective. When the users don't even think about it because they are so lost in the ideas, THAT'S when it can be considered successful. (This is true of all of what I call the invisible arts: typography, movie soundtracks, etc)

There are two products that I know of that have successfully bridged this gap: Fedora-Commons and OpenLink Virtuoso. Each has their strength and their weaknesses.

I'll provide a comparison between them soon.

Getting Credit from Old Media


This blog, sparked by Jill Miller Zimon, fascinates me since it's totally contrary to my philosophy as a political blogger. You see, from the start my main goal was propaganda. I was a one man left win pinko version of Fox News pouring liberal poison into the ears of the media elite.

My theory was that most regular reporters were too jaded/overworked/lazy to actually write good content, so that merely by reporting on stories that didn't get any coverage you could completely shape how the media covered them. The same way that old media acts as a catalyst/shaper of how the blogosphere reacts to mainstream stories, a blog that targets stories that don't get the coverage could have the same effect on them. It's human nature. The first person to comment on anything shapes how the collective reacts to it. As I like to say, the first person to ride into dodge is always the sheriff.

Bottom line, I wanted them to copy me, and I didn't want any credit, because if they did it would lessen the power of the story. Blogger sitting in his basement who doesn't even live in the district reports... has a lot less power than the venerable Cincinnati Enquirer, why we are venerable is beyond us since we've been endorsing candidates such as Jean Schmidt that have been sucking the lifeblood out of this great nation for decades, reports... I know that it worked because reporters have told me that I had an influence.

This is why I refused interviews where I would be on camera. The story was never about me. It was about my agenda. In the case of the special it was the Democratic Party's assault on the heart in the Republican base. I've embraced the medium and was playing the game as the landscape dictated. The fact that others took credit for my work even though they only showed up at the last minute was the price of a job well done.

In the end the only credit that really meant anything to me came from insiders within the Schmidt camp.

Here's an example. I'm on a mailing list where somebody was complaining about how some of his pro Obama content was being copied, and that people weren't taking it down fast enough after they complained. I asked him if he thought that the people writing slanderous emails about Obama were complaining that they were being forwarded? He should be happy that he's being "ripped" off, because that means that it is viral and thus having impact upon people. In politics, as in everything, the profit comes from selling the sausage, not from making it.

Jill Resigns


Jill Miller Zimon resigned today from working for the Plain Dealer on the Wide Open site. I can't even imagine how difficult it was for her. This was her vision.
Jean and several other people at the PD and in the blogosphere know that for almost two solid years, I’ve asked and written about and pushed issues related to integrating traditional journalism, new media, bloggers and citizen journalism - all in the name of providing better and more content for readers who consult more and different types of sources for reading news and information. Someone confirmed to me this afternoon, when I said to him, “I know there must be some folks saying, about how my efforts to integrate these groups were in vain, ‘I told her so,’” that, yes, some people are saying, “I told her so.”
She's right to see this as the future, and shouldn't let the bad planning and implementation of others dissuade her. It was a gutsy move. Unfortunately it all ended just when I thought it was really hitting its stride.

The post I had been working on concerning Wide Open was called Harmonic Convergence. You can see how we disseminate information is becoming more plastic and less bound by any single media. Print media has the most to lose from all this... thus their klutzy overreactions. Unfortunately, they have yet to realize that they also have to most to gain.

This all makes me sad. It was nice seeing people trying new ways to rise above the static.

The old rules are dead. At this point it's all about the old referees fighting to keep their jobs.

An Obfuscated Dealer


For over a month now I've been wrestling with a post about the Cleveland Plain Dealer's Wide Open blog initiative, an experiment in hiring four partisan bloggers to create an online political dialog. Today's events made things moot when they fired the Ohio political blogger that I have more respect for than any other, Jeff Coryell, aka Yellow Dog Sammy. Their reason was simple: Congressman Steven LaTourette complained about Jeff because he'd been critical of LaTourette's campaign in the past.I was curious to see how the paper would deal with the double edged nature of hiring two bloggers which have been lauded in right wing circles for leading the charge in trying to paint our current Governor as a champion of NAMBLA. Personally, I found it to be the most disgusting smear campaign I had ever seen. To turn a difficult non vote of conscience by a trained professional and try to twist it using the basest, most pretzel-like forms of twisted faux-logic was the kind of intellectual dishonesty reserved for the archest of partisan hacks. Back when I myself was a "partisan blogger" their actions in the matter caused me to pull links to their sites; an action that pained me given how BizzyBlog had stood up for me in the past. I must confess that it did give me a certain amount of ironic joy in knowing that their actions actually helped elect Governor Strickland and Senator Brown by promoting a Republican strategic campaign completely devoid of substance or integrity.I was wondering how the PD would deal with things when their dark craft resurfaced the next time the GOP decided to anoint an obviously empty suit. Well... today's story has instead shifted things in an entirely unexpected direction.The right thing to do for Jeff's fellow Wide Open bloggers is to resign. The actions by the Plain Dealer have proven the exercise to be a farce. Before last years Governor's race I had a lot of respect for NixGuy and BizzyBlog, and as I like to say, it's never to late to do the right thing.You can read more about the situation at Buckeye State Blog as well as Jeff's post here.Here's Jeff's release on the subject:FOR IMMEDIATE RELEASE30 October 2007My name is Jeff Coryell, although I have also written on political blogs under the pseudonym Yellow Dog Sammy.Today I was terminated from my engagement as a freelance blogger at the blog "Wide Open" as a direct result of intervention of Rep. Steve LaTourette (R-Bainbridge Township) of the 14th Ohio Congressional District, in retaliation for my previous blogging about his re-election campaign and my financial support for two of his election opponents.In August, the Cleveland Plain Dealer hired four Ohio political bloggers to contribute to a daily political group blog called "Wide Open," located at . In order to assure balance, two bloggers with liberal leanings were chosen, and two with conservative leanings. The other participants are Jill Miller Zimon of Writes Like She Talks (, Tom Blumer of BizzyBlog (, and Dave of Nixguy ( participation in the project soon came to the attention of Rep. LaTourette. I had written extensively about LaTourette's 2006 re-election contest and I explicitly supported his challenger, law professor Lew Katz (D-Pepper Pike). I also wrote about what I regard as the suspicious connection between large amounts of campaign cash LaTourette received from the Ratner family of Cleveland, of the Forest City real estate empire, and their receiving an enormous contract to develop 44 acres of the Southeast Federal Center in Washington DC. LaTourette was a member of the powerful Committee on Transportation and Infrastructure and Chair of the Subcommittee on Economic Development, Public Buildings and Emergency Management, which oversees the agency that awarded the contract (the General Services Administration). Tha[...]

Beyond Corruption


Lawrence Lessig is one of my heros. I returned to computers after a ten year absence when I had finally figured out a good use for them: church hymns. I had taken on a job directing a small Lutheran church choir in Stockton, California. I was quickly frustrated trying to find easy music that we could sing that matched up with the weeks Bible readings. This would be a great job for a computer program I thought... a sort of database driven digital hymnal. I quickly got to work learning as much as I could about computer databases and the maddening world of copyright law. Time and time again Professor Lessig's name came up as the one positive counter to the forces trying to pervert copyright laws in favor of corporations over people.

I was thus with great interest that I read that he was switching his efforts from copyright reform to the greater issue of political corruption. One thing worries me though... he's arguing the negative.

To me it's not enough to just fight against "corruption." There are too many shades of gray in politics. It's my money... show me how it's spent. Today there is no easy way to do that. Everything is couched in the vaguest of terms and only then during the biannual feeding frenzy that is the election cycle.

I am a process man. It's not enough to just stick a label on your lapel and show how well you can recite your party's focus group tested talking points. Until it is easy for every American to hold their representatives accountable for the trust that we've given then, we can never really trust any of them. Anyone who's ever run a business knows that your employees will steal from you. If you don't monitor what they are doing with your money you are asking for them to steal from you.

It's not just about corruption... it's about accountability. If the bottom line you hold your employees to is that they aren't stealing from you you've got one fucked up company. Well my friends... welcome to the United States of America.

Lessig on "corruption"


(object) (embed)
(via Boing Boing)

On The Radar




The wonderful Open House project has a post up of a video presentation by open government pioneer Carl Malamud on his Washington Bridge project given at a Google Tech Talk on May 24, 2006. There's also a web page for the talk that includes a business plan for the venture. The idea was to provide streaming archived video of Congressional committee hearings.I haven't been able to find anything recent about the proposed plan, however since then he has sent an unsolicited report to the Speaker of the House on streaming committee data and on August 3rd provided an update that does look promising:... The analysis of this short-term solution by the Advanced Business Solutions unit has concluded: “from a technical stand point we now know this is very easy and inexpensive to do.” 3. As a long-term strategy, the Office of the Speaker has conducted a large number of meetings, as has the Committee on House Administration, the Chief Administrative Officer, and several other groups. There is a concrete, funded set of initiatives to finish the wiring of the rooms so that all hearings have video coverage, and it is clear from a technical point of view that it is possible to achieve the goal of broadcast-quality video for download on the Internet by the end of the 110th congress. The recommendation to adopt that goal is currently awaiting action from the Office of the Speaker and the Chairman of the Committee on House Administration.The thing that's really interesting about the video is how the Google people (25' in) latch onto the problem that fascinates me the most: annotating real time data streams. There aren't enough hours in the day for me to listen to my collection of old time radio shows let alone watch everything that is going on in Congress. How do I make it easy for the public to collectively annotate information in real time so that it can be easily processed and reused? It's hard as hell to annotate video and unless video is annotated it's hard as hell to leverage it with automated tools. You need to be able to make a media clip start at a specific moment rather than playing the whole thing. You need to be able to easily sync up the video and transcripts using some sort of marker. I'd like to be able to drag it onto my blog editor, and have it automatically embed the clip at that point as well as add default folksonomy. Beyond that synchronized transcripts so that I can do quick quotes. As he points out the transcripts are out there but they take months to be made available to the public if they ever are. How do we synchronize video and audio to transcripts so that people can easily quote from it?We need tags that work against a common ontology so that everyone is working off of the same page. When I say HR1515 am I talking about Representative Harris Fawell's amendment to the Balanced Budget Act of 1997, the 110th's Congress' bill to amend the Housing and Community Development Act of 1974 or the Georgia General Assemblies bill honoring the life of Linton Webster Eberhardt, Jr ?Ideally websites and tools should take care of those headaches for people. For instance... if I drag a bill in Thomas onto my blog editor, it knows that that is the one I am writing about, grabs the embedded meta-data from the page, and ads it as a folksonomy tag.[...]

A True Congressional Record


I received an email last week from Derek Willis @ the Washington Post concerning a post I did last week. He works on their Congressional Votes database and has been kind enough to let me reprint his email. First off here's what he was responding to: If print media wanted to cement their place in 21st century information streams they would work together to provide such semantic web reporting services. The Washington Post's US Congress Votes database is a start in this direction, even if it's crude and doesn't dive below the surface of what's really going on on Capitol Hill. Maybe I'm just super cynical, but I've always felt that the votes made in Congress are just misdirection in the 3 card monty game that is public governance.And here's Derek's email: Chris,Saw your posting on media and the growth of the semantic web, and I'm interested in hearing your ideas about how to improve our votes database (although I don't agree that it is crude and barely scratches the surface, given my experience as a reporter for Congressional Quarterly). If you've got some specific suggestions on ways we could make it a better service, I'm all ears.Derek--Derek WillisDatabase Editorwashingtonpost.comFirst off, I want to apologize for coming off so snarky in talking about work that I think is really cool and obviously took a lot of effort on their part. That was wrong of me and a sin common online. The Washington Post is clearly leading the way in an area that I think is an extremely significant step in 21st century democracy. However, leading the way doesn't mean that you've completed the journey. In hind sight I could have chosen a much better way to describe something that I think is really innovated both from technology as well as media perspectives.For me the interesting aspect of blogging is the interaction between streams of thought. Much more so that just writing in a vacuum. As such I'll confess that one of my tactics as a writer/debater is to throw off quick confrontational asides in order to provoke reactions from people. This opens up a much more dynamic discussion of issues than what one would usually see in academic circles. And so, here's why it's crude and doesn't scratch below the surface ;-)In examining the Washington Post site I'm going to use the two criteria that I think are most significant in terms of evaluating a semantic web application: depth and plasticity.Depth. How deep is the data? How much does the reflect that's really going on?For example: I'm playing a gig with my band.OK... When? Where? What band? Who's in the band? What do they sound like? These would be the minimum data dimensions that I would think needed to be provided in order for people to be able to understand what's going on. Does the online listing for that gig reflect the true depth of what's going on?Plasticity. How flexible and reusable is the data?With the above example the basic thing would be the event having some kind of meta-data so that visitors can easily import it into their calendars. Adding hCalendar Microformat annotations to the listing on the page is one way to do this. Then if I had the Operator extension for Firefox I could import the event to my Google calender with the click of a button. Automated spiders looking for that kind of meta-data could also add it to their listings, making it easier for other people to find out about my band.I could go even further by offering a FOAF file for the band, listing contact information and relationships that the group has with other bands. An hCalendar annotated blog that would make it much easier for people to keep up to date with our future gigs.The easier it is for tools to export information into other tools, the more plastic it is. Plasticity is what the semantic web is all about. So lets[...]



I came up with a name for the project I'm working on about a year ago. I thought it was such a kewl name that I didn't want to say it anywhere for fear that it would get hijacked. (Paranoia is strong in my gene pool.) The basic idea is an open-source embeddable distributed engine for building and binding Folksonomy and other meta-data annotated content applications. Rather than building yet another awesome framework for sharing information about your bottle cap collection with other enthusiasts, this would be something that does all of the back end stuff that these Web 2.0 frameworks do over and over again, plus add some cool extra stuff.

So I bought all the core domains but never used them. .COM, .NET and .ORG. Every now and then I'd google the word just to make sure that everything was quiet. For a year it was.

Then last week BONK, low and behold up come a few hits. But instead of coming from the programming space it was coming from the Danish folk music space. Makes sense. I could see why someone would name a band that.

Anyway I'm going to start officially calling it what I call it. I guess I'll have to TM it at the end just to be safe. That gives me six months to cross my eyes and dot my tees. Luckily I've already got someone who's willing to let me use their business as a guinea pig.

So without further ado here's my new favorite band Folk Engine that also has the same name as the business/project I'm working on called FolkEngine(tm). It's the main reason (besides the boy) that I won't be blogging much these days.

This is a big load of my mind. I can finally start using the tag FolkEngine instead of code terms. :-) Now all I have to do is get it done before I drive everyone I know mad.



The first moment I looked at the RDF specification I had a problem with it: everything is URI based. If you want to annotate a thing, you have to refer to its URI.

What I'm working on completely seperates content from a specific location. A link to a text document, a thumbnail image linking to a text document, a summary of a text document, and the text document are all facets of the same thing, and I can place those things in many places.

For instance take a technical document with many sections. Now I can display those sections all on one web page for easy printing, or I can seperate them out into many pages for easier reading. Also, I can place the documument on many servers or offer a PDF version of the document. Which version is "the" document? Which URL do I point to? What if I make a new version of the document and I don't want to remove the old one?

In P2P frameworks content doesn't exist in any one place. It floats. I can grab it from many places. That makes the content extremely plastic. I want my content to have the same flexible characteristics.

The brings out what I consider to be the key weakness of what the W3C does. They create standards that define how the web works. The problem is that they do everything purely from the context of the web. Systems that store, display and annotate data don't just exist in a web context. I create a document on my computer that I want to place on the web. It has a local networked path and a system path. How do I annotate it with RDF if I haven't given it a home yet? What if after I place it in one place we decide to move it? Binding myself to URIs makes my data brittle when I want it flexible.

This also dovetails into another problem that is inherant in any framework: how do you identify people? I have many email addresses, that change every now and then. I move from place to place and use different variations of my name depending upon context. What defines me digitally? Right now nothing.

A Friend of a Friend (FOAF) could be said to be such a think, but it has information that changes over time. Also, my FOAF from work would be totally different from one that I exchange with my old drinking buddies.

I think conceptually I've got the solution: baselines.

A baseline is a collection of fields that define unchanging aspects of a thing.

This makes defining who I am digitally rather simple.

Robert Bob
Momma Bob
Billy Bo Bob
The Moon

This creates a SHA-1 hash of f54634c2c982500c67d254d2afa44c618104bfee

Now I've got an identifier that defines me without containing any information that I'd consider private. I can do the same thing for any content by creating fields that define it. Description, creation date, created by etc...

With a baseline I have a key to an object that isn't bound by its specific content, context, or location.

The Blogosphere Revolution is a Semantic Web Revolution


I'm sure I'm not saying anything new but I wanted to get this down.

I've been separating out the concept of content syndication from the semantic web, but that is a mistake. Weblog content syndication, which is the technological force that has powered the current wave of online Progressive activism, is the original semantic web application. An RSS file is a collection of metadata that points to people's posts. That metadata is combined with tools allowing people to easily discover and distribute information.

While the "blogs" get all the credit for what's been going on, it wouldn't be anything without the semantic web as the conceptual distribution mechanism. Take away the semantic web and all you'd be left with is bulletin boards, which we've had for a long time.

The semantic web as a concept has taken a lot of hits for being unworkable. What's already been done online with political activism is proof positive that it isn't.

In the end the real difficulty is with making the RDF specification something that can easily interact with content in various formats. The solution by makers of blog software has been to not use RDF and create specs that focus entirely on the 1 dimensional streams that are blogs.

The future has got to be with breaking away from this 1-D prison.

Blogs and the Media


The Daily Bellwether has an interesting thread up reflecting on Jill Miller Zimon's post concerning relations between bloggers and the print media. This relates to rumors of the PD planning on hiring several bloggers including Jill.

It's going to be interesting watching how old media handles the growth of the semantic web. So far they've been mainly reactionary, which is always a dangerous sign. Both the New York Times and the Washington Post have been adapting in interesting ways.

I'm thinking that eventually we'll see things adjust to a multi tier approach... blog feeds giving up to the minute information and old world print existing to provide overviews. The problem with adding more and more information into a system is that it gets harder and harder to process that information. Old media could do a much better job helping people with that. The problem is that they'd first have to understand what's actually going on, which means dropping a lot of their 20th century definitions for what news is.

One of my favorite examples of this is the Dayton Daily News' Get on the Bus blog that focuses on education. They haven't done a very good job of promoting it, but it does provide a lot of valuable up to the minute information.

One of the key steps of leveraging the semantic web in order to take on corruption in areas such as public education will be in creating standards for reporting public budgets at all levels of government. I can think of few things more important. Follow the money.

If print media wanted to cement their place in 21st century information streams they would work together to provide such semantic web reporting services. The Washington Post's US Congress Votes database is a start in this direction, even if it's crude and doesn't dive below the surface of what's really going on on Capitol Hill. Maybe I'm just super cynical, but I've always felt that the votes made in Congress are just misdirection in the 3 card monty game that is public governance.

On Tactical Coordination


Here's an essay that gets into what this is about.Politics is war by other means. It is an abstract war, fought with ideas on an ever growing multi-dimensional landscape. As the newest addition to that landscape the blogosphere is a dimension whose potential we have barely began to discover.The explosion of political blogging in the 21st century was born in large part from frustration over the Democratic Party's impotent response to Republican media tactics. Just as Goldwater's loss to LBJ fueled the conservative political revolution, two losing Democratic campaigns are to this day causing waves of change within the Democratic Party.The first wave came from the 2004 Howard Dean campaign's use of the decentralized blogosphere in order to raise funds and promote activism. It was entirely focused on the race for the White House and used large portal sites to concentrate the message. The second wave came from the greatly enhanced decentralized swarm that was the 2005 Hackett campaign. This race added the dimensions of local targeted blogging and improved coordination between hundreds of small sites thanks to the power of automated content syndication. The ability of these sites to quickly disseminate information and raise money was a sea change that the political establishment has barely begun to grasp intellectually let alone take advantage of.Large scale political change is usually the child of technological innovation. However, it is not enough to just possess technology, one also has to understand how it sings. In order to fully leverage the political blogosphere one needs to play to the blogosphere's strengths. As Marshall McLuhan said, "the medium is the message" and the blogosphere is a decentralized, grass roots medium. This bodes well for Progressive causes since it is a naturally democratic medium.While the blogs have been good at countering lies and distortions in traditional media there is still a large disconnect between what is talked about in the blogs and what is actually happening on Capitol Hill. How things work in the world's most powerful sausage factory is still largely the domain of beltway insiders. This creates patterns online that are largely superficial and reactionary. Form over function.Breaking down the barriers that separate the day to day actions of Representatives and those they represent is the next wave of progressive political activism online. The key is for progressive legislators and progressive interest groups to use the twin technologies of web syndication and the semantic web to extend the legislative battlefield beyond Washington DC. (These are really the same thing. - Chris)Here are several cases in point:1) Caucusing the blogosphereThe farm bill is currently working its way through Congress. While this is a topic of much discussion amongst agribusiness around the world, you will hear almost nothing about it on the blogosphere. While progressive Senators are fighting to streamline farm subsidies rural members of Congress are pushing for legislation that addresses the interests of corporate farming.By having Senate staffers simply document the struggle they make it much easier for bloggers local to the rural members of Congress to put political pressure on the representatives. Normally this information would only be available to lobbyists and connected insiders.One of the big points in this is that it completely changes the natural of how legislators attempt to influence the blogosphere. The natural tactic of politicians is to initially try to court someone that they see as an influencer, and if that doesn't work chalk them down as another political enemy. The key dynamic is that[...]