Archive for 2010

Google’s new news tagging scheme fails to give credit where credit is due

with one comment

Far be it for me to question the brilliance of Google, but in the case of its new news meta tagging scheme, I’m struggling to work out why it is brilliant or how it will be successful.

First, we should applaud the sentiment. Most of us would agree that it is A Good Thing that we should be able to distinguish between syndicated and non-syndicated content, and that we should be able to link back to original sources. So it is important to recognize that both of these are – in theory – important steps forward both from the perspective of news and the public.

But there are a number of problems with the meta tag scheme that Google proposes.

Problems With Google’s Approach

Meta tags are clunky and likely to be gamed. They are clunky because they cover the whole page, not just the article. As such, if the page contains more than one article or, more likely, contains lots of other content besides the article (e.g. links, promos, ads), the meta tag will not distinguish between them. More important is that meta tags are, traditionally, what many people have used to game the web. Put in lots of meta tags about your content, the theory goes, and you will get bumped up the search engine results. Rather than address this problem, the new Google system is likely to make it worse, since there will be assumed to be a material value to adding the “original source” meta tag.

Though there is a clear value in being able to identify sources, distinguishing between an “original source” as opposed to a source is fraught with complications. This is something that those of us working on hNews, a microformat for news, have found when talking with news organizations. For example, if a journalist attends a press conference then writes up that press conference, is that the original source? Or is it the press release from the conference with a transcript of what was said? Or is it the report written by another journalist in the room published the following day etc.? Google appears to suggest they could all be “original sources”, but if this extends too far then it is hard to see what use it is.

Even when there is an obvious original source, like a scientific paper, news organizations rarely link back to it (even though it’s easy using a hyperlink). The BBC – which is generally more willing to source than most – has, historically, tended to link to the front page of a scientific publication or website rather than to the scientific paper itself (something the Corporation has sought to address in its more recent editorial guidelines). It is not even clear, in the Google meta-tagging scheme, whether a scientific paper is an original source, or the news article based on it is an original source.

And what about original additions to existing news stories? As Tom Krazit wrote on CNET news,

… the notion of “original source” doesn’t take into account incremental advances in news reporting, such as when one publication advances a story originally broken by another publication with new important details. In other words, if one publication broke the news of Prince William’s engagement while another (hypothetically) later revealed exactly how he proposed, who is the “original source” for stories related to “Prince William engagement,” a hot search term on Google today?

Something else Google’s scheme does not acknowledge is that there are already methodologies out there that do much of what it is proposing, and are in widespread use (ironic given Google’s launch title “Credit where credit is due”). For example, our News Challenge-funded project, hNews addresses the question of syndicated/non-syndicated, and in a much simpler and more effective way. Google’s meta tags do not clash with hNews (both conventions can be used together), but neither do they build on its elements or work in concert with them.

One of the key elements of hNews is “source-org” or the source organization from which the article came. Not only does this go part-way towards the “original source” second tag Google suggests, it also cleverly avoids the difficult question of how to credit a news article that may be based on wire copy but has been adapted since — a frequent occurence in journalism. The Google syndication method does not capture this important difference. hNews is also already the standard used by the U.S.’s biggest syndicator of content, the Associated Press, and is also used by more than 500 professional U.S. news organizations.

It’s also not clear if Google has thought about how this will fit into the workflow of journalists. Every journalist we spoke to when developing hNews said they did not want to have to do things that would add time and effort to what they already do to gather, write up, edit and publish a story. It was partly for this reason that hNews was made easy to integrate to publishing systems; it’s also why hNews marks information up automatically.

Finally, the new Google tags only give certain aspects of credit. They give credit to the news agency and the original source but not to the author, or to when the piece was first published, or how it was changed and updated. As such, they are a poor cousin to methodologies like hNews and linked data/RDFa.

Ways to Improve

In theory Google’s initiative could be, as this post started by saying, a good thing. But there are a number of things Google should do if it is serious about encouraging better sourcing and wants to create a system that works and is sustainable. It should:

  • Work out how to link its scheme to existing methodologies — not just hNews but linked data and other meta tagging methods.
  • Start a dialogue with news organizations about sourcing information in a more consistent and helpful way
  • Clarify what it means by original source and how it will deal with different types of sources
  • Explain how it will prevent its meta tagging system from being misused such that the term “original source” fast becomes useless
  • Use its enormous power to encourage news organizations to include sources, authors, etc. by ranking properly marked-up news items over plain-text ones

It is not clear whether the Google scheme – as currently designed – is more focused on helping Google with some of its own problems sorting news or with nurturing a broader ecology of good practice.

One cheer for intention, none yet for collaboration or execution.

This article was first posted at PBS MediaShift Ideas Lab on Thursday 18th November.

Written by Martin Moore

November 19th, 2010 at 4:42 pm

Posted in hNews

Tagged with , , , ,

How does local TV fit into the Big Society?

with 2 comments

Given that the reinvention of local news seems to fit well with the UK government’s idea of a ‘Big Society’ why does the government’s current media policy seem to contradict both?

It is becoming increasingly clear that the reinvention of local news will have to be from the ground up. There are certain aspects of news gathering and publication that are not profitable to do (think court and council reporting). Therefore assuming there are no direct government subsidies – which seems a pretty safe assumption right now – if it is to be done it will need to done by people who want to do it, are committed to doing it, and are not looking to make lots of money out of it. In other words, people who live locally and want to contribute something to the society in which they live.

This is just the type of thing this government wants to get people doing. Just last month the Prime Minister David Cameron said he intends: ‘to build a nation of doers and go-getters, where people step forward not sit back, where people come together to make life better”. The government, Cameron continued, wants to empower people, particularly at a local level, ‘breaking apart the old system with a massive transfer for power, from the state to citizens, politicians to people, government to society.”

Local media is a great first step on the ladder of community involvement. You can start with a very low level of commitment – participating in a local online forum, posting pictures to a local flickr site – and then progress organically towards a deeper and more substantial commitment – organising local events and writing up reviews, reporting on discussions at the local school’s open meeting.

Read ‘What Works’, a fascinating recent report by the J-Lab assessing 46 local community start-ups that they have been involved with, and you’ll see how closely local media links with local community and local participation. Take The Forum in Deerfield, New Hampshire. It is run by a core group of local volunteers, but more than 350 people have contributed news, articles, photos, columns, art and literature to the site (50 of whom contribute regularly). According to J-Lab ‘The School Board and Select Board now seek out coverage. The local police departments send crime reports. Recreation departments and libraries submit articles.’ And the whole thing is run from within the community.

Yet, current government media policy seems to run counter both to this bottom up reinvention of local news and to the ideals of the Big Society. This was made manifest last week in two one-day conferences that, by chance, happened on consecutive days last week – 1000 flowers in Norwich, and Local TV news at City University in London.

At 1000 flowers Rick Waghorn had gathered together a motley collection of grassroots initiatives, entrepreneurs, local businesses, and local media players. Panels focused on the entrepreneurial side of local news, collaboration, and how to avoid onerous regulation. Even the big players who were there – like Trinity Mirror and STV – were clear about how much they had learnt these last few years about the limits of the top down approach. Trinity Mirror has, for example, now struck partnership deals with over 20 local bloggers in Birmingham. STV has spent the last few months talking to as many people as they can about collaborations.

By comparison the local TV news conference the following day illustrated how the local TV idea – the one on the table from government – seemed to come from another era. There were panels talking about transmitters, DTT platforms and whether Birmingham Alabama really is like Birmingham West Midlands (it isn’t). The assumption appears to be – from the government side – that if the State decides that the future of local news is local TV, provided on a digital terrestrial platform, then that is what the future will be.

This is classic top down thinking. Exactly the opposite of the Cameroonian Big Society. Rather than “breaking apart the old system with a massive transfer for power” this is a Kevin Costner Field of Dreams approach: ‘If we provide the transmission frequency it will be filled’.

But it won’t. It won’t because the economics don’t work, the demand is not proven, and there is little evidence that this is what people want to do (unless the government pays them to do it). On the economics, many many people with much more knowledge than me on money matters have made clear that local TV in the UK would not be profitable. If it were, as Claire Enders said on the Friday, people would be doing it.

The future of local media is likely to be messy – just like the Big Society. Messy in the sense that different communities will do things differently. There will not be homogeneity. Some communities will have a thriving community of journalists, geeks and bloggers covering local politics, local schools, and weekend fetes (like Birmingham). And they will do it in whatever way makes sense to them and what works for their community. Other communities will have very limited provision.

It is these gaps in provision we should be worrying about. But the way to fill them is not to ‘provide space on the DTT transmitter’, it is to motivate people and incentivise people to fill it. The way to do this is to lower the cost to providing this sort of information and scrutiny (e.g. though transparency) and then provide incentives for people to do this in a sustainable way (e.g. by providing tax breaks for public interest news provision).

But let’s please start to realise that telling people to do this, and telling them to do it on a DTT TV platform, is not going to make it happen.

More posts about #1000 Flowers and City Local TV conferences:

#1000 Flowers: David Higgerson, Suzanne Kavanagh, Harry Harrold, Joseph Stashko

City Local TV: Dominic Ponsford, George Brock, Charlie Beckett

Written by Martin Moore

November 12th, 2010 at 3:06 pm

The report we didn’t set out to write

with 2 comments

We didn’t set out to write a report on international news. We (the Media Standards Trust) set out to get a handle on what had really changed in newspapers – in terms of content – over the last few decades. There is so much – understandable – focus on the immediate, ongoing, news revolution that we wanted to take a step back, take the long view.

To do this we headed out to the wonderful, wind swept Colindale, the British newspaper library stranded in the nether regions of the Northern line. Here we looked at national newspapers from the 1970s, 1980s, 1990s and 2000s.

Two changes were particularly striking (apart from the ballooning number of pages and supplements):

  • The fall in the extent and prominence of international reporting
  • The fall in the extent of regional news

We left the regional news for now (that’s for a separate report), and decided to concentrate on international reporting – to see if our eyeballing of the papers was born out by the figures.

Knowing we could not count every story in every paper since the mid 1970s (the library would have moved to Yorkshire before we were finished) we chose a sample of papers and years. We picked an average week in 1979, 1989, 1999 and 2009 – a week that wasn’t skewed by  a big news story that dominated the press, like MP’s expenses or the election – and four newspapers (Daily Telegraph, The Guardian, Daily Mail and the Mirror), and we started counting.

And we counted. And we counted. We counted the number of international stories in the papers (being generous in our definition of international), and we counted the total number of stories in each paper – oh, and we made a note of the page number as well (e.g. 2 international stories on page 2 and 3 other news stories). In total we counted over 10,500 stories.

This way we could get an impression – and granted it is an impression – of how the extend and prominence of international news has changed.

The end result was pretty clear. International news in these four papers has declined in absolute and relative terms. In absolute terms, in other words in terms of the number of foreign news stories published, international coverage has dropped by almost 40%. In a working week in 1979 there were just over 500 international stories published in these four newspapers. By 2009 this had dropped to just over 300. The decline in international news as a proportion of each newspaper was even starker (because the papers have got bigger as international coverage has shrunk). So, in 1979 international news made up a fifth of each paper, on average. By 1989 this had fallen to 16%, by 1999 to 13% and by 2009 to 11%.

Having done all this counting we then wanted to see if these numbers correlated with the experience of foreign correspondents and editors. So we spent some time speaking to people from these and other news organisations. The numbers, they say, mapped quite closely to their own impressions. We then chatted to them about the reasons for the decline and discussed where they thought foreign reporting might be going.

We’ve captured some of their thoughts, and a few of our own, in the Media Standards Trust report published today: ‘Shrinking World: the decline of international reporting in the British press’ (November 2010).

You can download if from or, if you’d like a print copy, give us a call (020 7727 5252).

Written by Martin Moore

November 1st, 2010 at 6:21 am

Friday note: Iraq war logs, Google’s $5m and media bugs

without comments

Links to stuff I’ve read this week about where news may – or may not – be going:

‘Biggest document leak in history’The Iraq War Logs

  • More bigger leaks (as previously suggested on this blog). This time The Bureau for Investigative Journalism helped Wikileaks work with more news organisations to ‘mediate’ the nearly 400,000 Iraq War Logs. In addition to hooking Assange & Co up with Channel 4′s Dispatches the Bureau for Investigate Journalism (TBIJ) also launched the Iraq War Logs website, with its own stories from the files, and links to original documents.

Google invests $5 million in news, sort of – ‘Google to give $5million to journalism non-profits

  • Google announced it would be putting $2 million towards news innovation in the US and $3 million internationally. In the US Google has given the Knight Foundation charge of directing the money. Internationally… we don’t know yet.

Knight News Challenge 2011 launches

  • Knight launched its fifth – and final? – Knight News Challenge, with different parameters than previous competitions. This time it called for entries in four specific categories: mobile, sustainability, community, and authenticity. The Media Standards Trust was very pleased to be cited as one of Knight’s previous winners along with, Document Cloud, and Patchwork Nation.

von Ahn and Castells at the Royal SocietyWeb Science Conference 2010

  • The future of the web – web science – talks are now available online at the Royal Society. I would highly recommend Luis von Ahn’s lecture explaining how he has used mass collective intelligence to digitise millions of books, and Manuel Castells strong defence of the web as a source of happiness

MediaBugs goes national – nieman lab

  • MediaBugs, an ingenious online service for capturing mistakes in news and alerting news organisations, announced it was expanding across the US (having previously been based in San Francisco). Next stop the UK?

Written by Martin Moore

October 29th, 2010 at 3:49 pm

Posted in Uncategorized