The time has come to say goodbye to Blogger and hello to Wordpress: this blog is moving shortly to http://evolvingnewsroom.co.nz.
If you're a subscriber you shouldn't notice any change in service - the RSS feed should update itself (all going well). And I'll put a redirect on here in the next day or so to automatically route to the new site.
Why am I moving?
Because the combination of my own domain with WordPress offers way more flexibility and I've wanted to have a good play with WordPress for ages.
As it turns out, it also offers a fairly sharp learning curve as I figure out how to make things work in this new environment. It's a good learning curve, though. It's fun chopping out bits of code to see what happens and hoping like heck I don't break anything.
The new site isn't as settled as I'd like it to be - I'm still hunting for the right look and functionality (a rather time-consuming process). But it's functional for the most part and I want to make the switch before my job cranks up again for the year and starts swallowing my time.
So, wish me luck and I'll see you after the jump.
Wednesday, January 14, 2009
The Evolving Newsroom is moving
Posted by Julie Starr at 8:57 PM View Comments ((•)) Hear this post
Labels: blogger, evolvingnewsroom, move, wordpress
Sunday, January 11, 2009
UK Telegraph outsources production to Pagemasters
I'd heard about this but hadn't followed up. So, via Jeff Jarvis, who wins headline of the week award for:
Throw Another Sub on the Barbie
The Telegraph of London is outsourcing production of some of its sections to Australia, the Sydney Morning Herald reports.
At my New Business Models for News Summit, Telegraph digital head Edward Roussel rephrased my admonition and told the room to do what you do best and outsource the rest. Guess they mean it.
The work will go to Pagemasters, a company owned by the Australian Associated Press (co-owned by Fairfax and Murdoch’s News Corp), which said it has received inquiries from publishers around the world. I’ll bet. I’m surprised that American newspaper chains haven’t consolidated nearly all their production; there’s no reason it can’t be centralized. As the Herald reports, it makes even more sense to do it in Australia because salaries are lower and the work there can be done on the cheaper day shift. All we’d have to do is teach them that sport is plural and footballs aren’t round.
UPDATE: Edward Roussel confirms in the comments:
Jeff - I can confirm that we are outsourcing the production work for newspaper weekend supplements to Australia - and thereby saving quite a bit of money.The copy goes to Australia once it has been approved by an editor in London. To Rob Mark’s point, the printing takes place in the UK. We have outsourced that too. Arch-rival News International takes care of our printing.
Both the outsourcing of production and printing has allowed us to reduce costs and raise standards: NI has state-of-the-art color printing presses and we are happy with the standard of work that’s being done in Australia.
Reducing the cost of manufacturing and distribution is an imperative for any newspaper group that is determined to remain profitable, as we are. This is a great time to be shopping around the world for value-for-money partners.
The principle holds true on the digital side. ITN creates our video content, providing quality and value that we would struggle to generate internally; Brightcove handles our video distribution; Google powers our search; Escenic provides our web publishing tool; we use software developers in Bulgaria and India.
Newspaper-web companies should focus internal resource on what they do best: creating premium editorial content.
Posted by Julie Starr at 1:10 PM View Comments ((•)) Hear this post
Labels: jeff jarvis, outsourcing, production, sub-editors, Telegraph
Friday, January 2, 2009
Alltop a good place to find journalism voices
Because some things are worth repeating: Alltop's journalism page is a great entry point to dozens of blogs about journalism and the news business.
Posted by Julie Starr at 4:21 PM View Comments ((•)) Hear this post
Labels: aggregators, alltop, blogs, journalism
Blogger, Facebook top 2008 social media list
Via TechCrunch, a list of the top 20 social media sites in 2008:
Top Social Media Sites
(ranked by unique worldwide visitors November, 2008; comScore)
- Blogger (222 million)
- Facebook (200 million)
- MySpace (126 million)
- Wordpress (114 million)
- Windows Live Spaces (87 million)
- Yahoo Geocities (69 million)
- Flickr (64 million)
- hi5 (58 million)
- Orkut (46 million)
- Six Apart (46 million)
- Baidu Space (40 million)
- Friendster (31 million)
- 56.com (29 million)
- Webs.com (24 million)
- Bebo (24 million)
- Scribd (23 million)
- Lycos Tripod (23 million)
- Tagged (22 million)
- imeem (22 million)
- Netlog (21 million)
Posted by Julie Starr at 4:09 PM View Comments ((•)) Hear this post
Labels: 2008, lists, social media, social networks, techcrunch
Big doesn't necessarily mean authoritative
A couple of interesting points in Jeff Jarvis's post about what gives reporters, bloggers etc authority. He was writing in reference to a conversation about how to filter for authoritative voices on Twitter - more specifically, whether number of followers is useful in determining someone's authority, or relevance.
Rest of the post is here.The problem... is the same one that plagues analysis of online discussion using media metrics. In mass media, of course, big was better because you had to be big to own the press: Mass mattered. We still measure and value things online according to that scale, even though it is mostly outmoded. Indeed, we now complain about things getting too big - when, as Clay Shirky says, what we’re really complaining about is filter failure.
The press came to believe its own PR and it conflated size with authority: We are big, therefore we have authority; our authority comes from our bigness.
But the press, of all parties, should have seen that this didn’t give them authority, for the press was supposed to be in the business of going out to find the real authorities and reporting back to what they said. This is why I always cringe when reporters call themselves experts. No, reporters are expert only at finding experts.
Posted by Julie Starr at 3:43 PM View Comments ((•)) Hear this post
Labels: authority, jeff jarvis, journalism
News in colour
If lists of headlines don't work for you, this colour grid of news might do the trick. It tracks Google News and you can filter for a particular region.
Alternatively, you could try this news map from vocalnation.net. It aggregates news stories from MSMs round the world - nytimes, nzherald, washingtonpost etc - and pins them on a map.
Posted by Julie Starr at 3:23 PM View Comments ((•)) Hear this post
Labels: news, visualisation
Tuesday, December 30, 2008
What journalists need to know about search engine optimisation
This is a useful read for any journalist coming to terms with writing for the web and why that means understanding keywords and search engine optimisation.
It was written by Shane Richmond, Communities Editor for telegraph.co.uk, for the British Journalism Review.
I've pulled a few key chunks out of it but the whole thing is well worth reading.
The “Gotcha” headline on a Sun front-page splash about the sinking of the General Belgrano is one of the most famous, or infamous depending on your taste, in the history of British journalism. Yet no web producer with any experience would consider a headline like that today. The reason is search engine optimisation (SEO). SEO has been around almost as long as search engines themselves, but journalists were quite late to cotton on. It didn’t really reach newsrooms until a couple of years ago.The concept is simple. It’s about ensuring that your content is found by the millions of people every day who use search engines as their first filter for news and those who don’t search at all but trust an automated aggregator, such as Google News, to filter stories for them. These people are essentially asking a computer to tell them the news. If you want your story to be read, you’d better make sure the computer knows what you’re writing about.
To do that you need to ensure your article contains certain keywords. That means not only the words that someone types into a search engine but also the keywords that the search engine knows are commonly associated with the search term. So if someone types “credit crunch” into a search engine, the computer knows that an article about the credit crunch often contains other words, such as financial crisis, bail out or bailout, banks, recession and so on.
...
Let’s go back to May 4, 1982 and that “Gotcha” headline. The sub-head read: “Our lads sink gunboat and hole cruiser.” Below that, the story began: “The Navy had the Argies on their knees last night after a devastating double punch.” Alongside was a graphic showing a British soldier and the words “Battle for the Islands”. All of this works perfectly for its audience and its medium, but it wouldn’t be likely to figure highly in search results. Imagine for a moment that the Falklands conflict was happening today.What would you type into a search engine to find the latest news about it? Well, “Falklands” certainly, or perhaps “Falkland Islands”. You wouldn’t search for “the Islands”, which is used in the Sun copy. You’d be more likely to search for “Argentina” than “Argies” and “British Navy” or “Royal Navy” would get more relevant results than“The Navy”.
...Just as clever headlines, delayed drops and other journalistic tricks evolved to suit the medium, so we will learn new ways to take advantage of the opportunities SEO provides to reach a vast audience. Hopefully it should be clear by now that there’s nothing to debate when it comes to SEO. If you want your story to be found, you have to adopt these techniques. There’s no room for argument. But the debate frequently mutates into something else and unleashes a host of other concerns.
Once we know what people are searching for should we write stories to meet that demand? Will search engines end up dictating our news agenda as well as the way we format our stories? If we write stories simply to chase traffic, where do we find the resources to write the specialist stories, the ones that are important to our core readers but not massively popular?
All those concerns are legitimate, but they are not questions about SEO and shouldn’t be interpreted as such. They are editorial questions. If an editor wants to devote resources to writing stories based on topics people are searching for, they now have the data that will permit them to do so. Giving readers what they want is a sensible strategy, even though the overall mix of stories within a publication has to be balanced. Different editors will make different choices, but they are editorial choices, not SEO choices.
Posted by Julie Starr at 5:08 PM View Comments ((•)) Hear this post
Labels: keywords, search, seo, Shane Richmond, Telegraph
Common Craft on its success and business models
ReadWriteWeb have done a nice end-of-year profile on Common Craft, the clever folk behind those RSS in Plain English and other videos which explain social media and web stuff.
It's a nice read and the part about why Common Craft deciced to move away from its custom video service into its current licencing business model is interesting:
- Custom videos do not scale. We would have to hire people to grow the company and we don't want to hire. We are a two person company.
- Custom videos are usually promotional. We are more comfortable with education than promotion. Another realization is that promotion is fad-driven and education isn't as much. We see a longer lifespan for our videos in education.
- Our goal is independence - we want to work for our own goals on our own schedule and maintain a lifestyle that supports us."
What is Common Craft going to do instead of making themselves available for hire making custom videos? Lee says that for the past year they've been getting requests three or four times a week for permission to re-use their Plain English videos. The solution they decided on was licensing them for corporate and eductional use.
Common Craft now sells licenses for high-quality, downloadable versions of their explanatory videos. All of their time working is now spent building out the library. Videos are licensed for under $20 for individual use and $350 for site-wide use, like on a company intranet. Commercial licensing, for use on public commercial websites, is the next option the company will be offering.Of course the video content is available free to anyone online, but Common Craft says that many companies feel far more comfortable paying for official permission to use high quality, unbranded versions. There's certainly no DRM involved.
"People want to do the right thing if they know the rules," Lee LeFever says. "Our challenge is to educate people about how we expect our videos to be used. We're lucky to have fans that feel good about supporting us with their purchases. Given limited resources, we would rather spend time educating people on the right thing to do than trying to make the wrong things impossible."
Posted by Julie Starr at 11:52 AM View Comments ((•)) Hear this post
Labels: business models, common craft, education, readwriteweb, rss, tools, video
Monday, December 22, 2008
JEANZ, penguins and keeping it simple
The prize for most enjoyable PowerPoint at the recent JEANZ (journalism educators of NZ) conference has to go to Susan Boyd-Bell, who demonstrated the value of keeping it simple and letting a few well-chosen quotes tell a story.
The quotes come from students Susan interviewed as part of her research into the value of experiential learning, specifically on AUT's terrific student newspaper project Te Waha Nui. The paper, incidentally, won a couple of Ossie awards recently including 'Best regular student publication 2008'. I had the pleasure of working with this year's award-winning team (there have been others over the years) and it's lovely to see them walk away with a prize. Well done!
Here's what some students had to say about the experience:
This was my first JEANZ conference and I enjoyed it. There's nothing quite like having a couple of days to talk shop non-stop. Not to mention a cosy dinner with penguins at the Antarctic Centre (a few blurry pictures from my phone here).
The conference agenda was fairly broad and I don't intend to summarise the whole event here. But one point I will make is that it's good to see journalism schools countrywide teaching digital media, multimedia, web 2.0 tools for journalists etc in some form or another.
In a very rough nutshell:
Massey University runs a convergence course for its graduate diploma students giving them an introduction to talking to camera, working with audio, editing packages and writing for the web; Aoraki is offering multiplatform courses at its Christchurch campus.
Jim Tucker at Whitireia has a website his diploma students write for and create images, video and slide shows for; and AUT has a digital media paper which is largely taught online and includes an exploration of online journalism and building a simple website.
Wintec is teaching its pre-journalism students how to use web2.0 tools such as social bookmarking, RSS, blogs and simple audio and video editing software to enhance their study, research and ultimately their journalism.
More about that course in another post.We've learned a lot from its maiden run - including that gen x and gen y very often aren't 'digital natives'.
Posted by Julie Starr at 10:19 AM View Comments ((•)) Hear this post
Labels: conference, education, jeanz, journalism, powerpoint
Sunday, December 21, 2008
First step in bringing change: find the believers
Erik Ulken has posted a must-read top 10 list of lessons learned while setting up the data desk in the LA Times newsroom.
The data desk's job is to take detailed information that's dreary to read in text or table form and make it useful by presenting it in compelling and interactive formats. A well-known example is the LA Time's Homicide Map which allows readers to filter through a database of crime statistics and see them represented visually in graphics and on a map.
The 10 lessons Erik posted are all good so I'm pasting them all in here but it's still worth checking out his post to see what else he has to say.
Erik's list echoes some of the things learned at the Telegraph when it was reinvigorating its website and beginning to integrate its web and print operations. I think these lessons would apply in all sorts of development and change management scenarios.
- Find the believers: You'll likely discover enthusiasts and experts in places you didn't expect. In our case, teaming up with the Times' computer-assisted reporting staff, led by Doug Smith, was a no-brainer. Doug was publishing data to the web before the website had anybody devoted to interactive projects. But besides Doug's group, we found eager partners on the paper's graphics staff, where, for example, GIS expert Tom Lauder had already been playing with Flash and web-based mapping tools for a while. A number of reporters were collecting data for their stories and wondering what else could be done with it. We also found people on the tech side with a good news sense who intuitively understood what we were trying to do.
- Get buy-in from above: For small projects, you might be able to collaborate informally with your fellow believers, but for big initiatives, you need the commitment of top editors who control the newsroom departments whose resources you'll draw on. At the Times, a series of meetings among senior editors to chart a strategic vision for the paper gave us an opportunity to float the data desk idea. This led to plans to devote some reporting resources to gathering data and to move members of the data team into a shared space near the editorial library (see #8).
- Set some priorities: Your group may come from a variety of departments, but if their priorities are in alignment, disparate reporting structures might not be such a big issue. We engaged in "priority alignment" by inviting stakeholders from all the relevant departments (and their bosses) to a series of meetings with the goal of drafting a data strategy memo and setting some project priorities. (We arrived at these projects democratically by taping a big list on the wall and letting people vote by checkmark; ideas with the most checks made the cut.) Priorities will change, of course, but having some concrete goals to guide you will help.
- Go off the reservation: No matter how good your IT department is, their priorities are unlikely to be in sync with yours. They're thinking big-picture product roadmaps with lots of moving pieces. Good luck fitting your database of dog names (oh yes, we did one of those) into their pipeline. Early on, database producer Ben Welsh set up a Django box at projects.latimes.com, where many of the Times' interactive projects live. There are other great solutions besides Django, including Ruby on Rails (the framework that powers the Times' articles and topics pages and many of the great data projects produced by The New York Times) and PHP (an inline scripting language so simple even I managed to learn it). Some people (including the L.A. Times, occasionally) are using Caspio to create and host data apps, sans programming. I am not a fan, for reasons Derek Willis sums up much better than I could, but if you have no other options, it's better than sitting on your hands.
- Templatize: Don't build it unless you can reuse it. The goal of all this is to be able to roll out projects rapidly (see #6), so you need templates, code snippets, Flash components, widgets, etc., that you can get at, customize and turn around quickly. Interactive graphics producer Sean Connelley was able to use the same county-level California map umpteen times as the basis for various election visualizations in Flash.
- Do breaking news: Your priority list may be full of long-term projects like school profiles and test scores, but often it's the quick-turnaround stuff that has the biggest immediate effect. This is where a close relationship with your newsgathering staff is crucial. At the Times, assistant metro editor Megan Garvey has been overseeing the metro staff's contributions to data projects for a few months now. When a Metrolink commuter train collided with a freight train on Sept. 12, Megan began mobilizing reporters to collect key information on the victims while Ben adapted an earlier Django project (templatizing in action!) to create a database of fatalities, complete with reader comments. Metro staffers updated the database via Django's easy-to-use admin interface. (We've also used Google Spreadsheets for drama-free collaborative data entry.) ... Update 11/29/2008: I was remiss in not pointing out Ben's earlier post on this topic.
- Develop new skills: Disclaimer: I know neither Django nor Flash, so I'm kind of a hypocrite here. I'm a lucky hypocrite, though, because I got to work with guys who dream in ActionScript and Python. If you don't have access to a Sean or a Ben — and I realize few newsrooms have the budget to hire tech gurus right now — then train and nurture your enthusiasts. IRE runs occasional Django boot camps, and there are a number of good online tutorials, including Jeff Croft's explanation of Django for non-programmers. Here's a nice primer on data visualization with Flash.
- Cohabitate (but marriage is optional): This may be less of an issue in smaller newsrooms, but in large organizations, collaboration can suffer when teams are split among several floors (or cities). The constituent parts of the Times' Data Desk — print and web graphics, the computer-assisted reporting team and the interactive projects team — have only been in the same place for a couple months, but the benefits to innovation and efficiency are already clear. For one thing, being in brainstorming distance of all the people you might want to bounce ideas off of is ideal, especially in breaking news situations. Also, once we had everybody in the same place, our onetime goal of unifying the reporting structure became less important. The interactive folks still report to latimes.com managing editor Daniel Gaines, and the computer-assisted reporting people continue to report to metro editor David Lauter. The graphics folks still report to their respective bosses. Yes, there are the occasional communication breakdowns and mixed messages. But there is broad agreement on the major priorities and regular conversation on needs and goals.
- Integrate: Don't let your projects dangle out there with a big ugly search box as their only point of entry. Weave them into the fabric of your site. We were inspired by the efforts of a number of newspapers — in particular the Indianapolis Star and its Gannett siblings — to make data projects a central goal of their newsgathering operations. But we wanted to do more than publish data for data's sake. We wanted it to have context and depth, and we didn't want to relegate data projects to a "Data Central"-type page, something Matt Waite (of Politifact fame) memorably dubbed the "data ghetto." (I would link to Waite's thoughtful post, but his site unfortunately reports that it "took a dirt nap recently.") I should note that the Times recently did fashion a data projects index of its own, but only as a secondary way in. The most important routes into data projects are still through related Times content and search engines.
- Give back: Understand that database and visualization projects demand substantial resources at a time when they're in very short supply. Not everyone in your newsroom will see the benefit. Make clear the value your work brings to the organization by looking for ways to pipe the best parts (interesting slices of data, say, or novel visualizations) into your print or broadcast product. For example, some of the election visualizations the data team produced were adapted for print use, and another was used on the air by a partner TV station.
Posted by Julie Starr at 10:19 AM View Comments ((•)) Hear this post
Labels: change management, data, homicidemap, latimes, lessons, newsrooms