Adrian Short

Design, citizenship and the city

Archive for the ‘Sutton Council’ tag

Worst practice: 10 ways that Sutton Council’s website (still) drives me nuts

with 7 comments

Someone famous once said that the true definition of madness is doing the same thing over and over and expecting the results to be different. Well I keep going back to the Sutton Council website and nine months after launch it’s still not any better. Arguably it’s worse.

Wibble.

In no particular order:

1. No redirect from sutton.gov.uk to www.sutton.gov.uk

That’s one small step for the DNS admin, one large dollop of timewasting annoyance for dozens of users every day.

2. Enormously bloated top navbar.

sutton council navbar

So useful that they let you hide it (now). Does that tell you something?

3. No distinct visited link colours

Sutton Council no visited link colours

Want to know which links you’ve already clicked? Tough. Perhaps the designers were off for Usability 101. So irritating that I wrote a Greasemonkey script to fix it. (Who says users never want to customise their council’s website?)

4. Abysmal RSS implementation

No autodiscovery. Homepage RSS icons link to a help page rather than the feeds themselves. On the help page even the enormous RSS icon isn’t a feed link either, just a pretty picture. And once you finally manage to subscribe, you have the exquisite pleasure of renaming “Latest press releases RSS feed” to “Sutton Council news” and “Sutton Council” to “Sutton Council jobs” in your feed reader. All of which makes me think that none of this was designed by someone who’s ever used RSS, let alone tested properly. Please fix it before one of us dies.

5. Distracting, patronising, juvenile stock photos

If the current homepage is to be believed, Sutton is the kind of place where people are ecstatic to have TWO ice creams, wear flowers in their hair and grow beards. This isn’t cool, it’s the dad dance of civic web design. How about letting the real content speak for itself without having to compete with this junk?

6. The clock/calendar anti-pattern

Sutton Council clock/calendar

Put the entirely useless current time and date where the content date should go, then type the content date into the story titles. Is this really a content management system or is someone just bashing it out with FrontPage? (Extra bonus points will be awarded to any designer that can find the time/date on the screen of every user’s computer. Clue: It’s not in the browser.)

7. Search form uses POST rather than GET

Want to bookmark or link to a page of search results? No can do. Some basic instruction in the meaning and usage of HTTP methods required. Failing that, just copy every other search form on the entire web.

8. No permalinks

Non-permalinks

1999 called — they want their URLs back. I wonder whether I’ll have time to fix all my inbound deep links and bookmarks to the site before they change them. Again. Permalinks are cool. Two-ice-cream girl take note.

9. Don’t Contact Us

It’s there, but can you find it? Enjoy the multi-step form when you do. Wizards are magic!

10. Subscribe to this page

Except it doesn’t work. Never has. Makes no sense. A small prize is offered to anyone that can explain clearly 1) What it’s supposed to do and 2) How you use it. I’m just a web designer and not a very bright one at that. Goes right over my head. (Tip: There’s already a general subscription mechanism for web content called RSS.)

11. £200K and rising

I had to help pay for it too. Now that really hurts. Got a spare £200K? You can get a site like this for your council too.

Written by Adrian Short

August 7th, 2009 at 2:34 pm

Is Sutton Council too white?

with one comment

The Sutton Minority Ethnic Forum is running a project called the Shadow Councillor Scheme, which lets people who may be interested in becoming councillors find out more by closely following a councillor for a month. While everyone is welcome to apply to the project, the council says that:

We are particularly hoping to attract interest from Sutton’s minority groups to help improve the political representation of the borough’s minority ethnic communities.

There seem to be two assumptions in this statement. The first is that there aren’t enough ethnic minority councillors (hereafter “ME”) and the second that the ethnic composition of the council actually matters. Read the rest of this entry »

Written by Adrian Short

May 13th, 2009 at 11:39 pm

Why I’m throwing down the gauntlet to our councils over RSS feeds

without comments

mtslogo_200

You’re free to republish this article under the Creative Commons Attribution 2.0 UK licence with credit and a link to Adrian Short / Mash the State

Today I connected 66 councils to their citizens by making it easy to subscribe to their news by email. It took me around ten minutes. I’d say this was a fairly good use of my time in terms of the ratio of effort to value produced, but I can’t claim to have done it single handed. What made it possible is that all 66 of these councils serve an RSS feed from their websites — and they’re the only ones in the country that do. Hooking those feeds up to FeedMyInbox through the council pages at Mash the State was a simple matter of dropping a single web link into a template and pushing it to the live site. Job done.

RSS is a simple way of getting data out of a website and into another program. The technology is ten years old and RSS feeds are ubiquitous on blogs, on mainstream news media websites and in Web 2.0 applications. The three leading web browsers — Internet Explorer, Firefox and Safari — all contain built-in RSS readers. Yet despite running websites costing tens of thousands of pounds annually each, only 15% of UK councils bother with RSS. Nothing could be more symbolic of large parts of government’s unwillingness to think beyond the confines of their own websites than making it practically impossible to receive basic local council information like news and events except by taking a trip to anytown.gov.uk to do it on the council’s own terms.

The ten minutes it took to emailify those 66 councils compare quite unfavourably with probably a similar number of hours I’ve spent trying to scrape Sutton Council’s news into a database, and from there through Delicious into RSS and Twitter. Writing screen scrapers — programs which extract text from web pages and turn them into structured, reusable data — is sometimes tricky but Sutton’s news is trickier than most. The news archive serves inconsistent page structures and even dynamically changing URLs to compete with. I vowed never to write another scraper, though as we’ll see, that’s a promise I soon had to break.

Screen scraping and copyright infringement are the dirty not-so-secrets of the civic hacking world. Show me a useful, innovative third-party civic website and I’ll most probably be able to show you the terms and conditions that were ignored and the data that was taken and repurposed without permission or legal licence. Similar behaviour is not unknown in the public sector itself, in some cases because government organisations are recycling that very same stolen data from third party applications into their own websites. The recent Rewired State National Hack the Government Day saw some incredibly inspiring, innovative and useful projects produced in very short order. How many of these projects didn’t involve citizens jailbreaking their own government to get the data they’ve paid for? What kind of society not only massively impedes but actually criminalises — in principle if not in practice — citizens devoting their own time, skills and money to write software to improve democracy and public services? Our society, it seems.

This has to stop. Hackers have shown their ability and willingness to surmount technical obstacles and run legal risks to get the data they need but less technical citizens simply cannot. No-one should have to. A rich, technologically-advanced and supposedly forward-thinking society such as ours should make citizens’ access to government data so commonplace that it doesn’t deserve comment. No technical wizardry required. No legal minefields to navigate. Just all the data served through common protocols with open licences that permit, well, anything. Then we can focus our time and energy on the considerably more interesting higher-order opportunities that come from actually using government data, not just getting hold of it.

Last week I launched Mash the State, a national campaign to get government data to the people. It’s not a new idea but our method is. We’ll be setting up a series of challenges to the public sector, asking one group of public bodies at a time to release one specific set of data. Our first challenge asks all local councils to serve up an RSS news feed by Christmas. I wouldn’t have bet good money in 2003 that by 2009 370 councils would still be without RSS, but here we are. I’ve thrown the gauntlet down and I’m pleased to see that a couple of hundred people have signed up to our website or followed us on Twitter to help make this happen. The councils have got over eight months to do what in most cases will not be more than half a day’s work to serve RSS from their websites. Others less fortunate will have to persuade their content management system suppliers to enable this feature for them. All have got plenty of time to perform this technically trivial task in time to give the public a small but highly symbolic Christmas present that shows that government in this country is prepared to trust its citizens with their own data.

As for my promise never to write another scraper, it didn’t last long. The very first task to build Mash the State was an hour spent writing a scraper to tease a list of councils from a government website. Join us and help to hasten the day when no-one will ever have to do anything like that again.

Written by Adrian Short

April 14th, 2009 at 10:30 pm

Building a local news mashup with Twitter, TwitterFeed, Delicious, Yahoo! Pipes, Ruby and RSS

with 17 comments

sutton-local-news-mashup

(Click on the image to download the PDF, 19KB, opens in new window/tab.)

Like this? Follow me on Twitter: http://twitter.com/adrianshort

I’m a self-confessed and unashamed news junkie and this is how I’m starting to mash up news in my local area. For those that aren’t local, Sutton is a London borough with a population of approximately 180,000. Stonecot Hill is a neighbourhood within Sutton with a population of a few thousand.

Here’s how it all works.

Sources (green boxes)

I write Stonecot Hill News which is a local news blog running as a standalone WordPress installation on its own server. It produces an RSS 2.0 feed which here is treated as an outbound API.

Paul Burstow is the local member of parliament (constituency: Sutton & Cheam). Paul posts news regularly to his website and for many years that site has been serving an RSS 1.0 (RDF) feed. Whether he realises it or not, Paul laid one of the first foundations for news mashability in the borough.

The Sutton Guardian is the local newspaper, published by Newsquest. Together with its sister titles in other areas, they publish several dozen RSS 2.0 feeds for a wide variety of content.

Sutton Council is the local authority for the borough. Despite a recent £270,000 revamp to their website they haven’t yet managed to step into the Twenty-First and produce any RSS feeds. However, they do publish a variety of content regularly on their website, including their press releases.

APIs (grey boxes)

For the non-technical: API stands for Application Programming Interface, but that doesn’t tell you very much. Think of APIs like connectors or adapters that allow one program to plug into another in the same way that our household appliances can all connect to the electrical network because they share common plugs and sockets.

An API may be inbound (allowing data to be put into an application), outbound (allowing data to be extracted) or both.

As we can see in the diagram, applications which use APIs can be daisy-chained together, with the output of one application being fed into another.

RSS and Atom feeds are also APIs in that they provide a structured way for a program to get data out of an application. These feed formats are simple to implement (many applications produce them automatically) and are the first thing to consider when implementing a simple outbound API for an application.

Mashers (pink boxes)

Mashers are small programs that connect otherwise incompatible inbound and outbound APIs together. TwitterFeed is a simple example. Say you want to automatically post the new items from your blog to your Twitter account. Your blog serves an RSS feed but Twitter, while it has an inbound API, cannot accept RSS directly as input. TwitterFeed links the two, allowing the user to define any number of RSS feeds as inputs and any number of Twitter accounts as outputs, via the Twitter API. In this way, TwitterFeed plugs blogs into Twitter.

Yahoo! Pipes is a much more sophisticated and flexible masher. It can take inputs from a variety of sources (RSS, Atom, CSV, Flickr API, Google Base or even raw web pages), sort, filter and combine them in every conceivable way, and output the results as a single stream in various formats (RSS, JSON, and KML, the geo-format used by Google Earth). For my mashup I created this pipe to filter Paul Burstow’s, the Sutton Guardian’s and Sutton Council’s news and only pass through items containing the word “stonecot” to the stream that eventually ends in the @stonecothill Twitter feed, which is just for Stonecot Hill residents. The number of items coming through these sources about Stonecot Hill is very low, but when something appears residents will want to see it. (By way of example, only a single press release from Sutton Council in the last 227 concerns the Stonecot Hill area specifically.)

As mentioned above, Sutton Council doesn’t provide an RSS feed or any other kind of outbound API for its press release. I wrote a screen scraper in Ruby (using Hpricot) that grabs the press releases directly from the council website, dumps them into a MySQL database and pushes new items into the Delicious API. I’ve used Delicious here for two reasons. Firstly, because it generates an RSS feed automatically from all the items posted to it, so I can easily connect this output to other mashers and APIs further downstream without having to generate and host an RSS feed myself. Also, Delicious provides a useful search facility on its website allowing me to easily search just the press releases from Sutton Council. This isn’t possible with the council’s own website, where searches are scoped to the entire site.

Destinations (orange boxes)

In my diagram, the destinations are sites and services which represent new ways of consuming information coming from the original sources. Don’t want to read Sutton Council’s press releases on their own website? You can folllow them in Delicious or on Twitter. Want to keep up with the latest news about Stonecot Hill? Again, the @stonecothill Twitter account can find this for you from various sources. I also add my own items to @stonecothill, making it a unique mashup of original and syndicated content that’s highly targeted and very local.

The information stream doesn’t need to end with these destinations. Any destination that provides an outbound API can simply be another link in the chain to downstream services. In my diagram, the RSS feed from Delicious is used to do just that, pushing all its content on to the @suttonboro Twitter account, and just the Stonecot Hill-related content on to the @stonecothill account via the Yahoo! Pipes filter. Twitter has its own specific outbound API and also serves RSS feeds. There’s nothing to stop anyone else building on these destinations by combining and filtering them with other sources to produce their own unique, relevant information streams that they find useful.

What next?

If you run a website, it’s time to start thinking of mashability with the same degree of seriousness as you treat human visitors. Your website needs to serve up feeds and APIs so that other programs can connect to your content and deliver it to people in ways and contexts that they find useful. Some of these may have an audience of thousands or even millions. Others may have an audience of one. Regardless, by providing an API to your content you enable others to build things that you haven’t imagined, don’t have the resources or desire to build yourself, and won’t have to maintain. Businesses like newspapers that survive by selling their content (or selling advertising around their content) are thinking very carefully about the challenges and opportunities for the future of their industries. For government and voluntary organisations, it’s time to start thinking more like evangelists than economists. Spread the word like the free Bibles in hotel bedrooms and take every opportunity to get your message out there.

Sutton Council have been encouraged in various ways to implement feeds on their own website and the song will remain the same until they do. I don’t want to maintain my scraper for ever and I certainly don’t want to build any more of them.

The whole API and mashability agenda is far bigger than simple web feed formats like RSS and Atom. It’s time for technologists to stop flogging the line that “RSS is an easy way for people who follow lots of websites to read all their news in one place”. Direct human consumption of RSS feeds is never going to hit the mainstream in that way. If you’re reading this, you’re far more likely that average to use an RSS reader. (I’ve got 86 feeds in my Google Reader right now). The average web user has barely heard of the concept and most definitely don’t do it. I suspect they never will. But it’s likely they’re already benefiting from syndicated content through sites and applications that they use. If they never have to see or care about the underlying technology that’s really no more a problem than worrying that the average web user doesn’t understand HTTP or DNS. It’s just plumbing that can stay out of sight and out of mind as long as it works.

For the minority that do use personal RSS readers, I’d like to see more of them with built-in filtering features. Setting a simple keyword filter on a feed makes RSS reading considerably more powerful.

For those serving up feeds, I’d like to see Atom more widely used. Without wanting to open a can of Wineresque worms, RSS 2.0 fudges a number of important issues around content semantics and provides no support whatsoever for correctly attributing items in feeds mashed from several sources. Atom was designed to solve these problems and it does. Let’s use it.

Lastly, mashability is about every conceivable kind of content and content type. It’s not just about news and text. Every stream of information should have its own machine-readable feed. Every system that can accept data from human input should implement an inbound API to do likewise. To take one example, FixMyStreet is a website for people to report street faults to local authorities and currently takes around 1000 reports a week. It even has its own iPhone application so people can report faults complete with GPS locations and photos directly from the street. Only a single local authority in over 400 has implemented an inbound API to receive these reports. The rest get them by email, which must be manually copied into their own databases with all the effort, expense, possibility for error and opportunity costs that represents. Third-parties building extensions to other people’s systems is no longer unusual, so organisations need to embrace the possibilities rather than fighting against it or standing around looking bemused.

It’s time to open the doors and windows and get the web joined up, mashed up and moving.

Written by Adrian Short

March 15th, 2009 at 6:35 pm

Fixing Sutton Council’s usability with Greasemonkey

with 2 comments

Having dealt with the issue of broken links on Sutton Council’s new website, today I’ll turn to some of the other usability issues that beset the hapless traveller on their road to local government web nirvana. True to the spirit of my own advice about fixing problems where possible rather than just moaning about them, I’ll present a fix that will curb some of the worst excesses and give the site better usability in some areas. Scroll to the bottom for the good stuff if you can’t wait. First, the discussion.

1. No distinct link colours, no visited link colours.

I’ve read half of these stories, but which ones?

Two of the web’s strongest conventions are to use different colours for links and body text, and to use different colours for visited and unvisited links. Ignore them at your peril.

Links need to stand out from body text so they’re easily visible at a glance, not just on closer scrutiny. The usual method is to use a contrasting colour for the links and to underline them. The underlining can be dropped in obvious groups of links such as navigation bars and at a push in body text. A different colour is pretty much mandatory. If you’ve got links, why camouflage them?

Using a different colour for visited links is all but essential so that the user can easily see which links they’ve used and which they haven’t. The more links a page has, the more important this becomes. Again, it’s effectively a mandatory usability requirement and so widespread it’s ubiquitous. Not using different colours for visited links is one of Jakob Nielsen’s Top 10 Mistakes in Web Design.

On Sutton Council’s new site, the body text is black, the links are black and underlined and the visited links are black and underlined. Spot the difference? Clearly, badly-conceived ideas about graphic design have taken precedence over the convenience and sanity of the poor souls that might actually have to plough through some of the site’s several hundred pages. Or maybe the designers have short-term memories that can hold twenty or thirty items. Who knows?

2. The Clock/Calendar anti-pattern

Perhaps I’m not really in the target audience, but when I want to know the time or the date my first instinct isn’t to visit Sutton Council’s website. Right now I can see the time in three different places (watch, wall clock, taskbar) and finding the date requires no more effort than hovering my mouse over the clock in the corner of my screen.

Putting the current date and time in a web page is rarely necessary and often confuses. Aside from the obvious cost of cluttering the page with something that just doesn’t belong there, it can lull the user into a false sense of contemporaneity. Hey, this site is bang up to date! Just like the clock on my wall!

Sadly, the current date on a web page is often mistaken for the publication date of the web page itself. This is a problem as I hazard to suggest that very little of Sutton Council’s web content has been published within the last minute. It would be all too easy to see that date as being relevant to an otherwise undated news article or press release.

Dumping the current date and time into a web page is a shoddy anti-pattern that needs to stop. It’s a bad habit picked up by lousy designers (or lousy clients) who presumably feel that it’ll liven up an otherwise pedestrian site. If it’s not contextual it’s clutter, so leave it out.

Incidentally, given that the council’s PR department ploughs through nearly £600,000 a year, it’s worth asking whether we can get dated press releases and news articles for that money or will we have to stump up a bit more. What’s it worth?

3. Teeny text

Is it just me getting old or is the text just a tad too small? Yes, there are gratuitous “accessibility” widgets at the top of every page to adjust it, but a better approach might well have been to make it a bit bigger by default. Not everyone on the web is a 20-something 1337 h4×0rz.

Help is at hand!

Better Sutton Council is a Greasemonkey script I’ve written to fix these problems and enable colourful, legible and bad-date-free browsing.

How to get it:

1. You must be using the Firefox browser. No Internet Explorer, Opera, Chrome or what have you.

2. Install the Greasemonkey add-on if you don’t already have it. You’ll probably know about it if you do.

3. Install Better Sutton Council as a user script and if necessary, activate Greasemonkey by clicking on the greyed-out sad monkey face on the status bar at the bottom of your browser window. Once the monkey face is smiling happy and colourful, you should be ready to go.

4. Just refresh/reload/visit Sutton Council and enjoy a whole new way of browsing.

A couple of important points:

  • I haven’t been bothered to track down the exceptions to the default link colours I’ve defined for darker backgrounds. My aim is to make the site more legible and usable, not to improve its overall prettiness. If you’re expecting a comprehensive redesign you’ll be disappointed.
  • This “hack” operates purely in the user’s browser within a well-managed script framework for modifying downloaded web pages before they’re displayed. At no point have I compromised Sutton Council’s security or created any vulnerability on anyone’s computers. Don’t embarrass yourself by trying to McKinnon me: I haven’t done anything worse than the web equivalent of colouring my daily newspaper with crayons.

The software’s in the public domain. Modify to taste if you know how. If not, just enjoy it as it is or uninstall through Manage User Scripts on the Greasemonkey menu (right-click on the monkey face).

That’s better.

Written by Adrian Short

September 29th, 2008 at 10:36 am

Permalinks — a guide for the perplexed at Sutton Council

with 7 comments

Sutton Council launched their long-awaited new website this week and it’s disappointingly dreadful in many ways. Possibly worse than anything in the design or content of the site is the sad fact that the new design has broken all the inbound links to the site, just like it did the last time and the time before that.

What does this mean, why does it matter and what can be done about it?

Read the rest of this entry »

Written by Adrian Short

September 27th, 2008 at 11:47 am

Twittering Sutton

without comments

Problems:

1. Sutton Council’s Latest News section doesn’t have an RSS feed or any easy way for the public to track it other than by visiting it regularly.

2. The Sutton Guardian has more dirt than diamonds (although at least it has a feed).

3. Other things happen that don’t get reported.

4. You don’t have time to plough through two dozen websites to keep track of what’s going on in Sutton.

Solutions:

1. Visit http://twitter.com/suttonboro for a concise, well-edited overview of borough activity.

2. If you use an RSS reader, subscribe to the feed at http://feeds.feedburner.com/suttonboro

3. Subscribe to the latest updates by email, if that’s your thing.

Enjoy.

Written by Adrian Short

August 18th, 2008 at 10:00 am