As somebody who works in local government IT, consistent scraping of our data like this is the bane of our life. We get hit by thousands of these, many with no rate limiting, making hugely intensive requests, that cause downtime and knock-on effects for actual customers and citizens. We block IPs, add captchas, and yet it persists.
If you really want the data, just FOI it for goodness' sake.
I get the distinct impression that many of these outfits aren't really advocating for impoved transparency but are simply trying to exploit and monetise illicitly obtained government data to make a quick buck.
Fair points and yeah you must be sick of unrate-limited mass scraping. I run with 1.5-3 second delays from a single residential IP and back off when portals push back, but from your side I look the same as someone hammering you.
On your point regarding FOI, what you say is fair. I should probably have led with that for the trickier councils. But the honest reason I haven't is doing 240 FOI requests at scale felt like it'd put a different kind of strain on councils, but if you're telling me the scraping is worse then I take that seriously.
On "monetise illicitly obtained data"... I'm not going to pretend the £19 is altruism. But there is a public interest in this data being navigable across council boundaries, and that's not something individual councils can deliver. I must stress that I'm not sure I've got the model right yet and a lot of feedback today is pushing me toward more free, which I'm seriously considering.
Maybe I'm just naive, but why wouldn't a citizen do both?
I'm not implying that anything would get deliberately redacted, but it seems likely that information released through other channels would not match the web. A request might also reveal information that was not on the web.
Waste disposal and planning for quarrying and mineral extraction are different functions, decided at a higher tier of local government, and are not directly comparable to development management/planning.
Websites usually link to their RSS feed using a <link> attribute in the head of the page.
Browsers used to detect this and show an RSS icon near the address bar if the website you were viewing had a feed - and you could click the icon to see more details and subscribe.
I find this so sad. I would gladly pay/donate to support Firefox, far in excess of however much money they would make from data mining and advertising. I am sure that enough people feel the same way to make it a viable model.
Thunderbird raises more than $8mn a year in donations to support their development. Thunderbird's success has proven that this model would work.
I bought an N9 in 2011 and it was an incredible phone. The design and UI were gorgeous and it was a joy to use. I still miss the swipe-driven UI - it was clever, intuitive and well thought out. The phone itself had Facebook, Twitter, WhatsApp, and Spotify clients, and MS Exchange support for calendaring and email (I believe Nokia developed or ported many of these in-house) and was completely usable day-to-day.
Compared to Nokia's symbian phones and earlier Maemo efforts, it felt revolutionary and I'd agree Nokia had a device which could have paved the way for a post-symbian future. It definitely felt like, with continued investment, it would have been a real iPhone competitor, and in just the nick of time.
If you really want the data, just FOI it for goodness' sake.
I get the distinct impression that many of these outfits aren't really advocating for impoved transparency but are simply trying to exploit and monetise illicitly obtained government data to make a quick buck.
reply