Upon loading the map I must admit to feeling a sense of trepidation. It's really not the most pleasing of sights as you get Bing satellite imagery combined with a baffling array of place names at various sizes. Too much! And slapped on top are some rather odd looking marker symbols....but at least they are designed to be seen across the dark basemap and all that detail.
As you zoom, map content scales nicely and you get to see actual flood risk surfaces appear. Personally, I'd like some further information on how these are calculated; and they aren't clickable so I don't get a sense of the scale of the potential threat other than visually. Is this important? Well it's a flood risk zone and it seems sensible to use this as a gateway to details about the risk rather than have them listed only by the more generally located marker symbol.
There are options to switch between Bing imagery, Ordnance Survey raster data or a Bing road map and at different scales this becomes useful to ensure you can see the flood risk zones clearly. Of course, none of these are perfect solutions because the basemaps are not primarily designed to act as the underlay for operational layers like flood risk zones...but that's a perennial problem when mashing up polygonized operational overlays on top of general reference/purpose basemaps. A nice touch is the ability to apply some transparency to the flood risk zone layer...and also to invoke an 'emphasis layer' which adds a dark grey transparent wash across the map, effectively darkening the background and allowing emphasis to be given to the flood risk zones and markers. This is a novel way of countering over-cluttered basemaps in the event that you don't have access to a nice, clean de-cluttered basemap.
The search works well and I managed to locate a range of places easily using either place names or postcodes and you can sign up to receive alerts via other mechanisms.
Overall, a pretty good effort and a good example of a well constructed web map. It's not over-complicated and its use is fairly intuitive. I guess the only real question is the accuracy of the flood risk surface itself...when you dig into the help it's clear that the risk zones are a little spurious. Flood mapping is notoriously difficult and the map admits this. It states that the zones do not include information on flood depth, speed or volume of flow or groundwater, runoff and other sources. Hmm...so it's what then? It's a predicted surface of areas of risk based on what? Well that's not made clear and that's where maps like this fall over. On the face of it, it gives access to what many would view as a detailed and accurate map of flood risk. But it's only as detailed and accurate as the input data which is partial at best. The danger here is the map, while generally well presented, implies a level of accuracy that might not be valid. Not many (I would think) would read the help file to explore the details of how the risk surface was or wasn't constructed...they're going to go straight to their home location and see a great big orange or red polygon splattered over their house. Notwithstanding that the map explains that it shouldn't be used to infer any risk to individual properties, this is the natural inference.
Implied accuracy is something that web maps in general need to wrestle with. People view maps as accurate (they always have). They rarely question them, they rarely ask the right questions about data efficacy, the mapping technique and how it was applied, or whether the map's form and function have been in any way subjectively manipulated. Since we look at web maps rapidly and want rapid information retrieval there is a need for the map to somehow qualify what it's showing so that on the one hand the potential risk is communicated to the public but without causing panic or hysteria through implied accuracy. One way this might be achieved is to create graded flood risk surfaces. At the moment it's a binary surface...the zone is either there or it isn't...and it's graded as Flood alert, Flood warning or Severe Flood warning. Why not code this in the map itself as a layer with different internal zones...and why not calculate some measure of how 'believable' the alert might be. For instance, not all areas have the same probability of flooding despite being in the same alert zone. This is more complicated but this is also where data science needs to meet web mapping to provide more accurate frameworks.
UPDATE:
Owen raises an interesting issue with regard to the complexity of the data as it is, what it shows and how more complex data might lead to a confusing map. I agree to an extent. Making complex data easy to understand is tricky BUT to my mind this map is an example of less is less. @mapperz tweeted the following this morning which characterises the problem very well:
So a map made for public consumption can be simple so that the public either don't need to think too hard or get bogged down with detail? Whereas if the map was made for 'GIS statistical analysis' (whatever that is) then we by definition require something altogether more complex because either we can handle complexity or we demand the detail to do more with it. I disagree here...and this is my beef with far too many maps (and web maps in particular)...they over-simplify and justify the approach is to satisfy a general public with low levels of understanding and cognitive capabilities. They don't want to create confusion so the answer is to create a very vague map using data that generalises. Actually the reverse is true...it's an art to take something with a complex structure and deal with it in a sophisticated manner to communicate the complexity clearly and without ambiguity. Just creating a simple version and claiming people don't want to do GIS analysis is a bit of a cop-out. Using the argument that the map was made for public consumption shouldn't absolve the map-maker of dealing with complexity in a sophisticated manner.
No comments:
Post a Comment