Sunday, October 30, 2005

Community Wireless Emergency Response -- Updates on Our Work & the Lessons Learned. | Katrina: lessons learned

A volunteer community wireless brigade reports from New Orleans

# The rigidity of the 'official' disaster response continues to hamper core mission objectives -- even today. For example, the only supported browser disaster survivors can use to apply for FEMA assistance is IE 6.0 (in violation of the government's own Section 508 accessibility rules)-- you can check out this out for yourself at: FEMA was aware of this problem by September 8th, but has still not fixed the problem -- meaning that Mac users as well as Linux and other OS users will have trouble even gaining access to disaster aid.

# Ad-hoc (wireless) networks were often the first telecommunications infrastructure made available to evacuees, beating out the major providers by days (and often weeks).

# Had a diverse array of telecommunications infrastructures been in place, the cataclysmic failure may have been avoided. In addition, networks that are set up to 'phone home' to central locations/servers are prone to failure when most needed.

# The telecom incumbents are spending a ton of time & energy to obfuscate these issues and are conducting extensive lobbying efforts to spin this tragedy to their own advantage. Especially important to them are preventing the growth of unlicensed spectrum, ad-hoc networking technologies, and bandwidth-sharing infrastructures."

more at

Thursday, October 13, 2005

visual complexity collection (information aesthetics weblog)

visual complexity collection

14 October 2005

visualcomplexity.jpga very beautiful collection of networked data visualizations, meant as a unified online resource space for anyone interested in the visualization of complex networks. the collection contains many examples retrieved online as well as from literature (with many new ones, & most online projects similar to those in the infosthetics aesthetics or infovis category).
according to the author: 'the project main goal is to leverage a critical understanding of different visualization methods, across a series of disciplines as diverse as biology, social networks or the world wide web. the website truly hopes it can inspire, motivate & enlighten any person doing research on this field'. sounds somehow familiar. [|thnkx Andrew]

Tuesday, October 11, 2005

Pigeon Rank

Somehow I missed this a few years ago

The technology behind Google's great results

As a Google user, you're familiar with the speed and accuracy of a Google search. How exactly does Google manage to find the right results for every query as quickly as it does? The heart of Google's search technology is PigeonRank™, a system for ranking web pages developed by Google founders Larry Page and Sergey Brin at Stanford University.

PigeonRank System

Building upon the breakthrough work of B. F. Skinner, Page and Brin reasoned that low cost pigeon clusters (PCs) could be used to compute the relative value of web pages faster than human editors or machine-based algorithms. And while Google has dozens of engineers working to improve every aspect of our service on a daily basis, PigeonRank continues to provide the basis for all of our web search tools.

It gets better...

BlogMarks is cool

Saturday, October 01, 2005

Scientific American: Drowning New Orleans

Scientific American October 2001:

Drowning New Orleans

A major hurricane could swamp New Orleans under 20 feet of water, killing thousands. Human activities along the Mississippi River have dramatically increased the risk, and now only massive reengineering of southeastern Louisiana can save the city

By Mark Fischetti

New Orleans is a disaster waiting to happen. The city lies below sea level, in a bowl bordered by levees that fend off Lake Pontchartrain to the north and the Mississippi River to the south and west. And because of a damning confluence of factors, the city is sinking further, putting it at increasing flood risk after even minor storms. The low-lying Mississippi Delta, which buffers the city from the gulf, is also rapidly disappearing. A year from now another 25 to 30 square miles of delta marsh-an area the size of Manhattan-will have vanished. An acre disappears every 24 minutes. Each loss gives a storm surge a clearer path to wash over the delta and pour into the bowl, trapping one million people inside and another million in surrounding communities. Extensive evacuation would be impossible because the surging water would cut off the few escape routes. Scientists at Louisiana State University (L.S.U.), who have modeled hundreds of possible storm tracks on advanced computers, predict that more than 100,000 people could die. The body bags wouldn�t go very far....continued at Scientific American Digital ...