I spent yesterday at the IUPUI Conference Center attending an NSF "Campus Bridging" workshop.  Your first response will likely be the same as everyone I've spoken to - "What the heck is that ?".

Well, this was the first one I attended, but I believe this is one track in a series of workshops to help the NSF decide how to structure it's future Cyberinfrastructure (CI) funding programs.  The focus was on how to get campuses ramped up to support the data deluge generated by scientific instruments from gene sequencers to the LHC.  Obviously networking is a big part of that equation, but certainly not the only part.  There was a lot of discussion about data storage and indexing, meta data, federated identity and so on.

Here are a couple of good presentations that I think hit the nail on the head in terms of how we should be building campus networks to handle big data science applications...

Network Architecture for High Performance (Joe Metzger - ESNET)
The Data Intensive Network (Guy Almes - TAMU)

Incidentally, IU started building our campus networks this way in about 2003-04 and I think this is one of the reasons we've been so successful with projects like the Data Capacitor.