With news stories we are all starting to get confortable with the idea of dynamic or live feeds of content to our web pages. In other areas we take dynamic data updates as almost a given or at least a given that will shotly be the norm – areas like the stock market, defense, and even medical information all are moving or have already moved to "online – all of the time". For reasons not totally clear this has not yet happened in the geographic information industry – in spite of the fact that few other kinds of information demand such integration more strongly than that of geographic information. The idea of news feeds – real time information syndication offers an instructive model for real time integration of geographic information. It shows that it is possible. And since of these feeds have geographic information associated with them (see www.georss.org ) – it seems only a small step from feeds of geo-oriented content to feeds of geoinformation itself. This was the reason for the creation of GML and is at the heart of the meaning of the Geo-Web. The news feed analogy also carries another point – the importance of combing the right information. While a news editor may justapose disimilar stories, this is always done for a point – to achieve a particular effect. We need to understand how to do the similar thing in the geo-domain – meaning real time data integration. Automated news feeds also make us think about information quality. Is this story true? As a publisher can I accept the feed without review? Can I accept it from just anyone? Similar issues exist. albeit in a more complex guise, in the world of geographic information. Data quality is paramount. Garbage in and garbage out. What do we know of the quality of what we are receiving? How can we provide for high data quality in an online environment? How do we restrict who can publish and what they can publish?