Personalized Summaries for Large Collections of Geo-Referenced Photographs
We describe a framework for automatically selecting a summary set of photographs from a large collection of geo-referenced photos. The summary algorithm is based on spatial patterns in photo sets, but can be expanded to support social, temporal, as well as topical-textual factors of the photo set. The summary set can be biased by the content of the query, the user making the query, and the context in which the query is made. An initial evaluation of our implementation on a set of geo-referenced photos shows that our algorithm performs well, producing summaries that are highly rated by users.
Sponsor of The CIO Dinner