You can newer be 100% sure that the data in the system is the most reacent one. In a decentralized network, there will be many copies of the data, rather than one big database that serve all users. In this enviroment, there will be much data that isn't up to date. We can't have one big server. We can't check with a central source everytim the data is to be used. We must suffice with we last known data.
A frequently visited page could in its links store some data about the links destination (last updated, its size, etc) and the information would be easily accessible to the visitors. In this way could the flow burden be restricted. Information that selldome change could be recalculated / checked maby once a month. Other information could maby be checked every 24 houre. Still other information is maby only changed in special circumstances, as on a users request, or on the basis on submitted information from a user. This does of cource apply on all information. Not only on the information in links. I firmly believe that we should let go of the concept from relational databases, there all data should be normalized to the fifth degree.