NRS Brainstorming Megathread

NRS brainstorm Post-it summary

A summary of the discussion points above which went from the desirability of NRS to the meaning of the perpetual web, to approximating time keeping on Safe. It’s a Wiki so please add, subtract or qualify

What is NRS?

Name Resolution System (NRS) is a system that translates a human-readable web address into a content based 256-bit XOR address. It’s analogous to the Domain Name System (DNS) on the Internet.

XOR addresses (unique identifiers for stored data) can also be translated into more manageable base32-encoded XOR-URLs, but though shorter these are still not easy for people to read. XOR-URLs may also encode metadata such as mime type.

Features of NRS

NRS is an app-layer system rather than a feature of the network itself, meaning it can be replaced by a competitor. However, it is currently part of the MVP feature set.

NRS only translates the address of a domain. It does not encode any metadata.

Websites on the Safe Network can be identified using NRS thus: safe://service_name.public_id

Files on the Safe Network can be identified using NRS thus: safe://service_name.public_id/file?v=[int]. The version number applies to mutable data. When the data is changed the version number increases by 1.

There is arguably a case for enforcing version numbers in NRS links to resources so that the version returned is predictable. There may also be a case for allowing ?v=latest so that the latest version of the file is returned (See below)

Advantages of NRS

Ease of use, people want simplicity.

NRS is a key differentiator.

Can allow simple transactions without copy-pasting long keys

XOR URLs are hard to use for people with visual impairments, scary for the non-technical

Possible business opportunity for MaidSafe selling domains

Disadvantages of NRS

In current form it allows domain squatting

Link rot “For something to be called the ‘permanent web’ it must do more than just store bits, it must have them accessible.” (See below)

NRS is currently the only game in town

The simplicity and tidiness of NRS may encourage unversioned links (See below)

Allows use of visually similar characters and commonly confused characters, homoglyphs etc

Confusion of addressing and ID "There’s an assumption that follows from DNS that public names are unique”.

Other comments

Should priority be search instead of refining NRS?

Competitive NRS could be advantageous.

But could lead to competing incompatible ecosystems.

Possible alternative approaches/enhancements to NRS

Just use XOR-URLs.

Pros: everything is versioned, no squatting, secure. Cons: not user friendly

Don’t use a single global namespace and instead have hundreds or thousands of short-ish TLDs with more that become available over time"

Instead of alphabetic TLDs, you could also use the year

Identity

Title (or Petname) / PublicName combo - a non-unique name to go alongside the PublicName.

“These three elements, Public Name, Site Name, and Pet Name`, are the elements that would allow a site to be Findable, and communicable.”

There can be other ways to check the Public ID of a publisher, including a profile, flags and site page history. Do they pass the smell test?

Link rot

Link rot is where a link to an external resource goes out of date when the version of that resource changes.

“An important difference with Safe is that the destination data doesn’t rot, it’s the link pointing to it that may no longer point to the correct version".

Versioned files might link to CSS stylesheets, JavaScript, and other programs and frameworks that might change in unpredictable ways and may not be backwards compatible.

Possible solutions to link rot

Link to the latest version by default using something like safe:link_to_resource?version=latest.

Insist all NRS links are versioned - i.e. they point to a specific version of the resource, or use XOR-URLs instead of NRS.

Enforce versioning: Safe_api could reject any SafeUrl without a version identifier

Drawbacks to these solutions

Link to latest by default:

Makes the perpetual web difficult to imagine, because a page viewed some time later might show a different embedded image, different text etc, and changes to the external resources are out of our control. “Being able to link to “latest” anything is easier/simpler/cleaner. But we cannot have that and a permanent web as far as I can see at the mo.”

Cannot use dates or relative time to pin version (Safe has no time).

All external links versioned

Could be inflexible:

“Links never go stale on a permaweb.”

“They do, if the intent is to link to the latest version and that not an option.”

The Perpetual Web

The Perpetual Web is a Safe USP, but what do we mean by the Perpetual Web? It needs a clearer definition.

Does it mean:

  • Historical pages are strictly immutable and will look and behave in exactly the same way in 20 years time?
  • Historical pages will feature the same content but may look and/or behave differently (eg updated CSS/JS)?
  • Historical pages will exist forever but the links within them can change?

What proportion of Safe’s users will value the Perpetual Web? Should it be the default setting or an option for archivists, journalists and fact checkers?

If links are made to static versions, how can we be sure that links in the linked documents are also static? What if they link to the latest version of some image? How do we get to the latest version?

If links are by default to the latest version, how to we smoothly roll back to previous versions?

How can a perpetual web and one that’s constantly changing coexist?

Safe should be able to accommodate both the need to snapshot the past and the requirement to keep up with the present. But there are likely lots of edge cases.

Example: a page that presents images at random.

“If there is random behavior at version N, then there should be randomness again 10 years later at version N.”

Where and should these ‘snapshots’ be stored? If a niche use case requires the entire site is saved exactly as is for the historical record, this could be done in private (paid) storage.

Is there an alternative?

Possible solutions:

Start with enforcing versioning and later relax it to allow version=latest as tools (apps) become available.

Use JavaScript to control versions (however JS might change the content, see below).

Stick to pointers and versioning to avoid JS, as in IoT devices like cameras.

Browser / app level controls to switch between current and historical versions.

“By a) requiring that version always specified, b) providing option preferlatest flag, and c) implementing normal and historic browsing modes in user-agent, allow the web to be viewed as a dynamic ever-changing thing in the present, but also as global fixed snapshots in the past. This is powerful and has not existed before."

Calendar tab in browser/app to dial back through time, but to do that you would need to timestamp each version when published (See Timestamps below)

Perhaps a distinction could be made between “things that are used to build this page” vs “things that this page is linking to”.

Static vs dynamic sites

Safe can host static sites but dynamic sites are difficult because of versioning. Some static sites can behave like dynamic sites (e.g linking to version=latest), but they are not truly dynamic and this should be made clear to avoid confusion.

“You can link to documents such as a price list at latest version via normal NRS, just not load them directly into the page (as things stand)”

“Anything loaded as part of the page (images, javascript, css, audiofiles, video etc) has to be versioned or it will not load”

JavaScript

JavaScript can help with linked version choice but it may also promote risky behaviour.

A static page with JavaScript can show different content every time (eg Math.Random(), new content feed), but a page without JS cannot.

Mixing old and current materials on the same site may be tricky.

“JavaScript breaks all the efforts to enforce versions”.

Two lines of JavaScript to load an image.

Is using JavaScript to link to latest a useful halfway house or adding unnecessary complexity? How much load would it put on the network?

How to tell if linked content has been updated

Devs should be encouraged to version their content and users need a quick way of checking the currency of the linked content.

A three digit versioning scheme (x.y.z == ..) to provide an intuitive measure for the relative weight of an update

Sum total of all the versions linked to displayed. If something changes this number will too.

Tracking app.

Browsers / tools

Since NRS is app level, the controls around versioning will be in the app level too.

Have an indicator in browser when version is not the latest.

The option to scroll back version in the browser can be encouraged.

At the moment the browser just won’t load unversioned content (images/scripts/CSS)

But this doesn’t mean you cannot link to unversioned content or that (versioned) JavaScript cannot load unversioned content.

The option to scroll back version in the browser can be encouraged.

Backwards compatibility: Will browsers always maintain backwards compatibility over long periods of time?

Maybe insist on web standards like http headers to ensure compatibility.

Possible ‘developer mode’ in the browser which catches uncertain links which may cause a problem. And for a tool that automatically updates the version in links, etc.

Timestamps

Timestamps are a very human way of placing content in its temporal context, but difficult to achieve in a network that knows nothing of time. Some ideas:

Could timestamps be reliable if based on a section (elder/adult) or close group consensus?

NRS only resolves the site name. Updating AD objects does not store a timestamp either.

Would be a major change. Leave timestamps for application layers.

Perhaps could be managed off-network with third-party service, eg Opentimestamps.org, but maybe dangerous to depend on external services.

Time (if used) should enhance version, not supplant it.

Could time be factored into node age? If an elder is consistently out of time with the group they lose age.

Could be a tolerance on each time stamp which introduces an uncertainty that reflects what is fairly known?

Maybe the size of the Safe Network itself could be used instead of actual clock/calendar time.

"There already is a section timestamp similar to a Lamport clock [A simple algorithm used to determine the order of events in a distributed computer system]. But time is weird and does not work very well”.

Section chain network wide is a tree structure. So we can go back to any point in section time.

17 Likes