I wasn’t quite sure where to put this but given I’m thinking this might be an alternative to the classic http DNS approach I thought I’d stick it in features.
Okay so how DNS works from my understanding is it essentially works like old school telephone operators used to. You call up the DNS, ask for a name and they look it up in a long directory of IP addresses. Now what if instead of a web address system (classic htttp approach) we created an AI to do something similar. So say you asked an AI with an LLM “Please load the automati forum” or “can you please list all related automati sites” similar to doing a search. Or “Please search the safe network for cat videos” or whatever. Remember there will be a ton of public domain content on the safe network. The problem won’t actually be finding the sites. The problem is making those addresses human readable. But an AI doesnt care if it has to deal with hash code addresses. It’s all just strings in a database to an AI and they’re great at remembering things, unlike us. So, get the AI to act as a bookmarking operator for you, hell you can have it act as your personal DNS system if you add a profiling system to it. You could even add a tagging function for content that way easily. You could create an extension for a browser or just write an ai using python that could be run in the web browser itself. Then people would just need to submit their addresses to some kind of public dataset, which could be in turn stored on the safenetwork and retrieved by the AI. You could even write a simple function into the AI for submitting your website info couldn’t you. “Hey could you submit my website’s info to the public list?” AI asks for the specifics and the user inputs them then the AI sends them off. Would that be so complicated?
I realize creating an AI to do all this might be a bit more complicated than it sounds but it can’t be THAT more complicated. And I’m positing the idea here because it would affect a wide array of people so I think it’s fair to have a bunch of community feedback on the idea.
We can use QR Codes to help with sharing XOR addresses, which is handy when you can’t physically click a link (e.g. on a flyer, an advert, etc).
We can bookmark XOR addresses and give them a friendly name. By default, bookmarks are searched when you type in the URL bar too, so the experience is pretty good.
We share our bookmarks with others as our ‘pet names’ for sites. Maybe this forms a sort of curated list of names for XOR addresses, but it’s decentralised.
Once we have a search engine, finding XOR addresses by names/keywords gets much easier, especially if some sort of ranking system can be used.
Maybe there is room for a de-facto domain names, but I suspect it will either have to be a gold rush, free for all, filled with squatters trying to extort folks. Or, it has centralised elements, which seems at odds with what Autonomi is all about.
Well what if you paired a pet name with a unique ID code? So there might be a dozen people named Joe Smith but how many of those Joe Smith people have matching ID codes? So just slap a ID hash along with each pet name OR a set of easily recognizable symbols/NFT emojis. The ID code could be burried in the metadata but would show up in things like search results or URLs. Http://safe.ID.petname.suffix or something like that. How many combinations of name and ID codes could be added if we just added a 5-7 alphanumeric hash as an ID code? What if it included symbols?
I mean if the goal is to prevent people from buying up domain names then if you increase the number of available pet names from 1 to 1M then it kind of makes buying them all up cost prohibitive.
Whether you’re using bookmarks, id codes, a search feature or whatever having an AI to help manage it would be useful. The bookmarks could be stored as a simple xml or csv just like how they are when exporting or importing from firefox. Then its just a matter of sharing bookmark xml files with other users just like you would playlists or something. The AI could simply act as an easy to use UI and formatting tool so you dont manually have to edit the xml file and can just go “Add this link with this info to my favorites.”
Website is concept of same age as DNS. We use web basically s interface to do two things - to access data from some source and to order some service (both online and real world).
I don’t se AI as search and bookmarking tool for websites. What AI can do is interacting directly with databases and APIs of service providers and prepare personalized user interface to whatever you want to do at the moment.