Better go and polish that then…
Thanks for validating rather than following the LLM output. Few people bother to do that for obvious reasons and it’s a problem.
I suspect few people are smart enough to use LLMs effectively which makes them vulnerable to both simple errors and manipulation (all existing LLMs are centralised and that is an even bigger problem - even if you run them locally, if you can’t validate the training data you can’t expect the output not to be deliberately tainted).
Back to BIP v RFC, I don’t plan to dig into the details - maybe others will.
Yep, since the 80’s at least
Not only that, but the majority of training data has a bias already. Initially it was 18-25, white, nerdy, unmarried, usually no girlfriends, often sheltered upbringing, males, in silicon valley
The more “well trained” it is has become, it is still biased due to the nature of the information put out there. Very few truth pages, and 100’s or 1000’s of theories around the facts, or conspiracies about the facts.
In this environment with owners of LLMs racing to get their trained models out there, there is no ability to have it have a measure of reality, they just hope wikipedia and the millions of blogs and reddit and … are factual enough to overcome the errors.
Like if enough people said RFC is request for consideration then LLMs at some stage will say that instead of request for comments.
I went through the various BIP procedures documents (BIP-1, BIP-2, BIP-3, and BIP-123) and have drafted AIP-1: AIP Purpose, Process, and Guidelines. I checked this into a new repo on the safenetwork-community github: aips
This is a line-by-line walk through of BIP-3, while also merging in BIP-123, with changes focused on Autonomi. After going through the document it is very thorough and well documented. I’m honestly impressed that the bitcoin folks went to this much trouble. We should take advantage of that. There are a handful of FIXME’s related to licensing options (we need to decide what licenses we want to support for AIPs and associated code) as well as a blurb about how to document/classify Autonomi network layer AIPs, especially related to how to handle potential network forks. We also need to define a list of people that are willing to take on the role of AIP editor to manage requests as they come in. The role here is defined in the document, but mostly just making sure documents are in the proper format, handle abandoned AIPs, and manage pull requests.
I think this is a viable path forward. What do you guys think?
I’m going through the comments on how to build this authentication app trying to understand as best I can. There is a lot of talk here about siloing keys/data between applications with application IDs used to derive keys. I do see certain use cases for this, especially in regards to application configuration, but I would also note that, where possible, data should be able to port between applications.
For example, if I’m using application A to store all of my pictures and application B to do all of my music and then application C comes out that does everything, I want the ability to open up application C and see/interact with all of my stuff and it just work. And I should be able to see anything I build in application C on my A and B setups. Today there aren’t metadata standards that would enable something like this, but there may be one day. All that to say, I don’t think it is a good idea to derive all keys by application ID, unless we reserve an ID like ‘0’ as “common data” or something.
At least that’s how I’m understanding the conversation so far, let me know if I’m not getting it
It’s good to be aware of Maidsafe RFCs and Community CEP/RFPs, maybe worth taking from our own backyard what we have here
Thanks for the links, I’ll dig into these.
I went through the Maidsafe RFC process and the Rust RFC process (which is very closely mirrors). Very good read. I appreciate you sending me the links. I didn’t know where the MaidSafe RFC process lived, looks like it has been dormant for several years. Here are my thoughts:
The RFC process is tightly coupled with MaidSafe: An RFC is created, some back and forth in the community and with MaidSafe devs occurs, and if all goes well, is accepted by MaidSafe. It is very centralized to the core development team and the network (or language in Rust’s case) the team is building with its purpose being to update some function to the network protocol itself. And in earlier days, this makes a lot of sense because we were trying to get the network running, so all work would need to go through MaidSafe.
But now we have a live network and a community of devs building upon it. The function of the improvement process has grown in scope, so we’re going to be less and less concerned with core network and API updates and much more interest in application related topics and standards. MaidSafe will become much less important to the point where often as a community, we may not interact with them at all. So tying a community proposal management system with a single entity will not only become a bottleneck as we gain developers, but could even be a liability if something were to happen.
That is all to say, we could update the RFC verbiage and process to include a greater community stake in the process (looks like it hasn’t been touched in several years), but I would argue, why bother when the Bitcoin folks already did it. In the end, it is my opinion that a modified Bitcoin BIP type process will be more applicable for the Autonomi community in the long term than the Rust language RFC process.
That will be down to your applications to store the data in a way the other Apps can utilise. If the files/data are private then the only data to share between apps is the datamaps to the files.
There should not be any reason you cannot give the “datamaps” store a UID that the app asks the authentication app for the “datamap store” data key. Then the user responds to the authentication app to give permissions (and option to add that to the list of approved apps that can get access so next time no need for user to give permission)
Sharing data doesn’t require sharing private keys, so portability of data is a separate issue.
And does the CEP has anything to take from there, how do you think?
I’m not necessarily talking about sharing data. If I go make some scratchpads with application A, I would want to be able to find them and interact with them in application B. I may not be sharing them with anyone else.
Yes, good point!
I read through the CEP description and the comments. It does sounds like an interesting concept to layer on top of this. I don’t see it as competing necessarily. Maybe the bamboo garden fund could be used to fund/incentivise proposal development and discovery? That said, it reminds me of the IF challenge snafu we just went through, so I have a somewhat bitter taste of voting with tokens at this point
I do like the structure in how it interacts with the forum by binning out proposals into separate categories. In the end, it is my opinion that the forum be the central hub for discussions vs starting a mailing list or doing this on discord.
Ah, that makes sense. So each application could have its own ID and derivation path, but the authentication app can pass application A’s main key/path to other end applications requesting access. It wouldn’t be automatic, you’d have to approve it. Did I get that right? If so, I like it. You get the best of both worlds: separate name spaces when you want it, security of key chains by default, discovery of data by request.
The only downside I can potentially see is that we will have a lot riding on this one app. This thing better be rock solid or it could take the whole scheme down.
Exactly, I just wondered if we can get some good ideas from there.
Hmm I think the voting was with clicking “Vote” on a topic in “Ideas” category, not tokens.
Hmm this makes me think of somehow ensuring decentralization, or at least de-siloing, so maybe we should think about how we could migrate content to other platform… Or using platforms, that have APIS, so we could maintain a mirror somewhere else, in case GitHub / Discourse went down or @Southside lost access to GitHub
Very good point. I guess I should say, the forum is the central place UNTIL a viable Autonomi replacement is ready.
I would consider making @loziniak @zettawatt (and poss others) owners of the safenetforum-community alongside @Josh @JPL and @aatonnomicc
We won’t all lose access, but even so a mirror off ofGitHub is a good idea.
I have this principle for ryyn file sharing.
Files are encrypted with a key specific to it, scoped under the app etc.
A link can be sent to recipient, with payload encrypted to recipient key, containing a) the address to the file, and b) the file secret key to decrypt it.
You can see my example here: GitHub - oetyng/ryyn