Undoubtedly
Well, I would focus then only on coin transfers and private data if I were Maidsafe. Leave the public part for web3 projects (they can store their public data as private on Safe). Or maybe some anonymous community can create a public data layer on top of Maidsafes Safe Network.
One issue I see here is that all aspects of the network can become illegal over time. If encryption becomes illegal, then this tactic wonât work again.
Just want to note that the biggest issue here is governments censoring the development of the network itself. If they can blackmail MaidSafe into including this file censorship in the protocol, then they have too much power and the network isnât autonomous.
I have nothing against attempts to get the super bad content off the network in general, but judging by this thread there isnât an agreed on way to do that, so what we have is a discussion on how to change the protocol to something that the supporters of the network donât actually want.
But, that government control isnât just limited to this, and theyâll try to meddle again in the future. Back when this project started, there wasnât quite so much overreach, but now there is and itâs set to get worse. Whatever happens with file censorship on network launch doesnât matter, as long as there is a clear path to decentralisation of development and potential removal of censorship afterwards.
It seems that the discussion stops, so I will speak again. I would like to comment on deduplication, since it was mentioned in the topic of the last update. Deduplication doesnât seem like such a good idea to me now. We found out that there is a possibility of damage to the integrity of the file if censorship algorithms are created by someone. If each of the owners of the nodes that store all copies of at least one chunk install software that censors the network, all copies of the chunk can be deleted at the same time. Thus, the file may be corrupted. The larger the file size, the more likely it is to be corrupted. I understand that the introduction of deduplication algorithms can significantly reduce the cost of uploading files to the network, but in contrast to this, another algorithm could be created - an algorithm that can increase the security of data integrity. The idea is not to remove âextra chunk copiesâ, but to include more chunk copies in the overall list if the file has been uploaded to the network more than once. In this way, it would be possible to increase the security of the integrity of the file from being corrupted by external censoring software. I understand that this idea is likely to be rejected by the developers, but I could not help expressing it.
If all copies of file data is deleted, then file canât be recovered, it is obvious.
But thinking about possibility that all nodes can do this at the same time do not produce useful conclusions.
There is no sense in network existence under such circumstances.
It is better to think that some percent of nodes will damage data and other nodes will act as they should.
From such model amount of duplication can be derived.
If nodes delete data frequently, then more copies are needed, if such events are rare, then having less copies is fine.
Making âevery node store every fileâ and âstore only single copy in the whole networkâ is equally bad - something in-between is needed.
If censorship of some kind is implemented, itâs implemented. If you dislike it, better to ensure it is not implemented. Messing around with deduplication misses the point: donât allow censorship or find efficient ways (eg increased replication) to nullify it.
Iâm not sure of the maths, but even for large files I suspect it would require a significant proportion of all nodes to opt in to censorship of a particular file to ensure its deletion. Which in a way is a form of concensus.
Also, the network wonât be the sole holder of most data so itâs not necessarily lost. If it is important itâs likely to be recoverable somewhere.
Iâm not arguing for censorship here, like others I think that any filtering should be application layer and under the control of individuals. My point is that we donât need to be as concerned as some are because as things stand it is not likely to be implemented and should not be used to divert the team or change the implementation.
The point of this debate is to anticipate legal and regulatory moves which might undermine the Safe fundamentals and counter them.
Please read this carefully peolple!
Threat model should include both digital and physical activities.
Original thread looks like âwow, we found that we have not only digital, but also physical threatsâ.
It do not mean that digital threats are gone of course.
I suggest Maidsafe start expressing themselves, their goals and aims much more exactly and literally. No more ants, nature, and âsoonâ. Please pay attention to detail, including spelling and grammar. We donât need abstract dreams of âgoodnessâ. We need unbreakable code.
(peolple â people)

Deduplication doesnât seem like such a good idea
It is an inherent attribute of the design. The chunkâs address is determined from its content. If a chunk of data is stored then trying to store the same chunk always results in the same place to store it.
Yes private data hashing of content has a salt (userID from memory) to make a different address. But that is the only difference.
It would require a redesgn of the storage of this very simple but efficient method of determining the address.
For this discussion the main issue is to find a solution to the regulatory requirements that either minimises the effect or eliminates the need for network changes to be made.
For me having chunks of a horrific file on the network is not the same as having the file available, the datamap is needed to make the file a file. A chunk is gobbledygook otherwise.
@dirvine If the solution must be node side and not in the client (sigh - it should be in the client) then why worry about the chunks and be concerned only with the datamap(&meta). If the Datamap of an horrific file is refused to be stored on the network then that is equivalent to the file not being stored.
This way no chunk that happens by some quirk to be the same as a chunk in an horrific file then the good file will not be destroyed.
@dirvine Important - Correct me if I am wrong but if you have a file of many chunks and chunks are encrypted using the chunk before and after to encrypt the chunk then its possible to have a file where a few sequential chunks are replaced to make an horrific file. For instance an image file that is a good file legal file has a portion replaced to turn it into an horrific file (#bytes kept same, eg JPG with face replaced by underage) will result in only a portion of the chunks being changed (one before and one after or even rest). Thus some chunks remain the same.
This means that I could upload perfectly legal images of nudes or whatever and someone changes a bit of it would result in many chunks not being changed and some changed. The chunk list of the horrific file would result in the good chunks of the legal file being deleted because they exist in the bad file.
This is like the example where a file modified only sees some chunks replaced, except this is for the specific case I outlined.
Then this would have to be unacceptable since good files will be destroyed because a horrific file had some of the same chunks
But in the end as @happybeing says, the point is the regulatory system and working towards a method that does not require censorship within the nodes. Maybe if in the client is not acceptable to the devs then censoring the datamap is the cleaner solution since it does not destroy encrypted chunks that may belong to other files as well.
Many formats allows garbage at the end of the file by the way.
So you can take large legal file (.zip), replace its start with illegal file (.jpg) - and lots of programs wonât complain about it.

@dirvine If the solution must be node side and not in the client (sigh - it should be in the client)
W just donât know yet, there may be no issue at all node side and all client side is fine to handle this. Itâs a huge investigation really.

@dirvine Important - Correct me if I am wrong but if you have a file of many chunks and chunks are encrypted using the chunk before and after to encrypt the chunk then its possible to have a file where a few sequential chunks are replaced to make an horrific file
Yes, just like changing an actual file, but in our case you need to alter the data map and data map holders wonât accept your data map. So it should be a non-issue in terms of chunks. But if you can inject a new data map, it could really be anything. How you would do that is not clear though unless you directed folk on a website like register to a new link and then show horrific content there.
I posted the Strategy Aims we are building plans around a few days ago which are important in the context of this discussion. Because we all want the project to meet its objectives. We all need a better Internet. My hope it you all manage to have a read of that and let the implications soak in.
A lot has changed in the 16 odd years since this project started. Itâs a long time in the history of the Internet. In fact, the Safe Networkâs conception is closer in time to the launch of the world wide web, than it is to the present day.
I say this to make the point that the context and environment we will be launching the Network into matters. It matters in how it will be received, perceived, used, adopted, attacked, and abused too.
The Network is intended to be a shared resource. A knowledge commons.
Commons can be fragile things that need to be continually nurtured and tended to. This isnât a new challenge, or even a technological one⌠itâs sociological in nature. That commons could be a rice paddy, or a drinking-water well.
Yes the network will be autonomous but it wonât be all powerful, and it will always have humanâs defining and organising the parameters of its operation, use, development, and its future.
Those people could be acting with good intentions, or with malice, and their decisions could have foreseeable, or unforeseen consequences. But they are still decisions taken by humans, for the benefit of other humans. The key is how decentralisation works to distribute power rather than concentrate it.
There still is, and there must be, mechanisms for the Network to adapt, change, and course correct over time based on the needs of humans. We arenât making an indestructible robot, or a virusâwe are making a shared resource that is owned by humanity, and it must be answerable to humanity. The question is how does humanity articulate those questions, and demands? That is the problem to be solved.
We are having this discussion in the light of the fact that we do not have the answers yet. Neither do any other teams and projects grappling with the same problem.
But we must work diligently and responsibly on it, and face up to it directly in good faith in order to solve that problem. As itâs not going to just go away with âThis One Neat Legal Trickâ, or some launch tactic, or technology alone. Itâs rolling up sleeves time.
I have an answer.
build a decentralized, content neutral protocol/technology, walk away from it (provide no centralized ownership target), and let the chips fall where they may.
It is that, or become just another controlled, regulated, captured, censored entity.
Thatâs how I see it.
This project needs to decide if it is cypherpunk or not. There isnât really a middle path.

This project needs to decide if it is cypherpunk or not. There isnât really a middle path.
After their public statements regarding this topic you should already know this is not a cypherpunk project.
my stand is that nodes and network should be neutral and then maidsafe should make a client with a system for filtering out downloads for each juristiction. just like tcp udp ftp http https etc all are neutral!

build a decentralized, content neutral protocol/technology
Wasnât that the point from the beginning? Maidsafe has taken a wrong turn if they concern themselves with any 3rd party content whatsoever.

he Network is intended to be a shared resource. A knowledge commons
Wrong. The safe network itself was supposed to be a decentralized collection of computers that just manage chunks of random noise. If the storage technogy and communications are ultimately secure then it may be possible for a knowledge commons or perpetual web to emerge and grow on top of that. Content was only supposed to exist at the client end points. Clients are ultimately responsible for and in control of what they upload/download and are accountable to their local laws and regs for it.
They arenât concerning themselves with content, theyâre concerned with getting the network built in an increasingly hostile environment.

They arenât concerning themselves with content
Iâm perplexed⌠then what is the topic title of this thread? Dealing with an increasingly hostile environment or something or something else?
Maybe read the thread?