The new legislation changes the rules of the game for Autonomi

Ah, back to native token! :sweat_smile:

OG token investors have had many opportunities to sell out, at a profit, from their original purchase prices.

Holding a company to what it promised a decade ago is unreasonable, especially as folks haven’t been forced to hold. The last year made selling even easier.

BTTF folks excepted, but that is another story.

If the world was full of rainbows, we would all be unicorns.

2 Likes

You can yawn ā€œthe native tokenā€ thing a thousand times and it still doesn’t change two facts:

Anonymous, instant tx, non-blockchain native token was one of the key parts of the planned and proclaimed network design/function for ten years.

Claiming now that people should have could have sold many times in a decade for which a significantly different product was promised reeks.

Seriously:

  1. You make it sound as if the company made a one-time promise some time in 2014. –in reality, the company was promising the native token for ten years. But even if your attempt at twisting facts was correct, what would it matter? A promise is still a promise.
  2. Folks surely haven’t been forced to hold, but they have been misled to believe the company would deliver, and that the company would deliver something specific, as whitepapered.
  3. Last year made selling even easier, hence the company is off the hook?

You make things worse, and instead of successfully disguising the ugly truth, you actually make it even uglier.

3 Likes

The team have said native token isn’t on the product plan for months now. It was obvious at least a year ago, probably 2, that it wasn’t a priority either. They talked with exchanges too, who said no.

How much more information is needed?

If folks are mainly here for native token, they should probably look for the exits too. No point going over and over it.

I get that people hoped for native. I was one of them. However, the priority has long been about the data storage first.

Since 2014, the digital currency landscape has changed vastly. The space is crammed with well connected fintech businesses. Payments have evolved, laws have evolved, outcomes have changed.

I am sorry that some people feel hoodwinked. I’ve dealt with much worse, tbh. If folks feel betrayed, just sell up and leave… move on.

3 Likes

For me personally, and I think for some on the forum here, the thing isn’t necessarily we want native. I applaud the team for taking the Arbitrum direction for multiple reasons, to name a few:

  • Focus on core product (data storage) first
  • No vulnerabilities risk in payments
  • Exchange access

What bothers me personally is the extreme lack of transparency when it comes to questions regarding the blockchain fee. A few examples:

  • The team mentioned a Merkle tree could fit data for 4 GB of data. Right now, we’re only seeing 1 GB.
  • The team mentioned fees will be near gone with Merkle tree, but a Merkle tree payment is $0.12/GB right now. That’s not near gone, that’s very much significant.
  • They mention that they’re working on improving it much further, but when asked what they expect the cost can come down to, they don’t have an answer

I strongly believe that the team is very well intended here and can make leaps forward. But what I don’t get is the secrecy around the subject. I think these questions are quite simple to answer, right?

Will a Merkle tree pay for 1GB or 4GB of data in the future? Either 1GB, 4GB or we don’t know yet.
When mentioned blockchain fees will be near gone, was $0.12 a Merkle tree the aim? Yes, No, We don’t know.
You mentioned you’re working on improving to lowering the cost much further. Is there an indication to how far we can reduce the fees? 10%, 90%, we don’t know.

Instead what we’ve got is silence and vague answers. For me personally, that is worrying. Not just because it adds a level of uncertainty, but also because I know anyone that has been a fan of native now has another reason to raise the pitchforks and it’s not helping the team/community relationship. And then for me personally, as a Discord moderator, how do I moderate when people ask the same question a thousand times, if a thousand times they have not received an answer? I decide to let them ask the question over and over as long as it’s raised in a constructive manner. But then that demoralizes/pisses off some team members: ā€œBecause it’s been asked a thousand timesā€.

2 Likes

There’s no strong argument against having a native token. Even from an investors point of view, if they wanted to make a quick buck there are hundreds of crypto projects they could have invested in for a quick buck, success in crypto isn’t about offering a great product. Nobody uses storj. I want a good product.

1 Like

Seems like the priority now is LLMs and Crypto, not the data storage. The plan changed again.

2 Likes

Well you said holding a company to what it said a decade ago was unreasonable. I felt the need to point out saying that was in fact unreasonable.

I agree though people who disagree with the new direction (but won’t really change anything) are better off doing something else than complain. Even legal action is probably better for their own sake, if they feel like taking their best shot.

Autonomi 1.0 literally lost a bunch of data a few months ago. I’d call that argument pretty compelling.

(Not to mention Autonomi 1.0 has now been sunsetted)

1 Like

Just asked this on discord thought I’d ask here aswell. I was thinking about fae only running on powerful expensive macs (for now) and believe it should be this way. Has any other possibly ground breaking tech ever ran on average hardware straight away?

1 Like

You’re correct, it was to be expected and while it was claimed this would change within months, some of us said that was unlikely (three years ago - and nothing has changed in this respect).

Right now, apart from tweaking nodes which can be done effectively in other ways, I’m not aware of anything useful this LLM will be able to do even on a high end machine - which means for a tiny number of users. This is supposed to be for everyone remember.

One day maybe, but it makes no sense to be focusing effort on this and tainting the project further, unless there are some real wins here.

What we need is a functioning network. Even something pretty limited, but reliable would be fantastic. But we keep going for fantastic things in unrealistic timescales again and again.

3 Likes

Is fae not future proofing the network? Like it or not Mark this AI stuff is here to stay. I think we need to see what 2.0 is like before we condemn it don’t we?

3 Likes

Im not really sure what to make of all this yet.

I will say I didnt really manage to run nodes on 1.0

If I’d spent time actually looking at the cli I feel sure I’d have got there in the end.

Maybe 2.0 would help many who would otherwise not supply / use the network.

I don’t have anything apple, could aquire some but it seems many others don’t either.

Native would have been amazing, but surely there are solutions out there, even if not imagined as yet.

1 Like

Did try it two weeks ago on a Windows-machine with (an older) NVIDIA-card, still there in it’s GitHub-repository, as well as a compiled version, "fae-windows-x86_64.exe’'.

2 Likes

I spent a lot of time thinking about Fae over past few days

Immediate impressions:

How dare David betray us all by concentrating on rich mans toys like Apple Silicon Macbooks?
Do we really need this?

Then I thought some more. - came to same conclusion as @scottefc86 .

David is using the most capable tools he has - he just happens to be more fortunate than most and has a high-end Mac. These Macs work extremely well with LLMs - possibly it was wha they were explicitly designed for. So of course he will use the most appropriate kit for the job in hand. If he can prove both the concept and the practical demonstration of Fae on a MAcBook, he’s done better than anyone else so far. Prove the concept, kinda productionise it, see what breaks, fix it and learn. And once sufficient learning has been done ( including if Fae itself is worth the candle) then make it possible for work to begin on non-Mac kit.

David says it would take a few weeks of dedicated work to make a linux port. DeepSeek and Claude differ in their immediate priorities but both think it could be done by 5-6 2-3rd yr CS students each working on their own area for a few months (semester).

I very much doubt that David would (or could) stand in the way of any team that took up that challenge and nobody can accuse him of a lack of documentation.

So I will allow time for a period of lamentation and hand-wringing and then be back to try and encourage all you properly brainy kids to work together to see what can be done whilst David finishes work on the Mac Fae <—sounds like something out of a latter Pratchett story - Im thinking this is no coincidence :slight_smile:

AI is not going to go awayĀ· We need to take it on and master it ourselves or be overtaken and succumb to what the bankers ( we’ll just all them that for now) really want for AI. David is hopefully showing us that it can be taken on and used on our own terms with the security that we desire.

I’m just pissed off that I don’t have a suitable MacBook and dont see myself affording one anytime soon. Best I could quickly find with >24GB was just under Ā£1500 MacBook Pro M3 2023, 14-inch, 36Gb, 500Gb | eBay UK

NONE of this stops us running Autonomi 2.0 nodes on our existing kit. It promises to be a lot more straightforward than Autonomi 1.0 with NAT hassles removed and other enhancements.
Maybe for 99% of users it will work out of the box? Remember the days of running 1000+ nodes are over except for a VERY few. Tools like anm and Formicaio may not be needed at all - or used purely for reporting rather than management.

So lets just let the dust settle a little, see what is actually delivered with Autonomi2.0 and lets try to be nice to each other here or we will lose many more apart from the dear departed @anon26713768

5 Likes

@jane

Let me put it bluntly…

Are you ever going to come up with something constructive or are you simply going to hang around like a bad smell moaning like buggery and never actually doing something for the project?

You are obviously technically aware, most probably quite competent across a range of skills if I have read your posts correctly.
Ever thought about what you could contribute other than snark from the sidelines?

Ask your own choice of AI to look at the README from Fae and ask it how it would go about porting Fae to Linux/windows.

Let 1000 flowers bloom. - Mao

Thats an argument not to store your data on there, in which case your blockchain tokens were also lost/wasted.

Bitcoin transactions would be a big one. Plus steam, napster, netflix, tor, torrents, nostr, veilid, and almost every other novel p2p network being worked on today. All of these ran on potatos from day one. I believe this was/is a key element to their success. Veilid actually make a point that their protocol has to be able to run on phones, since for much of the world that is the main way they access anything on the internet. But even people who own or could afford high end apple macbooks probably don’t want to use them for many of the things autonomi is intended for - uploading photos, archiving, blogs etc. Is fae really groundbreaking anyway - it is an off the shelf LLM (qwen) with some custom configs like countless other openclaw/opencode/ollama type apps that use local models?

3 Likes

From David on discord

Its really effing ridiculous that I have to cross-post here because some folk are too idealogically hidebound that they wont got on Discord themselves.

But I do because I want to keep this community together.

2 Likes

Rejecting Discord is hardly ideological.

3 Likes

What are your non-idelogical reasons for rejecting Discord then?

Actually don’t bother. - cos I’m no sure it matters.

Hopefully those who may see benefit in a Linux/Windows version of Fae can read what David said there and choose if they are interested or not.