Testing

Concerns have been raised about testing or rather the lack of it.
We all know the devs are busy busy busy ants these days and cannot do everything we would wish.
I think we now have a few days window in which to discuss if/how the community can step up and lend a hand.

I’d like to hear from the rest of you on the following points

  • Is the concept of a community testing initiative feasible?
  • If so, how should we go about this?
  • Testing uploads will cost attos, should we appeal to Autonomi for some help here?
  • How many folk do you think we could get to commit to 10-15 mins work each day until TGE with or without incentives - allowing for holidays of course.

Obviously an awful lot more to discuss - I’d love to hear your thoughts.

6 Likes

I should add I have bought the worlds cheapest domain to use if neccessary for downloads, record keeping, whatever…

http://wellitwascheap.online

Clicking the above will prove underwhelming for now.

While feasible, it takes organisation. Requires a number of people to commit to it and that is realistically the hard part. If you thought business meetings were bad at getting anything done, then community organisation is certainly not any better.

Well ideally we need to be testing against the latest stable release.

A plan of testing needs to be drawn up. People need to make a list of parts they can test and have many doing it at the same time for many tests of stress.
EG

  • uploading of many small files at once. This tests the rate of chunk storage, not data volume but record volume
  • uploading of large many GB files at same time
  • uploading a range of sizes at same time
  • uploading a range of sizes while others are downloading many files (at the same time)
  • all tests need to have the unix “time” (EG ">time autonomi files") and best to have prepared scripts to do this
  • and other tests people have like site browsing

It is likely we will have to limit it to linux since windows has its known issues making testing perhaps not so reliable except by windows experts.

Definitely, and gas too as many will not have any/enough

I’d say we need to have coordinated times to start a script for each set of tasks and if using the “time” placed before the command we can get the time taken for each test.

So we need what tests then someone to build a set of scripts to run that will allow the person to start the script and walk away. Then at worse the user will have to note down the times taken. Of course if running scripts then using $SECONDS allows the script to calculate the seconds taken to do something and write that out to a report file

5 Likes

upload to wellitwascheap.online and I’ll collate them from there

thanks - saves me writing that out :slight_smile:

Absolutely and this is my big fear - it will be too little too late - but if we get enough interest then maybe its a worthwhile task,

Also
I’d suggest that we don’t restrict it to linux only - we can only (easily) get times from linux boxes but having Windows users adding to the overall load should be OK.

5 Likes

I’d be happy to help. Maybe you could set up a poll asking to answer yes if you can commit to helping with this. If it is a script that could be set to run on a given day, and could be provided to everyone, if people have time commitments, they could just start the script running in the morning, which would deploy the uploads etc at given pre-set times within the script. Just some thoughts.

4 Likes

Thank you

Yep this is what I’m thinking. :slight_smile:
How is your scripting? Are you any use with Powershell? cos I’m not.

At its simplest , we would need someone to look at the linux scripts and do them in Powershell whithout the time reporting - unless of course that could be easily handled in PS

I’ll put a poll up tomorrow if see much more interest.

I have not used Powershell. If we could get a Linux script going, someone may volunteer to do the Windows side, but they should have ample lead time.

1 Like

Doesn’t Windows 11 have a linux sub shell that can be run. I’ve seen people do stuff using linux in that system on windows 11 and its not a virtual box sort of thing but built into windows

2 Likes

yes, I forgot about that - never used it myself

To start a script at a certain time, to the second

starttime="17........"  # linux time stamp for the time you want to start
nowtime="$( date +s )"
sleeptime="$(( starttime - nowtime - 30 ))"   # allow 30 seconds as a buffer
if (( sleeptime > 0 )); then sleep $sleeptime; fi
nowtime="$( date +s )"
while (( nowtime < starttime )); do sleep 1; nowtime="$( date +s )"; done

starttask="$( date +s )"
############# do the task
endtask="$( date +s )"
taskelapsetime="$(( endtask - starttask ))"

echo "Task xyz start:$starttask  end:$endtask   elasped:$taskelapsedtime" >> test-report-file

Then rinse and repeat for more tasks at their set time.

Suggest a large gap between tasks start time since some computers/nodes will take a lot longer than others.

5 Likes

If there is no problems being a few seconds out then can get rid of the 30 second buffer thing.

Done too many real time jobs and to the second was important for some that I just write code to be that way.

I am sure there is simpler ways to do this, that was just a quick way to do it thats simple enough

2 Likes

I’m ready to go when you guys are. I’m following the thread.

2 Likes

Yes, that is correct. I don’t usually use Windows, but I do have one running and just installed Linux from Powershell. You do have to do that one time install, at least on my version of Windows 11. It is called wsl: “Windows sub-system for Linux.” What is Windows Subsystem for Linux | Microsoft Learn

I guess now I can say I have used Powershell.

3 Likes

I’ll be starting on Introducing Verifi, regularly ensure uploaded files aren't lost or corrupt in a couple of days. If anyone has good ideas as to how timing (performance) could be added to it I’m open to ideas.

I haven’t put much time into thinking about testing performance because Verifi is primarily focused on integrity, but what if a run of it optionally saved download times to serve as a baseline so subsequent runs could report deltas?

2 Likes

tl;dr; I’m happy to help, if some of the others are ready to pitch in.

Obviously feasable, 6 months too late, and we have to ask ourselves what we are looking to achieve - dragging some of the “issues” into the open might get “push back” - if it’s not clear by now, Bux will launch this at TGE and there is nothing we can say or do to change that, we will be labeled again as complaining, negative, [… long list of negatives… ]

Github would be my preference, code and process needs to be out in the open for scrutiny, and trying to code via a forum would be a nightmare - good for coordination though.

I would like to see a blueprint on Github covering objectives of this test, and ideas around testing , so people who agree to help know what the aim is.

We need to have a communicated kill switch, probably via github that can stop the test via code.

We need some test plans, that can be coded into maybe something cron style that we can use to coordinate the test agents.

I can see an “agent” being put onto a testers machine, that would download a test from github, and then run the test on schedule - the output from the test needs to go somewhere, so maybe upload that back onto autonomi, and then share the public link onto the blockchain, or back onto github.

Someone with Grafana skills can then scrape the test results to give us some metrics.

We should target Linux first, other OS’es can come later but it makes the dev really complex trying to code for everything.

Bash and Python would get us something basic, other languages are available :hourglass_flowing_sand:

No, we’ve asked the last 9 months for some developer atos, they arn’t interested. We should community fund this - I would say a community ERC-20 address that people can donate to their attos.

The testing needs to supply agents with ERC-20 addresses, and load them for participants - to be clear, no one should be using their proper ERC-20 wallets or addresses for this, it’s just not “safe and secure” at the moment as private keys are exposed.

Long term, we could even think about a community reward scheme, where we provide attos back to agent runners, but that will have to wait for native…

Not many, the timing is so bad - holidays are coming, so resources limited…

What I would say is that I see a future for the community testing, I can see an agent being run on peoples machines who opt in, to allow distributed stats on the network to be collected and displayed - we can track upload times, download times, CRC checks to data corruption, and many more - this is good as it will help grow confidence with the network, don’t trust verify…

5 Likes

Pretty much agree with all that,

I will be AFK most of the day, will respond later tonight

You know I’m in as long it’s just copy and pasting :joy:

1 Like

For the first year, :sweat_smile: this is why I’m hoping we get NRS asap 4 :see_no_evil:1 attos :upside_down_face: my friend with :muscle:250K + user is actually waiting 4 this

It would be useful, if someone (not me), would make a video explaining how to farm, how to upload, how to…

Would be fun if someone showed Formicaio and Awe sum, then my friend will sign the :dotted_line_face:

Can your friend run Decker?

Huh, you mean Docker? No, not even I can run docker :sweat_smile:

Would be fun though, if there are some video tuts of how to’$