Testing

Sorry, should have said: I can’t run docker, yet… Sure there are online tutorials

But this was more my point, that we need more online video tutorials to get new people playing with this Network.

:beers:

1 Like

Plenty vids on running Docker.
A vid showing Formaciao on Docker should be short and doable.
Look on YouTube and get yourself Docked.
Formaciao on top is dead simple

1 Like

This much is true ab we have to deal with that.
Our job is to forewarn the team before Joe Public tells them if there is a problem
As for being labelled “negative”, I’ve been called worse and fully expect to be so again before teatime…

Indeed and I will try to get something up tonight. Kinda difficult, I burnt my hand badly the other day. Just back from having it re-dressed and its alot more restrictive at the typing plus painkillers are slowing down the braincells.

I think there is a danger of overcomplicating this in the short term - we just want zillions of folk hammering the network, If we can do that in a semi-structured way, thats good. If we can do that and gather some performance data, thats better.

Lets not over-complicate this too soon.

All of what you say is true - especially if we were having this convo last June. I feel guilty for nit having made noises sooner but we are where we are and we need to do whats best for the network right now

3 Likes

At it’s simplest it’s perhaps someone putting couple nodes online together with EVM and a client/node code modified to run on this easily, or a guide on how to compile/run manually for this setup.

Compiling client and node from source would probably greatly limit the testers count, but I can’t imagine a bunch of people trust and download some unofficial binaries…

Maybe we don’t need Arbitrum official blockchain, just make local setup available on some VPS? Then on test attos/gas an address could be set-up with a private key available to everybody to fill their clients with. Or a forum thread with list of addresses , then someone sending attos regularly to all of them?

100% agree, no binary distro for transparency and trust - can probably see a docker build, if people don’t want to run python code environment.

Maybe a bit much to code quickly, was just planning to use a wrapper around the autonomi supplied binaries, also a good test for them.

oh no, sorry to hear that - don’t feel you have to do anything until that has healed, health more important.

Apologies to all - my burnt hand and associated painkillers plus some other hassles have meant there has been no progress on this over the past couple of days. Also looks like I will have a lot less time to spare than I had hoped before the next network goes up.
I hope we can get some co-ordinated testing done but I will only be able to join in at best and not help organise as I had hoped. Apologies

PS If anyone can use the wellitwascheap.online domain and hosting to move this forward, DM me for access details. Otherise I will cancel the hosting and use the domain for something else eventually

Its a hyper-v thing in disguise…

1 Like

no worries,I’ve had a quick fiddle today and chucked some python boiler plates on github for a testing agent. It’s a long way from being usable, as with no network online all the autonomi client calls are failing, but with the next network online in a few days, I might get a chance to update it to a point where it could be alpha’d - but with the holidays upon us, that might be in January or later…

I’ve written a 60 minute scheduler, with downtime windows, so it’s not a denial of service, along with a kill-switch to terminate all distributed agent tests, from an XML test plan - tests run in 50 minutes of that 60 minute schedule - dates , and hours are ignored for scheduling…just minutes matter (TZ)… I’ve defined 3 test types, a download of defined file sizes (tiny,medum,huge etc defined in a CSV file), a quoting task(boiler), again based on defined file sizes, and then an upload task with EVM integration (boiler)

The agent is using threads, so it run’s parallel tasks, and the test definitions let the worker threads be defined (although I’ve put some hard caps in at the moment, as the autonomi client calls appeared to be a cpu hog) - I’ve got a ton of rate limiting code missing, and that needs to be in place before the agent can be trialed.

Output is into (currently two) InfluxDB formated files, with performance details - ones a rolling 1 minute snapshot roll-up, the other is a per task, per thread, per worker detailed stats (boiler) which might one day make it into a nice grafana dashboard.

As per one of my early comments, I’m not completely sure what the aim is here, but if someone has ideas of what needs testing, then it would be great to hear.

Jad

7 Likes

That’s some work!

Once you’ve alpha tested it, how easy would it be to distribute to everyone who wants to participate in testing to use as something they can easily download and run, with perhaps pre-specified files that they will be uploading/downloading?

If you scroll up on this thread, Neo posted some criteria for what we might want to be testing.

1 Like

Perhaps if you published it on GitHub, someone could help with pull requests or review?

3 Likes

pull requests are always welcome, but what’s there at the moment isn’t worth it, was a quick few hours boiler plate to see what’s possible :slight_smile: I’ve got a new branch I’m working on, that puts in much better thread safety, and sorts out the classes now I have a better idea of what might be fun.

#edit: now on community github account

I’ve written it in python, so there is no binary, it’s self compile and run - that can either be a local python install script, or docker build, to be fair I’ve not thought that far ahead :laughing:

The download file pointers, are currently stored in a csv format on github, that points them back onto the autonomi network with filename, ant address, md5 hash, file_size - I would love to store the source on the autonomi network, and use registers to reference them, but that might have to wait…

2 Likes

You have now :slight_smile:

Anybody else wanting access to the community hub, please DM me with your Github username or associated email.

2 Likes

Don’t forget that files have to be unique in most cases since uploading the same file results in dedup happening. That would be a separate test on its own to test dedup.

Either the python script generates its own files large and small or everyone is given a different linux variant to upload the iso files from.

2 Likes

Quick update, I’ve had a few spare hours today to bump the main branch.

The code is still very poor, but I’ve had a chance to flush out some errors, and it “should” compile.

The EVM upload branch is not merged at the moment, as I can’t test it, so the code is still boiler plate.

Been working on a dashboard, to allow agent stats to be viewed as well…

Plan is to still have something that others can try before TGE, however my time is limited on this.

When it’s usable I’ll push an announcement with all the instructions needed to run it, for those who are keen to try :slight_smile:

If people have files they have uploaded, that they have made public and don’t mind them being used for testing, feel free to post the Autonomi network address.

5 Likes

OK I have a thought on some testing for the genesis network.

At this time the network is a huge bunch of nodes doing sod all. So I have thought about a series of tests that people can run on most linux machines. Simple scripts they will be and the more people who run them the better testing since there are nearly 10 million nodes out there.

First script is perhaps the easiest. It simply requests quotes continuously until the user stops it. It also can do multiple quotes in parallel with the only parameter being the rate. How many seconds between quote requests. The script also cleans up the files.

Requirements

  1. best to have its own subdirectory
  2. client installed. The ant application.
    install by antup client
  3. download zip file from here and unzip in the directory.
    and hopefully people will examine the tiny script file and see it doesn’t do any sneaky thing. Do not trust anyone
  4. adopt the slogan “Resistance is Not Futile” Let the resistance against free loaders rise up :sweat_smile:

quote-exercise.zip (1.7 KB)

#
#
#    Script to request quotes from random nodes
#
#    quote-exercise -r 1.234
#
#    It generates a semi random data file that is small being less than 2000 bytes that is then used
#    ask for quotes from the network. This will increase the load on the network as a whole, but not
#    in a too agressive way.
#
#    Just one machine running this script will have little impact, but the more people that run it the
#    better stress testing that will occur.  It is not a be all end all stress test but one component, 
#    with another being the downloading of existing files.  The difference here is that the downloading
#    only affects nodes that have the files and in this genesis network that is a really small number 
#    of nodes.
#
#    The script will issue quotes at the rate requested by putting into the background the quote request.
#    Thus there will be overlapping quotes occurring at the same time.
#
#    This requires the filename used to be different for each quote requested and obviously
#    needs cleaning up afterwards
#
#
#################
#    
#    parameters
#        -r secs   - rate in decimal seconds 
#                    Each quote will be issues every (approx) seconds after the previous one 
#                    quote requests are sent as a background task and output sent to /dev/null
#
#
#
#
#################
#
#    NOTE: in addition to 3 chunks being quoted on there is a minimum of 5 nodes for each
#          chunk that are asked to give quotes.  IE a minimum of 15 nodes will be asked
#          for a quote.
#
#          This will generate a lot more traffic across the network than is being experienced
#          with near zero requests happening in this genesis network
#
#
#

########################################################################################################
#    Some declarations and ensure sub directories are there
########################################################################################################

declare -i i j k quoteCnt

if [[ ! -d tmp ]];      then mkdir tmp;      fi

########################################################################################################
#    Check for rate of quoting - too fast and system will start falling over itself
########################################################################################################

if [[ ! -z $1 ]] && [[ ${1} == "-r" ]]
then
    if [[ ! -z $2 ]] && ( [[ ${2} =~ ^[1-9][0-9]*[.]*[0-9]*$ ]] || [[ ${2} =~ ^0.[0-9]*$ ]] )
    then
        quoteRate="${2}"
    else
        echo "ERROR: invalid quoting rate supplied.  Decimal number required" >&2
        exit
    fi
else
    quoteRate="5.0"
fi


########################################################################################################
########################################################################################################
#
#    FUNCTION: do quote and clean up after itself.  Meant to run in the background
#
########################################################################################################
########################################################################################################
#
#    parameters
#    1 - quoteNo - Sequiental number supplied to keep runs separated
#


function doQuote {

declare -i i j k

    if [[ -z $1 ]];                      then echo "ERROR: error in parameters to quoting function ($1)" >&2; exit; fi
    quoteFile="tmp/${1}.file"
    logFile="tmp/${1}.logs"

#	ensure files do not exist from a previos aborted run
	rm -f $quoteFile $logFile

#	make a semi random set of text
	echo "$SECONDS $( date ) $RANDOM $RANDOM" > $quoteFile
	echo "$(( $RANDOM * $RANDOM ))  $( date +%s%N ) $RANDOM" >> $quoteFile

#	use client to request a quote and direct all output to temp file
    ant --log-output-dest stdout file cost $quoteFile > $logFile 2>&1
	
#	cleanup
    rm -f $quoteFile $logFile

#	exit background task
    exit
}
    



########################################################################################################
########################################################################################################
#
#        MAIN PROGRAM
#
########################################################################################################
########################################################################################################

quoteCnt="0"
while (( 1 == 1 ))
do
       
    quoteCnt="$(( quoteCnt + 1 ))"
    echo -e -n "\r$quoteCnt:       "
    doQuote $quoteCnt >/dev/null &
    sleep $quoteRate
    
done 

16 Likes

If people want to actually upload these test files then I’d suggest two changes

  1. find a file that is approximately 10MB in size and place it int he directory and add a line in the script after line 98 to append the big file to the quoting file. EG
    cat $quoteFile your-big-file >> $quoteFile
  2. remove the “cost” from the ant command
  ant --log-output-dest stdout file cost $quoteFile > $logFile 2>&1

becomes

    ant --log-output-dest stdout file $quoteFile > $logFile 2>&1
  1. add a token to the wallet add/create for this. Use this to see how wallets work ant wallet --help

If there is a demand for it I will do another script for this and also make the files public and have it build a file of the datamap addresses so another script can be made to continuously read the files from the datamap list.

Thus people can post here their datamap addresses file and then they can be combined for even more files to download. This will test the genesis network even more.

In a year’s time these few TB will be like a grain of sand in the network and not be much space at all when we have 100’s of millions of nodes out there.

6 Likes

Great to see some action! Unfortunately I’m AFK until Monday, but will surely join the ranks then.

2 Likes

Excellent idea and script!
I am running it at the moment. How can I display in the console the quote obtained from the network ? Not super important, it’s just to have something to look at in addition to the quote count :slight_smile:

2 Likes

what?!?!

you mean

isn’t good enough for you ? :slight_smile:

@neo this script of yours is quite heavy on resources

maybe change the default rate for folks without Ryzens and 32GB of RAM :slight_smile:

Im going to try quote-exercise -r 8 and see if my box feels less like jogging through 12 inches of treacle

See if you can spot when I changed that parameter

Treacle depth is down to a couple of inches now :sweat:

2 Likes