MaynardMiner/SWARM

Issue with nanominer-n-1 script....

UserDC-LeGrand opened this issue · 8 comments

Noticed issue with Nanominer-n-1 script which passes wrong arguments (none [octopus], which prevents the miner from starting, and bans it from SWARM execution...

nanominer-n-1 6 00 Hours 00 Minutes 5 Times

nanominer none [octopus] wallet=1234567890-abcdefg.rig3-SWARM pool1=octopus.auto.nicehash.com:9200 webport=9091 logPath=/hive/miners/custom/SWARM.3.6.4.linux/logs/NVIDIA1.log rigPassword=x

image

I see issue will fix.

Actually I don't see issue @UserDC-LeGrand.

The reason why it's like this is because nanominer accepts no arguments. SWARM sets a config file for it. It's why it looks like that in the get active.

You need to check the config file in .\bin\nanominer-n-1 and see if there is an issue in the config file.

https://github.com/nanopool/nanominer#configuration-file

Example configuration files for reference.

For the record too- SWARM creates in all miner folders in .\bin swarm_start_[algo].sh which allows you to self test miners without having to constantly restart SWARM.

So in .\bin\nanominer-n-1 there should be a swarm_start_octopus.sh that is an executable script you can run to see issue. You can also open/read that file to see how SWARM launches it.

If you can do that, and show me screen with error, that will help.

image

Nanominer benches for me atm. It is considerably slower than t-rex version when comparing it in get benchmarks command. So not sure if you want to put effort in to find out why.

Thanks for all the pointers. BTW, your 3.6.5 /bin/miner .sh scripts are pointing to the prior release deleted folder. For example, see last line in swarm_start_ethash.sh :

#!/usr/bin/env bash
export LD_LIBRARY_PATH=/usr/local/swarm/lib64:/hive/miners/custom/SWARM.3.6.4.linux/bin/nanominer-n-1
sleep .1
export DISPLAY=:0
sleep .1
/hive/miners/custom/SWARM**.3.6.4.**linux/bin/nanominer-n-1/nanominer

Not really a bug- The scripts are written when SWARM launches the miner. It just means those miners haven't ran since you updated

Closing because I can't reproduce as notated above