Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some issue with locator related commands #135

Open
buhtignew opened this issue Jul 3, 2024 · 1 comment
Open

Some issue with locator related commands #135

buhtignew opened this issue Jul 3, 2024 · 1 comment
Assignees
Labels
anomaly If something works not as expected

Comments

@buhtignew
Copy link
Collaborator

buhtignew commented Jul 3, 2024

At the moment I'm not able to check whether the locator related commands are creating duplicates in my blocklocator.json file, so the issue #117 is at the moment in standby for me.

The issue I've got while I was trying to create duplicates is the following:
I've run token cache 539839eb5c3a3dbf1eb9b942f4a8126b58a7733efaf9d1f3ccba86904b6fcee3, token cache 160679b53e6785664e75bc3cde5e1c41e88a9aacc8afcc92b641152f51dec959 and address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -s 445930 -b 10000 and after the blocks were processed I've got the following error message:

General error raised by PeerAssets. Check if your input is correct.

and no entry in the blocklocator.json file was created.

Thinking my blocklocator.json file was corrupted I've renamed it and run address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -s 445930 -b 1000 and got the same message while the new blocklocator.json file wasn't created.

However I've renamed my blocklocator.json file back and have run address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -b 100. In this case no error message was displayed and the new entry was created in my blocklocator.json file.

Thinking there was something wrong in what I've done so far I've tried to run token cache 160679b53e6785664e75bc3cde5e1c41e88a9aacc8afcc92b641152f51dec959 -b 100 and got the following message:

Storing blockheight locators for decks: ['160679b53e6785664e75bc3cde5e1c41e88a9aacc8afcc92b641152f51dec959']
First deck spawn at block height: 3339
Start block: 3339 End block: 3439 Number of blocks: 100
Processing block: 3400
 
        General error raised by PeerAssets. Check if your input is correct.

and nothing has changed in the blocklocator.json file.

Then I was able to successfully run address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -b 1000 twice.
Although there was still no transactions in the blocks I've processed.

Probably I'm doing something wrong, but I don't know what.

Right now I'm running address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP, as soon I'll get the result I'll publish it here.

UPDATE:
The address scanning went smoothly.
The output after the block's processing was:

Stored block data until block 52100 with hash 00000004713e6a69aeb30eb95aa96cd2595fad02357fe09ece6e4ca0c6f4a315 .
Block heights for the checked addresses: {'mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP': [21987, 22003, 23190, 23191]}
Storing new locator block heights.

I was expecting the block 454302 being mentioned in the mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP address section of the blocklocator.json file since the corresponding transaction bc8c92dfe02b6c9ed53c0add55a347318d7eba6a8cde138e386f1859763be6bc contains that address as receiver, but since the block is already mentioned in the mkwJijAwqXNFcEBxuC93j3Kni43wzXVuik address section I assumed the block would be being taken from there somehow.

So I've checked the time needed to get the output of the transaction list mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -x -l -f 445930 -e 455622 command (after I've run address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP as above) and compared it with the time needed to get the same output using the transaction list mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -x -f 445930 -e 455622 command and the only difference was that the former command reports to be processing blocks like this

Processing block: 446000
Processing block: 446100
Processing block: 446200
Processing block: 446300
Processing block: 446400
Processing block: 446500
Processing block: 446600
Processing block: 446700
Processing block: 446800
Processing block: 446900
Processing block: 447000
Processing block: 447100
Processing block: 447200
Processing block: 447300
Processing block: 447400
Processing block: 447500
Processing block: 447600
Processing block: 447700
Processing block: 447800
Processing block: 447900
Processing block: 448000
Processing block: 448100
Processing block: 448200
Processing block: 448300
Processing block: 448400
Processing block: 448500
Processing block: 448600
Processing block: 448700
Processing block: 448800
Processing block: 448900
Processing block: 449000
Processing block: 449100
Processing block: 449200
Processing block: 449300
Processing block: 449400
Processing block: 449500
Processing block: 449600
Processing block: 449700
Processing block: 449800
Processing block: 449900
Processing block: 450000
Processing block: 450100
Processing block: 450200
Processing block: 450300
Processing block: 450400
Processing block: 450500
Processing block: 450600
Processing block: 450700
Processing block: 450800
Processing block: 450900
Processing block: 451000
Processing block: 451100
Processing block: 451200
Processing block: 451300
Processing block: 451400
Processing block: 451500
Processing block: 451600
Processing block: 451700
Processing block: 451800
Processing block: 451900
Processing block: 452000
Processing block: 452100
Processing block: 452200
Processing block: 452300
Processing block: 452400
Processing block: 452500
Processing block: 452600
Processing block: 452700
Processing block: 452800
Processing block: 452900
Processing block: 453000
Processing block: 453100
Processing block: 453200
Processing block: 453300
Processing block: 453400
Processing block: 453500
Processing block: 453600
Processing block: 453700
Processing block: 453800
Processing block: 453900
Processing block: 454000
Processing block: 454100
Processing block: 454200
Processing block: 454300
Processing block: 454400
Processing block: 454500
Processing block: 454600
Processing block: 454700
Processing block: 454800
Processing block: 454900
Processing block: 455000
Processing block: 455100
Processing block: 455200
Processing block: 455300
Processing block: 455400
Processing block: 455500
Processing block: 455600

while the later command wasn't reporting anything while scanning.

So this time my hypothesis was that the blocks that were not mentioned in the address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP command output (as above) were not only not put into the blocklocator.json file for that address but were not retrieved from the other addresses scanning either, contrary to my previous assumption.

To verify I've run transaction list mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -x -l -f 21986 -e 23192 and then transaction list mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP -x -f 21986 -e 23192 and the time needed to retrieve the output was very different. The first command needed about a couple of seconds, while the latter required about a minute.

So I'd say that the address cache mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP command hasn't stored all the blocks in which the address is present, but only those which were not already stored for another address (mkwJijAwqXNFcEBxuC93j3Kni43wzXVuik in my case) which doesn't let taking advantage of the previous scanning to get quicker results while running commands that are using the locator feature.
_
UPDATE 09-07-2024
The same issue has the token init if used with the -c flag.

@d5000
Copy link

d5000 commented Aug 25, 2024

The first bug was probably fixed with commit 1b65ff1.

The reason that the error appeared sometimes and sometimes not, is that it appeared always when the command "Provided start block is above the cached range. Not using nor storing locators to avoid inconsistencies." was thrown (which due to side effects threw the "red" error).

This was however not intended in the case of the first "caching process" of a new address, because you should always be able to start caching an addresses from an arbitrary block on - as long as it's the first time you cache that address, or after you erase the entry in blocklocator.json.

I'll look now through the rest of the post after the first of your "UPDATE"s and update the post accordingly if I find new bugs and fix them.


About your first update:

I was expecting the block 454302 being mentioned in the mx5MdsenFDZufFuwT9ND7BKQBzS4Hy9YUP address section of the blocklocator.json file since the corresponding transaction bc8c92dfe02b6c9ed53c0add55a347318d7eba6a8cde138e386f1859763be6bc contains that address as receiver,

As far as you reported your commands, you only seem to have cached the first 50000 blocks for the ...YUP address after you used a new blocklocator.json, and thus a block over 400000 of course would still not show up there.

since the block is already mentioned in the mkwJijAwqXNFcEBxuC93j3Kni43wzXVuik address section I assumed the block would be being taken from there somehow.

If the addresses are cached separately, no entry in the blocklocator.json influences another one. Only when you cache a whole deck or a list of addresses, they will update together.


Having read the update, I think the problem is simply that you haven't cached all the blocks but only the first 50000 as the standard option of this command was. If you cache the whole chain then you will benefit from the locators when doing transaction list queries for the whole chain.

Having thought a bit how to made some easy and useful improvements, I have made some changes to the address cache command in the last update:

  • I renamed --full option to --chain or -c. This is the option which will always cache the whole chain.
  • I added a --force option which can be shortened to -f. This is the reason why I renamed --full.
  • To be consistent with deck cache I renamed the --full option there also to --chain (force option is not available there at this moment).

The --force option for address cache allows something which was previously not allowed: You can now cache discontinuously, for example, from block 0 to 1000 and then from 2000 to 3000. This is meant for situations when you know the address was used at certain blocks.

Commit f375a49

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
anomaly If something works not as expected
Projects
None yet
Development

No branches or pull requests

2 participants