Loading a wallet with conflicts without a chain (e.g. wallet tool and
migration) would previously result in an assertion due to -1 being both
a valid number of conflict confirmations, and the indicator that that
member has not been set yet.
Github-Pull: #28542
Rebased-From: 782701ce7d
MarkConflicted calculates conflict confirmations incorrectly when both
the last block processed height and the conflicting height are negative
(i.e. uninitialized). If either are negative, we should not be marking
conflicts and should exit early.
Github-Pull: #28542
Rebased-From: 4660fc82a1
This commit adds tests to ensure that old fee_estimates.dat files
are not read and that fee_estimates are periodically flushed to the
fee_estimates.dat file.
Additionaly it tests the -regtestonly option -acceptstalefeeestimates.
Github-Pull: #27622
Rebased-From: d2b39e09bc
If -acceptstalefeeestimates option is passed stale fee estimates can now
be read when operating in regtest environments.
Additionally, this commit updates all declarations of the CBlockPolicyEstimator
class to include a the second constructor variable.
Github-Pull: #27622
Rebased-From: cf219f29f3
Old fee estimates could cause transactions to become stuck in the
mempool. This commit prevents the node from using stale estimates
from an old file.
Github-Pull: #27622
Rebased-From: 3eb241a141
The migration process must skip any invalid script inside the legacy
spkm and all the addressbook records linked to them.
These scripts are not being watched by the current wallet, nor should
be watched by the migrated one.
IsMine() returns ISMINE_NO for them.
Github-Pull: #28125
Rebased-From: 8e7e3e6149
The legacy wallet allowed to import any raw script, without checking if
it was valid or not. Appending it to the watch-only set.
This causes a crash in the migration process because we are only
expecting to find valid scripts inside the legacy spkm.
These stored scripts internally map to `ISMINE_NO` (same as if they
weren't stored at all..).
So we need to check for these special case, and take into account that
the legacy spkm could be storing invalid not watched scripts.
Which, in code words, means IsMineInner() returning IsMineResult::INVALID
for them.
Github-Pull: #28125
Rebased-From: 1de8a2372a
The functional test feature_taproot.py fails in some rare cases on the
execution of the `"branched_codesep"` spending script. The problem
occurs if the first data-push (having random content with a random
length in the range [0, 510]) has a length of 1 and the single byte has
value of [1...16] or [-1]; in this case, the data-push is not minimally
encoded by test framework's CScript class (i.e. doesn't use the special
op-codes OP_1...OP_16 or OP_1NEGATE) and the script interpreter throws
an SCRIPT_ERR_MINIMALDATA error:
```
test_framework.authproxy.JSONRPCException: non-mandatory-script-verify-flag (Data push larger than necessary) (-26)
```
Background:
The functional test framework's CScript class translates passed
bytes/bytearrays always to data pushes using OP_PUSHx/OP_PUSHDATA{1,2,4}
op-codes. E.g. the expression `CScript(bytes([1]))` yields
`bytes([OP_PUSH1, 1])` instead of the minimal-encoded `bytes([OP_1])`.
Fix this by adapting the random-size range to [2,...], i.e. never pass
byte-arrays below length two to be pushed.
Closes#27595.
Github-Pull: #27631
Rebased-From: 54877253c8
While allowing submitted packages to be slightly larger than what
may be allowed in the mempool to allow simpler reasoning
about contextual-less checks vs chain limits.
Github-Pull: #28471
Rebased-From: 533660c58a
In functional tests it is a quite common scenario to generate fresh
elliptic curve keypairs, which is currently a bit cumbersome as it
involves multiple steps, e.g.:
privkey = ECKey()
privkey.generate()
privkey_wif = bytes_to_wif(privkey.get_bytes())
pubkey = privkey.get_pubkey().get_bytes()
Simplify this by providing a new `generate_keypair` helper function that
returns the private key either as `ECKey` object or as WIF-string
(depending on the boolean `wif` parameter) and the public key as
byte-string; these formats are what we mostly need (currently we don't
use `ECPubKey` objects from generated keypairs anywhere).
With this, most of the affected code blocks following the pattern above
can be replaced by one-liners, e.g.:
privkey, pubkey = generate_keypair(wif=True)
Github-Pull: #27733
Rebased-From: 1a572ce7d6 [partial]
Right now when we get the help for -torcontrol it says that there is a
default ip and port we dont specify if there is a specified ip that we
would also use port 9051 as default
Github-Pull: #28101
Rebased-From: 9a84200cfc
During a reorg, continuing execution when a block cannot be
reversed leaves the coinstats index in an inconsistent state,
which was surely overlooked when 'CustomRewind' was implemented.
Github-Pull: #28427
Rebased-From: eef595560e
The test `feature_addrman.py` currently serializes the addrdb without
specifying endianness for `int`s, so the machine's native byte order is used (see
https://docs.python.org/3/library/struct.html#byte-order-size-and-alignment)
and the generated `peers.dat` would be invalid on big-endian systems.
Fix this by explicitly specifying little-endian serialization via the
`<` character in `struct.pack(...)`.
Github-Pull: #27529
Rebased-From: 53c990ad34