mirror of
https://github.com/bitcoin-s/bitcoin-s.git
synced 2025-03-13 11:35:40 +01:00
Node (#490)
* WIP: 2018 12 22 node project (#280) * Add files from old spv node project src compiling test files compiling ran scalafmt Fix serializer tests Get non networking test cases to work WIP: Debug peermessagehandler Update CRUD, remove all of the Actor craziness. Add DbManagement trait and unit test db WIP: Rewroking PeerMessageHandler, create Peer, DataMessageHandler, PeerHandler Reworking Client to handle all tcp messages and message alignment for bitcoin p2p messages * Wip: Node refactor * Create node test project, move all node tests into that project and move all generators for the node project into testkit * Rework ClientTest to use testkit, start minimizing akka usage, implement connect(),isConnected(), disconnect(), isDisconnected() in PeerMessageReceiver * Create Peer, PeerHandler, PeerMessageSender and PeerMessageReceiver * update readme about status of node project (#359) * Add flyway plugin to manage database schemas (#361) * Add flyway plugin to manage database schemas * Switch database driver to sqlite3 to be more portable, rework configs for sqlite3 * Set up sqlite database directories and files if they are not already created * Add torkel's review * Add chain, wallet, db-commons projects (#367) * Add chain, wallet, db-commons projects * Rework db creation logic if they db does not exist * Add config logging to try to debug travis ci * Pass explicit class loader for db config * Remove duplicate call to dbConfig * Make DbConfig.dbConfig a lazy val * Remove noisy log * Add scaladoc to DbConfig * Switch dbConfig readme paragraphs * Fix compile issues introduced during rebase onto master with rpc changes (#394) * WIP: 2019 03 12 tip validation (#378) * Implement blockchain handling data structures Add TipValidation happy path Add more test cases for TipValidation.checkNewTip for badPrevBlockHash and badPOW Add overflow check, fix endianness bug for checking proof of work Add pow transition check, refactor difficultyChangeInterval into chain params, add more tests fix a few nits Fix compile error, clean up unused import Remove redundant files from node project * Implement GetNextWorkRequrired/CalculateNextWorkRequired, move BlockHeaderDAOTest cases into chain project * Add full POW change check in TipValidation, address code review nits * Configure logging in chainTest, turn logging OFF in other test projects * Address code review pt2 * Add coverage minimum for chain project (#398) * Add coverage minimum for chain project * Add first Blockchain.connectTip() unit test, switch to a in memory sqlite database for unit tests, starting using fixtures for BlockHeaderDAO in unit tests * Add tests for ChainHandler.processNewHeader(), ChainHandler.getHeader(), Blockchain.connectTip(). Refactor redundant configurations being passed around excessivly * Address code review, fix a flaky test in ClientTest.scala * Test Fixtures (#403) * Working test fixtures * Removed ChainTestFixture trait in main code * Composing Fixtures (#413) * Downloaded over 9000 mainnet BlockHeaders into a json file * Added new fixture with populated blockHeaderDAO * Split writing to db into batches * Rebased * Simplified fixtures with makeFixture abstraction * Added util functions for composing builders * Add integration test between bitcoind <-> zmq <-> bitcoin-s-chain project. Test that we can relay a header from bitcoind over zmq into the bitcoin-s chain project. Redo ZmqConfig to use InetSocketAddress * Address code review * wip * A compiling withBitcoindZmqChainHandler fixture * Tests passing! * Made blockHeaderDAO private * Got 9000 new block headers from 562375 to 571375 * Added offset to populated blockHeaderDAO fixture * Added scaladocs to fixture things * Initial wallet: import UTXO and spend it (#391) * Updates ExtKeyVersion with fromChainParams method * Add equals to Address * Update BIP44 classes * Add ScriptType * Initial work on wallet support * Add foreign keys pragma for SQLite * Add UTXO models and DAO * Add addres P2WPKH generation and WIP for addUTXO * Add logging config for wallet * Add change address generation, proper-ish addUtxo and sendToAddress * Address code review on #391 * Add empty AES passphrase invariant * Add poor mans test fixtures * Add listUtxos, listAddresses and getBalance to wallet API * Use fixtures from chain project * Fix CI test failures * Fix broken up package path * Updates bloop config for new projects (#424) * Multi fixture file (#419) * Created FixtureTag and ChainFixture Used ChainFixture in BitcoinPowTest Added implicit conversions for nice syntactic sugar * Added documentation for multi-fixture * Made defaultTag a val * add a logback-test.xml to the wallet project (#433) * Introduce AppConfig that combines ChainParams and DbConfig (#432) * 2019 04 23 app config per project db config per project (#434) * Add DB conf file resolution that works across projects * Create applicatoin configurations for specific projects, rework DbConfig structures for individual projects. Force network to be mixed into DbConfig rather than DbConfig to be mixed into the network * Add ammonite to db-commons, remove noisy logs * remove mixin for DbConfig that required a NetworkDb. Now networkDb is just a field on 'DbConfig', this simplifies things downstream type wise when interacting with the projects AppConfig. This commit also removes a parameter from AppConfig, now only a DbConfig needs to be passed in, and we can derive the network and chain params from the DbConfig. The only exemption is UnitTestDbConfig as it is sometimes handy to specify a different network (i.e. mainnet) when testing * Turn DbConfig objects to case objects, wrap those case objects in their parent type companion object * remove cast in Wallet.scala * Add EnhancedEither class for 2.11 compat (#437) Add implicit conversion from Either to 2.11-compatible Either-wrapper. Also remove trailing comma in WalletTestUtil that breaks 2.11 build. * Fix CI tests hanging (#438) * Execute wallet tests sequentially to avoid SQLite deadlocks * Refactor logback config to reduce duplication * Use in-memory SQLite DB for unit tests * Debug prints for DatabaseConfig.forConfig * Fork JVMs in test to ensure proper in-memory DBs * Pass in Akka config to Eclair tests, avoid cluttering Akka log output * Don't fork JVM on node tests' * Move things out of ChainUnitTest (#448) * Move things out of ChainUnitTest * Remove printlns * 2019 04 29 client test (#449) * Bump timeout on connect to node test * Change from isConnected -> isInitialized to avoid the error trying to disconnect before we are fully initialized * Wrote tests for POW difficulty change calculation and header processing (#429) Fixed BitcoinPowTest Rebased onto AppConfig code Rewrote ChainHandler integration test Made chain handler test synchronous Fixed a couple test bugs Implmented a more efficient getAncestorByHeight Fixed ChainHandler integration test by using the correct starting conditions Responded to code review Responded to more code review Deleted redundant Pow test Made BlockHeaderDAO.getAncestorAtHeight use a List for its loop to improve performance * WIP: Create ChainSync, BitcoindChainHandlerViaRpc, add simple ChainSyncTes… (#450) * Create ChainSync, BitcoindChainHandlerViaRpc, add simple ChainSyncTest to sync one block from a external bitcoind instance via rpc * Add check for having the best block hash in our chain state already * Fix prev block hash to be empty hash if genesis block header * BlockchainBuilder (#439) * First commit for implementing a BlockchainBuilder * use Builder rather than ReusableBuilder to be compatible with scala 2.11.x * Decouple Blockchain & BlockHeaderDAO * Rebase onto node, incorporate changes in #429 * Add more comments * Reverse order of headers in builder * rebase onot node branch, refactor apis * DB: Add utility method for listing tables in a DB (#447) * Node rebase (#458) * Implement BIP32 path diffing * Rebase node onto newest HD changes in master * Fix 2.11 compile errors * 2019 05 01 wallet ammonite scripts pt2 (#452) * wip -- not finding testkit in doc worksheet Wip -- classdef not found for create-wallet.sc zmq bug Clean up some logs nest zmq start in bitcoindF update jeromq to 0.5.2-SNAPSHOT to get rid of annoying log to stdout Rebase onto node branch with new configs Successfully running ammonite script create-wallet.sc 2019 05 01 wallet ammonite scripts pt2 (#25) * Refactor Ammonite dep * Add basic error handling in AmmoniteBridge * Add very basic README for doc project Fix compile issues after rebasing onto master Add code to sync our wallet code with bitcoind after creating a tx * refactor ZMQSubscriber to _hopefully_ avoid hanging when we call context.term(). We do this by closing the socket before calling context.term() and using socket.setLinger() * Update doc/src/main/scala/org/bitcoins/doc/wallet/create-wallet.sc Co-Authored-By: Christewart <stewart.chris1234@gmail.com> * 2019 05 05 sync chain (#460) * Add code to sync our wallet code with bitcoind after creating a tx Add script to illustrate how the chain persisted and how to sync against a running bitcoind instance on regtest * fix bug relating to subtraction operator not being communative in Pow.getNextWorkRequired(). This kept us from being able to switch proof of work intervals correctly * rename script from persist-chain.sc -> sync-chain.sc * fix 2.11.x compile issues * Refactor chain, node, wallet config (#463) * Refactor chain, node, wallet config Get rid of NetworkDb, DbConfig Add proper structure to conf system, moving everything under the bitcoin-s root key. * Remove Scalacheck from node project * Add doc on configuration * Add override feature to AppConfig * Address code review in #463 * Throw if default data dir is used in tests, add Scaladoc to AppConfig * Add explanations for withOverrides, link to configuration.md from AppConfig * Fix compile error * Moves chain fixtures to testkit project (#475) reset node files * Store encrypted mnemonic to disk (#462) * Add WalletStorage object * Add encrypted mnemonic storage, locked wallet Add lock and unlock operations to wallet. Separate between locked and unlock wallet. * Handle non-existant seed file * Respond to code review from Chris * Use val instead of import * Add doc on how mnemonics are encrypted/stored * 2019 05 15 spv sync headers (#479) * Implement SpvNode skeleton, create NodeUnitTest and move it to the testkit * Implement test case to sync a header via spv into bitcoin-s * Fix compiler errors * Make node project Main runnable (#26) * Add logging configuration to node project * Make default config workable in non-test environments * Add more logging of config in BH DAO and AppConfig * Make Peer id optional * Make node Main.scala runnable * Implement Main.scala to sync with a locally running bitcoind instance. You can now run with 'bloop run node' and sync the node if you adjust the parameters inside of Main.scala. This also reworks the structure of 'AppConfig'. It turns the *AppConfig into a case class intead of case objects. This allows us to pass custom configs into those case classes * Address code review from torkel * Reintroduce withOverrides (#29) * Turn off chain validation logs * Make datadir a parameter to bitcoind config rather than having it implicitly written to the bitcoin.conf file. This was a difference that was occurring in the node branch which had a parameter for the datadir and master which was implicitly writing it to bitcoin.conf * Add ability to overrwrite conf file except in the case of overwriting the DEFAULT_DATADIR & DEFAULT_CONF * remove extra Bitcoind.stopServers in WalletIntegrationTest
This commit is contained in:
parent
3e7fb2be42
commit
839d520206
276 changed files with 14987 additions and 393 deletions
|
@ -54,6 +54,7 @@ stages:
|
|||
|
||||
script: sbt ++$TRAVIS_SCALA_VERSION coverage test &&
|
||||
sbt ++$TRAVIS_SCALA_VERSION core/coverageReport &&
|
||||
sbt ++$TRAVIS_SCALA_VERSION chain/coverageReport &&
|
||||
sbt ++$TRAVIS_SCALA_VERSION coverageAggregate &&
|
||||
sbt ++$TRAVIS_SCALA_VERSION coveralls
|
||||
|
||||
|
|
|
@ -1,23 +0,0 @@
|
|||
<configuration>
|
||||
|
||||
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
|
||||
<file>logs/rpc-test-application.log</file>
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
|
||||
<root level="INFO">
|
||||
<appender-ref ref="STDOUT" />
|
||||
<appender-ref ref="FILE"/>
|
||||
</root>
|
||||
|
||||
|
||||
</configuration>
|
|
@ -34,20 +34,6 @@ class BitcoindInstanceTest extends BitcoindRpcTest {
|
|||
pw.close()
|
||||
}
|
||||
|
||||
override def afterAll(): Unit = {}
|
||||
|
||||
def addDatadirAndWrite(conf: BitcoindConfig): BitcoindConfig = {
|
||||
val tempDir = Files.createTempDirectory("")
|
||||
val confWithDatadir = conf.datadir match {
|
||||
case None =>
|
||||
conf.withOption("datadir", tempDir.toString)
|
||||
case Some(value) => conf
|
||||
}
|
||||
val tempfile = Paths.get(Properties.tmpDir, "bitcoin.conf")
|
||||
BitcoindRpcTestUtil.writeConfigToFile(confWithDatadir)
|
||||
confWithDatadir
|
||||
}
|
||||
|
||||
behavior of "BitcoindInstance"
|
||||
|
||||
it should "start a bitcoind with cookie based authentication" in {
|
||||
|
@ -58,7 +44,7 @@ class BitcoindInstanceTest extends BitcoindRpcTest {
|
|||
|rpcport=${RpcUtil.randomPort}
|
||||
""".stripMargin
|
||||
|
||||
val conf = addDatadirAndWrite(BitcoindConfig(confStr))
|
||||
val conf = BitcoindConfig(confStr, BitcoindRpcTestUtil.tmpDir())
|
||||
val instance = BitcoindInstance.fromConfig(conf)
|
||||
assert(
|
||||
instance.authCredentials
|
||||
|
@ -81,7 +67,7 @@ class BitcoindInstanceTest extends BitcoindRpcTest {
|
|||
|rpcport=${RpcUtil.randomPort}
|
||||
""".stripMargin
|
||||
|
||||
val conf = addDatadirAndWrite(BitcoindConfig(confStr))
|
||||
val conf = BitcoindConfig(confStr, BitcoindRpcTestUtil.tmpDir())
|
||||
val instance = BitcoindInstance.fromConfig(conf)
|
||||
assert(
|
||||
instance.authCredentials
|
||||
|
@ -113,7 +99,7 @@ class BitcoindInstanceTest extends BitcoindRpcTest {
|
|||
|rpcport=${RpcUtil.randomPort}
|
||||
""".stripMargin
|
||||
|
||||
val conf = addDatadirAndWrite(BitcoindConfig(confStr))
|
||||
val conf = BitcoindConfig(confStr, BitcoindRpcTestUtil.tmpDir())
|
||||
val authCredentials =
|
||||
BitcoindAuthCredentials.PasswordBased(username = "bitcoin-s",
|
||||
password = "strong_password")
|
||||
|
@ -123,7 +109,7 @@ class BitcoindInstanceTest extends BitcoindRpcTest {
|
|||
uri = new URI(s"http://localhost:$port"),
|
||||
rpcUri = new URI(s"http://localhost:$rpcPort"),
|
||||
authCredentials = authCredentials,
|
||||
datadir = conf.datadir.get
|
||||
datadir = conf.datadir
|
||||
)
|
||||
|
||||
for {
|
||||
|
|
|
@ -28,18 +28,10 @@ class MempoolRpcTest extends BitcoindRpcTest {
|
|||
case (client, otherClient) =>
|
||||
val defaultConfig = BitcoindRpcTestUtil.standardConfig
|
||||
|
||||
val datadir: Path = {
|
||||
val tempDirPrefix = null // because java APIs are bad
|
||||
Files.createTempDirectory(tempDirPrefix)
|
||||
}
|
||||
|
||||
val configNoBroadcast =
|
||||
defaultConfig
|
||||
.withOption("datadir", datadir.toString())
|
||||
.withOption("walletbroadcast", 0.toString)
|
||||
|
||||
val _ = BitcoindRpcTestUtil.writeConfigToFile(configNoBroadcast)
|
||||
|
||||
val instanceWithoutBroadcast =
|
||||
BitcoindInstance.fromConfig(configNoBroadcast)
|
||||
|
||||
|
|
|
@ -2,9 +2,9 @@ package org.bitcoins.rpc.common
|
|||
|
||||
import org.bitcoins.core.crypto.ECPrivateKey
|
||||
import org.bitcoins.core.protocol.P2PKHAddress
|
||||
import org.bitcoins.core.script.ScriptType
|
||||
import org.bitcoins.rpc.client.common.BitcoindRpcClient
|
||||
import org.bitcoins.rpc.client.common.RpcOpts.AddressType
|
||||
import org.bitcoins.rpc.jsonmodels.RpcScriptType
|
||||
import org.bitcoins.testkit.rpc.BitcoindRpcTestUtil
|
||||
import org.bitcoins.testkit.util.BitcoindRpcTest
|
||||
|
||||
|
@ -37,7 +37,7 @@ class UtilRpcTest extends BitcoindRpcTest {
|
|||
decoded <- client.decodeScript(multisig.redeemScript)
|
||||
} yield {
|
||||
assert(decoded.reqSigs.contains(2))
|
||||
assert(decoded.typeOfScript.contains(RpcScriptType.MULTISIG))
|
||||
assert(decoded.typeOfScript.contains(ScriptType.MULTISIG))
|
||||
assert(decoded.addresses.get.contains(address))
|
||||
}
|
||||
}
|
||||
|
|
|
@ -4,13 +4,14 @@ import org.bitcoins.testkit.util.BitcoinSUnitTest
|
|||
import org.bitcoins.rpc.config.BitcoindAuthCredentials.CookieBased
|
||||
import org.bitcoins.rpc.config.BitcoindAuthCredentials.PasswordBased
|
||||
import org.bitcoins.core.config.RegTest
|
||||
import org.bitcoins.testkit.rpc.BitcoindRpcTestUtil
|
||||
|
||||
class BitcoindAuthCredentialsTest extends BitcoinSUnitTest {
|
||||
it must "handle cookie based auth" in {
|
||||
val confStr = """
|
||||
|regtest=1
|
||||
""".stripMargin
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, BitcoindRpcTestUtil.tmpDir())
|
||||
val auth = BitcoindAuthCredentials.fromConfig(conf)
|
||||
val cookie = auth match {
|
||||
case cookie: CookieBased => cookie
|
||||
|
@ -28,7 +29,7 @@ class BitcoindAuthCredentialsTest extends BitcoinSUnitTest {
|
|||
|rpcuser=foo
|
||||
|rpcpassword=bar
|
||||
""".stripMargin
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, BitcoindRpcTestUtil.tmpDir())
|
||||
val auth = BitcoindAuthCredentials.fromConfig(conf)
|
||||
|
||||
val pass = auth match {
|
||||
|
@ -48,7 +49,7 @@ class BitcoindAuthCredentialsTest extends BitcoinSUnitTest {
|
|||
|rpcpassword=bar
|
||||
""".stripMargin
|
||||
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, BitcoindRpcTestUtil.tmpDir())
|
||||
BitcoindAuthCredentials.fromConfig(conf) match {
|
||||
case _: CookieBased => fail
|
||||
case PasswordBased(username, password) =>
|
||||
|
|
|
@ -9,10 +9,11 @@ import org.bitcoins.testkit.util.BitcoindRpcTest
|
|||
|
||||
class BitcoindConfigTest extends BitcoinSUnitTest {
|
||||
|
||||
def tmpDir = BitcoindRpcTestUtil.tmpDir()
|
||||
it must "have to/fromString symmetry" in {
|
||||
val conf = BitcoindRpcTestUtil.standardConfig
|
||||
val confStr = conf.toWriteableString
|
||||
val otherConf = BitcoindConfig(confStr)
|
||||
val otherConf = BitcoindConfig(confStr, tmpDir)
|
||||
val otherConfStr = otherConf.toWriteableString
|
||||
assert(confStr == otherConfStr)
|
||||
}
|
||||
|
@ -20,7 +21,8 @@ class BitcoindConfigTest extends BitcoinSUnitTest {
|
|||
it must "parse networks" in {
|
||||
val conf = BitcoindConfig("""
|
||||
|regtest=1
|
||||
""".stripMargin)
|
||||
""".stripMargin,
|
||||
tmpDir)
|
||||
assert(conf.network == RegTest)
|
||||
}
|
||||
|
||||
|
@ -35,7 +37,7 @@ class BitcoindConfigTest extends BitcoinSUnitTest {
|
|||
|rpcport=4000
|
||||
""".stripMargin.split("\n")
|
||||
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, tmpDir)
|
||||
assert(conf.rpcport == 3000)
|
||||
assert(conf.network == RegTest)
|
||||
}
|
||||
|
@ -52,7 +54,7 @@ class BitcoindConfigTest extends BitcoinSUnitTest {
|
|||
|regtest.rpcport=3000
|
||||
""".stripMargin.split("\n")
|
||||
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, tmpDir)
|
||||
assert(conf.rpcport == 4000)
|
||||
assert(conf.network == RegTest)
|
||||
}
|
||||
|
@ -67,7 +69,7 @@ class BitcoindConfigTest extends BitcoinSUnitTest {
|
|||
|regtest.rpcport=3000
|
||||
""".stripMargin.split("\n")
|
||||
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, tmpDir)
|
||||
assert(conf.rpcport == TestNet3.rpcPort)
|
||||
assert(conf.network == TestNet3)
|
||||
}
|
||||
|
@ -87,7 +89,7 @@ class BitcoindConfigTest extends BitcoinSUnitTest {
|
|||
|rpcport=1000
|
||||
""".stripMargin.split("\n")
|
||||
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, tmpDir)
|
||||
assert(conf.rpcport == 3000)
|
||||
assert(conf.network == TestNet3)
|
||||
assert(conf.username.contains("username"))
|
||||
|
@ -116,7 +118,7 @@ class BitcoindConfigTest extends BitcoinSUnitTest {
|
|||
|rpcuser=username
|
||||
""".stripMargin.split("\n")
|
||||
|
||||
val conf = BitcoindConfig(confStr)
|
||||
val conf = BitcoindConfig(confStr, tmpDir)
|
||||
assert(conf.rpcport == 4000)
|
||||
assert(conf.network == RegTest)
|
||||
assert(conf.username.contains("username"))
|
||||
|
|
|
@ -58,7 +58,7 @@ object BitcoindAuthCredentials extends BitcoinSLogger {
|
|||
datadir: File = BitcoindConfig.DEFAULT_DATADIR)
|
||||
extends BitcoindAuthCredentials {
|
||||
|
||||
lazy private[bitcoins] val cookiePath = {
|
||||
private[bitcoins] lazy val cookiePath = {
|
||||
val middleSegment = network match {
|
||||
case TestNet3 => "testnet3"
|
||||
case MainNet => ""
|
||||
|
@ -87,7 +87,7 @@ object BitcoindAuthCredentials extends BitcoinSLogger {
|
|||
}
|
||||
|
||||
def fromConfig(config: BitcoindConfig): BitcoindAuthCredentials = {
|
||||
val datadir = config.datadir.getOrElse(BitcoindConfig.DEFAULT_DATADIR)
|
||||
val datadir = config.datadir
|
||||
val username = config.username
|
||||
val password = config.password
|
||||
(username, password) match {
|
||||
|
|
|
@ -1,12 +1,13 @@
|
|||
package org.bitcoins.rpc.config
|
||||
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
import org.bitcoins.core.util.{BitcoinSLogger, BitcoinSUtil}
|
||||
import org.bitcoins.core.config._
|
||||
import java.io.File
|
||||
import java.nio.file.Files
|
||||
|
||||
import scala.util.Properties
|
||||
import java.nio.file.Paths
|
||||
import java.net.URI
|
||||
import java.net.{InetSocketAddress, URI}
|
||||
import java.nio.file.Path
|
||||
|
||||
/**
|
||||
|
@ -23,8 +24,27 @@ import java.nio.file.Path
|
|||
*
|
||||
* @see https://github.com/bitcoin/bitcoin/blob/master/doc/bitcoin-conf.md
|
||||
*/
|
||||
abstract class BitcoindConfig extends BitcoinSLogger {
|
||||
private[bitcoins] def lines: Seq[String]
|
||||
case class BitcoindConfig(
|
||||
private[bitcoins] val lines: Seq[String],
|
||||
datadir: File)
|
||||
extends BitcoinSLogger {
|
||||
|
||||
//create datadir and config if it DNE on disk
|
||||
if (!datadir.exists()) {
|
||||
logger.info(
|
||||
s"datadir=${datadir.getAbsolutePath} does not exist, creating now")
|
||||
datadir.mkdirs()
|
||||
BitcoindConfig.writeConfigToFile(this, datadir)
|
||||
}
|
||||
|
||||
private val confFile = datadir.toPath.resolve("bitcoin.conf")
|
||||
|
||||
//create bitcoin.conf file in datadir if it does not exist
|
||||
if (!Files.exists(confFile)) {
|
||||
logger.info(
|
||||
s"bitcoin.conf in datadir=${datadir.getAbsolutePath} does not exist, creating now")
|
||||
BitcoindConfig.writeConfigToFile(this, datadir)
|
||||
}
|
||||
|
||||
/**
|
||||
* Converts the config back to a string that can be written
|
||||
|
@ -187,16 +207,16 @@ abstract class BitcoindConfig extends BitcoinSLogger {
|
|||
}.headOption
|
||||
}
|
||||
|
||||
lazy val datadir: Option[File] = getValue("datadir").map(new File(_))
|
||||
|
||||
lazy val username: Option[String] = getValue("rpcuser")
|
||||
lazy val password: Option[String] = getValue("rpcpassword")
|
||||
lazy val zmqpubrawblock: Option[URI] =
|
||||
getValue("zmqpubrawblock").map(new URI(_))
|
||||
lazy val zmqpubrawtx: Option[URI] = getValue("zmqpubrawtx").map(new URI(_))
|
||||
lazy val zmqpubhashblock: Option[URI] =
|
||||
getValue("zmqpubhashblock").map(new URI(_))
|
||||
lazy val zmqpubhashtx: Option[URI] = getValue("zmqpubhashtx").map(new URI(_))
|
||||
lazy val zmqpubrawblock: Option[InetSocketAddress] =
|
||||
getValue("zmqpubrawblock").map(BitcoinSUtil.toInetSocketAddress)
|
||||
lazy val zmqpubrawtx: Option[InetSocketAddress] =
|
||||
getValue("zmqpubrawtx").map(BitcoinSUtil.toInetSocketAddress)
|
||||
lazy val zmqpubhashblock: Option[InetSocketAddress] =
|
||||
getValue("zmqpubhashblock").map(BitcoinSUtil.toInetSocketAddress)
|
||||
lazy val zmqpubhashtx: Option[InetSocketAddress] =
|
||||
getValue("zmqpubhashtx").map(BitcoinSUtil.toInetSocketAddress)
|
||||
|
||||
lazy val port: Int = getValue("port").map(_.toInt).getOrElse(network.port)
|
||||
|
||||
|
@ -224,13 +244,14 @@ abstract class BitcoindConfig extends BitcoinSLogger {
|
|||
/** Creates a new config with the given keys and values appended */
|
||||
def withOption(key: String, value: String): BitcoindConfig = {
|
||||
val ourLines = this.lines
|
||||
new BitcoindConfig {
|
||||
|
||||
def lines: Seq[String] = {
|
||||
val newLine = s"$key=$value"
|
||||
newLine +: ourLines
|
||||
}
|
||||
}
|
||||
val lines = newLine +: ourLines
|
||||
val newConfig = BitcoindConfig(lines, datadir)
|
||||
logger.debug(
|
||||
s"Appending new config with $key=$value to datadir=${datadir.getAbsolutePath}")
|
||||
BitcoindConfig.writeConfigToFile(newConfig, datadir)
|
||||
|
||||
newConfig
|
||||
}
|
||||
|
||||
/** Creates a new config with the given key and values,
|
||||
|
@ -254,30 +275,30 @@ abstract class BitcoindConfig extends BitcoinSLogger {
|
|||
network: NetworkParameters): BitcoindConfig =
|
||||
withOption(key = s"${networkString(network)}.$key", value = value)
|
||||
|
||||
def withDatadir(newDatadir: File): BitcoindConfig = {
|
||||
BitcoindConfig(lines, newDatadir)
|
||||
}
|
||||
|
||||
object BitcoindConfig {
|
||||
}
|
||||
|
||||
object BitcoindConfig extends BitcoinSLogger {
|
||||
|
||||
/** The empty `bitcoind` config */
|
||||
lazy val empty: BitcoindConfig = BitcoindConfig("")
|
||||
|
||||
/** Constructs a `bitcoind` config from the given lines */
|
||||
def apply(config: Seq[String]): BitcoindConfig = new BitcoindConfig {
|
||||
val lines: Seq[String] = config
|
||||
}
|
||||
lazy val empty: BitcoindConfig = BitcoindConfig("", DEFAULT_DATADIR)
|
||||
|
||||
/**
|
||||
* Constructs a `bitcoind` config from the given string,
|
||||
* by splitting it on newlines
|
||||
*/
|
||||
def apply(config: String): BitcoindConfig =
|
||||
apply(config.split("\n"))
|
||||
def apply(config: String, datadir: File): BitcoindConfig =
|
||||
apply(config.split("\n"), datadir)
|
||||
|
||||
/** Reads the given path and construct a `bitcoind` config from it */
|
||||
def apply(config: Path): BitcoindConfig = apply(config.toFile)
|
||||
def apply(config: Path): BitcoindConfig =
|
||||
apply(config.toFile, config.getParent.toFile)
|
||||
|
||||
/** Reads the given file and construct a `bitcoind` config from it */
|
||||
def apply(config: File): BitcoindConfig = {
|
||||
def apply(config: File, datadir: File = DEFAULT_DATADIR): BitcoindConfig = {
|
||||
import scala.collection.JavaConverters._
|
||||
val lines = Files
|
||||
.readAllLines(config.toPath)
|
||||
|
@ -285,7 +306,7 @@ object BitcoindConfig {
|
|||
.asScala
|
||||
.toList
|
||||
|
||||
apply(lines)
|
||||
apply(lines, datadir)
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -321,4 +342,25 @@ object BitcoindConfig {
|
|||
.toPath()
|
||||
.resolve("bitcoin.conf")
|
||||
.toFile
|
||||
|
||||
/**
|
||||
* Writes the config to the data directory within it, if it doesn't
|
||||
* exist. Returns the written file.
|
||||
*/
|
||||
def writeConfigToFile(config: BitcoindConfig, datadir: File): Path = {
|
||||
|
||||
val confStr = config.lines.mkString("\n")
|
||||
|
||||
Files.createDirectories(datadir.toPath)
|
||||
val confFile = datadir.toPath.resolve("bitcoin.conf")
|
||||
|
||||
if (datadir == DEFAULT_DATADIR && confFile == DEFAULT_CONF_FILE) {
|
||||
logger.warn(
|
||||
s"We will not overrwrite the existing bitcoin.conf in default datadir")
|
||||
} else {
|
||||
Files.write(confFile, confStr.getBytes)
|
||||
}
|
||||
|
||||
confFile
|
||||
}
|
||||
}
|
||||
|
|
|
@ -50,6 +50,8 @@ sealed trait BitcoindInstance extends BitcoinSLogger {
|
|||
case _: String => BitcoindVersion.Unknown
|
||||
}
|
||||
}
|
||||
|
||||
def p2pPort: Int = uri.getPort
|
||||
}
|
||||
|
||||
object BitcoindInstance {
|
||||
|
@ -104,8 +106,7 @@ object BitcoindInstance {
|
|||
val file = configPath.toFile()
|
||||
fromConfigFile(file)
|
||||
} else {
|
||||
fromConfig(
|
||||
BitcoindConfig.empty.withOption("datadir", configPath.toString))
|
||||
fromConfig(BitcoindConfig.empty)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -120,15 +121,9 @@ object BitcoindInstance {
|
|||
require(file.exists, s"${file.getPath} does not exist!")
|
||||
require(file.isFile, s"${file.getPath} is not a file!")
|
||||
|
||||
val conf = BitcoindConfig(file)
|
||||
val conf = BitcoindConfig(file, file.getParentFile)
|
||||
|
||||
val confWithDatadir = if (conf.datadir.isEmpty) {
|
||||
conf.withOption("datadir", file.getParent.toString)
|
||||
} else {
|
||||
conf
|
||||
}
|
||||
|
||||
fromConfig(confWithDatadir)
|
||||
fromConfig(conf)
|
||||
}
|
||||
|
||||
/** Constructs a `bitcoind` instance from the given config */
|
||||
|
@ -137,21 +132,11 @@ object BitcoindInstance {
|
|||
): BitcoindInstance = {
|
||||
|
||||
val authCredentials = BitcoindAuthCredentials.fromConfig(config)
|
||||
|
||||
config.datadir match {
|
||||
case None =>
|
||||
BitcoindInstance(config.network,
|
||||
config.uri,
|
||||
config.rpcUri,
|
||||
authCredentials,
|
||||
zmqConfig = ZmqConfig.fromConfig(config))
|
||||
case Some(datadir) =>
|
||||
BitcoindInstance(config.network,
|
||||
config.uri,
|
||||
config.rpcUri,
|
||||
authCredentials,
|
||||
zmqConfig = ZmqConfig.fromConfig(config),
|
||||
datadir = datadir)
|
||||
}
|
||||
datadir = config.datadir)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -1,27 +1,29 @@
|
|||
package org.bitcoins.rpc.config
|
||||
|
||||
import java.net.URI
|
||||
import java.net.InetSocketAddress
|
||||
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
|
||||
sealed trait ZmqConfig {
|
||||
def hashBlock: Option[URI]
|
||||
def rawBlock: Option[URI]
|
||||
def hashTx: Option[URI]
|
||||
def rawTx: Option[URI]
|
||||
def hashBlock: Option[InetSocketAddress]
|
||||
def rawBlock: Option[InetSocketAddress]
|
||||
def hashTx: Option[InetSocketAddress]
|
||||
def rawTx: Option[InetSocketAddress]
|
||||
}
|
||||
|
||||
object ZmqConfig {
|
||||
object ZmqConfig extends BitcoinSLogger {
|
||||
private case class ZmqConfigImpl(
|
||||
hashBlock: Option[URI],
|
||||
rawBlock: Option[URI],
|
||||
hashTx: Option[URI],
|
||||
rawTx: Option[URI]
|
||||
hashBlock: Option[InetSocketAddress],
|
||||
rawBlock: Option[InetSocketAddress],
|
||||
hashTx: Option[InetSocketAddress],
|
||||
rawTx: Option[InetSocketAddress]
|
||||
) extends ZmqConfig
|
||||
|
||||
def apply(
|
||||
hashBlock: Option[URI] = None,
|
||||
rawBlock: Option[URI] = None,
|
||||
hashTx: Option[URI] = None,
|
||||
rawTx: Option[URI] = None
|
||||
hashBlock: Option[InetSocketAddress] = None,
|
||||
rawBlock: Option[InetSocketAddress] = None,
|
||||
hashTx: Option[InetSocketAddress] = None,
|
||||
rawTx: Option[InetSocketAddress] = None
|
||||
): ZmqConfig =
|
||||
ZmqConfigImpl(hashBlock = hashBlock,
|
||||
rawBlock = rawBlock,
|
||||
|
@ -33,7 +35,7 @@ object ZmqConfig {
|
|||
* `localhost` and the same port
|
||||
*/
|
||||
def fromPort(port: Int): ZmqConfig = {
|
||||
val uri = new URI(s"tcp://localhost:$port")
|
||||
val uri = new InetSocketAddress("tcp://127.0.0.1", port)
|
||||
ZmqConfig(hashBlock = Some(uri),
|
||||
rawBlock = Some(uri),
|
||||
hashTx = Some(uri),
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
package org.bitcoins.rpc.jsonmodels
|
||||
|
||||
import org.bitcoins.core.crypto.{DoubleSha256DigestBE}
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.core.currency.Bitcoins
|
||||
import org.bitcoins.core.number.{Int32, UInt32}
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import org.bitcoins.core.wallet.fee.BitcoinFeeUnit
|
||||
|
||||
sealed abstract class BlockchainResult
|
||||
|
@ -104,7 +105,26 @@ case class GetBlockHeaderResult(
|
|||
chainwork: String,
|
||||
previousblockhash: Option[DoubleSha256DigestBE],
|
||||
nextblockhash: Option[DoubleSha256DigestBE])
|
||||
extends BlockchainResult
|
||||
extends BlockchainResult {
|
||||
def blockHeader: BlockHeader = {
|
||||
|
||||
//prevblockhash is only empty if we have the genesis block
|
||||
//we assume the prevhash of the gensis block is the empty hash
|
||||
val prevHash = {
|
||||
if (height == 0 && previousblockhash.isEmpty) {
|
||||
DoubleSha256DigestBE.empty
|
||||
} else {
|
||||
previousblockhash.get
|
||||
}
|
||||
}
|
||||
BlockHeader(version = Int32(version),
|
||||
previousBlockHash = prevHash.flip,
|
||||
merkleRootHash = merkleroot.flip,
|
||||
time = time,
|
||||
nBits = bits,
|
||||
nonce = nonce)
|
||||
}
|
||||
}
|
||||
|
||||
case class ChainTip(
|
||||
height: Int,
|
||||
|
|
|
@ -6,6 +6,7 @@ import org.bitcoins.core.number.UInt32
|
|||
import org.bitcoins.core.protocol.script.{ScriptPubKey, ScriptSignature}
|
||||
import org.bitcoins.core.protocol.transaction.{Transaction, TransactionInput}
|
||||
import org.bitcoins.core.protocol.{BitcoinAddress, P2PKHAddress, P2SHAddress}
|
||||
import org.bitcoins.core.script.ScriptType
|
||||
|
||||
sealed abstract class RawTransactionResult
|
||||
|
||||
|
@ -27,35 +28,17 @@ case class RpcTransactionOutput(
|
|||
scriptPubKey: RpcScriptPubKey)
|
||||
extends RawTransactionResult
|
||||
|
||||
/**
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/fa6180188b8ab89af97860e6497716405a48bab6/src/script/standard.cpp#L27 standard.cpp]]
|
||||
* from Bitcoin Core
|
||||
*/
|
||||
sealed abstract class RpcScriptType extends RawTransactionResult
|
||||
|
||||
object RpcScriptType {
|
||||
final case object NONSTANDARD extends RpcScriptType
|
||||
final case object PUBKEY extends RpcScriptType
|
||||
final case object PUBKEYHASH extends RpcScriptType
|
||||
final case object SCRIPTHASH extends RpcScriptType
|
||||
final case object MULTISIG extends RpcScriptType
|
||||
final case object NULLDATA extends RpcScriptType
|
||||
final case object WITNESS_V0_KEYHASH extends RpcScriptType
|
||||
final case object WITNESS_V0_SCRIPTHASH extends RpcScriptType
|
||||
final case object WITNESS_UNKNOWN extends RpcScriptType
|
||||
}
|
||||
|
||||
case class RpcScriptPubKey(
|
||||
asm: String,
|
||||
hex: String,
|
||||
reqSigs: Option[Int],
|
||||
scriptType: RpcScriptType,
|
||||
scriptType: ScriptType,
|
||||
addresses: Option[Vector[BitcoinAddress]])
|
||||
extends RawTransactionResult
|
||||
|
||||
case class DecodeScriptResult(
|
||||
asm: String,
|
||||
typeOfScript: Option[RpcScriptType],
|
||||
typeOfScript: Option[ScriptType],
|
||||
reqSigs: Option[Int],
|
||||
addresses: Option[Vector[P2PKHAddress]],
|
||||
p2sh: P2SHAddress)
|
||||
|
|
|
@ -5,6 +5,7 @@ import org.bitcoins.core.currency.Bitcoins
|
|||
import org.bitcoins.core.protocol.BitcoinAddress
|
||||
import org.bitcoins.core.protocol.script.ScriptPubKey
|
||||
import org.bitcoins.core.protocol.transaction.Transaction
|
||||
import org.bitcoins.core.script.ScriptType
|
||||
import org.bitcoins.core.script.crypto.HashType
|
||||
|
||||
sealed abstract class RpcPsbtResult
|
||||
|
@ -40,7 +41,7 @@ final case class RpcPsbtInput(
|
|||
final case class RpcPsbtScript(
|
||||
asm: String, // todo(torkelrogstad) split into Vector[ScriptToken]?
|
||||
hex: ScriptPubKey,
|
||||
scriptType: Option[RpcScriptType],
|
||||
scriptType: Option[ScriptType],
|
||||
address: Option[BitcoinAddress]
|
||||
) extends RpcPsbtResult
|
||||
|
||||
|
|
|
@ -14,6 +14,7 @@ import org.bitcoins.core.number.UInt32
|
|||
import org.bitcoins.core.protocol.BitcoinAddress
|
||||
import org.bitcoins.core.protocol.script.{ScriptPubKey, WitnessVersion}
|
||||
import org.bitcoins.core.protocol.transaction.Transaction
|
||||
import org.bitcoins.core.script.ScriptType
|
||||
import org.bitcoins.core.wallet.fee.BitcoinFeeUnit
|
||||
import org.bitcoins.rpc.client.common.RpcOpts.LabelPurpose
|
||||
import org.joda.time.DateTime
|
||||
|
@ -198,7 +199,7 @@ case class AddressInfoResult(
|
|||
iscompressed: Option[Boolean],
|
||||
witness_version: Option[WitnessVersion],
|
||||
witness_program: Option[String], // todo what's the correct type here?
|
||||
script: Option[RpcScriptType],
|
||||
script: Option[ScriptType],
|
||||
hex: Option[ScriptPubKey],
|
||||
pubkeys: Option[Vector[ECPublicKey]],
|
||||
sigsrequired: Option[Int],
|
||||
|
|
|
@ -20,6 +20,7 @@ import org.bitcoins.core.protocol.{
|
|||
P2PKHAddress,
|
||||
P2SHAddress
|
||||
}
|
||||
import org.bitcoins.core.script.ScriptType
|
||||
import org.bitcoins.core.script.crypto.HashType
|
||||
import org.bitcoins.core.wallet.fee.{BitcoinFeeUnit, SatoshisPerByte}
|
||||
import org.bitcoins.rpc.client.common.RpcOpts.LabelPurpose
|
||||
|
@ -448,7 +449,7 @@ object JsonReaders {
|
|||
for {
|
||||
asm <- (json \ "asm").validate[String]
|
||||
hex <- (json \ "hex").validate[ScriptPubKey]
|
||||
scriptType <- (json \ "type").validateOpt[RpcScriptType]
|
||||
scriptType <- (json \ "type").validateOpt[ScriptType]
|
||||
address <- (json \ "address").validateOpt[BitcoinAddress]
|
||||
} yield
|
||||
RpcPsbtScript(asm = asm,
|
||||
|
@ -498,20 +499,11 @@ object JsonReaders {
|
|||
|
||||
}
|
||||
|
||||
implicit object RpcScriptTypeReads extends Reads[RpcScriptType] {
|
||||
import RpcScriptType._
|
||||
override def reads(json: JsValue): JsResult[RpcScriptType] =
|
||||
json.validate[String].flatMap {
|
||||
case "nonstandard" => JsSuccess(NONSTANDARD)
|
||||
case "pubkey" => JsSuccess(PUBKEY)
|
||||
case "pubkeyhash" => JsSuccess(PUBKEYHASH)
|
||||
case "scripthash" => JsSuccess(SCRIPTHASH)
|
||||
case "multisig" => JsSuccess(MULTISIG)
|
||||
case "nulldata" => JsSuccess(NULLDATA)
|
||||
case "witness_v0_keyhash" => JsSuccess(WITNESS_V0_KEYHASH)
|
||||
case "witness_v0_scripthash" => JsSuccess(WITNESS_V0_SCRIPTHASH)
|
||||
case "witness_unknown" => JsSuccess(WITNESS_UNKNOWN)
|
||||
}
|
||||
implicit object ScriptTypeReads extends Reads[ScriptType] {
|
||||
override def reads(json: JsValue): JsResult[ScriptType] =
|
||||
json
|
||||
.validate[String]
|
||||
.map(ScriptType.fromStringExn)
|
||||
}
|
||||
|
||||
implicit object TestMempoolAcceptResultReads
|
||||
|
|
|
@ -20,6 +20,7 @@ import org.bitcoins.core.protocol.{
|
|||
P2PKHAddress,
|
||||
P2SHAddress
|
||||
}
|
||||
import org.bitcoins.core.script.ScriptType
|
||||
import org.bitcoins.core.wallet.fee.BitcoinFeeUnit
|
||||
import org.bitcoins.rpc.client.common.RpcOpts.AddressType
|
||||
import org.bitcoins.rpc.jsonmodels._
|
||||
|
@ -87,7 +88,7 @@ object JsonSerializers {
|
|||
((__ \ "asm").read[String] and
|
||||
(__ \ "hex").read[String] and
|
||||
(__ \ "reqSigs").readNullable[Int] and
|
||||
(__ \ "type").read[RpcScriptType] and
|
||||
(__ \ "type").read[ScriptType] and
|
||||
(__ \ "addresses").readNullable[Vector[BitcoinAddress]])(RpcScriptPubKey)
|
||||
implicit val rpcTransactionOutputReads: Reads[RpcTransactionOutput] =
|
||||
Json.reads[RpcTransactionOutput]
|
||||
|
@ -96,7 +97,7 @@ object JsonSerializers {
|
|||
|
||||
implicit val decodeScriptResultReads: Reads[DecodeScriptResult] =
|
||||
((__ \ "asm").read[String] and
|
||||
(__ \ "type").readNullable[RpcScriptType] and
|
||||
(__ \ "type").readNullable[ScriptType] and
|
||||
(__ \ "reqSigs").readNullable[Int] and
|
||||
(__ \ "addresses").readNullable[Vector[P2PKHAddress]] and
|
||||
(__ \ "p2sh").read[P2SHAddress])(DecodeScriptResult)
|
||||
|
@ -359,7 +360,7 @@ object JsonSerializers {
|
|||
implicit val walletCreateFundedPsbtResultReads: Reads[
|
||||
WalletCreateFundedPsbtResult] = Json.reads[WalletCreateFundedPsbtResult]
|
||||
|
||||
implicit val rpcScriptTypeReads: Reads[RpcScriptType] = RpcScriptTypeReads
|
||||
implicit val scriptTypeReads: Reads[ScriptType] = ScriptTypeReads
|
||||
|
||||
implicit val testMempoolAcceptResultReads: Reads[TestMempoolAcceptResult] =
|
||||
TestMempoolAcceptResultReads
|
||||
|
|
|
@ -17,6 +17,9 @@ import play.api.libs.json._
|
|||
|
||||
import scala.collection.mutable
|
||||
|
||||
// for mapWrites below
|
||||
import scala.language.implicitConversions
|
||||
|
||||
object JsonWriters {
|
||||
implicit object HashTypeWrites extends Writes[HashType] {
|
||||
override def writes(hash: HashType): JsValue = hash match {
|
||||
|
|
189
build.sbt
189
build.sbt
|
@ -6,7 +6,10 @@ import scala.util.Properties
|
|||
|
||||
cancelable in Global := true
|
||||
|
||||
fork in Test := true
|
||||
//don't allow us to wipe all of our prod databases
|
||||
flywayClean / aggregate := false
|
||||
//allow us to wipe our test databases
|
||||
Test / flywayClean / aggregate := true
|
||||
|
||||
lazy val timestamp = new java.util.Date().getTime
|
||||
|
||||
|
@ -57,6 +60,7 @@ lazy val commonSettings = List(
|
|||
assemblyOption in assembly := (assemblyOption in assembly).value
|
||||
.copy(includeScala = false),
|
||||
licenses += ("MIT", url("http://opensource.org/licenses/MIT")),
|
||||
|
||||
/**
|
||||
* Adding Ammonite REPL to test scope, can access both test and compile
|
||||
* sources. Docs: http://ammonite.io/#Ammonite-REPL
|
||||
|
@ -76,9 +80,16 @@ lazy val commonSettings = List(
|
|||
)
|
||||
|
||||
lazy val commonTestSettings = Seq(
|
||||
publish / skip := true
|
||||
publish / skip := true,
|
||||
) ++ commonSettings
|
||||
|
||||
lazy val commonTestWithDbSettings = Seq(
|
||||
// To make in-memory DBs work properly
|
||||
Test / fork := true,
|
||||
// To avoid deadlock issues with SQLite
|
||||
Test / parallelExecution := false
|
||||
) ++ commonTestSettings
|
||||
|
||||
lazy val commonProdSettings = Seq(
|
||||
Test / bloopGenerate := None
|
||||
) ++ commonSettings
|
||||
|
@ -87,16 +98,22 @@ lazy val bitcoins = project
|
|||
.in(file("."))
|
||||
.aggregate(
|
||||
secp256k1jni,
|
||||
chain,
|
||||
chainTest,
|
||||
core,
|
||||
coreTest,
|
||||
zmq,
|
||||
bitcoindRpc,
|
||||
bitcoindRpcTest,
|
||||
bench,
|
||||
eclairRpc,
|
||||
eclairRpcTest,
|
||||
node,
|
||||
nodeTest,
|
||||
wallet,
|
||||
walletTest,
|
||||
testkit,
|
||||
scripts
|
||||
scripts,
|
||||
zmq
|
||||
)
|
||||
.settings(commonSettings: _*)
|
||||
.settings(crossScalaVersions := Nil)
|
||||
|
@ -207,14 +224,45 @@ lazy val coreTest = project
|
|||
)
|
||||
.enablePlugins()
|
||||
|
||||
lazy val chainDbSettings = dbFlywaySettings("chaindb")
|
||||
lazy val chain = project
|
||||
.in(file("chain"))
|
||||
.settings(commonProdSettings: _*)
|
||||
.settings(chainDbSettings: _*)
|
||||
.settings(
|
||||
name := "bitcoin-s-chain",
|
||||
libraryDependencies ++= Deps.chain
|
||||
).dependsOn(core, dbCommons)
|
||||
.enablePlugins(FlywayPlugin)
|
||||
|
||||
lazy val chainTest = project
|
||||
.in(file("chain-test"))
|
||||
.settings(commonTestWithDbSettings: _*)
|
||||
.settings(chainDbSettings: _*)
|
||||
.settings(
|
||||
name := "bitcoin-s-chain-test",
|
||||
libraryDependencies ++= Deps.chainTest,
|
||||
).dependsOn(chain, core, testkit, zmq)
|
||||
.enablePlugins(FlywayPlugin)
|
||||
|
||||
|
||||
lazy val dbCommons = project
|
||||
.in(file("db-commons"))
|
||||
.settings(commonSettings: _*)
|
||||
.settings(
|
||||
name := "bitcoin-s-db-commons",
|
||||
libraryDependencies ++= Deps.dbCommons
|
||||
).dependsOn(core)
|
||||
.enablePlugins()
|
||||
|
||||
|
||||
lazy val zmq = project
|
||||
.in(file("zmq"))
|
||||
.settings(commonSettings: _*)
|
||||
.settings(name := "bitcoin-s-zmq", libraryDependencies ++= Deps.bitcoindZmq)
|
||||
.dependsOn(
|
||||
core
|
||||
)
|
||||
.enablePlugins(GitVersioning)
|
||||
).enablePlugins(GitVersioning)
|
||||
|
||||
lazy val bitcoindRpc = project
|
||||
.in(file("bitcoind-rpc"))
|
||||
|
@ -264,13 +312,56 @@ lazy val eclairRpcTest = project
|
|||
.dependsOn(testkit)
|
||||
.enablePlugins()
|
||||
|
||||
lazy val nodeDbSettings = dbFlywaySettings("nodedb")
|
||||
lazy val node = {
|
||||
project
|
||||
.in(file("node"))
|
||||
.settings(commonSettings: _*)
|
||||
.settings(nodeDbSettings: _*)
|
||||
.settings(
|
||||
name := "bitcoin-s-node",
|
||||
libraryDependencies ++= Deps.node
|
||||
)
|
||||
.dependsOn(
|
||||
core,
|
||||
chain,
|
||||
dbCommons,
|
||||
bitcoindRpc
|
||||
).enablePlugins(FlywayPlugin)
|
||||
}
|
||||
|
||||
lazy val nodeTest = {
|
||||
project
|
||||
.in(file("node-test"))
|
||||
.settings(commonTestWithDbSettings: _*)
|
||||
.settings(nodeDbSettings: _*)
|
||||
.settings(
|
||||
name := "bitcoin-s-node-test",
|
||||
// There's a weird issue with forking
|
||||
// in node tests, for example this CI
|
||||
// error: https://travis-ci.org/bitcoin-s/bitcoin-s-core/jobs/525018199#L1252
|
||||
// It seems to be related to this
|
||||
// Scalatest issue:
|
||||
// https://github.com/scalatest/scalatest/issues/556
|
||||
Test / fork := false,
|
||||
libraryDependencies ++= Deps.nodeTest
|
||||
).dependsOn(
|
||||
node,
|
||||
testkit
|
||||
).enablePlugins(FlywayPlugin)
|
||||
}
|
||||
|
||||
lazy val testkit = project
|
||||
.in(file("testkit"))
|
||||
.settings(commonProdSettings: _*)
|
||||
.dependsOn(
|
||||
core,
|
||||
chain,
|
||||
bitcoindRpc,
|
||||
eclairRpc
|
||||
eclairRpc,
|
||||
node,
|
||||
wallet,
|
||||
zmq
|
||||
)
|
||||
.enablePlugins(GitVersioning)
|
||||
|
||||
|
@ -304,6 +395,30 @@ lazy val docs = project
|
|||
)
|
||||
.enablePlugins(MdocPlugin, DocusaurusPlugin)
|
||||
|
||||
lazy val walletDbSettings = dbFlywaySettings("walletdb")
|
||||
lazy val wallet = project
|
||||
.in(file("wallet"))
|
||||
.settings(commonProdSettings: _*)
|
||||
.settings(walletDbSettings: _*)
|
||||
.settings(
|
||||
name := "bitcoin-s-wallet",
|
||||
libraryDependencies ++= Deps.wallet
|
||||
)
|
||||
.dependsOn(core, dbCommons)
|
||||
.enablePlugins(FlywayPlugin)
|
||||
|
||||
lazy val walletTest = project
|
||||
.in(file("wallet-test"))
|
||||
.settings(commonTestWithDbSettings: _*)
|
||||
.settings(walletDbSettings: _*)
|
||||
.settings(
|
||||
name := "bitcoin-s-wallet-test",
|
||||
libraryDependencies ++= Deps.walletTest,
|
||||
)
|
||||
.dependsOn(core, testkit, wallet)
|
||||
.enablePlugins(FlywayPlugin)
|
||||
|
||||
|
||||
lazy val scripts = project
|
||||
.in(file("scripts"))
|
||||
.dependsOn(core, bitcoindRpc, eclairRpc, zmq)
|
||||
|
@ -312,6 +427,17 @@ lazy val scripts = project
|
|||
name := "bitcoin-s-scripts",
|
||||
libraryDependencies ++= Deps.scripts
|
||||
)
|
||||
.dependsOn(
|
||||
bitcoindRpc,
|
||||
chain,
|
||||
core,
|
||||
eclairRpc,
|
||||
node,
|
||||
secp256k1jni,
|
||||
testkit,
|
||||
wallet,
|
||||
zmq
|
||||
)
|
||||
|
||||
// Ammonite is invoked through running
|
||||
// a main class it places in test sources
|
||||
|
@ -327,3 +453,52 @@ lazy val scripts = project
|
|||
addCommandAlias("amm", "test:run")
|
||||
|
||||
publishArtifact in bitcoins := false
|
||||
|
||||
def dbFlywaySettings(dbName: String): List[Setting[_]] = {
|
||||
lazy val DB_HOST = "localhost"
|
||||
lazy val DB_NAME = s"${dbName}.sqlite"
|
||||
lazy val network = "unittest" //mainnet, testnet3, regtest, unittest
|
||||
|
||||
lazy val mainnetDir = s"${System.getenv("HOME")}/.bitcoin-s/mainnet/"
|
||||
lazy val testnetDir = s"${System.getenv("HOME")}/.bitcoin-s/testnet3/"
|
||||
lazy val regtestDir = s"${System.getenv("HOME")}/.bitcoin-s/regtest/"
|
||||
lazy val unittestDir = s"${System.getenv("HOME")}/.bitcoin-s/unittest/"
|
||||
|
||||
lazy val dirs = List(mainnetDir,testnetDir,regtestDir,unittestDir)
|
||||
|
||||
//create directies if they DNE
|
||||
dirs.foreach { d =>
|
||||
val file = new File(d)
|
||||
file.mkdirs()
|
||||
val db = new File(d + DB_NAME)
|
||||
db.createNewFile()
|
||||
}
|
||||
|
||||
def makeNetworkSettings(directoryPath: String): List[Setting[_]] = List(
|
||||
Test / flywayUrl := s"jdbc:sqlite:$directoryPath$DB_NAME",
|
||||
Test / flywayLocations := List("nodedb/migration"),
|
||||
Test / flywayUser := "nodedb",
|
||||
Test / flywayPassword := "",
|
||||
flywayUrl := s"jdbc:sqlite:$directoryPath$DB_NAME",
|
||||
flywayUser := "nodedb",
|
||||
flywayPassword := ""
|
||||
)
|
||||
|
||||
lazy val mainnet = makeNetworkSettings(mainnetDir)
|
||||
|
||||
lazy val testnet3 = makeNetworkSettings(testnetDir)
|
||||
|
||||
lazy val regtest = makeNetworkSettings(regtestDir)
|
||||
|
||||
lazy val unittest = makeNetworkSettings(unittestDir)
|
||||
|
||||
network match {
|
||||
case "mainnet" => mainnet
|
||||
case "testnet3" => testnet3
|
||||
case "regtest" => regtest
|
||||
case "unittest" => unittest
|
||||
case unknown: String => throw new IllegalArgumentException(s"Unknown network=${unknown}")
|
||||
}
|
||||
}
|
||||
|
||||
publishArtifact in bitcoins := false
|
1
chain-test/src/test/resources/block_headers.json
Normal file
1
chain-test/src/test/resources/block_headers.json
Normal file
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,33 @@
|
|||
package org.bitcoins.chain
|
||||
|
||||
import org.bitcoins.testkit.util.BitcoinSUnitTest
|
||||
import org.bitcoins.core.config.TestNet3
|
||||
import com.typesafe.config.Config
|
||||
import com.typesafe.config.ConfigFactory
|
||||
import org.bitcoins.core.config.RegTest
|
||||
import org.bitcoins.core.config.MainNet
|
||||
import org.bitcoins.chain.config.ChainAppConfig
|
||||
|
||||
class ChainAppConfigTest extends BitcoinSUnitTest {
|
||||
val config = ChainAppConfig()
|
||||
|
||||
it must "be overridable" in {
|
||||
assert(config.network == RegTest)
|
||||
|
||||
val otherConf = ConfigFactory.parseString("bitcoin-s.network = testnet3")
|
||||
val withOther: ChainAppConfig = config.withOverrides(otherConf)
|
||||
assert(withOther.network == TestNet3)
|
||||
|
||||
val mainnetConf = ConfigFactory.parseString("bitcoin-s.network = mainnet")
|
||||
val mainnet: ChainAppConfig = withOther.withOverrides(mainnetConf)
|
||||
assert(mainnet.network == MainNet)
|
||||
}
|
||||
|
||||
it must "be overridable with multiple levels" in {
|
||||
val testnet = ConfigFactory.parseString("bitcoin-s.network = testnet3")
|
||||
val mainnet = ConfigFactory.parseString("bitcoin-s.network = mainnet")
|
||||
val overriden: ChainAppConfig = config.withOverrides(testnet, mainnet)
|
||||
assert(overriden.network == MainNet)
|
||||
|
||||
}
|
||||
}
|
|
@ -0,0 +1,51 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.rpc.util.RpcUtil
|
||||
import org.bitcoins.testkit.chain.ChainUnitTest
|
||||
import org.bitcoins.testkit.chain.fixture.BitcoindChainHandlerViaZmq
|
||||
import org.scalatest.FutureOutcome
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
class BitcoindChainHandlerViaZmqTest extends ChainUnitTest {
|
||||
|
||||
override type FixtureParam = BitcoindChainHandlerViaZmq
|
||||
|
||||
override implicit val system: ActorSystem = ActorSystem("ChainUnitTest")
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome =
|
||||
withBitcoindChainHandlerViaZmq(test)
|
||||
|
||||
behavior of "BitcoindChainHandlerViaZmq"
|
||||
|
||||
it must "peer with bitcoind via zmq and have blockchain info relayed" in {
|
||||
bitcoindChainHandler: BitcoindChainHandlerViaZmq =>
|
||||
val bitcoind = bitcoindChainHandler.bitcoindRpc
|
||||
|
||||
val chainHandler = bitcoindChainHandler.chainHandler
|
||||
|
||||
val assert1F = chainHandler.getBlockCount
|
||||
.map(count => assert(count == 0))
|
||||
|
||||
//mine a block on bitcoind
|
||||
val generatedF = assert1F.flatMap(_ => bitcoind.generate(1))
|
||||
|
||||
generatedF.flatMap { headers =>
|
||||
val hash = headers.head
|
||||
val foundHeaderF: Future[Unit] = {
|
||||
//test case is totally async since we
|
||||
//can't monitor processing flow for zmq
|
||||
//so we just need to await until we
|
||||
//have fully processed the header
|
||||
RpcUtil.awaitConditionF(() =>
|
||||
chainHandler.getHeader(hash).map(_.isDefined))
|
||||
}
|
||||
|
||||
for {
|
||||
_ <- foundHeaderF
|
||||
header <- chainHandler.getHeader(hash)
|
||||
} yield assert(header.get.hashBE == hash)
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,39 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.chain.models.BlockHeaderDAO
|
||||
import org.bitcoins.testkit.chain.{BlockHeaderHelper, ChainUnitTest}
|
||||
import org.scalatest.FutureOutcome
|
||||
|
||||
class BlockchainTest extends ChainUnitTest {
|
||||
|
||||
override type FixtureParam = BlockHeaderDAO
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome =
|
||||
withBlockHeaderDAO(test)
|
||||
|
||||
override implicit val system: ActorSystem = ActorSystem("BlockchainTest")
|
||||
|
||||
behavior of "Blockchain"
|
||||
|
||||
it must "connect a new header to the current tip of a blockchain" in {
|
||||
bhDAO: BlockHeaderDAO =>
|
||||
val blockchain = Blockchain.fromHeaders(
|
||||
headers = Vector(ChainUnitTest.genesisHeaderDb)
|
||||
)
|
||||
|
||||
val newHeader =
|
||||
BlockHeaderHelper.buildNextHeader(ChainUnitTest.genesisHeaderDb)
|
||||
|
||||
val connectTipF = Blockchain.connectTip(header = newHeader.blockHeader,
|
||||
blockHeaderDAO = bhDAO)
|
||||
|
||||
connectTipF.map {
|
||||
case BlockchainUpdate.Successful(_, connectedHeader) =>
|
||||
assert(newHeader == connectedHeader)
|
||||
|
||||
case fail: BlockchainUpdate.Failed =>
|
||||
assert(false)
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,139 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.chain.config.ChainAppConfig
|
||||
import org.bitcoins.chain.models.BlockHeaderDbHelper
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import org.bitcoins.core.util.FileUtil
|
||||
import org.bitcoins.testkit.chain.fixture.ChainFixtureTag
|
||||
import org.bitcoins.testkit.chain.{
|
||||
BlockHeaderHelper,
|
||||
ChainTestUtil,
|
||||
ChainUnitTest
|
||||
}
|
||||
import org.scalatest.{Assertion, FutureOutcome}
|
||||
import play.api.libs.json.Json
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
class ChainHandlerTest extends ChainUnitTest {
|
||||
|
||||
override type FixtureParam = ChainHandler
|
||||
|
||||
override implicit val system = ActorSystem("ChainUnitTest")
|
||||
|
||||
// we're working with mainnet data
|
||||
override lazy implicit val appConfig: ChainAppConfig = mainnetAppConfig
|
||||
|
||||
override val defaultTag: ChainFixtureTag = ChainFixtureTag.GenisisChainHandler
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome =
|
||||
withChainHandler(test)
|
||||
|
||||
behavior of "ChainHandler"
|
||||
|
||||
it must "process a new valid block header, and then be able to fetch that header" in {
|
||||
chainHandler: ChainHandler =>
|
||||
val newValidHeader =
|
||||
BlockHeaderHelper.buildNextHeader(ChainUnitTest.genesisHeaderDb)
|
||||
val processedHeaderF =
|
||||
chainHandler.processHeader(newValidHeader.blockHeader)
|
||||
|
||||
val foundHeaderF =
|
||||
processedHeaderF.flatMap(_.getHeader(newValidHeader.hashBE))
|
||||
|
||||
foundHeaderF.map(found => assert(found.get == newValidHeader))
|
||||
}
|
||||
|
||||
it must "have an in-order seed" in { _ =>
|
||||
val source = FileUtil.getFileAsSource("block_headers.json")
|
||||
val arrStr = source.getLines.next
|
||||
source.close()
|
||||
|
||||
import org.bitcoins.rpc.serializers.JsonReaders.BlockHeaderReads
|
||||
val headersResult = Json.parse(arrStr).validate[Vector[BlockHeader]]
|
||||
if (headersResult.isError) {
|
||||
fail(headersResult.toString)
|
||||
}
|
||||
|
||||
val blockHeaders = headersResult.get
|
||||
|
||||
blockHeaders.reduce[BlockHeader] {
|
||||
case (prev, next) =>
|
||||
assert(next.previousBlockHashBE == prev.hashBE)
|
||||
next
|
||||
}
|
||||
|
||||
succeed
|
||||
}
|
||||
|
||||
it must "be able to process and fetch real headers from mainnet" in {
|
||||
chainHandler: ChainHandler =>
|
||||
val source = FileUtil.getFileAsSource("block_headers.json")
|
||||
val arrStr = source.getLines.next
|
||||
source.close()
|
||||
|
||||
import org.bitcoins.rpc.serializers.JsonReaders.BlockHeaderReads
|
||||
val headersResult = Json.parse(arrStr).validate[Vector[BlockHeader]]
|
||||
if (headersResult.isError) {
|
||||
fail(headersResult.toString)
|
||||
}
|
||||
|
||||
val blockHeaders =
|
||||
headersResult.get.drop(
|
||||
ChainUnitTest.FIRST_POW_CHANGE - ChainUnitTest.FIRST_BLOCK_HEIGHT)
|
||||
|
||||
val firstBlockHeaderDb =
|
||||
BlockHeaderDbHelper.fromBlockHeader(ChainUnitTest.FIRST_POW_CHANGE - 2,
|
||||
ChainTestUtil.blockHeader562462)
|
||||
|
||||
val secondBlockHeaderDb =
|
||||
BlockHeaderDbHelper.fromBlockHeader(ChainUnitTest.FIRST_POW_CHANGE - 1,
|
||||
ChainTestUtil.blockHeader562463)
|
||||
|
||||
val thirdBlockHeaderDb =
|
||||
BlockHeaderDbHelper.fromBlockHeader(ChainUnitTest.FIRST_POW_CHANGE,
|
||||
ChainTestUtil.blockHeader562464)
|
||||
|
||||
/*
|
||||
* We need to insert one block before the first POW check because it is used on the next
|
||||
* POW check. We then need to insert the next to blocks to circumvent a POW check since
|
||||
* that would require we have an old block in the Blockchain that we don't have.
|
||||
*/
|
||||
val firstThreeBlocks =
|
||||
Vector(firstBlockHeaderDb, secondBlockHeaderDb, thirdBlockHeaderDb)
|
||||
|
||||
val createdF = chainHandler.blockHeaderDAO.createAll(firstThreeBlocks)
|
||||
|
||||
createdF.flatMap { _ =>
|
||||
val processorF = Future.successful(chainHandler)
|
||||
// Takes way too long to do all blocks
|
||||
val blockHeadersToTest = blockHeaders.tail
|
||||
.take(
|
||||
(2 * chainHandler.chainConfig.chain.difficultyChangeInterval + 1).toInt)
|
||||
.toList
|
||||
|
||||
processHeaders(processorF = processorF,
|
||||
remainingHeaders = blockHeadersToTest,
|
||||
height = ChainUnitTest.FIRST_POW_CHANGE + 1)
|
||||
}
|
||||
}
|
||||
|
||||
final def processHeaders(
|
||||
processorF: Future[ChainHandler],
|
||||
remainingHeaders: List[BlockHeader],
|
||||
height: Long): Future[Assertion] = {
|
||||
remainingHeaders match {
|
||||
case header :: headersTail =>
|
||||
val newProcessorF = processorF.flatMap(_.processHeader(header))
|
||||
val getHeaderF = newProcessorF.flatMap(_.getHeader(header.hashBE))
|
||||
val expectedBlockHeaderDb =
|
||||
BlockHeaderDbHelper.fromBlockHeader(height, header)
|
||||
val assertionF =
|
||||
getHeaderF.map(tips => assert(tips.contains(expectedBlockHeaderDb)))
|
||||
assertionF.flatMap(_ =>
|
||||
processHeaders(newProcessorF, headersTail, height = height + 1))
|
||||
case Nil => succeed
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,74 @@
|
|||
package org.bitcoins.chain.blockchain.sync
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.chain.api.ChainApi
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.testkit.chain.ChainUnitTest
|
||||
import org.bitcoins.testkit.chain.fixture.BitcoindChainHandlerViaRpc
|
||||
import org.scalatest.FutureOutcome
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
class ChainSyncTest extends ChainUnitTest {
|
||||
override type FixtureParam = BitcoindChainHandlerViaRpc
|
||||
|
||||
override implicit val system = ActorSystem(
|
||||
s"chain-sync-test-${System.currentTimeMillis()}")
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome = {
|
||||
withBitcoindChainHandlerViaRpc(test)
|
||||
}
|
||||
|
||||
behavior of "ChainSync"
|
||||
|
||||
it must "sync our chain handler when it is not synced with bitcoind" in {
|
||||
bitcoindWithChainHandler: BitcoindChainHandlerViaRpc =>
|
||||
val bitcoind = bitcoindWithChainHandler.bitcoindRpc
|
||||
val chainHandler = bitcoindWithChainHandler.chainHandler
|
||||
//first we need to implement the 'getBestBlockHashFunc' and 'getBlockHeaderFunc' functions
|
||||
val getBestBlockHashFunc = { () =>
|
||||
bitcoind.getBestBlockHash
|
||||
}
|
||||
|
||||
val getBlockHeaderFunc = { hash: DoubleSha256DigestBE =>
|
||||
bitcoind.getBlockHeader(hash).map(_.blockHeader)
|
||||
}
|
||||
|
||||
//let's generate a block on bitcoind
|
||||
val block1F = bitcoind.generate(1)
|
||||
val newChainHandlerF: Future[ChainApi] = block1F.flatMap { hashes =>
|
||||
ChainSync.sync(chainHandler = chainHandler,
|
||||
getBlockHeaderFunc = getBlockHeaderFunc,
|
||||
getBestBlockHashFunc = getBestBlockHashFunc)
|
||||
}
|
||||
|
||||
newChainHandlerF.flatMap { chainHandler =>
|
||||
chainHandler.getBlockCount.map(count => assert(count == 1))
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
it must "not fail when syncing a chain handler that is synced with it's external data source" in {
|
||||
bitcoindWithChainHandler: BitcoindChainHandlerViaRpc =>
|
||||
val bitcoind = bitcoindWithChainHandler.bitcoindRpc
|
||||
val chainHandler = bitcoindWithChainHandler.chainHandler
|
||||
//first we need to implement the 'getBestBlockHashFunc' and 'getBlockHeaderFunc' functions
|
||||
val getBestBlockHashFunc = { () =>
|
||||
bitcoind.getBestBlockHash
|
||||
}
|
||||
|
||||
val getBlockHeaderFunc = { hash: DoubleSha256DigestBE =>
|
||||
bitcoind.getBlockHeader(hash).map(_.blockHeader)
|
||||
}
|
||||
|
||||
//note we are not generating a block on bitcoind
|
||||
val newChainHandlerF: Future[ChainApi] =
|
||||
ChainSync.sync(chainHandler = chainHandler,
|
||||
getBlockHeaderFunc = getBlockHeaderFunc,
|
||||
getBestBlockHashFunc = getBestBlockHashFunc)
|
||||
|
||||
newChainHandlerF.flatMap { chainHandler =>
|
||||
chainHandler.getBlockCount.map(count => assert(count == 0))
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,175 @@
|
|||
package org.bitcoins.chain.models
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.testkit.chain.{BlockHeaderHelper, ChainUnitTest}
|
||||
import org.scalatest.FutureOutcome
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
/**
|
||||
* Created by chris on 9/8/16.
|
||||
*/
|
||||
class BlockHeaderDAOTest extends ChainUnitTest {
|
||||
|
||||
override type FixtureParam = BlockHeaderDAO
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome =
|
||||
withBlockHeaderDAO(test)
|
||||
|
||||
override implicit val system: ActorSystem = ActorSystem("BlockHeaderDAOTest")
|
||||
|
||||
behavior of "BlockHeaderDAO"
|
||||
|
||||
private val genesisHeaderDb = ChainUnitTest.genesisHeaderDb
|
||||
it should "insert and read the genesis block header back" in {
|
||||
blockHeaderDAO: BlockHeaderDAO =>
|
||||
val readF = blockHeaderDAO.read(genesisHeaderDb.hashBE)
|
||||
|
||||
val assert1 = readF.map { readHeader =>
|
||||
assert(readHeader.get.blockHeader.hashBE == genesisHeaderDb.hashBE)
|
||||
}
|
||||
val read1F = blockHeaderDAO.getAtHeight(0)
|
||||
|
||||
val assert2 = {
|
||||
read1F.map { headersAtHeight0 =>
|
||||
assert(headersAtHeight0 == List(genesisHeaderDb))
|
||||
}
|
||||
}
|
||||
|
||||
assert1.flatMap(_ => assert2.map(_ => succeed))
|
||||
|
||||
}
|
||||
|
||||
it must "delete a block header in the database" in {
|
||||
blockHeaderDAO: BlockHeaderDAO =>
|
||||
val blockHeader = BlockHeaderHelper.buildNextHeader(genesisHeaderDb)
|
||||
|
||||
val createdF = blockHeaderDAO.create(blockHeader)
|
||||
//delete the header in the db
|
||||
val deletedF = {
|
||||
createdF.flatMap { _ =>
|
||||
blockHeaderDAO.delete(blockHeader)
|
||||
}
|
||||
}
|
||||
|
||||
deletedF.flatMap { _ =>
|
||||
blockHeaderDAO
|
||||
.read(blockHeader.blockHeader.hashBE)
|
||||
.map(opt => assert(opt.isEmpty))
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
it must "retrieve the chain tip saved in the database" in {
|
||||
blockHeaderDAO: BlockHeaderDAO =>
|
||||
val blockHeader = BlockHeaderHelper.buildNextHeader(genesisHeaderDb)
|
||||
|
||||
val createdF = blockHeaderDAO.create(blockHeader)
|
||||
|
||||
val chainTip1F = createdF.flatMap { _ =>
|
||||
blockHeaderDAO.chainTips
|
||||
}
|
||||
|
||||
val assert1F = chainTip1F.map { tips =>
|
||||
assert(tips.length == 1)
|
||||
assert(tips.head.blockHeader.hash == blockHeader.blockHeader.hash)
|
||||
}
|
||||
|
||||
val blockHeader2 = BlockHeaderHelper.buildNextHeader(blockHeader)
|
||||
|
||||
//insert another header and make sure that is the new last header
|
||||
assert1F.flatMap { _ =>
|
||||
val created2F = blockHeaderDAO.create(blockHeader2)
|
||||
val chainTip2F = created2F.flatMap(_ => blockHeaderDAO.chainTips)
|
||||
|
||||
chainTip2F.map { tips =>
|
||||
assert(tips.length == 1)
|
||||
assert(tips.head.blockHeader.hash == blockHeader2.blockHeader.hash)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
it must "return the genesis block when retrieving block headers from an empty database" in {
|
||||
blockHeaderDAO: BlockHeaderDAO =>
|
||||
val chainTipsF = blockHeaderDAO.chainTips
|
||||
chainTipsF.map { tips =>
|
||||
assert(tips.headOption == Some(genesisHeaderDb))
|
||||
}
|
||||
}
|
||||
|
||||
it must "retrieve a block header by height" in {
|
||||
blockHeaderDAO: BlockHeaderDAO =>
|
||||
val blockHeader = BlockHeaderHelper.buildNextHeader(genesisHeaderDb)
|
||||
|
||||
val createdF = blockHeaderDAO.create(blockHeader)
|
||||
|
||||
val getAtHeightF: Future[Vector[BlockHeaderDb]] = {
|
||||
createdF.flatMap { _ =>
|
||||
blockHeaderDAO.getAtHeight(1)
|
||||
}
|
||||
}
|
||||
|
||||
val assert1F = getAtHeightF.map {
|
||||
case headers =>
|
||||
assert(headers.head == blockHeader)
|
||||
assert(headers.head.height == 1)
|
||||
}
|
||||
|
||||
//create one at height 2
|
||||
val blockHeader2 = BlockHeaderHelper.buildNextHeader(blockHeader)
|
||||
|
||||
val created2F = blockHeaderDAO.create(blockHeader2)
|
||||
|
||||
val getAtHeight2F: Future[Vector[BlockHeaderDb]] = {
|
||||
created2F.flatMap(_ => blockHeaderDAO.getAtHeight(2))
|
||||
}
|
||||
|
||||
val assert2F = getAtHeight2F.map { headers =>
|
||||
assert(headers.head == blockHeader2)
|
||||
}
|
||||
|
||||
assert1F.flatMap(_ => assert2F.map(_ => succeed))
|
||||
}
|
||||
|
||||
it must "find the height of the longest chain" in {
|
||||
blockHeaderDAO: BlockHeaderDAO =>
|
||||
val blockHeader = BlockHeaderHelper.buildNextHeader(genesisHeaderDb)
|
||||
val createdF = blockHeaderDAO.create(blockHeader)
|
||||
|
||||
val maxHeightF = createdF.flatMap(_ => blockHeaderDAO.maxHeight)
|
||||
|
||||
val blockHeader2 = BlockHeaderHelper.buildNextHeader(blockHeader)
|
||||
|
||||
val created2F =
|
||||
maxHeightF.flatMap(_ => blockHeaderDAO.create(blockHeader2))
|
||||
|
||||
val maxHeight2F = created2F.flatMap(_ => blockHeaderDAO.maxHeight)
|
||||
|
||||
maxHeightF.flatMap { h1 =>
|
||||
maxHeight2F.map { h2 =>
|
||||
assert(h1 == 1)
|
||||
assert(h2 == 2)
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
it must "find the height of two headers that are competing to be the longest chain" in {
|
||||
blockHeaderDAO: BlockHeaderDAO =>
|
||||
val blockHeader = BlockHeaderHelper.buildNextHeader(genesisHeaderDb)
|
||||
val createdF = blockHeaderDAO.create(blockHeader)
|
||||
|
||||
val blockHeader1 = BlockHeaderHelper.buildNextHeader(genesisHeaderDb)
|
||||
val created2F = createdF.flatMap(_ => blockHeaderDAO.create(blockHeader1))
|
||||
|
||||
//now make sure they are both at height 1
|
||||
val getHeightF = created2F.flatMap(_ => blockHeaderDAO.getAtHeight(1))
|
||||
|
||||
getHeightF.map {
|
||||
case headers =>
|
||||
assert(headers == Seq(blockHeader, blockHeader1))
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,76 @@
|
|||
package org.bitcoins.chain.pow
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.chain.config.ChainAppConfig
|
||||
import org.bitcoins.chain.models.BlockHeaderDAO
|
||||
import org.bitcoins.core.protocol.blockchain.MainNetChainParams
|
||||
import org.bitcoins.db.AppConfig
|
||||
import org.bitcoins.testkit.chain.fixture.{ChainFixture, ChainFixtureTag}
|
||||
import org.bitcoins.testkit.chain.{ChainTestUtil, ChainUnitTest}
|
||||
import org.scalatest.FutureOutcome
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
class BitcoinPowTest extends ChainUnitTest {
|
||||
|
||||
override type FixtureParam = ChainFixture
|
||||
|
||||
override lazy implicit val appConfig: ChainAppConfig = mainnetAppConfig
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome =
|
||||
withChainFixture(test)
|
||||
|
||||
override implicit val system: ActorSystem = ActorSystem("BitcoinPowTest")
|
||||
|
||||
behavior of "BitcoinPow"
|
||||
|
||||
it must "NOT calculate a POW change when one is not needed" inFixtured {
|
||||
case ChainFixture.Empty =>
|
||||
val blockHeaderDAO = BlockHeaderDAO(appConfig)
|
||||
val header1 = ChainTestUtil.ValidPOWChange.blockHeaderDb566494
|
||||
val header2 = ChainTestUtil.ValidPOWChange.blockHeaderDb566495
|
||||
|
||||
val nextWorkF =
|
||||
Pow.getNetworkWorkRequired(header1, header2.blockHeader, blockHeaderDAO)
|
||||
|
||||
nextWorkF.map(nextWork => assert(nextWork == header1.nBits))
|
||||
}
|
||||
|
||||
it must "calculate a pow change as per the bitcoin network" inFixtured {
|
||||
case ChainFixture.Empty =>
|
||||
val firstBlockDb = ChainTestUtil.ValidPOWChange.blockHeaderDb564480
|
||||
val currentTipDb = ChainTestUtil.ValidPOWChange.blockHeaderDb566495
|
||||
val expectedNextWork =
|
||||
ChainTestUtil.ValidPOWChange.blockHeader566496.nBits
|
||||
val calculatedWorkF =
|
||||
Pow.calculateNextWorkRequired(currentTipDb,
|
||||
firstBlockDb,
|
||||
MainNetChainParams)
|
||||
|
||||
calculatedWorkF.map(calculatedWork =>
|
||||
assert(calculatedWork == expectedNextWork))
|
||||
}
|
||||
|
||||
it must "GetNextWorkRequired correctly" taggedAs ChainFixtureTag.PopulatedBlockHeaderDAO inFixtured {
|
||||
case ChainFixture.PopulatedBlockHeaderDAO(blockHeaderDAO) =>
|
||||
val iterations = 4200
|
||||
|
||||
// We must start after the first POW change to avoid looking for a block we don't have
|
||||
val assertionFs =
|
||||
(ChainUnitTest.FIRST_POW_CHANGE + 1 until ChainUnitTest.FIRST_POW_CHANGE + 1 + iterations)
|
||||
.map { height =>
|
||||
val blockF = blockHeaderDAO.getAtHeight(height).map(_.head)
|
||||
val nextBlockF = blockHeaderDAO.getAtHeight(height + 1).map(_.head)
|
||||
|
||||
for {
|
||||
currentTip <- blockF
|
||||
nextTip <- nextBlockF
|
||||
nextNBits <- Pow.getNetworkWorkRequired(currentTip,
|
||||
nextTip.blockHeader,
|
||||
blockHeaderDAO)
|
||||
} yield assert(nextNBits == nextTip.nBits)
|
||||
}
|
||||
|
||||
Future.sequence(assertionFs).map(_ => succeed)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,81 @@
|
|||
package org.bitcoins.chain.validation
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.chain.db.ChainDbManagement
|
||||
import org.bitcoins.chain.models.{
|
||||
BlockHeaderDAO,
|
||||
BlockHeaderDb,
|
||||
BlockHeaderDbHelper
|
||||
}
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import org.bitcoins.testkit.chain.{
|
||||
BlockHeaderHelper,
|
||||
ChainTestUtil,
|
||||
ChainUnitTest
|
||||
}
|
||||
import org.scalatest.{Assertion, FutureOutcome}
|
||||
|
||||
import scala.concurrent.Future
|
||||
import org.bitcoins.db.AppConfig
|
||||
import org.bitcoins.chain.config.ChainAppConfig
|
||||
import com.typesafe.config.ConfigFactory
|
||||
|
||||
class TipValidationTest extends ChainUnitTest {
|
||||
|
||||
override type FixtureParam = BlockHeaderDAO
|
||||
|
||||
// we're working with mainnet data
|
||||
override lazy implicit val appConfig: ChainAppConfig = mainnetAppConfig
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome =
|
||||
withBlockHeaderDAO(test)
|
||||
|
||||
override implicit val system: ActorSystem = ActorSystem("TipValidationTest")
|
||||
|
||||
behavior of "TipValidation"
|
||||
|
||||
//blocks 566,092 and 566,093
|
||||
val newValidTip = BlockHeaderHelper.header1
|
||||
val currentTipDb = BlockHeaderHelper.header2Db
|
||||
|
||||
it must "connect two blocks with that are valid" in { bhDAO =>
|
||||
val newValidTipDb =
|
||||
BlockHeaderDbHelper.fromBlockHeader(566093, newValidTip)
|
||||
val expected = TipUpdateResult.Success(newValidTipDb)
|
||||
|
||||
runTest(newValidTip, expected, bhDAO)
|
||||
}
|
||||
|
||||
it must "fail to connect two blocks that do not reference prev block hash correctly" in {
|
||||
bhDAO =>
|
||||
val badPrevHash = BlockHeaderHelper.badPrevHash
|
||||
|
||||
val expected = TipUpdateResult.BadPreviousBlockHash(badPrevHash)
|
||||
|
||||
runTest(badPrevHash, expected, bhDAO)
|
||||
}
|
||||
|
||||
it must "fail to connect two blocks with two different POW requirements at the wrong interval" in {
|
||||
bhDAO =>
|
||||
val badPOW = BlockHeaderHelper.badNBits
|
||||
val expected = TipUpdateResult.BadPOW(badPOW)
|
||||
runTest(badPOW, expected, bhDAO)
|
||||
}
|
||||
|
||||
it must "fail to connect two blocks with a bad nonce" in { bhDAO =>
|
||||
val badNonce = BlockHeaderHelper.badNonce
|
||||
val expected = TipUpdateResult.BadNonce(badNonce)
|
||||
runTest(badNonce, expected, bhDAO)
|
||||
}
|
||||
|
||||
private def runTest(
|
||||
header: BlockHeader,
|
||||
expected: TipUpdateResult,
|
||||
blockHeaderDAO: BlockHeaderDAO,
|
||||
currentTipDbDefault: BlockHeaderDb = currentTipDb): Future[Assertion] = {
|
||||
val checkTipF =
|
||||
TipValidation.checkNewTip(header, currentTipDbDefault, blockHeaderDAO)
|
||||
|
||||
checkTipF.map(validationResult => assert(validationResult == expected))
|
||||
}
|
||||
}
|
17
chain/README.md
Normal file
17
chain/README.md
Normal file
|
@ -0,0 +1,17 @@
|
|||
### chain
|
||||
|
||||
This is meant to be a stand alone project that process a new block / transaction and stores it.
|
||||
It also provides a interface to query information related to a blockchain.
|
||||
|
||||
The design goal with this project is to be agnostic of how the project is receiving
|
||||
the blockchain data, just that it processes and stores it. For instance
|
||||
you could provide the blockchain data via
|
||||
|
||||
- rpc
|
||||
- zmq
|
||||
- p2p
|
||||
- sattelite
|
||||
|
||||
This project just stores relevant [`block`](../core/src/main/scala/org/bitcoins/core/protocol/blockchain/Block.scala)
|
||||
and [`transaction`](../core/src/main/scala/org/bitcoins/core/protocol/transaction/Transaction.scala) information and allows
|
||||
for it to be queried via a api.
|
3
chain/build.sbt
Normal file
3
chain/build.sbt
Normal file
|
@ -0,0 +1,3 @@
|
|||
coverageMinimum := 90
|
||||
|
||||
coverageFailOnMinimum := true
|
5
chain/src/main/resources/chain.conf
Normal file
5
chain/src/main/resources/chain.conf
Normal file
|
@ -0,0 +1,5 @@
|
|||
bitcoin-s {
|
||||
database {
|
||||
name = "chaindb.sqlite"
|
||||
}
|
||||
}
|
48
chain/src/main/scala/org/bitcoins/chain/api/ChainApi.scala
Normal file
48
chain/src/main/scala/org/bitcoins/chain/api/ChainApi.scala
Normal file
|
@ -0,0 +1,48 @@
|
|||
package org.bitcoins.chain.api
|
||||
|
||||
import org.bitcoins.db._
|
||||
import org.bitcoins.chain.models.BlockHeaderDb
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
/**
|
||||
* Entry api to the chain project for adding new things to our blockchain
|
||||
*/
|
||||
trait ChainApi {
|
||||
|
||||
def chainConfig: AppConfig
|
||||
|
||||
/**
|
||||
* Adds a block header to our chain project
|
||||
* @param header
|
||||
* @return
|
||||
*/
|
||||
def processHeader(header: BlockHeader)(
|
||||
implicit ec: ExecutionContext): Future[ChainApi]
|
||||
|
||||
/** Process all of the given headers and returns a new [[ChainApi chain api]]
|
||||
* that contains these headers. This method processes headers in the order
|
||||
* that they are given. If the headers are out of order, this method will fail
|
||||
* @param headers
|
||||
* @return
|
||||
*/
|
||||
def processHeaders(headers: Vector[BlockHeader])(
|
||||
implicit ec: ExecutionContext): Future[ChainApi] = {
|
||||
headers.foldLeft(Future.successful(this)) {
|
||||
case (chainF, header) =>
|
||||
chainF.flatMap(_.processHeader(header))
|
||||
}
|
||||
}
|
||||
|
||||
/** Get's a [[org.bitcoins.chain.models.BlockHeaderDb]] from the chain's database */
|
||||
def getHeader(hash: DoubleSha256DigestBE): Future[Option[BlockHeaderDb]]
|
||||
|
||||
/** Gets the number of blocks in the database */
|
||||
def getBlockCount: Future[Long]
|
||||
|
||||
/** Gets the hash of the block that is what we consider "best" */
|
||||
def getBestBlockHash(
|
||||
implicit ec: ExecutionContext): Future[DoubleSha256DigestBE]
|
||||
}
|
|
@ -0,0 +1,109 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import org.bitcoins.chain.models.{BlockHeaderDAO, BlockHeaderDb}
|
||||
import org.bitcoins.chain.validation.{TipUpdateResult, TipValidation}
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
/**
|
||||
* In memory implementation of a blockchain
|
||||
* This data structure maintains the state of a
|
||||
* blockchain in memory, the headers can be accessed
|
||||
* with [[headers]]. The headers are stored with the most
|
||||
* recent header at index 0, the second most recent header at index 1 etc
|
||||
* You can walk the chain by
|
||||
* {{{
|
||||
* headers.map(h => println(h))
|
||||
* }}}
|
||||
*
|
||||
*/
|
||||
case class Blockchain(headers: Vector[BlockHeaderDb]) extends BitcoinSLogger {
|
||||
val tip: BlockHeaderDb = headers.head
|
||||
}
|
||||
|
||||
object Blockchain extends BitcoinSLogger {
|
||||
|
||||
def fromHeaders(headers: Vector[BlockHeaderDb]): Blockchain = {
|
||||
Blockchain(headers)
|
||||
}
|
||||
|
||||
/**
|
||||
* Attempts to connect the given block header with the given blockchain
|
||||
* This is done via the companion object for blockchain because
|
||||
* we query [[BlockHeaderDAO block header dao]] for the chain tips
|
||||
* We then attempt to connect this block header to all of our current
|
||||
* chain tips.
|
||||
* @param header the block header to connect to our chain
|
||||
* @param blockHeaderDAO where we can find our blockchain
|
||||
* @param ec
|
||||
* @return a [[Future future]] that contains a [[BlockchainUpdate update]] indicating
|
||||
* we [[BlockchainUpdate.Successful successfully]] connected the tip,
|
||||
* or [[BlockchainUpdate.Failed failed]] to connect to a tip
|
||||
*/
|
||||
def connectTip(header: BlockHeader, blockHeaderDAO: BlockHeaderDAO)(
|
||||
implicit ec: ExecutionContext): Future[BlockchainUpdate] = {
|
||||
|
||||
//get all competing chains we have
|
||||
val blockchainsF: Future[Vector[Blockchain]] =
|
||||
blockHeaderDAO.getBlockchains()
|
||||
|
||||
val tipResultF: Future[BlockchainUpdate] = blockchainsF.flatMap {
|
||||
blockchains =>
|
||||
val nested: Vector[Future[BlockchainUpdate]] = blockchains.map {
|
||||
blockchain =>
|
||||
val tip = blockchain.tip
|
||||
logger.info(
|
||||
s"Attempting to add new tip=${header.hashBE.hex} with prevhash=${header.previousBlockHashBE.hex} to chain with current tips=${tip.hashBE.hex}")
|
||||
val tipResultF = TipValidation.checkNewTip(newPotentialTip = header,
|
||||
currentTip = tip,
|
||||
blockHeaderDAO =
|
||||
blockHeaderDAO)
|
||||
|
||||
tipResultF.map { tipResult =>
|
||||
tipResult match {
|
||||
case TipUpdateResult.Success(headerDb) =>
|
||||
val newChain =
|
||||
Blockchain.fromHeaders(headerDb +: blockchain.headers)
|
||||
BlockchainUpdate.Successful(newChain, headerDb)
|
||||
case fail: TipUpdateResult.Failure =>
|
||||
BlockchainUpdate.Failed(blockchain, header, fail)
|
||||
}
|
||||
}
|
||||
}
|
||||
parseSuccessOrFailure(nested = nested)
|
||||
}
|
||||
|
||||
tipResultF
|
||||
}
|
||||
|
||||
/** Takes in a vector of blockchain updates being executed asynchronously, we can only connect one [[BlockHeader header]]
|
||||
* to a tip successfully, which means _all_ other [[BlockchainUpdate updates]] must fail. This is a helper method
|
||||
* to find the one [[BlockchainUpdate.Successful successful]] update, or else returns one of the [[BlockchainUpdate.Failed failures]]
|
||||
* @param nested
|
||||
* @param ec
|
||||
* @return
|
||||
*/
|
||||
private def parseSuccessOrFailure(nested: Vector[Future[BlockchainUpdate]])(
|
||||
implicit ec: ExecutionContext): Future[BlockchainUpdate] = {
|
||||
val successfulTipOptF: Future[Option[BlockchainUpdate]] = {
|
||||
Future.find(nested) {
|
||||
case update: BlockchainUpdate =>
|
||||
update.isInstanceOf[BlockchainUpdate.Successful]
|
||||
}
|
||||
}
|
||||
|
||||
successfulTipOptF.flatMap {
|
||||
case Some(update) => Future.successful(update)
|
||||
case None =>
|
||||
//if we didn't successfully connect a tip, just take the first failure we see
|
||||
Future
|
||||
.find(nested) {
|
||||
case update: BlockchainUpdate =>
|
||||
update.isInstanceOf[BlockchainUpdate.Failed]
|
||||
}
|
||||
.map(_.get)
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,26 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import org.bitcoins.chain.models.{BlockHeaderDAO, BlockHeaderDb}
|
||||
|
||||
import scala.collection.mutable
|
||||
|
||||
/**
|
||||
* @inheritdoc
|
||||
* @param blockHeaderDAO
|
||||
*/
|
||||
case class BlockchainBuilder(blockHeaderDAO: BlockHeaderDAO) extends mutable.Builder[BlockHeaderDb, Blockchain] {
|
||||
private val internal = Vector.newBuilder[BlockHeaderDb]
|
||||
|
||||
|
||||
override def result(): Blockchain = {
|
||||
Blockchain.fromHeaders(internal.result().reverse)
|
||||
}
|
||||
|
||||
override def +=(blockHeaderDb: BlockHeaderDb): this.type = {
|
||||
internal.+=(blockHeaderDb)
|
||||
this
|
||||
}
|
||||
|
||||
|
||||
override def clear(): Unit = internal.clear()
|
||||
}
|
|
@ -0,0 +1,35 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import org.bitcoins.chain.models.BlockHeaderDb
|
||||
import org.bitcoins.chain.validation.TipUpdateResult
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
|
||||
/** Represents the state of an update to our [[org.bitcoins.chain.blockchain.Blockchain Blockchain]]
|
||||
* An example of a successful update is receiving a [[BlockHeader BlockHeader]] and successfully
|
||||
* adding it to our database.
|
||||
*
|
||||
* An example of a [[org.bitcoins.chain.blockchain.BlockchainUpdate.Failed Failed]] update
|
||||
* is when we receive a [[BlockHeader]] that is invalid and because of a
|
||||
* [[org.bitcoins.chain.validation.TipUpdateResult.Failure TipUpdateFailure]]
|
||||
* because of [[org.bitcoins.chain.validation.TipUpdateResult.BadPOW BadPOW]] or a
|
||||
* [[org.bitcoins.chain.validation.TipUpdateResult.BadNonce BadNonce]] etc
|
||||
*/
|
||||
sealed abstract class BlockchainUpdate
|
||||
|
||||
object BlockchainUpdate {
|
||||
|
||||
/** The key thing we receive here is [[org.bitcoins.chain.models.BlockHeaderDb BlockHeaderDb]]
|
||||
* with a height assigned to it this happens after
|
||||
* calling [[ChainHandler.processHeader() ChainHandler.processHeader]]
|
||||
*/
|
||||
case class Successful(blockchain: Blockchain, updatedHeader: BlockHeaderDb)
|
||||
extends BlockchainUpdate {
|
||||
def height: Long = updatedHeader.height
|
||||
}
|
||||
|
||||
case class Failed(
|
||||
blockchain: Blockchain,
|
||||
failedHeader: BlockHeader,
|
||||
tipUpdateFailure: TipUpdateResult.Failure)
|
||||
extends BlockchainUpdate
|
||||
}
|
|
@ -0,0 +1,73 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import org.bitcoins.chain.api.ChainApi
|
||||
import org.bitcoins.chain.config.ChainAppConfig
|
||||
import org.bitcoins.chain.models.{BlockHeaderDAO, BlockHeaderDb}
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
/**
|
||||
* Chain Handler is meant to be the reference implementation
|
||||
* of [[ChainApi]], this is the entry point in to the
|
||||
* chain project.
|
||||
*/
|
||||
case class ChainHandler(
|
||||
blockHeaderDAO: BlockHeaderDAO,
|
||||
chainConfig: ChainAppConfig)
|
||||
extends ChainApi
|
||||
with BitcoinSLogger {
|
||||
|
||||
override def getBlockCount: Future[Long] = {
|
||||
blockHeaderDAO.maxHeight
|
||||
}
|
||||
|
||||
override def getHeader(
|
||||
hash: DoubleSha256DigestBE): Future[Option[BlockHeaderDb]] = {
|
||||
blockHeaderDAO.findByHash(hash)
|
||||
}
|
||||
|
||||
override def processHeader(header: BlockHeader)(
|
||||
implicit ec: ExecutionContext): Future[ChainHandler] = {
|
||||
|
||||
val blockchainUpdateF = Blockchain.connectTip(header, blockHeaderDAO)
|
||||
|
||||
val newHandlerF = blockchainUpdateF.flatMap {
|
||||
case BlockchainUpdate.Successful(_, updatedHeader) =>
|
||||
//now we have successfully connected the header, we need to insert
|
||||
//it into the database
|
||||
val createdF = blockHeaderDAO.create(updatedHeader)
|
||||
createdF.map(_ => ChainHandler(blockHeaderDAO, chainConfig))
|
||||
case BlockchainUpdate.Failed(_, _, reason) =>
|
||||
val errMsg =
|
||||
s"Failed to add header to chain, header=${header.hashBE.hex} reason=${reason}"
|
||||
logger.warn(errMsg)
|
||||
Future.failed(new RuntimeException(errMsg))
|
||||
}
|
||||
|
||||
blockchainUpdateF.failed.foreach { err =>
|
||||
logger.error(
|
||||
s"Failed to connect header=${header.hashBE.hex} err=${err.getMessage}")
|
||||
|
||||
}
|
||||
|
||||
newHandlerF
|
||||
}
|
||||
|
||||
/**
|
||||
* @inheritdoc
|
||||
*/
|
||||
override def getBestBlockHash(
|
||||
implicit ec: ExecutionContext): Future[DoubleSha256DigestBE] = {
|
||||
//naive implementation, this is looking for the tip with the _most_ proof of work
|
||||
//this does _not_ mean that it is on the chain that has the most work
|
||||
//TODO: Enhance this in the future to return the "heaviest" header
|
||||
//https://bitcoin.org/en/glossary/block-chain
|
||||
blockHeaderDAO.chainTips.map { tips =>
|
||||
val sorted = tips.sortBy(header => header.blockHeader.difficulty)
|
||||
sorted.head.hashBE
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,5 @@
|
|||
package org.bitcoins.chain.blockchain
|
||||
|
||||
import org.bitcoins.chain.validation.TipUpdateResult
|
||||
|
||||
case class CheckHeaderResult(result: TipUpdateResult, chain: Blockchain)
|
|
@ -0,0 +1,124 @@
|
|||
package org.bitcoins.chain.blockchain.sync
|
||||
|
||||
import org.bitcoins.chain.api.ChainApi
|
||||
import org.bitcoins.chain.blockchain.ChainHandler
|
||||
import org.bitcoins.chain.models.BlockHeaderDb
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
trait ChainSync extends BitcoinSLogger {
|
||||
|
||||
/** This method checks if our chain handler has the tip of the blockchain as an external source
|
||||
* If we do not have the same chain, we sync our chain handler until we are at the same best block hash
|
||||
* @param chainHandler our internal chain handler
|
||||
* @param getBlockHeaderFunc a function that we can call to retrieve a block
|
||||
* @param getBestBlockHashFunc a function that can call a third party source (bitcoind, block explorer etc)
|
||||
* to retrieve what the best block is on the blockchain
|
||||
* @param ec
|
||||
* @return
|
||||
*/
|
||||
def sync(chainHandler: ChainHandler,
|
||||
getBlockHeaderFunc: DoubleSha256DigestBE => Future[BlockHeader],
|
||||
getBestBlockHashFunc: () => Future[DoubleSha256DigestBE])(implicit ec: ExecutionContext): Future[ChainApi] = {
|
||||
val currentTipsF: Future[Vector[BlockHeaderDb]] = {
|
||||
chainHandler.blockHeaderDAO.chainTips
|
||||
}
|
||||
|
||||
//TODO: We are implicitly trusting whatever
|
||||
// getBestBlockHashFunc returns as the best chain
|
||||
// and we don't ever even have to have this connect
|
||||
// with our current best tips
|
||||
// do we some how want to mitigate against the divergence
|
||||
// in chains here?
|
||||
val bestBlockHashF = {
|
||||
getBestBlockHashFunc()
|
||||
}
|
||||
|
||||
val updatedChainApi = bestBlockHashF.flatMap { bestBlockHash =>
|
||||
currentTipsF.flatMap { tips =>
|
||||
syncTips(chainApi = chainHandler,
|
||||
tips = tips,
|
||||
bestBlockHash = bestBlockHash,
|
||||
getBlockHeaderFunc = getBlockHeaderFunc)
|
||||
}
|
||||
}
|
||||
|
||||
updatedChainApi
|
||||
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Keeps walking backwards on the chain until we match one
|
||||
* of the tips we have in our chain
|
||||
* @param chainApi the chain api that represents our current chain state
|
||||
* @param tips the best block header we know about
|
||||
* @param bestBlockHash the best block header seen by our third party data source
|
||||
* @param getBlockHeaderFunc how we can retrieve block headers
|
||||
* @param ec
|
||||
* @return
|
||||
*/
|
||||
private def syncTips(chainApi: ChainApi,
|
||||
tips: Vector[BlockHeaderDb],
|
||||
bestBlockHash: DoubleSha256DigestBE,
|
||||
getBlockHeaderFunc: DoubleSha256DigestBE => Future[BlockHeader])(implicit ec: ExecutionContext): Future[ChainApi] = {
|
||||
require(tips.nonEmpty, s"Cannot sync without the genesis block")
|
||||
|
||||
//we need to walk backwards on the chain until we get to one of our tips
|
||||
|
||||
val tipsBH = tips.map(_.blockHeader)
|
||||
|
||||
def loop(lastHeaderF: Future[BlockHeader], accum: List[BlockHeader]): Future[List[BlockHeader]] = {
|
||||
lastHeaderF.flatMap { lastHeader =>
|
||||
if (tipsBH.contains(lastHeader)) {
|
||||
//means we have synced back to a block that we know
|
||||
Future.successful(accum)
|
||||
} else {
|
||||
|
||||
logger.debug(s"Last header=${lastHeader.hashBE.hex}")
|
||||
//we don't know this block, so we need to keep walking backwards
|
||||
//to find a block a we know
|
||||
val newLastHeaderF = getBlockHeaderFunc(lastHeader.previousBlockHashBE)
|
||||
|
||||
loop(newLastHeaderF,lastHeader +: accum)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
val bestHeaderF = getBlockHeaderFunc(bestBlockHash)
|
||||
|
||||
bestHeaderF.map { bestHeader =>
|
||||
logger.info(s"Best tip from third party=${bestHeader.hashBE.hex} currentTips=${tips.map(_.hashBE.hex)}")
|
||||
}
|
||||
|
||||
//one sanity check to make sure we aren't _ahead_ of our data source
|
||||
val hasBlockHashF = chainApi.getHeader(bestBlockHash)
|
||||
|
||||
hasBlockHashF.flatMap { hasBlockHashF: Option[BlockHeaderDb] =>
|
||||
if (hasBlockHashF.isDefined) {
|
||||
//if we have the best block hash in our
|
||||
//chainstate already, we don't need to search
|
||||
//for it again!
|
||||
Future.successful(chainApi)
|
||||
} else {
|
||||
//this represents all headers we have received from our external data source
|
||||
//and need to process with our chain handler
|
||||
val headersToSyncF = loop(bestHeaderF, List.empty)
|
||||
|
||||
//now we are going to add them to our chain and return the chain api
|
||||
headersToSyncF.flatMap { headers =>
|
||||
logger.info(s"Attempting to sync ${headers.length} blockheader to our chainstate")
|
||||
chainApi.processHeaders(headers.toVector)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
object ChainSync extends ChainSync
|
|
@ -0,0 +1,62 @@
|
|||
package org.bitcoins.chain.config
|
||||
|
||||
import com.typesafe.config.Config
|
||||
import org.bitcoins.chain.db.ChainDbManagement
|
||||
import org.bitcoins.db._
|
||||
import org.bitcoins.chain.models.{BlockHeaderDAO, BlockHeaderDbHelper}
|
||||
import org.bitcoins.core.util.FutureUtil
|
||||
|
||||
import scala.concurrent.ExecutionContext
|
||||
import scala.concurrent.Future
|
||||
import scala.concurrent.Promise
|
||||
import scala.util.Success
|
||||
import scala.util.Failure
|
||||
|
||||
case class ChainAppConfig(val confs: Config*) extends AppConfig {
|
||||
override protected val configOverrides: List[Config] = confs.toList
|
||||
override protected val moduleConfigName: String = "chain.conf"
|
||||
override protected type ConfigType = ChainAppConfig
|
||||
override protected def newConfigOfType(
|
||||
configs: List[Config]): ChainAppConfig = ChainAppConfig(configs: _*)
|
||||
|
||||
def isInitialized()(implicit ec: ExecutionContext): Future[Boolean] = {
|
||||
val bhDAO = BlockHeaderDAO(this)
|
||||
val p = Promise[Boolean]()
|
||||
val isDefinedOptF = {
|
||||
bhDAO.read(chain.genesisBlock.blockHeader.hashBE).map(_.isDefined)
|
||||
}
|
||||
isDefinedOptF.onComplete {
|
||||
case Success(bool) =>
|
||||
logger.info(s"Chain project is initialized")
|
||||
p.success(bool)
|
||||
case Failure(err) =>
|
||||
logger.info(s"Failed to init chain app err=${err.getMessage}")
|
||||
p.success(false)
|
||||
}
|
||||
|
||||
p.future
|
||||
}
|
||||
|
||||
/** Initializes our chain project if it is needed
|
||||
* This creates the necessary tables for the chain project
|
||||
* and inserts preliminary data like the genesis block header
|
||||
* */
|
||||
def initialize(implicit ec: ExecutionContext): Future[Unit] = {
|
||||
val blockHeaderDAO = BlockHeaderDAO(this)
|
||||
val isInitF = isInitialized()
|
||||
isInitF.flatMap { isInit =>
|
||||
if (isInit) {
|
||||
FutureUtil.unit
|
||||
} else {
|
||||
val createdF = ChainDbManagement.createAll()(this, ec)
|
||||
val genesisHeader =
|
||||
BlockHeaderDbHelper.fromBlockHeader(height = 0,
|
||||
bh =
|
||||
chain.genesisBlock.blockHeader)
|
||||
val bhCreatedF =
|
||||
createdF.flatMap(_ => blockHeaderDAO.create(genesisHeader))
|
||||
bhCreatedF.flatMap(_ => FutureUtil.unit)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,31 @@
|
|||
package org.bitcoins.chain.db
|
||||
|
||||
import org.bitcoins.db._
|
||||
import org.bitcoins.chain.models.BlockHeaderTable
|
||||
import org.bitcoins.db.{DbManagement}
|
||||
import slick.lifted.TableQuery
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
/**
|
||||
* Responsible for creating and destroying database
|
||||
* tables inside of the Chain project.
|
||||
*/
|
||||
sealed abstract class ChainDbManagement extends DbManagement {
|
||||
|
||||
private val chainTable: TableQuery[BlockHeaderTable] =
|
||||
TableQuery[BlockHeaderTable]
|
||||
|
||||
override val allTables = List(chainTable)
|
||||
|
||||
def createHeaderTable(createIfNotExists: Boolean = true)(
|
||||
implicit config: AppConfig): Future[Unit] = {
|
||||
createTable(chainTable, createIfNotExists)
|
||||
}
|
||||
|
||||
def dropHeaderTable()(implicit config: AppConfig): Future[Unit] = {
|
||||
dropTable(chainTable)
|
||||
}
|
||||
}
|
||||
|
||||
object ChainDbManagement extends ChainDbManagement
|
|
@ -0,0 +1,196 @@
|
|||
package org.bitcoins.chain.models
|
||||
|
||||
import org.bitcoins.chain.blockchain.Blockchain
|
||||
import org.bitcoins.chain.config.ChainAppConfig
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.db._
|
||||
import slick.jdbc.SQLiteProfile
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
import scala.annotation.tailrec
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
/**
|
||||
* This class is responsible for all database access related
|
||||
* to [[org.bitcoins.core.protocol.blockchain.BlockHeader]]s in
|
||||
* our chain project
|
||||
*/
|
||||
case class BlockHeaderDAO(appConfig: ChainAppConfig)(
|
||||
implicit override val ec: ExecutionContext)
|
||||
extends CRUD[BlockHeaderDb, DoubleSha256DigestBE] {
|
||||
|
||||
import org.bitcoins.db.DbCommonsColumnMappers._
|
||||
|
||||
override val table: TableQuery[BlockHeaderTable] =
|
||||
TableQuery[BlockHeaderTable]
|
||||
|
||||
/** Creates all of the given [[BlockHeaderDb]] in the database */
|
||||
override def createAll(
|
||||
headers: Vector[BlockHeaderDb]): Future[Vector[BlockHeaderDb]] = {
|
||||
SlickUtil.createAllNoAutoInc(ts = headers,
|
||||
database = database,
|
||||
table = table)
|
||||
}
|
||||
|
||||
override protected def findAll(
|
||||
ts: Vector[BlockHeaderDb]): Query[Table[_], BlockHeaderDb, Seq] = {
|
||||
findByPrimaryKeys(ts.map(_.hashBE))
|
||||
}
|
||||
|
||||
def findByHash(hash: DoubleSha256DigestBE): Future[Option[BlockHeaderDb]] = {
|
||||
val query = findByPrimaryKey(hash).result
|
||||
database.runVec(query).map(_.headOption)
|
||||
}
|
||||
|
||||
override def findByPrimaryKeys(hashes: Vector[DoubleSha256DigestBE]): Query[
|
||||
Table[_],
|
||||
BlockHeaderDb,
|
||||
Seq] = {
|
||||
table.filter(_.hash.inSet(hashes))
|
||||
}
|
||||
|
||||
/** Retrives the ancestor for the given block header at the given height
|
||||
* @param child
|
||||
* @param height
|
||||
* @return
|
||||
*/
|
||||
def getAncestorAtHeight(
|
||||
child: BlockHeaderDb,
|
||||
height: Long): Future[Option[BlockHeaderDb]] = {
|
||||
/*
|
||||
* To avoid making many database reads, we make one database read for all
|
||||
* possibly useful block headers.
|
||||
*/
|
||||
val headersF = getBetweenHeights(from = height, to = child.height - 1)
|
||||
|
||||
/*
|
||||
* We then bucket sort these headers by height so that any ancestor can be found
|
||||
* in linear time assuming a bounded number of contentious tips.
|
||||
*/
|
||||
val headersByHeight: Array[Vector[BlockHeaderDb]] =
|
||||
new Array[Vector[BlockHeaderDb]](_length = (child.height - height).toInt)
|
||||
|
||||
/*
|
||||
* I believe Array's of Objects are instantiated with null, which is evil,
|
||||
* and so we start by giving each element of the array a Vector.empty.
|
||||
*/
|
||||
headersByHeight.indices.foreach(index =>
|
||||
headersByHeight(index) = Vector.empty)
|
||||
|
||||
// Bucket sort
|
||||
headersF.map { headers =>
|
||||
headers.foreach { header =>
|
||||
val index = (header.height - height).toInt
|
||||
headersByHeight(index) = headersByHeight(index).:+(header)
|
||||
}
|
||||
|
||||
// Now that the bucket sort is done, we get rid of mutability
|
||||
val groupedByHeightHeaders: List[Vector[BlockHeaderDb]] =
|
||||
headersByHeight.toList
|
||||
|
||||
@tailrec
|
||||
def loop(
|
||||
currentHeader: BlockHeaderDb,
|
||||
headersByDescHeight: List[Vector[BlockHeaderDb]]): Option[
|
||||
BlockHeaderDb] = {
|
||||
if (currentHeader.height == height) {
|
||||
Some(currentHeader)
|
||||
} else {
|
||||
val prevHeaderOpt = headersByDescHeight.headOption.flatMap(
|
||||
_.find(_.hashBE == currentHeader.previousBlockHashBE))
|
||||
|
||||
prevHeaderOpt match {
|
||||
case None => None
|
||||
case Some(prevHeader) => loop(prevHeader, headersByDescHeight.tail)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
loop(child, groupedByHeightHeaders.reverse)
|
||||
}
|
||||
}
|
||||
|
||||
/** Retrieves a [[BlockHeaderDb]] at the given height */
|
||||
def getAtHeight(height: Long): Future[Vector[BlockHeaderDb]] = {
|
||||
val query = getAtHeightQuery(height)
|
||||
database.runVec(query)
|
||||
}
|
||||
|
||||
def getAtHeightQuery(height: Long): SQLiteProfile.StreamingProfileAction[
|
||||
Seq[BlockHeaderDb],
|
||||
BlockHeaderDb,
|
||||
Effect.Read] = {
|
||||
table.filter(_.height === height).result
|
||||
}
|
||||
|
||||
/** Gets Block Headers between (inclusive) from and to, could be out of order */
|
||||
def getBetweenHeights(from: Long, to: Long): Future[Vector[BlockHeaderDb]] = {
|
||||
val query = getBetweenHeightsQuery(from, to)
|
||||
database.runVec(query)
|
||||
}
|
||||
|
||||
def getBetweenHeightsQuery(
|
||||
from: Long,
|
||||
to: Long): SQLiteProfile.StreamingProfileAction[
|
||||
Seq[BlockHeaderDb],
|
||||
BlockHeaderDb,
|
||||
Effect.Read] = {
|
||||
table.filter(header => header.height >= from && header.height <= to).result
|
||||
}
|
||||
|
||||
/** Returns the maximum block height from our database */
|
||||
def maxHeight: Future[Long] = {
|
||||
val query = maxHeightQuery
|
||||
val result = database.run(query)
|
||||
result
|
||||
}
|
||||
|
||||
private def maxHeightQuery: SQLiteProfile.ProfileAction[
|
||||
Long,
|
||||
NoStream,
|
||||
Effect.Read] = {
|
||||
val query = table.map(_.height).max.getOrElse(0L).result
|
||||
query
|
||||
}
|
||||
|
||||
/** Returns the chainTips in our database. This can be multiple headers if we have
|
||||
* competing blockchains (fork) */
|
||||
def chainTips: Future[Vector[BlockHeaderDb]] = {
|
||||
logger.debug(s"Getting chaintips from: ${database.config.dbConfig.config}")
|
||||
val aggregate = {
|
||||
maxHeightQuery.flatMap { height =>
|
||||
logger.debug(s"Max block height: $height")
|
||||
val atHeight = getAtHeightQuery(height)
|
||||
atHeight.map { headers =>
|
||||
logger.debug(s"Headers at $height: $headers")
|
||||
}
|
||||
atHeight
|
||||
}
|
||||
}
|
||||
|
||||
database.runVec(aggregate)
|
||||
}
|
||||
|
||||
/** Returns competing blockchains that are contained in our BlockHeaderDAO
|
||||
* Each chain returns the last [[org.bitcoins.core.protocol.blockchain.ChainParams.difficultyChangeInterval difficutly interval]]
|
||||
* as defined by the network we are on. For instance, on bitcoin mainnet this will be 2016 block headers.
|
||||
* If no competing tips are found, we only return one [[Blockchain blockchain]], else we
|
||||
* return n chains for the number of competing [[chainTips tips]] we have
|
||||
* @see [[Blockchain]]
|
||||
* @param ec
|
||||
* @return
|
||||
*/
|
||||
def getBlockchains()(
|
||||
implicit ec: ExecutionContext): Future[Vector[Blockchain]] = {
|
||||
val chainTipsF = chainTips
|
||||
val diffInterval = appConfig.chain.difficultyChangeInterval
|
||||
chainTipsF.flatMap { tips =>
|
||||
val nestedFuture: Vector[Future[Blockchain]] = tips.map { tip =>
|
||||
val height = Math.max(0, tip.height - diffInterval)
|
||||
val headersF = getBetweenHeights(from = height, to = tip.height)
|
||||
headersF.map(headers => Blockchain.fromHeaders(headers.reverse))
|
||||
}
|
||||
Future.sequence(nestedFuture)
|
||||
}
|
||||
}
|
||||
}
|
|
@ -0,0 +1,89 @@
|
|||
package org.bitcoins.chain.models
|
||||
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.core.number.{Int32, UInt32}
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
case class BlockHeaderDb(
|
||||
height: Long,
|
||||
hashBE: DoubleSha256DigestBE,
|
||||
version: Int32,
|
||||
previousBlockHashBE: DoubleSha256DigestBE,
|
||||
merkleRootHashBE: DoubleSha256DigestBE,
|
||||
time: UInt32,
|
||||
nBits: UInt32,
|
||||
nonce: UInt32,
|
||||
hex: String) {
|
||||
|
||||
lazy val blockHeader: BlockHeader = {
|
||||
val blockHeader = BlockHeader.fromHex(hex)
|
||||
|
||||
require(blockHeader.hashBE == hashBE)
|
||||
require(blockHeader.previousBlockHashBE == previousBlockHashBE)
|
||||
require(blockHeader.version == version)
|
||||
require(blockHeader.nBits == nBits)
|
||||
require(blockHeader.nonce == nonce)
|
||||
|
||||
blockHeader
|
||||
}
|
||||
}
|
||||
|
||||
object BlockHeaderDbHelper {
|
||||
|
||||
def fromBlockHeader(height: Long, bh: BlockHeader): BlockHeaderDb = {
|
||||
BlockHeaderDb(
|
||||
height = height,
|
||||
hashBE = bh.hashBE,
|
||||
previousBlockHashBE = bh.previousBlockHashBE,
|
||||
merkleRootHashBE = bh.merkleRootHashBE,
|
||||
time = bh.time,
|
||||
nBits = bh.nBits,
|
||||
nonce = bh.nonce,
|
||||
version = bh.version,
|
||||
hex = bh.hex
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
/** A table that stores block headers related to a blockchain */
|
||||
class BlockHeaderTable(tag: Tag)
|
||||
extends Table[BlockHeaderDb](tag, "block_headers") {
|
||||
import org.bitcoins.db.DbCommonsColumnMappers._
|
||||
|
||||
def height = column[Long]("height")
|
||||
|
||||
def hash = column[DoubleSha256DigestBE]("hash", O.PrimaryKey)
|
||||
|
||||
def version = column[Int32]("version")
|
||||
|
||||
def previousBlockHash = column[DoubleSha256DigestBE]("previous_block_hash")
|
||||
|
||||
def merkleRootHash = column[DoubleSha256DigestBE]("merkle_root_hash")
|
||||
|
||||
def time = column[UInt32]("time")
|
||||
|
||||
def nBits = column[UInt32]("n_bits")
|
||||
|
||||
def nonce = column[UInt32]("nonce")
|
||||
|
||||
def hex = column[String]("hex")
|
||||
|
||||
/** The sql index for searching based on [[height]] */
|
||||
def heightIndex = index("height_index", height)
|
||||
|
||||
def hashIndex = index("hash_index", hash)
|
||||
|
||||
def * = {
|
||||
(height,
|
||||
hash,
|
||||
version,
|
||||
previousBlockHash,
|
||||
merkleRootHash,
|
||||
time,
|
||||
nBits,
|
||||
nonce,
|
||||
hex).<>(BlockHeaderDb.tupled, BlockHeaderDb.unapply)
|
||||
}
|
||||
|
||||
}
|
119
chain/src/main/scala/org/bitcoins/chain/pow/Pow.scala
Normal file
119
chain/src/main/scala/org/bitcoins/chain/pow/Pow.scala
Normal file
|
@ -0,0 +1,119 @@
|
|||
package org.bitcoins.chain.pow
|
||||
|
||||
import org.bitcoins.chain.models.{BlockHeaderDAO, BlockHeaderDb}
|
||||
import org.bitcoins.core.number.UInt32
|
||||
import org.bitcoins.core.protocol.blockchain.{BlockHeader, ChainParams}
|
||||
import org.bitcoins.core.util.{BitcoinSLogger, NumberUtil}
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
/**
|
||||
* Implements functions found inside of bitcoin core's
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/35477e9e4e3f0f207ac6fa5764886b15bf9af8d0/src/pow.cpp pow.cpp]]
|
||||
*/
|
||||
sealed abstract class Pow extends BitcoinSLogger {
|
||||
|
||||
/**
|
||||
* Gets the next proof of work requirement for a block
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/35477e9e4e3f0f207ac6fa5764886b15bf9af8d0/src/pow.cpp#L13 Mimics bitcoin core implmentation]]
|
||||
* @param tip
|
||||
* @param newPotentialTip
|
||||
* @return
|
||||
*/
|
||||
def getNetworkWorkRequired(
|
||||
tip: BlockHeaderDb,
|
||||
newPotentialTip: BlockHeader,
|
||||
blockHeaderDAO: BlockHeaderDAO)(
|
||||
implicit ec: ExecutionContext): Future[UInt32] = {
|
||||
val chainParams = blockHeaderDAO.appConfig.chain
|
||||
val currentHeight = tip.height
|
||||
|
||||
val powLimit = NumberUtil.targetCompression(bigInteger =
|
||||
chainParams.powLimit,
|
||||
isNegative = false)
|
||||
if ((currentHeight + 1) % chainParams.difficultyChangeInterval != 0) {
|
||||
if (chainParams.allowMinDifficultyBlocks) {
|
||||
// Special difficulty rule for testnet:
|
||||
// If the new block's timestamp is more than 2* 10 minutes
|
||||
// then allow mining of a min-difficulty block.
|
||||
if (newPotentialTip.time.toLong > tip.blockHeader.time.toLong + chainParams.powTargetSpacing.toSeconds * 2) {
|
||||
Future.successful(powLimit)
|
||||
} else {
|
||||
// Return the last non-special-min-difficulty-rules-block
|
||||
|
||||
// this is complex to implement and requires walking the
|
||||
//chain until we find a block header that does not have
|
||||
//the minimum difficulty rule on testnet
|
||||
|
||||
//TODO: This is not correctly implemented, come back and fix this when BlockHeaderDAO has a predicate to satisfy
|
||||
Future.successful(powLimit)
|
||||
}
|
||||
} else {
|
||||
Future.successful(tip.blockHeader.nBits)
|
||||
}
|
||||
} else {
|
||||
val firstHeight = currentHeight - (chainParams.difficultyChangeInterval - 1)
|
||||
|
||||
require(firstHeight >= 0, s"We must have our first height be postive, got=${firstHeight}")
|
||||
|
||||
val firstBlockAtIntervalF: Future[Option[BlockHeaderDb]] = {
|
||||
blockHeaderDAO.getAncestorAtHeight(tip, firstHeight)
|
||||
}
|
||||
|
||||
firstBlockAtIntervalF.flatMap {
|
||||
case Some(firstBlock) =>
|
||||
calculateNextWorkRequired(currentTip = tip, firstBlock, chainParams)
|
||||
case None =>
|
||||
Future.failed(
|
||||
new IllegalArgumentException(
|
||||
s"Could not find ancestor for block=${tip.hashBE.hex}"))
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate the next proof of work requirement for our blockchain
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/35477e9e4e3f0f207ac6fa5764886b15bf9af8d0/src/pow.cpp#L49 bitcoin core implementation]]
|
||||
* @param currentTip
|
||||
* @param firstBlock
|
||||
* @param chainParams
|
||||
* @return
|
||||
*/
|
||||
def calculateNextWorkRequired(
|
||||
currentTip: BlockHeaderDb,
|
||||
firstBlock: BlockHeaderDb,
|
||||
chainParams: ChainParams): Future[UInt32] = {
|
||||
if (chainParams.noRetargeting) {
|
||||
Future.successful(currentTip.nBits)
|
||||
} else {
|
||||
var actualTimespan = (currentTip.time - firstBlock.time).toLong
|
||||
val timespanSeconds = chainParams.powTargetTimeSpan.toSeconds
|
||||
if (actualTimespan < timespanSeconds / 4) {
|
||||
actualTimespan = timespanSeconds / 4
|
||||
}
|
||||
|
||||
if (actualTimespan > timespanSeconds * 4) {
|
||||
actualTimespan = timespanSeconds * 4
|
||||
}
|
||||
|
||||
val powLimit = chainParams.powLimit
|
||||
|
||||
var bnNew = NumberUtil.targetExpansion(currentTip.nBits).difficulty
|
||||
|
||||
bnNew = bnNew * actualTimespan
|
||||
|
||||
bnNew = bnNew / timespanSeconds
|
||||
|
||||
if (bnNew > powLimit) {
|
||||
bnNew = powLimit
|
||||
}
|
||||
|
||||
val newTarget = NumberUtil.targetCompression(bnNew, false)
|
||||
|
||||
Future.successful(newTarget)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
object Pow extends Pow
|
|
@ -0,0 +1,28 @@
|
|||
package org.bitcoins.chain.validation
|
||||
|
||||
import org.bitcoins.chain.models.BlockHeaderDb
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
|
||||
/** Represents the result of updating the chain with
|
||||
* the given header
|
||||
*/
|
||||
sealed abstract class TipUpdateResult
|
||||
|
||||
object TipUpdateResult {
|
||||
|
||||
/** Indicates we successfully update the chain tip with this header */
|
||||
case class Success(header: BlockHeaderDb) extends TipUpdateResult
|
||||
|
||||
sealed abstract class Failure extends TipUpdateResult {
|
||||
def header: BlockHeader
|
||||
}
|
||||
|
||||
/** Means that [[header.previousBlockHashBE]] was incorrect */
|
||||
case class BadPreviousBlockHash(header: BlockHeader) extends Failure
|
||||
|
||||
/** Means that [[header.nBits]] was invalid */
|
||||
case class BadPOW(header: BlockHeader) extends Failure
|
||||
|
||||
/** Means that [[header.nonce]] was invalid */
|
||||
case class BadNonce(header: BlockHeader) extends Failure
|
||||
}
|
|
@ -0,0 +1,107 @@
|
|||
package org.bitcoins.chain.validation
|
||||
|
||||
import org.bitcoins.chain.models.{BlockHeaderDAO, BlockHeaderDb, BlockHeaderDbHelper}
|
||||
import org.bitcoins.chain.pow.Pow
|
||||
import org.bitcoins.core.number.UInt32
|
||||
import org.bitcoins.core.protocol.blockchain.BlockHeader
|
||||
import org.bitcoins.core.util.{BitcoinSLogger, NumberUtil}
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
/**
|
||||
* Responsible for checking if we can connect two
|
||||
* block headers together on the blockchain. The checks
|
||||
* things like proof of work difficulty, if it
|
||||
* references the previous block header correctly etc.
|
||||
*/
|
||||
sealed abstract class TipValidation extends BitcoinSLogger {
|
||||
|
||||
/** Checks if the given header can be connected to the current tip
|
||||
* This is the method where a [[BlockHeader]] is transformed into a
|
||||
* [[BlockHeaderDb]]. What this really means is that a height is
|
||||
* assigned to a [[BlockHeader header]] after all these
|
||||
* validation checks occur
|
||||
* */
|
||||
def checkNewTip(
|
||||
newPotentialTip: BlockHeader,
|
||||
currentTip: BlockHeaderDb,
|
||||
blockHeaderDAO: BlockHeaderDAO)(
|
||||
implicit ec: ExecutionContext): Future[TipUpdateResult] = {
|
||||
val header = newPotentialTip
|
||||
logger.info(
|
||||
s"Checking header=${header.hashBE.hex} to try to connect to currentTip=${currentTip.hashBE.hex} with height=${currentTip.height}")
|
||||
|
||||
val powCheckF = isBadPow(newPotentialTip = newPotentialTip,
|
||||
currentTip = currentTip,
|
||||
blockHeaderDAO = blockHeaderDAO)
|
||||
|
||||
val connectTipResultF: Future[TipUpdateResult] = {
|
||||
powCheckF.map { expectedWork =>
|
||||
if (header.previousBlockHashBE != currentTip.hashBE) {
|
||||
logger.warn(
|
||||
s"Failed to connect tip=${header.hashBE.hex} to current chain")
|
||||
TipUpdateResult.BadPreviousBlockHash(newPotentialTip)
|
||||
} else if (header.nBits != expectedWork) {
|
||||
//https://github.com/bitcoin/bitcoin/blob/eb7daf4d600eeb631427c018a984a77a34aca66e/src/pow.cpp#L19
|
||||
TipUpdateResult.BadPOW(newPotentialTip)
|
||||
} else if (isBadNonce(newPotentialTip)) {
|
||||
TipUpdateResult.BadNonce(newPotentialTip)
|
||||
} else {
|
||||
val headerDb = BlockHeaderDbHelper.fromBlockHeader(
|
||||
height = currentTip.height + 1,
|
||||
bh = newPotentialTip
|
||||
)
|
||||
TipUpdateResult.Success(headerDb)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
logTipResult(connectTipResultF, currentTip)
|
||||
connectTipResultF
|
||||
}
|
||||
|
||||
/** Logs the result of [[org.bitcoins.chain.validation.TipValidation.checkNewTip() checkNewTip]] */
|
||||
private def logTipResult(
|
||||
connectTipResultF: Future[TipUpdateResult],
|
||||
currentTip: BlockHeaderDb)(implicit ec: ExecutionContext): Unit = {
|
||||
connectTipResultF.map {
|
||||
case TipUpdateResult.Success(tipDb) =>
|
||||
logger.info(
|
||||
s"Successfully connected ${tipDb.hashBE.hex} with height=${tipDb.height} to block=${currentTip.hashBE.hex} with height=${currentTip.height}")
|
||||
|
||||
case bad: TipUpdateResult.Failure =>
|
||||
logger.warn(
|
||||
s"Failed to connect ${bad.header.hashBE.hex} to ${currentTip.hashBE.hex} with height=${currentTip.height}, reason=${bad}")
|
||||
|
||||
}
|
||||
|
||||
()
|
||||
}
|
||||
|
||||
/** Checks if [[header]] hashes to meet the POW requirements for this block (nBits)
|
||||
* Mimics this
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/eb7daf4d600eeb631427c018a984a77a34aca66e/src/pow.cpp#L74]]
|
||||
* */
|
||||
def isBadNonce(header: BlockHeader): Boolean = {
|
||||
//convert hash into a big integer
|
||||
val headerWork = BigInt(1, header.hashBE.bytes.toArray)
|
||||
if (headerWork <= 0 || NumberUtil.isNBitsOverflow(nBits = header.nBits)) {
|
||||
true
|
||||
} else {
|
||||
headerWork > header.difficulty
|
||||
}
|
||||
}
|
||||
|
||||
private def isBadPow(
|
||||
newPotentialTip: BlockHeader,
|
||||
currentTip: BlockHeaderDb,
|
||||
blockHeaderDAO: BlockHeaderDAO)(
|
||||
implicit ec: ExecutionContext): Future[UInt32] = {
|
||||
Pow.getNetworkWorkRequired(tip = currentTip,
|
||||
newPotentialTip = newPotentialTip,
|
||||
blockHeaderDAO = blockHeaderDAO)
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
object TipValidation extends TipValidation
|
|
@ -1,23 +1,9 @@
|
|||
<configuration>
|
||||
|
||||
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
|
||||
<file>logs/test-application.log</file>
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<include resource="common-logback.xml" />
|
||||
|
||||
<root level="OFF">
|
||||
<appender-ref ref="STDOUT" />
|
||||
<appender-ref ref="FILE"/>
|
||||
</root>
|
||||
|
||||
|
||||
</configuration>
|
||||
|
|
|
@ -10,7 +10,6 @@ class AesCryptTest extends BitcoinSUnitTest {
|
|||
|
||||
val password = AesPassword("PASSWORD")
|
||||
val badPassword = AesPassword("BAD_PASSWORD")
|
||||
val emptyPassword = AesPassword("")
|
||||
|
||||
/**
|
||||
* The test vectors in this test was generated by using
|
||||
|
@ -111,26 +110,6 @@ class AesCryptTest extends BitcoinSUnitTest {
|
|||
|
||||
}
|
||||
|
||||
it must "fail when encrypting with an empty password" in {
|
||||
val encryptE =
|
||||
AesCrypt.encrypt(plainText = hex"abcdef", password = emptyPassword)
|
||||
encryptE match {
|
||||
case Right(_) => fail("Was able to encrypt with an empty password!")
|
||||
case Left(AesException.EmptyPasswordException) => succeed
|
||||
case Left(exc) => fail("Failed with unexpected exception", exc)
|
||||
}
|
||||
}
|
||||
|
||||
it must "fail when decrypting with an empty password" in {
|
||||
val encrypted = AesCrypt.encryptExc(plainText = hex"123456789", password)
|
||||
AesCrypt.decrypt(encrypted, emptyPassword) match {
|
||||
case Right(_) => fail("Was able to decrypt with an empty password!")
|
||||
case Left(AesException.EmptyPasswordException) => succeed
|
||||
case Left(exc) => fail("Failed with unexpected exception", exc)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
it must "have encryption and decryption symmetry" in {
|
||||
forAll(NumberGenerator.bytevector,
|
||||
StringGenerators.genString.suchThat(_.nonEmpty)) {
|
||||
|
@ -155,4 +134,8 @@ class AesCryptTest extends BitcoinSUnitTest {
|
|||
}
|
||||
}
|
||||
}
|
||||
|
||||
it must "fail to create an empty AES password" in {
|
||||
assertThrows[IllegalArgumentException](AesPassword(""))
|
||||
}
|
||||
}
|
||||
|
|
|
@ -12,6 +12,7 @@ import org.scalacheck.{Gen, Shrink}
|
|||
import org.scalatest.path
|
||||
|
||||
import scala.util.{Success, Try}
|
||||
import org.bitcoins.core.crypto.ExtPrivateKey
|
||||
|
||||
class BIP32PathTest extends BitcoinSUnitTest {
|
||||
|
||||
|
@ -127,4 +128,67 @@ class BIP32PathTest extends BitcoinSUnitTest {
|
|||
assert(path == BIP32Path.fromString(toString))
|
||||
}
|
||||
}
|
||||
|
||||
it must "do path diffing" in {
|
||||
{
|
||||
val first = BIP32Path.fromString("m/44'/1'")
|
||||
assert(first.diff(first).contains(BIP32Path.empty))
|
||||
}
|
||||
|
||||
{
|
||||
val first = BIP32Path.fromString("m/44'/0'/0'")
|
||||
val second = BIP32Path.fromString("m/44'/0'/0'/0/2")
|
||||
val expected = BIP32Path.fromString("m/0/2")
|
||||
assert(first.diff(second).contains(expected))
|
||||
}
|
||||
|
||||
{
|
||||
val first = BIP32Path.fromString("m/44'/0'/0'/1")
|
||||
val second = BIP32Path.fromString("m/44'/0'/0'/1/2")
|
||||
val expected = BIP32Path.fromString("m/2")
|
||||
assert(first.diff(second).contains(expected))
|
||||
}
|
||||
|
||||
{
|
||||
val first = BIP32Path.fromString("m/44'/1'")
|
||||
val second = BIP32Path.fromString("m/44'")
|
||||
assert(first.diff(second).isEmpty)
|
||||
}
|
||||
|
||||
{
|
||||
val first = BIP32Path.fromString("m/44'")
|
||||
val second = BIP32Path.fromString("m/44'/1'")
|
||||
val expected = BIP32Path.fromString("m/1'")
|
||||
assert(first.diff(second).contains(expected))
|
||||
}
|
||||
|
||||
{
|
||||
val first = BIP32Path.fromString("m/44'/1'")
|
||||
val second = BIP32Path.fromString("m/43'/2'")
|
||||
assert(first.diff(second).isEmpty)
|
||||
}
|
||||
|
||||
{
|
||||
val first = BIP32Path.fromString("m/44'/1/0")
|
||||
val second = BIP32Path.fromString("m/44'/2/0")
|
||||
assert(first.diff(second).isEmpty)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
it must "do path diffing without altering the result" in {
|
||||
forAll(HDGenerators.diffableHDPaths, CryptoGenerators.extPrivateKey) {
|
||||
case ((short, long), xpriv) =>
|
||||
val diffed = short.diff(long) match {
|
||||
case None => fail(s"$short and $long was not diffable!")
|
||||
case Some(value) => value
|
||||
}
|
||||
|
||||
val longDerived = xpriv.deriveChildPrivKey(long)
|
||||
val diffDerived =
|
||||
xpriv.deriveChildPrivKey(short).deriveChildPrivKey(diffed)
|
||||
assert(longDerived == diffDerived)
|
||||
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
54
core/src/main/resources/common-logback.xml
Normal file
54
core/src/main/resources/common-logback.xml
Normal file
|
@ -0,0 +1,54 @@
|
|||
<included>
|
||||
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
|
||||
<file>logs/application.log</file>
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
|
||||
|
||||
<root level="OFF">
|
||||
<appender-ref ref="FILE" />
|
||||
<appender-ref ref="STDOUT" />
|
||||
</root>
|
||||
|
||||
<!-- get rid of "Slf4jLogger started" messages -->
|
||||
<logger name="akka.event.slf4j.Slf4jLogger" level="OFF" />
|
||||
|
||||
<!-- get rid of annoying warning messages from tests -->
|
||||
<logger name="org.bitcoins.chain.validation" level="OFF" />
|
||||
|
||||
<!-- inspect resolved config -->
|
||||
<logger name="org.bitcoins.chain.config" level="INFO" />
|
||||
<logger name="org.bitcoins.node.config" level="INFO" />
|
||||
<logger name="org.bitcoins.wallet.config" level="INFO" />
|
||||
|
||||
<!-- inspect resolved db connection -->
|
||||
<logger name="org.bitcoins.db.SafeDatabase" level="INFO" />
|
||||
|
||||
<!-- see how long statements took to execute by setting to DEBUG -->
|
||||
<logger name="slick.jdbc.JdbcBackend.benchmark" level="INFO"/>
|
||||
|
||||
<!-- see what statements are executed by setting to DEBUG -->
|
||||
<logger name="slick.jdbc.JdbcBackend.statement" level="INFO"/>
|
||||
|
||||
<!-- see what slick is compiling to in sql -->
|
||||
<logger name="slick.compiler" level="INFO"/>
|
||||
|
||||
<!-- see what's returned by Slick -->
|
||||
<logger name="slick.jdbc.StatementInvoker.result" level="INFO" />
|
||||
|
||||
|
||||
<logger name="slick" level="INFO"/>
|
||||
<!-- Get rid of messages like this:
|
||||
Connection attempt failed. Backing off new connection
|
||||
attempts for at least 800 milliseconds. -->
|
||||
<logger name="akka.http.impl.engine.client.PoolGateway" level="OFF"/>
|
||||
</included>
|
|
@ -45,12 +45,9 @@ sealed abstract class NetworkParameters {
|
|||
*/
|
||||
def magicBytes: ByteVector
|
||||
|
||||
/** In bitcoin, the network recaculates the difficulty for the network every 2016 blocks */
|
||||
def difficultyChangeThreshold: Int
|
||||
}
|
||||
|
||||
sealed abstract class BitcoinNetwork extends NetworkParameters {
|
||||
override def difficultyChangeThreshold: Int = 2016
|
||||
|
||||
override def chainParams: BitcoinChainParams
|
||||
}
|
||||
|
@ -74,20 +71,23 @@ sealed abstract class MainNet extends BitcoinNetwork {
|
|||
/**
|
||||
* @inheritdoc
|
||||
*/
|
||||
override def dnsSeeds: Seq[String] =
|
||||
Seq("seed.bitcoin.sipa.be",
|
||||
override def dnsSeeds = {
|
||||
List(
|
||||
"seed.bitcoin.sipa.be",
|
||||
"dnsseed.bluematt.me",
|
||||
"dnsseed.bitcoin.dashjr.org",
|
||||
"seed.bitcoinstats.com",
|
||||
"bitseed.xf2.org",
|
||||
"seed.bitcoin.jonasschnelli.ch")
|
||||
"seed.btc.petertodd.org",
|
||||
"seed.bitcoin.jonasschnelli.ch",
|
||||
"seed.bitcoin.sprovoost.nl"
|
||||
)
|
||||
}
|
||||
|
||||
/**
|
||||
* @inheritdoc
|
||||
*/
|
||||
override def magicBytes = ByteVector(0xf9, 0xbe, 0xb4, 0xd9)
|
||||
|
||||
override def difficultyChangeThreshold: Int = 2016
|
||||
}
|
||||
|
||||
object MainNet extends MainNet
|
||||
|
@ -117,7 +117,6 @@ sealed abstract class TestNet3 extends BitcoinNetwork {
|
|||
*/
|
||||
override def magicBytes = ByteVector(0x0b, 0x11, 0x09, 0x07)
|
||||
|
||||
override def difficultyChangeThreshold: Int = 2016
|
||||
}
|
||||
|
||||
object TestNet3 extends TestNet3
|
||||
|
@ -144,7 +143,6 @@ sealed abstract class RegTest extends BitcoinNetwork {
|
|||
* @inheritdoc
|
||||
*/
|
||||
override def magicBytes = ByteVector(0xfa, 0xbf, 0xb5, 0xda)
|
||||
override def difficultyChangeThreshold: Int = 2016
|
||||
}
|
||||
|
||||
object RegTest extends RegTest
|
||||
|
|
|
@ -17,7 +17,12 @@ case class AesSalt(
|
|||
value: ByteVector
|
||||
)
|
||||
|
||||
case class AesPassword(value: String)
|
||||
/**
|
||||
* @throws IllegalArgumentException if passed an empty string
|
||||
*/
|
||||
case class AesPassword(value: String) {
|
||||
require(value.nonEmpty, "AES passwords cannot be empty!")
|
||||
}
|
||||
|
||||
/**
|
||||
* Provides functionality for encrypting and decrypting with AES
|
||||
|
|
|
@ -31,6 +31,8 @@ object BIP39Seed extends Factory[BIP39Seed] {
|
|||
override def fromBytes(bytes: ByteVector): BIP39Seed =
|
||||
BIP39SeedImpl(bytes)
|
||||
|
||||
val EMPTY_PASSWORD = ""
|
||||
|
||||
private val ITERATION_COUNT = 2048
|
||||
private val DERIVED_KEY_LENGTH = 512
|
||||
|
||||
|
@ -39,7 +41,9 @@ object BIP39Seed extends Factory[BIP39Seed] {
|
|||
* seed from a mnemonic code. An optional password can be supplied.
|
||||
* @param password Defaults to the empty string
|
||||
*/
|
||||
def fromMnemonic(mnemonic: MnemonicCode, password: String = ""): BIP39Seed = {
|
||||
def fromMnemonic(
|
||||
mnemonic: MnemonicCode,
|
||||
password: String = EMPTY_PASSWORD): BIP39Seed = {
|
||||
val salt = s"mnemonic$password"
|
||||
|
||||
val words = mnemonic.words.mkString(" ")
|
||||
|
|
|
@ -68,6 +68,11 @@ object Sha256Digest extends Factory[Sha256Digest] {
|
|||
}
|
||||
override def fromBytes(bytes: ByteVector): Sha256Digest =
|
||||
Sha256DigestImpl(bytes)
|
||||
|
||||
private val e = ByteVector(Array.fill(32)(0.toByte))
|
||||
|
||||
val empty: Sha256Digest = Sha256Digest.fromBytes(e)
|
||||
|
||||
}
|
||||
|
||||
/**
|
||||
|
@ -109,6 +114,9 @@ object DoubleSha256Digest extends Factory[DoubleSha256Digest] {
|
|||
override def fromBytes(bytes: ByteVector): DoubleSha256Digest =
|
||||
DoubleSha256DigestImpl(bytes)
|
||||
|
||||
private val e = ByteVector(Array.fill(32)(0.toByte))
|
||||
val empty: DoubleSha256Digest = DoubleSha256Digest.fromBytes(e)
|
||||
|
||||
}
|
||||
|
||||
/** The big endian version of [[org.bitcoins.core.crypto.DoubleSha256Digest DoubleSha256Digest]] */
|
||||
|
@ -128,6 +136,7 @@ object DoubleSha256DigestBE extends Factory[DoubleSha256DigestBE] {
|
|||
override def fromBytes(bytes: ByteVector): DoubleSha256DigestBE =
|
||||
DoubleSha256DigestBEImpl(bytes)
|
||||
|
||||
val empty: DoubleSha256DigestBE = DoubleSha256Digest.empty.flip
|
||||
}
|
||||
|
||||
/**
|
||||
|
|
|
@ -53,7 +53,7 @@ sealed abstract class MnemonicCode {
|
|||
* Returns the entropy initially provided to construct
|
||||
* this mnemonic code
|
||||
*/
|
||||
private[crypto] def toEntropy: BitVector = {
|
||||
private[bitcoins] def toEntropy: BitVector = {
|
||||
val entropyWithChecksumBits = toEntropyWithChecksum
|
||||
val lengthNoEntropy = MnemonicCode
|
||||
.getMnemonicCodeInfo(words)
|
||||
|
|
|
@ -6,6 +6,78 @@ import org.bitcoins.core.number.UInt32
|
|||
abstract class BIP32Path {
|
||||
def path: Vector[BIP32Node]
|
||||
|
||||
/**
|
||||
* BIP32 paths can be subsets/superset of each other.
|
||||
* If all elements in a path `p` is included in a path
|
||||
* `P`, (i.e. `p` is a subset of `P`), `p.diff(P)`
|
||||
* is the elements from `P` that is not in `p`.
|
||||
*
|
||||
* @example
|
||||
* {{{
|
||||
* // equal paths
|
||||
* m/44'/1' diff m/44'/1' == Some(BIP32Path.empty)
|
||||
*
|
||||
* // diffable path
|
||||
* m/44'/0'/0' diff m/44'/0'/0'/0/2 = Some(m/0/2)
|
||||
* m/44'/0'/0'/1 diff m/44'/0'/0'/1/2 = Some(m/2)
|
||||
*
|
||||
* // this is longer than other
|
||||
* m/44'/1' diff m/44' == None
|
||||
*
|
||||
* // any fields are unequal along the way
|
||||
* m/44'/1' diff m/43'/2' == None
|
||||
* m/44'/1'/0 diff m/44'/2'/1 == None
|
||||
* }}}
|
||||
*/
|
||||
def diff(that: BIP32Path): Option[BIP32Path] = {
|
||||
import that.{path => otherPath}
|
||||
|
||||
if (path.length > otherPath.length) {
|
||||
None
|
||||
} else if (path == otherPath) {
|
||||
Some(BIP32Path.empty)
|
||||
} else {
|
||||
val lengthDiff = otherPath.length - path.length
|
||||
|
||||
val extendedPath: Vector[Option[BIP32Node]] = path.map(Some(_)) ++
|
||||
Vector.fill[Option[BIP32Node]](lengthDiff)(None)
|
||||
|
||||
val pathsWithIndices = extendedPath
|
||||
.zip(otherPath)
|
||||
.zipWithIndex
|
||||
|
||||
val calculatedDiff: Option[BIP32Path] = pathsWithIndices
|
||||
.foldLeft(Option(BIP32Path.empty)) {
|
||||
// we encountered an error along the way, return
|
||||
// none
|
||||
case (None, _) => None
|
||||
|
||||
// we've reached the end of our path, append
|
||||
// the element from their path but don't
|
||||
// include the previous one (as
|
||||
// that's shared)
|
||||
case (Some(_), ((None, their), index)) if index == path.length =>
|
||||
Some(BIP32Path(their))
|
||||
|
||||
// append the next divergent element to
|
||||
// the acummed value
|
||||
case (Some(accum), ((None, their), _)) =>
|
||||
Some(BIP32Path(accum.path :+ their))
|
||||
|
||||
// we've not yet reached the start of diverging
|
||||
// paths
|
||||
case (Some(_), ((Some(our), their), _)) if our == their =>
|
||||
Some(BIP32Path(our))
|
||||
|
||||
// paths are divergent, fail the computation
|
||||
case (Some(_), ((Some(_), _), _)) =>
|
||||
None
|
||||
}
|
||||
|
||||
calculatedDiff
|
||||
}
|
||||
}
|
||||
|
||||
override def toString: String =
|
||||
path
|
||||
.map {
|
||||
|
|
|
@ -18,7 +18,7 @@ sealed abstract class HDAddress extends BIP32Path {
|
|||
def chain: HDChain
|
||||
def index: Int
|
||||
|
||||
def toPath: HDPath[_] = purpose match {
|
||||
def toPath: HDPath = purpose match {
|
||||
case HDPurposes.Legacy => LegacyHDPath(this)
|
||||
case HDPurposes.SegWit => SegWitHDPath(this)
|
||||
case HDPurposes.NestedSegWit => NestedSegWitHDPath(this)
|
||||
|
|
|
@ -1,16 +1,44 @@
|
|||
package org.bitcoins.core.hd
|
||||
import scala.util.Try
|
||||
|
||||
private[bitcoins] trait HDPath[T <: HDPath[T]] extends BIP32Path {
|
||||
private[bitcoins] trait HDPath extends BIP32Path {
|
||||
|
||||
/**
|
||||
* This type is to give a cleaner return
|
||||
* type of `next`.
|
||||
*
|
||||
* Consider:
|
||||
*
|
||||
* {{{
|
||||
* def next: this.type = ???
|
||||
*
|
||||
* val first: SegWitHDPath = ???
|
||||
* val second = first.next
|
||||
* // second is now:
|
||||
* // first.type (with underlying type org.bitcoins.core.hd.SegWitHDPath)
|
||||
* }}}
|
||||
|
||||
* {{{
|
||||
* def next: NextPath = ???
|
||||
*
|
||||
* // in SegWitHDPath
|
||||
* override type NextPath = SegWitHDPath
|
||||
*
|
||||
* val first: SegWitHDPath = ???
|
||||
* val second = first.next
|
||||
* // second is now:
|
||||
* // SegWitHDPath
|
||||
* }}}
|
||||
*/
|
||||
protected type NextPath <: HDPath
|
||||
|
||||
/**
|
||||
* Increments the address index and returns the
|
||||
* new path that can be passed into a
|
||||
* [[org.bitcoins.core.crypto.ExtKey ExtKey]]
|
||||
*/
|
||||
// TODO check out this cast
|
||||
def next: T =
|
||||
HDAddress(chain, account.index + 1).toPath.asInstanceOf[T]
|
||||
def next: NextPath =
|
||||
HDAddress(chain, account.index + 1).toPath.asInstanceOf[NextPath]
|
||||
|
||||
def account: HDAccount = address.account
|
||||
|
||||
|
@ -28,7 +56,7 @@ private[bitcoins] trait HDPath[T <: HDPath[T]] extends BIP32Path {
|
|||
object HDPath {
|
||||
|
||||
/** Attempts to parse a string into a valid HD path */
|
||||
def fromString(string: String): Option[HDPath[_]] =
|
||||
def fromString(string: String): Option[HDPath] =
|
||||
Try(LegacyHDPath.fromString(string))
|
||||
.orElse(Try(SegWitHDPath.fromString(string)))
|
||||
.orElse(Try(NestedSegWitHDPath.fromString(string)))
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
package org.bitcoins.core.hd
|
||||
|
||||
sealed abstract class LegacyHDPath extends HDPath[LegacyHDPath]
|
||||
sealed abstract class LegacyHDPath extends HDPath {
|
||||
override protected type NextPath = LegacyHDPath
|
||||
}
|
||||
|
||||
object LegacyHDPath extends HDPathFactory[LegacyHDPath] {
|
||||
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
package org.bitcoins.core.hd
|
||||
|
||||
sealed abstract class NestedSegWitHDPath extends HDPath[NestedSegWitHDPath]
|
||||
sealed abstract class NestedSegWitHDPath extends HDPath {
|
||||
override protected type NextPath = NestedSegWitHDPath
|
||||
}
|
||||
|
||||
object NestedSegWitHDPath extends HDPathFactory[NestedSegWitHDPath] {
|
||||
|
||||
|
|
|
@ -1,6 +1,8 @@
|
|||
package org.bitcoins.core.hd
|
||||
|
||||
sealed abstract class SegWitHDPath extends HDPath[SegWitHDPath]
|
||||
sealed abstract class SegWitHDPath extends HDPath {
|
||||
override protected type NextPath = SegWitHDPath
|
||||
}
|
||||
|
||||
object SegWitHDPath extends HDPathFactory[SegWitHDPath] {
|
||||
|
||||
|
|
|
@ -17,6 +17,11 @@ sealed abstract class Address {
|
|||
/** The string representation of this address */
|
||||
def value: String
|
||||
|
||||
override def equals(obj: Any): Boolean = obj match {
|
||||
case addr: Address => value == addr.value
|
||||
case _: Any => false
|
||||
}
|
||||
|
||||
/** Every address is derived from a [[org.bitcoins.core.crypto.HashDigest HashDigest]] in a
|
||||
* [[org.bitcoins.core.protocol.transaction.TransactionOutput TransactionOutput]] */
|
||||
def hash: HashDigest
|
||||
|
|
|
@ -3,6 +3,7 @@ package org.bitcoins.core.protocol.blockchain
|
|||
import java.math.BigInteger
|
||||
import java.nio.charset.StandardCharsets
|
||||
|
||||
import org.bitcoins.core.config.{BitcoinNetwork, MainNet, NetworkParameters, RegTest, TestNet3}
|
||||
import org.bitcoins.core.consensus.Merkle
|
||||
import org.bitcoins.core.crypto.DoubleSha256Digest
|
||||
import org.bitcoins.core.currency.{CurrencyUnit, Satoshis}
|
||||
|
@ -178,6 +179,25 @@ sealed abstract class ChainParams {
|
|||
def difficultyChangeInterval: Long = {
|
||||
powTargetTimeSpan.toSeconds / powTargetSpacing.toSeconds
|
||||
}
|
||||
|
||||
/**
|
||||
* Whether we should allow minimum difficulty blocks or not
|
||||
* As an example you can trivially mine blocks on [[RegTestNetChainParams]] and [[TestNetChainParams]]
|
||||
* but not the [[MainNetChainParams]]
|
||||
* @return
|
||||
*/
|
||||
def allowMinDifficultyBlocks: Boolean
|
||||
|
||||
/**
|
||||
* Whether this chain supports
|
||||
* proof of work retargeting or not
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/eb7daf4d600eeb631427c018a984a77a34aca66e/src/consensus/params.h#L72 link]]
|
||||
* @return
|
||||
*/
|
||||
def noRetargeting: Boolean
|
||||
|
||||
/** The [[org.bitcoins.core.config.BitcoinNetwork network]] that corresponds to this chain param */
|
||||
def network: NetworkParameters
|
||||
}
|
||||
|
||||
sealed abstract class BitcoinChainParams extends ChainParams {
|
||||
|
@ -200,21 +220,11 @@ sealed abstract class BitcoinChainParams extends ChainParams {
|
|||
/** The best chain should have this amount of work */
|
||||
def minimumChainWork: BigInteger
|
||||
|
||||
/**
|
||||
* Whether we should allow minimum difficulty blocks or not
|
||||
* As an example you can trivially mine blocks on [[RegTestNetChainParams]] and [[TestNetChainParams]]
|
||||
* but not the [[MainNetChainParams]]
|
||||
* @return
|
||||
*/
|
||||
def allowMinDifficultyBlocks: Boolean
|
||||
|
||||
/**
|
||||
* Whether this chain supports
|
||||
* proof of work retargeting or not
|
||||
* [[https://github.com/bitcoin/bitcoin/blob/eb7daf4d600eeb631427c018a984a77a34aca66e/src/consensus/params.h#L72 link]]
|
||||
* @return
|
||||
* @inheritdoc
|
||||
*/
|
||||
def noRetargeting: Boolean
|
||||
def network: BitcoinNetwork
|
||||
}
|
||||
|
||||
/** The Main Network parameters. */
|
||||
|
@ -272,6 +282,11 @@ object MainNetChainParams extends BitcoinChainParams {
|
|||
* [[https://github.com/bitcoin/bitcoin/blob/a083f75ba79d465f15fddba7b00ca02e31bb3d40/src/chainparams.cpp#L76 mainnet pow retargetting]]
|
||||
*/
|
||||
override lazy val noRetargeting: Boolean = false
|
||||
|
||||
/**
|
||||
* @inheritdoc
|
||||
*/
|
||||
override lazy val network: BitcoinNetwork = MainNet
|
||||
}
|
||||
|
||||
object TestNetChainParams extends BitcoinChainParams {
|
||||
|
@ -320,6 +335,11 @@ object TestNetChainParams extends BitcoinChainParams {
|
|||
* [[https://github.com/bitcoin/bitcoin/blob/a083f75ba79d465f15fddba7b00ca02e31bb3d40/src/chainparams.cpp#L193 testnet pow retargetting]]
|
||||
*/
|
||||
override lazy val noRetargeting: Boolean = false
|
||||
|
||||
/**
|
||||
* @inheritdoc
|
||||
*/
|
||||
override lazy val network: BitcoinNetwork = TestNet3
|
||||
}
|
||||
|
||||
object RegTestNetChainParams extends BitcoinChainParams {
|
||||
|
@ -362,6 +382,11 @@ object RegTestNetChainParams extends BitcoinChainParams {
|
|||
* [[https://github.com/bitcoin/bitcoin/blob/a083f75ba79d465f15fddba7b00ca02e31bb3d40/src/chainparams.cpp#L288 regtest pow retargetting]]
|
||||
*/
|
||||
override lazy val noRetargeting: Boolean = true
|
||||
|
||||
/**
|
||||
* @inheritdoc
|
||||
*/
|
||||
override lazy val network: BitcoinNetwork = RegTest
|
||||
}
|
||||
|
||||
sealed abstract class Base58Type
|
||||
|
|
|
@ -0,0 +1,69 @@
|
|||
package org.bitcoins.core.script
|
||||
|
||||
/**
|
||||
* The different Bitcoin Script type variations
|
||||
*
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/fa6180188b8ab89af97860e6497716405a48bab6/src/script/standard.h#L56 standard.h]]
|
||||
* and [[https://github.com/bitcoin/bitcoin/blob/03732f8644a449af34f4df1bb3b8915fb15ef22c/src/script/standard.cpp#L27 standarc.cpp]]
|
||||
* from Bitcoin Core
|
||||
*/
|
||||
sealed abstract class ScriptType {
|
||||
import org.bitcoins.core.script.ScriptType._
|
||||
override def toString = this match {
|
||||
case NONSTANDARD => "nonstandard"
|
||||
case PUBKEY => "pubkey"
|
||||
case PUBKEYHASH => "pubkeyhash"
|
||||
case SCRIPTHASH => "scripthash"
|
||||
case MULTISIG => "multisig"
|
||||
case NULLDATA => "nulldata"
|
||||
case WITNESS_V0_KEYHASH => "witness_v0_keyhash"
|
||||
case WITNESS_V0_SCRIPTHASH => "witness_v0_scripthash"
|
||||
case WITNESS_UNKNOWN => "witness_unknown"
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* The different Bitcoin Script type variations
|
||||
*
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/fa6180188b8ab89af97860e6497716405a48bab6/src/script/standard.h#L56 standard.h]]
|
||||
* and [[https://github.com/bitcoin/bitcoin/blob/03732f8644a449af34f4df1bb3b8915fb15ef22c/src/script/standard.cpp#L27 standarc.cpp]]
|
||||
* from Bitcoin Core
|
||||
*/
|
||||
object ScriptType {
|
||||
private val all: Seq[ScriptType] = Vector(NONSTANDARD,
|
||||
PUBKEY,
|
||||
PUBKEYHASH,
|
||||
SCRIPTHASH,
|
||||
MULTISIG,
|
||||
NULLDATA,
|
||||
WITNESS_V0_KEYHASH,
|
||||
WITNESS_V0_SCRIPTHASH,
|
||||
WITNESS_UNKNOWN)
|
||||
|
||||
def fromString(string: String): Option[ScriptType] =
|
||||
all.find(_.toString == string)
|
||||
|
||||
/** Throws if given string is invalid */
|
||||
def fromStringExn(string: String): ScriptType =
|
||||
fromString(string)
|
||||
.getOrElse(
|
||||
throw new IllegalArgumentException(
|
||||
s"$string is not a valid script type!"))
|
||||
|
||||
final case object NONSTANDARD extends ScriptType
|
||||
|
||||
// ╔ "standard" transaction/script types
|
||||
// V
|
||||
final case object PUBKEY extends ScriptType
|
||||
final case object PUBKEYHASH extends ScriptType
|
||||
final case object SCRIPTHASH extends ScriptType
|
||||
final case object MULTISIG extends ScriptType
|
||||
|
||||
/** unspendable OP_RETURN script that carries data */
|
||||
final case object NULLDATA extends ScriptType
|
||||
final case object WITNESS_V0_KEYHASH extends ScriptType
|
||||
final case object WITNESS_V0_SCRIPTHASH extends ScriptType
|
||||
|
||||
/** Only for Witness versions not already defined */
|
||||
final case object WITNESS_UNKNOWN extends ScriptType
|
||||
}
|
|
@ -49,14 +49,23 @@ sealed abstract class RawSerializerHelper {
|
|||
def writeCmpctSizeUInt[T](
|
||||
ts: Seq[T],
|
||||
serializer: T => ByteVector): ByteVector = {
|
||||
val serialized = ts.foldLeft(ByteVector.empty) {
|
||||
case (accum, t) =>
|
||||
val ser = serializer(t)
|
||||
accum ++ ser
|
||||
}
|
||||
val serialized = write(ts,serializer)
|
||||
val cmpct = CompactSizeUInt(UInt64(ts.size))
|
||||
cmpct.bytes ++ serialized
|
||||
}
|
||||
|
||||
|
||||
/** Serializes a [[Seq]] of [[org.bitcoins.core.protocol.NetworkElement]] to a [[scodec.bits.ByteVector]] */
|
||||
def writeNetworkElements[T <: NetworkElement](ts: Seq[T]): ByteVector = {
|
||||
val f = {t : T => t.bytes}
|
||||
write(ts, f)
|
||||
}
|
||||
|
||||
def write[T](ts: Seq[T], serializer: T => ByteVector): ByteVector = {
|
||||
ts.foldLeft(ByteVector.empty) { case (accum, t) =>
|
||||
accum ++ serializer(t)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
object RawSerializerHelper extends RawSerializerHelper
|
||||
|
|
|
@ -5,10 +5,8 @@ import org.slf4j.{Logger, LoggerFactory}
|
|||
/**
|
||||
* Created by chris on 3/11/16.
|
||||
*/
|
||||
abstract class BitcoinSLogger {
|
||||
|
||||
def logger: Logger = LoggerFactory.getLogger(this.getClass().toString)
|
||||
|
||||
trait BitcoinSLogger {
|
||||
lazy val logger: Logger = LoggerFactory.getLogger(getClass)
|
||||
}
|
||||
|
||||
object BitcoinSLogger extends BitcoinSLogger
|
||||
|
|
|
@ -1,5 +1,8 @@
|
|||
package org.bitcoins.core.util
|
||||
|
||||
import java.net.InetSocketAddress
|
||||
|
||||
import com.sun.jndi.toolkit.url.Uri
|
||||
import org.bitcoins.core.protocol.NetworkElement
|
||||
import scodec.bits.{BitVector, ByteVector}
|
||||
|
||||
|
@ -101,6 +104,10 @@ trait BitcoinSUtil {
|
|||
h.foldLeft(ByteVector.empty)(_ ++ _.bytes)
|
||||
}
|
||||
|
||||
def toInetSocketAddress(string: String): InetSocketAddress = {
|
||||
val uri = new Uri(string)
|
||||
new InetSocketAddress(uri.getHost, uri.getPort)
|
||||
}
|
||||
}
|
||||
|
||||
object BitcoinSUtil extends BitcoinSUtil
|
||||
|
|
88
core/src/main/scala/org/bitcoins/core/util/EitherUtil.scala
Normal file
88
core/src/main/scala/org/bitcoins/core/util/EitherUtil.scala
Normal file
|
@ -0,0 +1,88 @@
|
|||
package org.bitcoins.core.util
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
import scala.util.{Try, Success, Failure}
|
||||
|
||||
/**
|
||||
* @define liftBiasedFut Given a [[scala.Either Either]] that contains a
|
||||
* [[scala.concurrent.Future Future[L | R] ]] only on one side,
|
||||
* transforms it into a future [[scala.Either Either[L, R] ]]
|
||||
*/
|
||||
object EitherUtil {
|
||||
|
||||
/**
|
||||
* Flattens a nested `Either[Foo, Future[Foo, Bar]]` into
|
||||
* a `Future[Either[Foo, Bar]]`. This is useful for situtations
|
||||
* where the right hand side of an either is asynchronous.
|
||||
*/
|
||||
def flattenFutureE[L, R](
|
||||
either: Either[L, Future[Either[L, R]]]
|
||||
): Future[Either[L, R]] = {
|
||||
|
||||
def ifLeft(left: L): Future[Either[L, R]] = Future.successful(Left(left))
|
||||
def ifRight(rightF: Future[Either[L, R]]): Future[Either[L, R]] = rightF
|
||||
|
||||
either.fold(ifLeft, ifRight)
|
||||
}
|
||||
|
||||
/** $liftBiasedFut */
|
||||
def liftRightBiasedFutureE[L, R](
|
||||
either: Either[L, Future[R]]
|
||||
)(implicit ec: ExecutionContext): Future[Either[L, R]] =
|
||||
either match {
|
||||
case Right(fut) => fut.map(Right(_))
|
||||
case Left(l) => Future.successful(Left(l))
|
||||
}
|
||||
|
||||
/** $liftBiasedFut */
|
||||
def listLeftBiasedFutureE[L, R](
|
||||
either: Either[Future[L], R]
|
||||
)(implicit ec: ExecutionContext): Future[Either[L, R]] =
|
||||
either match {
|
||||
case Left(fut) => fut.map(Left(_))
|
||||
case Right(l) => Future.successful(Right(l))
|
||||
}
|
||||
|
||||
object EitherOps {
|
||||
import scala.language.implicitConversions
|
||||
implicit def either2EnhancedEither[A, B](
|
||||
either: Either[A, B]
|
||||
): EnchancedEither[A, B] = EnchancedEither(either)
|
||||
|
||||
implicit def enchancedEither2Either[A, B](
|
||||
enhanced: EnchancedEither[A, B]): Either[A, B] = enhanced.underlying
|
||||
}
|
||||
|
||||
/** The methods here are copied directly from the 2.12 stdlib */
|
||||
case class EnchancedEither[A, B](
|
||||
private[EitherUtil] val underlying: Either[A, B]) {
|
||||
|
||||
/** The given function is applied if this is a `Right`.
|
||||
*
|
||||
* {{{
|
||||
* Right(12).map(x => "flower") // Result: Right("flower")
|
||||
* Left(12).map(x => "flower") // Result: Left(12)
|
||||
* }}}
|
||||
*/
|
||||
def map[B1](f: B => B1): EnchancedEither[A, B1] = underlying match {
|
||||
case Right(b) => EnchancedEither(Right(f(b)))
|
||||
case _ => EnchancedEither(this.asInstanceOf[Either[A, B1]])
|
||||
}
|
||||
|
||||
/** Binds the given function across `Right`.
|
||||
*
|
||||
* @param f The function to bind across `Right`.
|
||||
*/
|
||||
def flatMap[A1 >: A, B1](f: B => Either[A1, B1]): EnchancedEither[A1, B1] =
|
||||
underlying match {
|
||||
case Right(b) => EnchancedEither(f(b))
|
||||
case _ => EnchancedEither(underlying.asInstanceOf[Either[A1, B1]])
|
||||
}
|
||||
|
||||
def toTry(implicit ev: A <:< Throwable): Try[B] = underlying match {
|
||||
case Right(b) => Success(b)
|
||||
case Left(a) => Failure(a)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,9 @@
|
|||
package org.bitcoins.core.util
|
||||
|
||||
object FileUtil {
|
||||
|
||||
/** Returns a BufferedSource for any file on the classpath */
|
||||
def getFileAsSource(fileName: String): scala.io.BufferedSource = {
|
||||
scala.io.Source.fromURL(getClass.getResource(s"/$fileName"))
|
||||
}
|
||||
}
|
|
@ -0,0 +1,8 @@
|
|||
package org.bitcoins.core.util
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
object FutureUtil {
|
||||
|
||||
val unit: Future[Unit] = Future.successful(())
|
||||
}
|
|
@ -298,6 +298,30 @@ sealed abstract class NumberUtil extends BitcoinSLogger {
|
|||
targetCompression(difficultyHelper.difficulty, difficultyHelper.isNegative)
|
||||
}
|
||||
|
||||
/**
|
||||
* Implements this check for overflowing for [[org.bitcoins.core.protocol.blockchain.BlockHeader.nBits]]
|
||||
* @see [[https://github.com/bitcoin/bitcoin/blob/2068f089c8b7b90eb4557d3f67ea0f0ed2059a23/src/arith_uint256.cpp#L220 bitcoin core check]]
|
||||
* @param nBits
|
||||
* @return
|
||||
*/
|
||||
def isNBitsOverflow(nBits: UInt32): Boolean = {
|
||||
val noSignificand = nBits.bytes.takeRight(3)
|
||||
val mantissaBytes = {
|
||||
val withSignBit = noSignificand
|
||||
val noSignBit = false +: withSignBit.bits.tail
|
||||
noSignBit.toByteVector
|
||||
}
|
||||
|
||||
val nSize: Long = nBits.toLong >>> 24L
|
||||
|
||||
val nWord: UInt32 = UInt32.fromBytes(mantissaBytes)
|
||||
|
||||
nWord != UInt32.zero && (
|
||||
nSize > 34 ||
|
||||
(nWord > UInt32(UInt8.max.toInt) && nSize > 33) ||
|
||||
(nWord > UInt32(0xffff) && nSize > 32)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
object NumberUtil extends NumberUtil
|
||||
|
|
|
@ -21,8 +21,26 @@ sealed abstract class FeeUnit {
|
|||
*/
|
||||
sealed abstract class BitcoinFeeUnit extends FeeUnit
|
||||
|
||||
case class SatoshisPerByte(currencyUnit: CurrencyUnit) extends BitcoinFeeUnit
|
||||
case class SatoshisPerByte(currencyUnit: CurrencyUnit) extends BitcoinFeeUnit {
|
||||
def toSatPerKb: SatoshisPerKiloByte = {
|
||||
SatoshisPerKiloByte(currencyUnit.satoshis * Satoshis(Int64(1000)))
|
||||
}
|
||||
}
|
||||
|
||||
case class SatoshisPerKiloByte(currencyUnit: CurrencyUnit) extends BitcoinFeeUnit {
|
||||
def toSatPerByte: SatoshisPerByte = {
|
||||
val conversionOpt = (currencyUnit.toBigDecimal * 0.001).toBigIntExact()
|
||||
conversionOpt match {
|
||||
case Some(conversion) =>
|
||||
val sat = Satoshis(Int64(conversion))
|
||||
SatoshisPerByte(sat)
|
||||
|
||||
case None =>
|
||||
throw new RuntimeException(s"Failed to convert sat/kb -> sat/byte for ${currencyUnit}")
|
||||
}
|
||||
|
||||
}
|
||||
}
|
||||
/**
|
||||
* A 'virtual byte' (also known as virtual size) is a new weight measurement that
|
||||
* was created with segregated witness (BIP141). Now 1 'virtual byte'
|
||||
|
|
21
db-commons/README.md
Normal file
21
db-commons/README.md
Normal file
|
@ -0,0 +1,21 @@
|
|||
### db-commons
|
||||
|
||||
This is a project that is meant to contain re-usable database related infrastructure for bitcoin-s. This project is a dependency of [`wallet`](../wallet/) and [`node`](../node).
|
||||
|
||||
The library that bitcoin-s currently uses for database related things is called [`Slick`](http://slick.lightbend.com/doc/3.3.0/).
|
||||
|
||||
The most important file in this project is [`DbConfig`](src/main/scala/org/bitcoins/db/DbConfig.scala). This provides a
|
||||
common way for databases to be accessed from configuration files. For more information on how Slick configuration files
|
||||
work please see this [reference](http://slick.lightbend.com/doc/3.3.0/gettingstarted.html#database-configuration).
|
||||
|
||||
|
||||
This project expects the following keys for databases
|
||||
|
||||
- mainnetDb
|
||||
- testnet3Db
|
||||
- regtestDb
|
||||
- unittestDb
|
||||
|
||||
This will be read by [`DbConfig`](src/main/scala/org/bitcoins/db/DbConfig.scala) to specify database information related
|
||||
to a specific project. You can look at the database configuration for the [`node`](../node/src/main/resources/application.conf) project for an example
|
||||
of how this works.
|
17
db-commons/src/main/resources/db.conf
Normal file
17
db-commons/src/main/resources/db.conf
Normal file
|
@ -0,0 +1,17 @@
|
|||
bitcoin-s {
|
||||
database {
|
||||
dataSourceClass = slick.jdbc.DatabaseUrlDataSource
|
||||
profile = "slick.jdbc.SQLiteProfile$"
|
||||
dbPath = ${bitcoin-s.datadir}/${bitcoin-s.network}/
|
||||
|
||||
# this config key is read by Slick
|
||||
db {
|
||||
driver = org.sqlite.JDBC
|
||||
url = "jdbc:sqlite:"${bitcoin-s.database.dbPath}${bitcoin-s.database.name}
|
||||
|
||||
# as long as we're on SQLite there's no point
|
||||
# in doing connection pooling
|
||||
connectionPool = disabled
|
||||
}
|
||||
}
|
||||
}
|
4
db-commons/src/main/resources/reference.conf
Normal file
4
db-commons/src/main/resources/reference.conf
Normal file
|
@ -0,0 +1,4 @@
|
|||
bitcoin-s {
|
||||
datadir = ${HOME}/.bitcoin-s
|
||||
network = regtest # regtest, testnet3, mainnet
|
||||
}
|
278
db-commons/src/main/scala/org/bitcoins/db/AppConfig.scala
Normal file
278
db-commons/src/main/scala/org/bitcoins/db/AppConfig.scala
Normal file
|
@ -0,0 +1,278 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
import org.bitcoins.core.config.NetworkParameters
|
||||
import org.bitcoins.core.protocol.blockchain.ChainParams
|
||||
import java.nio.file.Path
|
||||
import java.nio.file.Paths
|
||||
|
||||
import org.bitcoins.core.config.MainNet
|
||||
import org.bitcoins.core.config.TestNet3
|
||||
import org.bitcoins.core.config.RegTest
|
||||
import com.typesafe.config._
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
import slick.jdbc.SQLiteProfile
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
import scala.util.Try
|
||||
import scala.util.Success
|
||||
import scala.util.Failure
|
||||
import slick.basic.DatabaseConfig
|
||||
import org.bitcoins.core.protocol.blockchain.MainNetChainParams
|
||||
import org.bitcoins.core.protocol.blockchain.TestNetChainParams
|
||||
import org.bitcoins.core.protocol.blockchain.RegTestNetChainParams
|
||||
import java.nio.file.Files
|
||||
|
||||
import scala.util.Properties
|
||||
import scala.util.matching.Regex
|
||||
|
||||
/**
|
||||
* Everything needed to configure functionality
|
||||
* of bitcoin-s applications is found in here.
|
||||
*
|
||||
* @see [[https://github.com/bitcoin-s/bitcoin-s-core/blob/master/doc/configuration.md `configuration.md`]]
|
||||
* for more information.
|
||||
*/
|
||||
abstract class AppConfig extends BitcoinSLogger {
|
||||
|
||||
/** Sub members of AppConfig should override this type with
|
||||
* the type of themselves, ensuring `withOverrides` return
|
||||
* the correct type
|
||||
*/
|
||||
protected type ConfigType <: AppConfig
|
||||
|
||||
/** Constructor to make a new instance of this config type */
|
||||
protected def newConfigOfType(configOverrides: List[Config]): ConfigType
|
||||
|
||||
/** List of user-provided configs that should
|
||||
* override defaults
|
||||
*/
|
||||
protected val configOverrides: List[Config] = List.empty
|
||||
|
||||
/**
|
||||
* This method returns a new `AppConfig`, where every
|
||||
* key under `bitcoin-s` overrides the configuration
|
||||
* picked up by other means (the `reference.conf`
|
||||
* provided by bitcoin-s and the `application.conf`
|
||||
* provided by the user). If you pass in configs with
|
||||
* overlapping keys (e.g. several configs with the key
|
||||
* `bitcoin-s.network`), the latter config overrides the
|
||||
* first.
|
||||
*/
|
||||
def withOverrides(config: Config, configs: Config*): ConfigType = {
|
||||
// the two val assignments below are workarounds
|
||||
// for awkward name resolution in the block below
|
||||
val firstOverride = config
|
||||
|
||||
val numOverrides = configs.length + 1
|
||||
|
||||
if (logger.isDebugEnabled()) {
|
||||
// force lazy evaluation before we print
|
||||
// our lines
|
||||
val oldConfStr = this.config.asReadableJson
|
||||
|
||||
logger.debug(s"Creating AppConfig with $numOverrides override(s) ")
|
||||
logger.debug(s"Old config:")
|
||||
logger.debug(oldConfStr)
|
||||
}
|
||||
|
||||
val newConf = newConfigOfType(
|
||||
configOverrides = List(firstOverride) ++ configs
|
||||
)
|
||||
|
||||
// to avoid non-necessary lazy load
|
||||
if (logger.isDebugEnabled()) {
|
||||
// force lazy load before we print
|
||||
val newConfStr = newConf.config.asReadableJson
|
||||
|
||||
logger.debug("New config:")
|
||||
logger.debug(newConfStr)
|
||||
}
|
||||
|
||||
newConf
|
||||
}
|
||||
|
||||
/**
|
||||
* Name of module specific
|
||||
* config file. `wallet.conf`, `node.conf`,
|
||||
* etc.
|
||||
*/
|
||||
protected def moduleConfigName: String
|
||||
|
||||
/**
|
||||
* The configuration details for connecting/using the database for our projects
|
||||
* that require datbase connections
|
||||
*/
|
||||
lazy val dbConfig: DatabaseConfig[SQLiteProfile] = {
|
||||
//if we don't pass specific class, non-deterministic
|
||||
//errors around the loaded configuration depending
|
||||
//on the state of the default classLoader
|
||||
//https://github.com/lightbend/config#debugging-your-configuration
|
||||
val dbConfig = {
|
||||
Try {
|
||||
DatabaseConfig.forConfig[SQLiteProfile](path = "database", config)
|
||||
} match {
|
||||
case Success(value) =>
|
||||
value
|
||||
case Failure(exception) =>
|
||||
logger.error(s"Error when loading database from config: $exception")
|
||||
logger.error(s"Configuration: ${config.asReadableJson}")
|
||||
throw exception
|
||||
}
|
||||
}
|
||||
|
||||
logger.trace(s"Resolved DB config: ${dbConfig.config}")
|
||||
|
||||
val _ = createDbFileIfDNE()
|
||||
|
||||
dbConfig
|
||||
}
|
||||
|
||||
/** The database we are connecting to */
|
||||
lazy val database: Database = {
|
||||
dbConfig.db
|
||||
}
|
||||
|
||||
/** The path where our DB is located */
|
||||
// todo: what happens when to this if we
|
||||
// dont use SQLite?
|
||||
lazy val dbPath: Path = {
|
||||
val pathStr = config.getString("database.dbPath")
|
||||
val path = Paths.get(pathStr)
|
||||
logger.debug(s"DB path: $path")
|
||||
path
|
||||
}
|
||||
|
||||
private def createDbFileIfDNE(): Unit = {
|
||||
//should add a check in here that we are using sqlite
|
||||
if (!Files.exists(dbPath)) {
|
||||
logger.debug(s"Creating database directory=$dbPath")
|
||||
val _ = Files.createDirectories(dbPath)
|
||||
()
|
||||
}
|
||||
}
|
||||
|
||||
/** Chain parameters for the blockchain we're on */
|
||||
lazy val chain: ChainParams = {
|
||||
val networkStr = config.getString("network")
|
||||
networkStr match {
|
||||
case "mainnet" => MainNetChainParams
|
||||
case "testnet3" => TestNetChainParams
|
||||
case "regtest" => RegTestNetChainParams
|
||||
case other: String =>
|
||||
throw new IllegalArgumentException(
|
||||
s"'$other' is not a recognized network! Available options: mainnet, testnet3, regtest")
|
||||
}
|
||||
}
|
||||
|
||||
/** The blockchain network we're on */
|
||||
lazy val network: NetworkParameters = chain.network
|
||||
|
||||
/**
|
||||
* The underlying config that we derive the
|
||||
* rest of the fields in this class from
|
||||
*/
|
||||
protected lazy val config: Config = {
|
||||
val moduleConfig =
|
||||
ConfigFactory.load(moduleConfigName)
|
||||
|
||||
logger.debug(
|
||||
s"Module config: ${moduleConfig.getConfig("bitcoin-s").asReadableJson}")
|
||||
|
||||
// `load` tries to resolve substitions,
|
||||
// `parseResources` does not
|
||||
val dbConfig = ConfigFactory
|
||||
.parseResources("db.conf")
|
||||
|
||||
logger.trace(
|
||||
s"DB config: ${dbConfig.getConfig("bitcoin-s").asReadableJson}")
|
||||
|
||||
val classPathConfig =
|
||||
ConfigFactory
|
||||
.load()
|
||||
|
||||
logger.trace(
|
||||
s"Classpath config: ${classPathConfig.getConfig("bitcoin-s").asReadableJson}")
|
||||
|
||||
// loads reference.conf as well as application.conf,
|
||||
// if the user has made one
|
||||
val unresolvedConfig = classPathConfig
|
||||
.withFallback(moduleConfig)
|
||||
.withFallback(dbConfig)
|
||||
|
||||
logger.trace(s"Unresolved bitcoin-s config:")
|
||||
logger.trace(unresolvedConfig.getConfig("bitcoin-s").asReadableJson)
|
||||
|
||||
val withOverrides =
|
||||
if (configOverrides.nonEmpty) {
|
||||
val overrides =
|
||||
configOverrides
|
||||
// we reverse to make the configs specified last take precedent
|
||||
.reverse
|
||||
.reduce(_.withFallback(_))
|
||||
|
||||
val interestingOverrides = overrides.getConfig("bitcoin-s")
|
||||
logger.trace(s"User-overrides for bitcoin-s config:")
|
||||
logger.trace(interestingOverrides.asReadableJson)
|
||||
|
||||
// to make the overrides actually override
|
||||
// the default setings we have to do it
|
||||
// in this order
|
||||
overrides.withFallback(unresolvedConfig)
|
||||
} else {
|
||||
unresolvedConfig
|
||||
}
|
||||
|
||||
val config = withOverrides
|
||||
.resolve()
|
||||
.getConfig("bitcoin-s")
|
||||
|
||||
logger.debug(s"Resolved bitcoin-s config:")
|
||||
logger.debug(config.asReadableJson)
|
||||
|
||||
config
|
||||
|
||||
}
|
||||
|
||||
/** The data directory used by bitcoin-s apps */
|
||||
lazy val datadir: Path = {
|
||||
val basedir = Paths.get(config.getString("datadir"))
|
||||
val lastDirname = network match {
|
||||
case MainNet => "mainnet"
|
||||
case TestNet3 => "testnet3"
|
||||
case RegTest => "regtest"
|
||||
}
|
||||
basedir.resolve(lastDirname)
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
object AppConfig extends BitcoinSLogger {
|
||||
|
||||
/**
|
||||
* Matches the default data directory location
|
||||
* with a network appended,
|
||||
* both with and without a trailing `/`
|
||||
*/
|
||||
private val defaultDatadirRegex: Regex = {
|
||||
(Properties.userHome + "/.bitcoin-s/(testnet3|mainnet|regtest)/?$").r
|
||||
}
|
||||
|
||||
/**
|
||||
* Throws if the encountered datadir is the default one. Useful
|
||||
* in tests, to make sure you don't blow up important data.
|
||||
*/
|
||||
private[bitcoins] def throwIfDefaultDatadir(config: AppConfig): Unit = {
|
||||
val datadirStr = config.datadir.toString()
|
||||
AppConfig.defaultDatadirRegex.findFirstMatchIn(datadirStr) match {
|
||||
case None => () // pass
|
||||
case Some(_) =>
|
||||
val errMsg =
|
||||
List(
|
||||
"It looks like you haven't changed the data directory in your test configuration.",
|
||||
s"Your data directory is $datadirStr. This would cause tests to potentially",
|
||||
"overwrite your existing data, which you probably don't want."
|
||||
).mkString(" ")
|
||||
throw new RuntimeException(errMsg)
|
||||
}
|
||||
}
|
||||
}
|
148
db-commons/src/main/scala/org/bitcoins/db/CRUD.scala
Normal file
148
db-commons/src/main/scala/org/bitcoins/db/CRUD.scala
Normal file
|
@ -0,0 +1,148 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
/**
|
||||
* Created by chris on 9/8/16.
|
||||
* This is an abstract actor that can be used to implement any sort of
|
||||
* actor that accesses a Postgres database. It creates
|
||||
* read, update, upsert, and delete methods for your actor to call.
|
||||
* You are responsible for the create function. You also need to specify
|
||||
* the table and the database you are connecting to.
|
||||
*/
|
||||
abstract class CRUD[T, PrimaryKeyType] extends BitcoinSLogger {
|
||||
|
||||
def appConfig: AppConfig
|
||||
implicit val ec: ExecutionContext
|
||||
|
||||
/** The table inside our database we are inserting into */
|
||||
val table: TableQuery[_ <: Table[T]]
|
||||
|
||||
/** Binding to the actual database itself, this is what is used to run querys */
|
||||
def database: SafeDatabase = SafeDatabase(appConfig)
|
||||
|
||||
/**
|
||||
* create a record in the database
|
||||
*
|
||||
* @param t - the record to be inserted
|
||||
* @return the inserted record
|
||||
*/
|
||||
def create(t: T): Future[T] = createAll(Vector(t)).map(_.head)
|
||||
|
||||
def createAll(ts: Vector[T]): Future[Vector[T]]
|
||||
|
||||
/**
|
||||
* read a record from the database
|
||||
*
|
||||
* @param id - the id of the record to be read
|
||||
* @return Option[T] - the record if found, else none
|
||||
*/
|
||||
def read(id: PrimaryKeyType): Future[Option[T]] = {
|
||||
val query = findByPrimaryKey(id)
|
||||
val rows: Future[Seq[T]] = database.run(query.result)
|
||||
rows.map(_.headOption)
|
||||
}
|
||||
|
||||
/** Update the corresponding record in the database */
|
||||
def update(t: T): Future[T] = {
|
||||
updateAll(Vector(t)).map { ts =>
|
||||
ts.headOption match {
|
||||
case Some(updated) => updated
|
||||
case None => throw UpdateFailedException("Update failed for: " + t)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/** Updates all of the given ts in the database */
|
||||
def updateAll(ts: Vector[T]): Future[Vector[T]] = {
|
||||
val query = findAll(ts)
|
||||
val actions = ts.map(t => query.update(t))
|
||||
val affectedRows: Future[Vector[Int]] = database.run(DBIO.sequence(actions))
|
||||
val updatedTs = findAll(ts)
|
||||
affectedRows.flatMap { _ =>
|
||||
database.runVec(updatedTs.result)
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* delete the corresponding record in the database
|
||||
*
|
||||
* @param t - the record to be deleted
|
||||
* @return int - the number of rows affected by the deletion
|
||||
*/
|
||||
def delete(t: T): Future[Int] = {
|
||||
logger.debug("Deleting record: " + t)
|
||||
val query: Query[Table[_], T, Seq] = find(t)
|
||||
database.run(query.delete)
|
||||
}
|
||||
|
||||
/**
|
||||
* insert the record if it does not exist, update it if it does
|
||||
*
|
||||
* @param t - the record to inserted / updated
|
||||
* @return t - the record that has been inserted / updated
|
||||
*/
|
||||
def upsert(t: T): Future[T] = upsertAll(Vector(t)).map(_.head)
|
||||
|
||||
/** Upserts all of the given ts in the database, then returns the upserted values */
|
||||
def upsertAll(ts: Vector[T]): Future[Vector[T]] = {
|
||||
val actions = ts.map(t => table.insertOrUpdate(t))
|
||||
val result: Future[Vector[Int]] = database.run(DBIO.sequence(actions))
|
||||
val findQueryFuture = result.map(_ => findAll(ts).result)
|
||||
findQueryFuture.flatMap(database.runVec(_))
|
||||
}
|
||||
|
||||
/**
|
||||
* return all rows that have a certain primary key
|
||||
*
|
||||
* @param id
|
||||
* @return Query object corresponding to the selected rows
|
||||
*/
|
||||
protected def findByPrimaryKey(id: PrimaryKeyType): Query[Table[_], T, Seq] =
|
||||
findByPrimaryKeys(Vector(id))
|
||||
|
||||
/** Finds the rows that correlate to the given primary keys */
|
||||
protected def findByPrimaryKeys(
|
||||
ids: Vector[PrimaryKeyType]): Query[Table[_], T, Seq]
|
||||
|
||||
/**
|
||||
* return the row that corresponds with this record
|
||||
*
|
||||
* @param t - the row to find
|
||||
* @return query - the sql query to find this record
|
||||
*/
|
||||
protected def find(t: T): Query[Table[_], T, Seq] = findAll(Vector(t))
|
||||
|
||||
protected def findAll(ts: Vector[T]): Query[Table[_], T, Seq]
|
||||
|
||||
}
|
||||
|
||||
case class SafeDatabase(config: AppConfig) extends BitcoinSLogger {
|
||||
|
||||
import config.database
|
||||
|
||||
/**
|
||||
* SQLite does not enable foreign keys by default. This query is
|
||||
* used to enable it. It must be included in all connections to
|
||||
* the database.
|
||||
*/
|
||||
private val foreignKeysPragma = sqlu"PRAGMA foreign_keys = TRUE;"
|
||||
|
||||
def run[R](action: DBIOAction[R, NoStream, _]): Future[R] = {
|
||||
|
||||
val result = database.run[R](foreignKeysPragma >> action)
|
||||
result
|
||||
}
|
||||
|
||||
def runVec[R](action: DBIOAction[Seq[R], NoStream, _])(
|
||||
implicit ec: ExecutionContext): Future[Vector[R]] = {
|
||||
val result = database.run[Seq[R]](foreignKeysPragma >> action)
|
||||
result.map(_.toVector)
|
||||
}
|
||||
}
|
||||
|
||||
case class UpdateFailedException(message: String)
|
||||
extends RuntimeException(message)
|
30
db-commons/src/main/scala/org/bitcoins/db/CRUDAutoInc.scala
Normal file
30
db-commons/src/main/scala/org/bitcoins/db/CRUDAutoInc.scala
Normal file
|
@ -0,0 +1,30 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
import slick.dbio.Effect.Write
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
abstract class CRUDAutoInc[T <: DbRowAutoInc[T]] extends CRUD[T, Long] {
|
||||
|
||||
/** The table inside our database we are inserting into */
|
||||
override val table: TableQuery[_ <: TableAutoInc[T]]
|
||||
|
||||
override def createAll(ts: Vector[T]): Future[Vector[T]] = {
|
||||
val query = table
|
||||
.returning(table.map(_.id))
|
||||
.into((t, id) => t.copyWithId(id = id))
|
||||
val actions: Vector[DBIOAction[query.SingleInsertResult, NoStream, Write]] =
|
||||
ts.map(r => query.+=(r))
|
||||
database.runVec(DBIO.sequence(actions))
|
||||
}
|
||||
|
||||
override def findByPrimaryKeys(ids: Vector[Long]): Query[Table[_], T, Seq] = {
|
||||
table.filter(_.id.inSet(ids))
|
||||
}
|
||||
|
||||
override def findAll(ts: Vector[T]): Query[Table[_], T, Seq] = {
|
||||
val ids = ts.filter(_.id.isDefined).map(_.id.get)
|
||||
findByPrimaryKeys(ids)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,169 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
import org.bitcoins.core.crypto._
|
||||
import org.bitcoins.core.number.{Int32, UInt32, UInt64}
|
||||
import org.bitcoins.core.protocol.BitcoinAddress
|
||||
import org.bitcoins.core.protocol.script.{ScriptPubKey, ScriptWitness}
|
||||
import org.bitcoins.core.protocol.transaction.{
|
||||
TransactionOutPoint,
|
||||
TransactionOutput
|
||||
}
|
||||
import org.bitcoins.core.script.ScriptType
|
||||
import org.bitcoins.core.serializers.script.RawScriptWitnessParser
|
||||
import scodec.bits.ByteVector
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
import org.bitcoins.core.hd.HDCoinType
|
||||
import org.bitcoins.core.hd.HDPath
|
||||
import org.bitcoins.core.hd.HDChainType
|
||||
import org.bitcoins.core.hd.HDPurpose
|
||||
import org.bitcoins.core.hd.HDPurposes
|
||||
import org.bitcoins.core.hd.SegWitHDPath
|
||||
import slick.jdbc.GetResult
|
||||
|
||||
abstract class DbCommonsColumnMappers {
|
||||
|
||||
/**
|
||||
* If executing something like this:
|
||||
*
|
||||
* {{{
|
||||
* sql"SELECT * FROM sqlite_master where type='table'"
|
||||
* }}}
|
||||
*
|
||||
* you end up with something like this:
|
||||
* {{{
|
||||
* /-------+---------------+---------------+----------+----------------------\
|
||||
* | 1 | 2 | 3 | 4 | 5 |
|
||||
* | type | name | tbl_name | rootpage | sql |
|
||||
* |-------+---------------+---------------+----------+----------------------|
|
||||
* | table | block_headers | block_headers | 2 | CREATE TABLE "blo... |
|
||||
* \-------+---------------+---------------+----------+----------------------/
|
||||
* }}}
|
||||
*
|
||||
* This is most likely an implementation that will break of you try and cast
|
||||
* the result of a different raw SQL query into a
|
||||
* [[org.bitcoins.db.SQLiteTableInfo SQLiteTableInfo]].
|
||||
*/
|
||||
implicit val sqliteTableInfoReader: GetResult[SQLiteTableInfo] =
|
||||
GetResult[SQLiteTableInfo] { row =>
|
||||
row.nextString() // type
|
||||
row.nextString() // name
|
||||
val tableName = row.nextString()
|
||||
row.nextString() // rootpage
|
||||
val sql = row.nextString()
|
||||
SQLiteTableInfo(tableName, sql)
|
||||
|
||||
}
|
||||
|
||||
/** Responsible for mapping a [[DoubleSha256Digest]] to a String, and vice versa */
|
||||
implicit val doubleSha256DigestMapper: BaseColumnType[DoubleSha256Digest] =
|
||||
MappedColumnType.base[DoubleSha256Digest, String](
|
||||
_.hex,
|
||||
DoubleSha256Digest.fromHex
|
||||
)
|
||||
|
||||
implicit val doubleSha256DigestBEMapper: BaseColumnType[
|
||||
DoubleSha256DigestBE] =
|
||||
MappedColumnType.base[DoubleSha256DigestBE, String](
|
||||
_.hex,
|
||||
DoubleSha256DigestBE.fromHex
|
||||
)
|
||||
|
||||
implicit val ecPublicKeyMapper: BaseColumnType[ECPublicKey] =
|
||||
MappedColumnType.base[ECPublicKey, String](_.hex, ECPublicKey.fromHex)
|
||||
|
||||
implicit val sha256Hash160DigestMapper: BaseColumnType[Sha256Hash160Digest] =
|
||||
MappedColumnType
|
||||
.base[Sha256Hash160Digest, String](_.hex, Sha256Hash160Digest.fromHex)
|
||||
|
||||
/** Responsible for mapping a [[UInt32]] to a long in Slick, and vice versa */
|
||||
implicit val uInt32Mapper: BaseColumnType[UInt32] =
|
||||
MappedColumnType.base[UInt32, Long](
|
||||
tmap = _.toLong,
|
||||
tcomap = UInt32(_)
|
||||
)
|
||||
|
||||
implicit val int32Mapper: BaseColumnType[Int32] = {
|
||||
MappedColumnType.base[Int32, Long](tmap = _.toLong, tcomap = Int32(_))
|
||||
}
|
||||
|
||||
/** Responsible for mapping a [[TransactionOutput]] to hex in Slick, and vice versa */
|
||||
implicit val transactionOutputMapper: BaseColumnType[TransactionOutput] = {
|
||||
MappedColumnType.base[TransactionOutput, String](
|
||||
_.hex,
|
||||
TransactionOutput(_)
|
||||
)
|
||||
}
|
||||
|
||||
implicit val uint64Mapper: BaseColumnType[UInt64] = {
|
||||
MappedColumnType.base[UInt64, BigDecimal](
|
||||
{ u64: UInt64 =>
|
||||
BigDecimal(u64.toBigInt.bigInteger)
|
||||
},
|
||||
//this has the potential to throw
|
||||
{ bigDec: BigDecimal =>
|
||||
UInt64(bigDec.toBigIntExact().get)
|
||||
}
|
||||
)
|
||||
}
|
||||
|
||||
implicit val transactionOutPointMapper: BaseColumnType[TransactionOutPoint] = {
|
||||
MappedColumnType
|
||||
.base[TransactionOutPoint, String](_.hex, TransactionOutPoint(_))
|
||||
}
|
||||
|
||||
implicit val scriptPubKeyMapper: BaseColumnType[ScriptPubKey] = {
|
||||
MappedColumnType.base[ScriptPubKey, String](_.hex, ScriptPubKey(_))
|
||||
}
|
||||
|
||||
implicit val scriptWitnessMapper: BaseColumnType[ScriptWitness] = {
|
||||
MappedColumnType
|
||||
.base[ScriptWitness, String](
|
||||
_.hex,
|
||||
hex => RawScriptWitnessParser.read(ByteVector.fromValidHex(hex)))
|
||||
}
|
||||
|
||||
implicit val byteVectorMapper: BaseColumnType[ByteVector] = {
|
||||
MappedColumnType
|
||||
.base[ByteVector, String](_.toHex, ByteVector.fromValidHex(_))
|
||||
}
|
||||
|
||||
implicit val xpubMapper: BaseColumnType[ExtPublicKey] = {
|
||||
MappedColumnType
|
||||
.base[ExtPublicKey, String](_.toString, ExtPublicKey.fromString(_).get)
|
||||
}
|
||||
|
||||
implicit val hdCoinTypeMapper: BaseColumnType[HDCoinType] = {
|
||||
MappedColumnType.base[HDCoinType, Int](_.toInt, HDCoinType.fromInt)
|
||||
}
|
||||
|
||||
implicit val hdPathMappper: BaseColumnType[HDPath] =
|
||||
MappedColumnType
|
||||
.base[HDPath, String](_.toString, HDPath.fromString(_).get) // hm rethink .get?
|
||||
|
||||
implicit val segwitPathMappper: BaseColumnType[SegWitHDPath] =
|
||||
MappedColumnType
|
||||
.base[SegWitHDPath, String](_.toString, SegWitHDPath.fromString(_)) // hm rethink .get?
|
||||
|
||||
implicit val hdChainTypeMapper: BaseColumnType[HDChainType] =
|
||||
MappedColumnType.base[HDChainType, Int](_.index, HDChainType.fromInt)
|
||||
|
||||
implicit val hdPurposeMapper: BaseColumnType[HDPurpose] =
|
||||
MappedColumnType
|
||||
.base[HDPurpose, Int](_.constant, HDPurposes.fromConstant(_).get) // hm rething .get
|
||||
|
||||
implicit val bitcoinAddressMapper: BaseColumnType[BitcoinAddress] =
|
||||
MappedColumnType
|
||||
.base[BitcoinAddress, String](_.value, BitcoinAddress.fromStringExn)
|
||||
|
||||
implicit val scriptTypeMapper: BaseColumnType[ScriptType] =
|
||||
MappedColumnType
|
||||
.base[ScriptType, String](_.toString, ScriptType.fromStringExn)
|
||||
|
||||
implicit val aesSaltMapper: BaseColumnType[AesSalt] =
|
||||
MappedColumnType.base[AesSalt, String](
|
||||
_.value.toHex,
|
||||
hex => AesSalt(ByteVector.fromValidHex(hex)))
|
||||
|
||||
}
|
||||
|
||||
object DbCommonsColumnMappers extends DbCommonsColumnMappers
|
51
db-commons/src/main/scala/org/bitcoins/db/DbManagement.scala
Normal file
51
db-commons/src/main/scala/org/bitcoins/db/DbManagement.scala
Normal file
|
@ -0,0 +1,51 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
import scala.concurrent.{ExecutionContext, Future}
|
||||
|
||||
abstract class DbManagement extends BitcoinSLogger {
|
||||
def allTables: List[TableQuery[_ <: Table[_]]]
|
||||
|
||||
/** Lists all tables in the given database */
|
||||
def listTables(db: Database): Future[Vector[SQLiteTableInfo]] = {
|
||||
import DbCommonsColumnMappers._
|
||||
val query = sql"SELECT * FROM sqlite_master where type='table'"
|
||||
.as[SQLiteTableInfo]
|
||||
db.run(query)
|
||||
}
|
||||
|
||||
def createAll()(
|
||||
implicit config: AppConfig,
|
||||
ec: ExecutionContext): Future[List[Unit]] = {
|
||||
Future.sequence(allTables.map(createTable(_)))
|
||||
}
|
||||
|
||||
def dropAll()(
|
||||
implicit config: AppConfig,
|
||||
ec: ExecutionContext): Future[List[Unit]] = {
|
||||
Future.sequence(allTables.reverse.map(dropTable(_)))
|
||||
}
|
||||
|
||||
def createTable(
|
||||
table: TableQuery[_ <: Table[_]],
|
||||
createIfNotExists: Boolean = true)(
|
||||
implicit config: AppConfig): Future[Unit] = {
|
||||
import config.database
|
||||
val result = if (createIfNotExists) {
|
||||
database.run(table.schema.createIfNotExists)
|
||||
} else {
|
||||
database.run(table.schema.create)
|
||||
}
|
||||
result
|
||||
}
|
||||
|
||||
def dropTable(
|
||||
table: TableQuery[_ <: Table[_]]
|
||||
)(implicit config: AppConfig): Future[Unit] = {
|
||||
import config.database
|
||||
val result = database.run(table.schema.dropIfExists)
|
||||
result
|
||||
}
|
||||
}
|
15
db-commons/src/main/scala/org/bitcoins/db/DbRowAutoInc.scala
Normal file
15
db-commons/src/main/scala/org/bitcoins/db/DbRowAutoInc.scala
Normal file
|
@ -0,0 +1,15 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
/** This is meant to be coupled with [[CRUDAutoInc]]
|
||||
* and [[TableAutoInc]] to allow for automatically incrementing an id
|
||||
* when inserting something into a database. This removes the boiler
|
||||
* boiler plate from this having to happen every where a [[CRUD]]
|
||||
* is created
|
||||
*/
|
||||
abstract class DbRowAutoInc[T] {
|
||||
|
||||
def id: Option[Long]
|
||||
|
||||
def copyWithId(id: Long): T
|
||||
|
||||
}
|
|
@ -0,0 +1,7 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
/**
|
||||
* @param name The name of the table
|
||||
* @param sql The SQL executed to create the table
|
||||
*/
|
||||
case class SQLiteTableInfo(name: String, sql: String)
|
19
db-commons/src/main/scala/org/bitcoins/db/SlickUtil.scala
Normal file
19
db-commons/src/main/scala/org/bitcoins/db/SlickUtil.scala
Normal file
|
@ -0,0 +1,19 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
import scala.concurrent.Future
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
sealed abstract class SlickUtil {
|
||||
|
||||
/** Creates rows in a database that are not auto incremented */
|
||||
def createAllNoAutoInc[T, U <: Table[T]](
|
||||
ts: Vector[T],
|
||||
database: SafeDatabase,
|
||||
table: TableQuery[U]): Future[Vector[T]] = {
|
||||
val actions = ts.map(t => (table += t).andThen(DBIO.successful(t)))
|
||||
val result = database.run(DBIO.sequence(actions))
|
||||
result
|
||||
}
|
||||
}
|
||||
|
||||
object SlickUtil extends SlickUtil
|
17
db-commons/src/main/scala/org/bitcoins/db/TableAutoInc.scala
Normal file
17
db-commons/src/main/scala/org/bitcoins/db/TableAutoInc.scala
Normal file
|
@ -0,0 +1,17 @@
|
|||
package org.bitcoins.db
|
||||
|
||||
import slick.jdbc.SQLiteProfile.api._
|
||||
|
||||
/** Defines a table that has an auto incremented fields that is named id.
|
||||
* This is useful for things we want to store that don't have an
|
||||
* inherent id such as a hash.
|
||||
* @param tag
|
||||
* @param tableName
|
||||
* @tparam T
|
||||
*/
|
||||
abstract class TableAutoInc[T](tag: Tag, tableName: String)
|
||||
extends Table[T](tag, tableName) {
|
||||
|
||||
def id: Rep[Long] = column[Long]("id", O.PrimaryKey, O.AutoInc)
|
||||
|
||||
}
|
16
db-commons/src/main/scala/org/bitcoins/db/package..scala
Normal file
16
db-commons/src/main/scala/org/bitcoins/db/package..scala
Normal file
|
@ -0,0 +1,16 @@
|
|||
package org.bitcoins
|
||||
|
||||
import com.typesafe.config.Config
|
||||
import com.typesafe.config.ConfigRenderOptions
|
||||
|
||||
package object db {
|
||||
|
||||
implicit class ConfigOps(private val config: Config) extends AnyVal {
|
||||
|
||||
def asReadableJson: String = {
|
||||
val options = ConfigRenderOptions.concise().setFormatted(true)
|
||||
config.root().render(options)
|
||||
}
|
||||
}
|
||||
|
||||
}
|
16
doc/README.md
Normal file
16
doc/README.md
Normal file
|
@ -0,0 +1,16 @@
|
|||
## Ammonite scripts
|
||||
|
||||
This project contain [Ammonite](https://ammonite.io) scripts that demonstrate
|
||||
functionality of `bitcoin-s`.
|
||||
|
||||
#### Running them with sbt:
|
||||
|
||||
```bash
|
||||
$ sbt "doc/run path/to/script.sc" # this is very slow, not recommended
|
||||
```
|
||||
|
||||
#### Running them with the [Bloop CLI](https://scalacenter.github.io/bloop/):
|
||||
|
||||
```bash
|
||||
$ bloop run doc --args path/to/script.sc # much faster than through sbt
|
||||
```
|
40
doc/configuration.md
Normal file
40
doc/configuration.md
Normal file
|
@ -0,0 +1,40 @@
|
|||
# bitcoin-s configuration
|
||||
|
||||
bitcoin-s uses [HOCON](https://github.com/lightbend/config/blob/master/HOCON.md)
|
||||
to configure various parts of the application the library offers. HOCON is a
|
||||
superset of JSON, that is, all valid JSON is valid HOCON.
|
||||
|
||||
All configuration for bitcoin-s is under the `bitcoin-s` key. The most interesting
|
||||
configurable parts right now are `datadir` and `network`. See
|
||||
[`db-commons/src/main/resources/reference.conf`](../db-commons/src/main/resources/reference.conf)
|
||||
for more information. In the future there will be separate keys under `bitcoin-s`
|
||||
for the `wallet`, `chain` and `node` modules.
|
||||
|
||||
If you have a file `application.conf` anywhere on your classpath when using
|
||||
bitcoin-s, the values there take precedence over the ones found in our
|
||||
`reference.conf`.
|
||||
|
||||
The resolved configuration gets parsed by
|
||||
[`AppConfig`](../db-commons/src/main/scala/org/bitcoins/db/AppConfig.scala).
|
||||
You can call the `.withOverrides` on this to override any value in the
|
||||
bitcoin-s configuration. An example of this would be:
|
||||
|
||||
```scala
|
||||
import org.bitcoins.wallet.config.WalletAppConfig
|
||||
import com.typesafe.config.ConfigFactory
|
||||
|
||||
val myConfig = ConfigFactory.parseString("bitcoin-s.network = testnet3")
|
||||
val walletConfig = WalletAppConfig.withOverrides(myConfig)
|
||||
```
|
||||
|
||||
You can pass as many configs as you'd like into `withOverrides`. If any
|
||||
keys appear multiple times the last one encountered. takes precedence.
|
||||
|
||||
|
||||
## Internal configuration
|
||||
|
||||
Database connections are also configured by using HOCON. This is done in
|
||||
[`db.conf`](../db-commons/src/main/resources/db.conf)
|
||||
(as well as [`application.conf`](../testkit/src/main/resources/application.conf)
|
||||
in `testkit` for running tests). The options exposed here are **not** intended to
|
||||
be used by users of bitcoin-s, and are internal only.
|
13
doc/database.md
Normal file
13
doc/database.md
Normal file
|
@ -0,0 +1,13 @@
|
|||
## bitcoin-s databases
|
||||
|
||||
### node project
|
||||
|
||||
This contains information related to peer to peer networking and chainstate for the bitcoin-s project. You can see configuration for these databases [here](../node/src/main/resources/reference.conf)
|
||||
|
||||
Database names:
|
||||
|
||||
- `nodedb` - the mainnet database
|
||||
- `nodedb-testnet3` - the testnet3 database
|
||||
- `nodedb-regtest` - the regtest database
|
||||
- `nodedb-unittest` - the database used by unit tests.
|
||||
|
|
@ -1,23 +1,22 @@
|
|||
<configuration>
|
||||
|
||||
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
|
||||
<file>logs/test-application.log</file>
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
|
||||
<file>logs/doc.log</file>
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<logger name="slick" level="INFO"/>
|
||||
<root level="DEBUG">
|
||||
<appender-ref ref="STDOUT" />
|
||||
<appender-ref ref="FILE"/>
|
||||
</root>
|
||||
|
||||
|
||||
</configuration>
|
157
doc/src/main/scala/TxBuilderExample.scala
Normal file
157
doc/src/main/scala/TxBuilderExample.scala
Normal file
|
@ -0,0 +1,157 @@
|
|||
import org.bitcoins.core.config.RegTest
|
||||
import org.bitcoins.core.crypto.ECPrivateKey
|
||||
import org.bitcoins.core.currency.Satoshis
|
||||
import org.bitcoins.core.number.{Int32, Int64, UInt32}
|
||||
import org.bitcoins.core.protocol.script.P2PKHScriptPubKey
|
||||
import org.bitcoins.core.protocol.transaction.{
|
||||
BaseTransaction,
|
||||
Transaction,
|
||||
TransactionOutPoint,
|
||||
TransactionOutput
|
||||
}
|
||||
import org.bitcoins.core.script.crypto.HashType
|
||||
import org.bitcoins.core.wallet.builder.BitcoinTxBuilder
|
||||
import org.bitcoins.core.wallet.fee.SatoshisPerByte
|
||||
import org.bitcoins.core.wallet.utxo.BitcoinUTXOSpendingInfo
|
||||
import org.scalatest.{FlatSpec, MustMatchers}
|
||||
|
||||
import scala.concurrent.ExecutionContext.Implicits.global
|
||||
import scala.concurrent.Future
|
||||
|
||||
class TxBuilderExample extends FlatSpec with MustMatchers {
|
||||
|
||||
behavior of "TxBuilderExample"
|
||||
|
||||
it must "build a signed tx" in {
|
||||
|
||||
//This is a documented example of how to create a signed bitcoin transaction
|
||||
//with bitcoin-s. You can run this test case with the following sbt command
|
||||
|
||||
//$ sbt "doc/testOnly *TxBuilderExample -- -z signed"
|
||||
|
||||
//generate a fresh private key that we are going to use in the scriptpubkey
|
||||
val privKey = ECPrivateKey.freshPrivateKey
|
||||
|
||||
//this is the script that the TxBuilder is going to create a
|
||||
//script signature that validly spends this scriptPubKey
|
||||
val creditingSpk = P2PKHScriptPubKey(pubKey = privKey.publicKey)
|
||||
val amount = Satoshis(Int64(10000))
|
||||
|
||||
//this is the utxo we are going to be spending
|
||||
val utxo =
|
||||
TransactionOutput(currencyUnit = amount, scriptPubKey = creditingSpk)
|
||||
|
||||
//the private key that locks the funds for the script we are spending too
|
||||
val destinationPrivKey = ECPrivateKey.freshPrivateKey
|
||||
|
||||
//the amount we are sending -- 5000 satoshis -- to the destinationSPK
|
||||
val destinationAmount = Satoshis(Int64(5000))
|
||||
|
||||
//the script that corresponds to destination private key, this is what is protecting the money
|
||||
val destinationSPK =
|
||||
P2PKHScriptPubKey(pubKey = destinationPrivKey.publicKey)
|
||||
|
||||
//this is where we are sending money too
|
||||
//we could add more destinations here if we
|
||||
//wanted to batch transactions
|
||||
val destinations = {
|
||||
val destination1 = TransactionOutput(currencyUnit = destinationAmount,
|
||||
scriptPubKey = destinationSPK)
|
||||
|
||||
List(destination1)
|
||||
}
|
||||
|
||||
//we have to fabricate a transaction that contains the
|
||||
//utxo we are trying to spend. If this were a real blockchain
|
||||
//we would need to reference the utxo set
|
||||
val creditingTx = BaseTransaction(version = Int32.one,
|
||||
inputs = List.empty,
|
||||
outputs = List(utxo),
|
||||
lockTime = UInt32.zero)
|
||||
|
||||
//this is the information we need from the crediting tx
|
||||
//to properly "link" it in the transaction we are creating
|
||||
val outPoint = TransactionOutPoint(creditingTx.txId, UInt32.zero)
|
||||
|
||||
// this contains all the information we need to
|
||||
// validly sign the utxo above
|
||||
val utxoSpendingInfo = BitcoinUTXOSpendingInfo(outPoint = outPoint,
|
||||
output = utxo,
|
||||
signers = List(privKey),
|
||||
redeemScriptOpt = None,
|
||||
scriptWitnessOpt = None,
|
||||
hashType =
|
||||
HashType.sigHashAll)
|
||||
|
||||
//all of the utxo spending information, since we are only
|
||||
//spending one utxo, this is just one element
|
||||
val utxos: List[BitcoinUTXOSpendingInfo] = List(utxoSpendingInfo)
|
||||
|
||||
//this is how much we are going to pay as a fee to the network
|
||||
//for this example, we are going to pay 1 satoshi per byte
|
||||
val feeRate = SatoshisPerByte(Satoshis.one)
|
||||
|
||||
val changePrivKey = ECPrivateKey.freshPrivateKey
|
||||
val changeSPK = P2PKHScriptPubKey(pubKey = changePrivKey.publicKey)
|
||||
|
||||
// the network we are on, for this example we are using
|
||||
// the regression test network. This is a network you control
|
||||
// on your own machine
|
||||
val networkParams = RegTest
|
||||
|
||||
//yay! Now we have a TxBuilder object that we can use
|
||||
//to sign the tx.
|
||||
val txBuilder: Future[BitcoinTxBuilder] = {
|
||||
BitcoinTxBuilder(
|
||||
destinations = destinations,
|
||||
utxos = utxos,
|
||||
feeRate = feeRate,
|
||||
changeSPK = changeSPK,
|
||||
network = networkParams
|
||||
)
|
||||
}
|
||||
|
||||
txBuilder.failed.foreach { case err => println(err.getMessage) }
|
||||
|
||||
//let's finally produce a validly signed tx
|
||||
//The 'sign' method is going produce a validly signed transaction
|
||||
//This is going to iterate through each of the 'utxos' and use
|
||||
//the corresponding 'UTXOSpendingInfo' to produce a validly
|
||||
//signed input. This tx has a
|
||||
//
|
||||
//1 input
|
||||
//2 outputs (destination and change outputs)
|
||||
//3 a fee rate of 1 satoshi/byte
|
||||
val signedTxF: Future[Transaction] = txBuilder.flatMap(_.sign)
|
||||
|
||||
//let's print these things out so you can example them
|
||||
signedTxF.map { tx =>
|
||||
println("\nInputs:")
|
||||
tx.inputs.foreach(println)
|
||||
|
||||
println("\nOutputs:")
|
||||
tx.outputs.foreach(println)
|
||||
|
||||
//here is the fully signed serialized tx that
|
||||
//you COULD broadcast to a cryptocurrency p2p network
|
||||
println(s"\nFully signed tx in hex:")
|
||||
|
||||
println(s"${tx.hex}")
|
||||
}
|
||||
|
||||
//The output from the print statements should read something like this
|
||||
|
||||
//Inputs:
|
||||
//TransactionInputImpl(TransactionOutPointImpl(DoubleSha256DigestImpl(43c75d1d59e6f13f2ad3baf6e124685ba0919bccdbdf89c362fe2f30fee4bdfc),UInt32Impl(0)),P2PKHScriptSignature(6a4730440220573a7bbbd59192c4bf01b8f1dcafe981d11ab8528fead9d66d702c1b72e5dc76022007946a423073c949e85a4ca3901ab10a2d6b72873a347d2a55ef873016adae8601210356d581971934349333066ed933cdea45ae9c72829ce34d8dd6a758d56967e4cb),UInt32Impl(0))
|
||||
//
|
||||
//Outputs:
|
||||
//TransactionOutputImpl(SatoshisImpl(Int64Impl(5000)),P2PKHScriptPubKeyImpl(1976a914dbdadae42124c46a00d81181e5d9ab28fbf546ed88ac))
|
||||
//TransactionOutputImpl(SatoshisImpl(Int64Impl(4774)),P2PKHScriptPubKeyImpl(1976a914a95eb0d284593f0c8f818f64a55fa6e3852012a688ac))
|
||||
//
|
||||
//Fully signed tx in hex:
|
||||
//020000000143c75d1d59e6f13f2ad3baf6e124685ba0919bccdbdf89c362fe2f30fee4bdfc000000006a4730440220573a7bbbd59192c4bf01b8f1dcafe981d11ab8528fead9d66d702c1b72e5dc76022007946a423073c949e85a4ca3901ab10a2d6b72873a347d2a55ef873016adae8601210356d581971934349333066ed933cdea45ae9c72829ce34d8dd6a758d56967e4cb000000000288130000000000001976a914dbdadae42124c46a00d81181e5d9ab28fbf546ed88aca6120000000000001976a914a95eb0d284593f0c8f818f64a55fa6e3852012a688ac00000000
|
||||
|
||||
//remember, you can call .hex on any bitcoin-s data structure to get the hex representation!
|
||||
}
|
||||
|
||||
}
|
43
doc/src/main/scala/org/bitcoins/doc/AmmoniteBridge.scala
Normal file
43
doc/src/main/scala/org/bitcoins/doc/AmmoniteBridge.scala
Normal file
|
@ -0,0 +1,43 @@
|
|||
package org.bitcoins.doc
|
||||
import java.nio.file.Files
|
||||
import java.nio.file.Path
|
||||
import java.nio.file.Paths
|
||||
import scala.util.Properties
|
||||
|
||||
object amm extends App {
|
||||
|
||||
/** Gets all files ending with .sc in dir or subdirs */
|
||||
def getScripts(dir: Path): Seq[Path] = {
|
||||
import scala.collection.JavaConverters._
|
||||
|
||||
Files
|
||||
.walk(dir)
|
||||
.iterator()
|
||||
.asScala
|
||||
.filter(Files.isRegularFile(_))
|
||||
.filter(_.toString.endsWith(".sc"))
|
||||
.toList
|
||||
}
|
||||
|
||||
if (args.isEmpty || args.headOption.forall(_.isEmpty)) {
|
||||
import System.err.{println => printerr}
|
||||
|
||||
printerr("No script name provided!")
|
||||
printerr()
|
||||
|
||||
val cwd = Paths.get(Properties.userDir)
|
||||
val scripts = getScripts(cwd)
|
||||
|
||||
if (scripts.nonEmpty) {
|
||||
printerr("Available scripts:")
|
||||
scripts.foreach { script =>
|
||||
printerr(s" ${cwd.relativize(script)}")
|
||||
}
|
||||
} else {
|
||||
printerr("No .sc scripts found!")
|
||||
}
|
||||
sys.exit(1)
|
||||
} else {
|
||||
ammonite.Main.main(args)
|
||||
}
|
||||
}
|
82
doc/src/main/scala/org/bitcoins/doc/chain/sync-chain.sc
Normal file
82
doc/src/main/scala/org/bitcoins/doc/chain/sync-chain.sc
Normal file
|
@ -0,0 +1,82 @@
|
|||
import org.bitcoins.rpc.config._
|
||||
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.chain.db._
|
||||
import org.bitcoins.chain.config._
|
||||
import org.bitcoins.chain.blockchain._
|
||||
import org.bitcoins.chain.blockchain.sync._
|
||||
import org.bitcoins.chain.models._
|
||||
|
||||
import org.bitcoins.core.protocol.blockchain._
|
||||
import org.bitcoins.rpc.client.common._
|
||||
import org.bitcoins.testkit.chain._
|
||||
import org.bitcoins.wallet._
|
||||
import org.bitcoins.wallet.api._
|
||||
|
||||
import org.slf4j.LoggerFactory
|
||||
|
||||
import scala.collection.JavaConverters._
|
||||
import scala.concurrent._
|
||||
import scala.concurrent.duration.DurationInt
|
||||
import scala.util._
|
||||
|
||||
//the goal for this script is to create a chain and sync it
|
||||
//to disk after creation
|
||||
|
||||
//we should be able to read this chain on subsequent runs
|
||||
//assuming we are connected to the same bitcoind instance
|
||||
|
||||
//you can run this script with
|
||||
//$ sbt "doc/run doc/src/main/scala/org/bitcoins/doc/chain/sync-chain.sc"
|
||||
|
||||
|
||||
//boring config stuff
|
||||
val logger = LoggerFactory.getLogger("org.bitcoins.doc.chain.SyncChain")
|
||||
val time = System.currentTimeMillis()
|
||||
implicit val system = ActorSystem(s"sync-chain-${time}")
|
||||
import system.dispatcher
|
||||
|
||||
//first we are assuming that a bitcoind regtest node is running in
|
||||
//the background, you can see 'connect_bitcoind.sc' script
|
||||
//to see how to bind to a local/remote bitcoind node
|
||||
//This script assumes that you have a bitcoind instance running in the
|
||||
//background and that you have ~/.bitcoin/bitcoin.conf setup.
|
||||
//you need to have 'rpcuser' and 'rpcpassword' set in that bitcoin.conf file
|
||||
//You can pass in an alternative datadir if you wish by construct a new java.io.File()
|
||||
val bitcoindInstance = BitcoindInstance.fromDatadir()
|
||||
val rpcCli = new BitcoindRpcClient(bitcoindInstance)
|
||||
|
||||
logger.info(s"Done configuring rpc client")
|
||||
//next we need to create a way to monitor the chain
|
||||
val getBestBlockHash = ChainTestUtil.bestBlockHashFnRpc(Future.successful(rpcCli))
|
||||
|
||||
val getBlockHeader = ChainTestUtil.getBlockHeaderFnRpc(Future.successful(rpcCli))
|
||||
|
||||
val chainDbConfig = ChainDbConfig.RegTestDbConfig
|
||||
val chainAppConfig = ChainAppConfig(chainDbConfig)
|
||||
|
||||
logger.info(s"Creating chain tables")
|
||||
//initialize chain tables in bitcoin-s if they do not exist
|
||||
val chainProjectInitF = ChainTestUtil.initializeIfNeeded(chainAppConfig)
|
||||
|
||||
val blockHeaderDAO = BlockHeaderDAO(appConfig = chainAppConfig)
|
||||
|
||||
val chainHandler = ChainHandler(blockHeaderDAO, chainAppConfig)
|
||||
|
||||
val syncedChainApiF = chainProjectInitF.flatMap { _ =>
|
||||
logger.info(s"Beginning sync to bitcoin-s chain state")
|
||||
ChainSync.sync(chainHandler, getBlockHeader, getBestBlockHash)
|
||||
}
|
||||
|
||||
val syncResultF = syncedChainApiF.flatMap { chainApi =>
|
||||
chainApi.getBlockCount.map(count => logger.info(s"chain api blockcount=${count}"))
|
||||
|
||||
rpcCli.getBlockCount.map(count => logger.info(s"bitcoind blockcount=${count}"))
|
||||
}
|
||||
|
||||
syncResultF.onComplete { case result =>
|
||||
|
||||
logger.info(s"Sync result=${result}")
|
||||
system.terminate()
|
||||
}
|
||||
|
232
doc/src/main/scala/org/bitcoins/doc/wallet/create-wallet.sc
Normal file
232
doc/src/main/scala/org/bitcoins/doc/wallet/create-wallet.sc
Normal file
|
@ -0,0 +1,232 @@
|
|||
import java.io.File
|
||||
|
||||
import org.bitcoins.chain.blockchain.{Blockchain, ChainHandler}
|
||||
import org.bitcoins.chain.models.{BlockHeaderDAO, BlockHeaderDb, BlockHeaderDbHelper}
|
||||
import org.bitcoins.core.protocol.blockchain.{Block, RegTestNetChainParams}
|
||||
import org.bitcoins.wallet.Wallet
|
||||
import org.bitcoins.wallet.api.InitializeWalletSuccess
|
||||
import scodec.bits.ByteVector
|
||||
import akka.actor.ActorSystem
|
||||
import org.bitcoins.chain.api.ChainApi
|
||||
import com.typesafe.config.ConfigFactory
|
||||
import org.bitcoins.chain.db.ChainDbManagement
|
||||
import org.bitcoins.chain.db.ChainDbConfig
|
||||
import org.bitcoins.chain.config.ChainAppConfig
|
||||
import org.bitcoins.chain.blockchain.sync.ChainSync
|
||||
import org.bitcoins.core.util.BitcoinSLogger
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.core.currency._
|
||||
import org.bitcoins.core.protocol.transaction._
|
||||
import org.bitcoins.core.number._
|
||||
import org.bitcoins.rpc.client.common.BitcoindRpcClient
|
||||
import org.bitcoins.rpc.client.v17.BitcoindV17RpcClient
|
||||
import org.bitcoins.rpc.config.BitcoindInstance
|
||||
import org.bitcoins.rpc.util.RpcUtil
|
||||
import org.bitcoins.testkit.rpc.BitcoindRpcTestUtil
|
||||
import org.bitcoins.wallet.db.WalletDbManagement
|
||||
import org.bitcoins.wallet.db.WalletDbConfig
|
||||
import org.bitcoins.wallet.config.WalletAppConfig
|
||||
|
||||
import org.bitcoins.zmq.ZMQSubscriber
|
||||
import org.slf4j.LoggerFactory
|
||||
|
||||
import scala.collection.JavaConverters._
|
||||
import scala.concurrent._
|
||||
import scala.concurrent.duration.DurationInt
|
||||
import scala.util._
|
||||
/**
|
||||
* This is for example purposes only!
|
||||
* This shows how to peer a bitcoin-s wallet
|
||||
* with a bitcoind instance that is relaying
|
||||
* information about what is happening on the blockchain
|
||||
* to the bitcoin-s wallet.
|
||||
*
|
||||
* This is useful if you want more flexible signing
|
||||
* procedures in the JVM ecosystem and more
|
||||
* granular control over your utxos with
|
||||
* popular databases like postgres, sqlite etc
|
||||
*/
|
||||
|
||||
//you can run this script with the following command
|
||||
//$ sbt "doc/run doc/src/main/scala/org/bitcoins/doc/wallet/create-wallet.sc"
|
||||
|
||||
val logger = LoggerFactory.getLogger("org.bitcoins.doc.wallet.CreateWallet")
|
||||
val time = System.currentTimeMillis()
|
||||
//boiler plate config
|
||||
implicit val system = ActorSystem(s"wallet-scala-sheet-${time}")
|
||||
import system.dispatcher
|
||||
|
||||
val chainDbConfig = ChainDbConfig.RegTestDbConfig
|
||||
val chainAppConfig = ChainAppConfig(chainDbConfig)
|
||||
implicit val chainParams = chainAppConfig.chain
|
||||
|
||||
val walletDbConfig = WalletDbConfig.RegTestDbConfig
|
||||
val walletAppConfig = WalletAppConfig(walletDbConfig)
|
||||
|
||||
val datadir = new File(s"/tmp/bitcoin-${time}/")
|
||||
val bitcoinConf = new File(datadir.getAbsolutePath + "/bitcoin.conf")
|
||||
|
||||
logger.info(s"bitcoin.conf location=${bitcoinConf.getAbsolutePath}")
|
||||
datadir.mkdirs()
|
||||
bitcoinConf.createNewFile()
|
||||
|
||||
val config = BitcoindRpcTestUtil.standardConfig
|
||||
val _ = BitcoindRpcTestUtil.writeConfigToFile(config,datadir)
|
||||
|
||||
//construct bitcoind
|
||||
val instance = BitcoindInstance.fromConfig(config = config, datadir)
|
||||
val bitcoind = new BitcoindRpcClient(instance = instance)
|
||||
|
||||
//start bitcoind, this may take a little while
|
||||
//generate 101 blocks so we have money in our wallet
|
||||
val bitcoindF = bitcoind.start().map(_ => bitcoind)
|
||||
|
||||
//create a native chain handler for bitcoin-s
|
||||
val blockHeaderDAO: BlockHeaderDAO = BlockHeaderDAO(appConfig = chainAppConfig)
|
||||
val genesisHeader = BlockHeaderDbHelper.fromBlockHeader(
|
||||
height = 0,
|
||||
bh = chainAppConfig.chain.genesisBlock.blockHeader)
|
||||
|
||||
|
||||
val blockHeaderTableF = {
|
||||
//drop regtest table if it exists
|
||||
val dropTableF = ChainDbManagement.dropHeaderTable(chainDbConfig)
|
||||
|
||||
//recreate the table
|
||||
val createdTableF = dropTableF.flatMap(_ => ChainDbManagement.createHeaderTable(chainDbConfig))
|
||||
|
||||
createdTableF
|
||||
}
|
||||
val createdGenHeaderF = blockHeaderTableF.flatMap(_ => blockHeaderDAO.create(genesisHeader))
|
||||
|
||||
val chainF = createdGenHeaderF.map(h => Vector(h))
|
||||
|
||||
val blockchainF = chainF.map(chain => Blockchain(chain))
|
||||
|
||||
val chainHandlerF = blockchainF.map(blockchain => ChainHandler(blockHeaderDAO, chainAppConfig))
|
||||
|
||||
val chainApi101BlocksF = sync(chainHandlerF, 101)
|
||||
|
||||
val bitcoinsLogF = chainApi101BlocksF.flatMap { chainApi =>
|
||||
chainApi.getBlockCount.map(count => logger.info(s"bitcoin-s blockcount=${count}"))
|
||||
}
|
||||
|
||||
val walletF = bitcoinsLogF.flatMap { _ =>
|
||||
//create tables
|
||||
val dropTablesF = WalletDbManagement.dropAll(walletDbConfig)
|
||||
val createTablesF = dropTablesF.flatMap(_ => WalletDbManagement.createAll(walletDbConfig))
|
||||
createTablesF.flatMap { _ =>
|
||||
Wallet.initialize(walletAppConfig)
|
||||
.collect{ case success: InitializeWalletSuccess => success.wallet }
|
||||
}
|
||||
}
|
||||
|
||||
val bitcoinsAddrF = walletF.flatMap(_.getNewAddress())
|
||||
|
||||
//send money to our wallet with bitcoind
|
||||
val amt = Bitcoins.one
|
||||
val transactionOutputIndexF: Future[(Transaction,Int)] = for {
|
||||
bitcoind <- bitcoindF
|
||||
bitcoinsAddr <- bitcoinsAddrF
|
||||
txid <- bitcoind.sendToAddress(bitcoinsAddr, amt)
|
||||
tx <- bitcoind.getRawTransactionRaw(txid)
|
||||
} yield {
|
||||
logger.info(s"Sending ${amt} to address ${bitcoinsAddr.value}")
|
||||
val Some((output,index)) = tx.outputs.zipWithIndex.find { case (output,index) =>
|
||||
output.scriptPubKey == bitcoinsAddr.scriptPubKey
|
||||
}
|
||||
|
||||
(tx,index)
|
||||
}
|
||||
|
||||
//add the utxo that was just created by bitcoind to our wallet
|
||||
val addUtxoF = for {
|
||||
wallet <- walletF
|
||||
(tx,index) <- transactionOutputIndexF
|
||||
addUtxo <- wallet.addUtxo(tx,UInt32(index))
|
||||
} yield {
|
||||
logger.info(s"Add utxo result=${addUtxo}")
|
||||
addUtxo
|
||||
}
|
||||
|
||||
//bury the utxo with enough proof of work to make it confirmed
|
||||
val chainApi6BlocksF = for {
|
||||
addUtxo <- addUtxoF
|
||||
(tx,_) <- transactionOutputIndexF
|
||||
chainApi <- sync(chainApi101BlocksF,6)
|
||||
} yield {
|
||||
logger.info(s"txid=${tx.txId.flip.hex}")
|
||||
}
|
||||
|
||||
//check balance & clean everything up
|
||||
chainApi6BlocksF.onComplete { chainApi =>
|
||||
val balanceF = walletF.flatMap(_.getBalance)
|
||||
|
||||
balanceF.onComplete(balance => logger.info(s"bitcoin-s walllet balance=${balance}"))
|
||||
|
||||
balanceF.flatMap(_ => cleanup())
|
||||
}
|
||||
|
||||
|
||||
|
||||
/** Syncs the give number of blocks to our chain */
|
||||
def sync(chainHandlerF: Future[ChainApi], numBlocks: Int)(implicit ec: ExecutionContext): Future[ChainApi] = {
|
||||
//we need a way to connect bitcoin-s to our running bitcoind, we are going to do this via rpc for now
|
||||
//we need to implement the 'getBestBlockHashFunc' and 'getBlockHeaderFunc' functions
|
||||
//to be able to sync our internal bitcoin-s chain with our external bitcoind chain
|
||||
val getBestBlockHashFunc = { () =>
|
||||
bitcoindF.flatMap(_.getBestBlockHash)
|
||||
}
|
||||
|
||||
val getBlockHeaderFunc = { hash: DoubleSha256DigestBE =>
|
||||
bitcoindF.flatMap(_.getBlockHeader(hash).map(_.blockHeader))
|
||||
}
|
||||
|
||||
|
||||
//now that we have bitcoind setup correctly and have rpc linked to
|
||||
//the bitcoin-s chain project, let's generate some blocks so
|
||||
//we have money to spend in our bitcoind wallet!
|
||||
//we need to generate 101 blocks to give us 50 btc to spend
|
||||
val genBlocksF = chainHandlerF.flatMap { _ =>
|
||||
bitcoindF.flatMap(_.generate(numBlocks))
|
||||
}
|
||||
|
||||
//now we need to sync those blocks into bitcoin-s
|
||||
val chainSyncF = genBlocksF.flatMap { _ =>
|
||||
chainHandlerF.flatMap { ch =>
|
||||
ChainSync.sync(
|
||||
ch.asInstanceOf[ChainHandler],
|
||||
getBlockHeaderFunc,
|
||||
getBestBlockHashFunc)
|
||||
}
|
||||
}
|
||||
|
||||
chainSyncF
|
||||
}
|
||||
|
||||
def cleanup(): Future[Unit] = {
|
||||
logger.info("Beginning clean up of create wallet script")
|
||||
val bitcoindStopF = {
|
||||
bitcoindF.flatMap { bitcoind =>
|
||||
val stopF = bitcoind.stop()
|
||||
stopF
|
||||
}
|
||||
}
|
||||
datadir.delete()
|
||||
logger.debug("cleaning up chain, wallet, and system")
|
||||
val chainCleanupF = ChainDbManagement.dropAll(chainDbConfig)
|
||||
val walletCleanupF = WalletDbManagement.dropAll(walletDbConfig)
|
||||
|
||||
val doneWithCleanupF = for {
|
||||
_ <- bitcoindStopF
|
||||
_ <- chainCleanupF
|
||||
_ <- walletCleanupF
|
||||
_ <- system.terminate()
|
||||
} yield {
|
||||
logger.info(s"Done cleaning up")
|
||||
}
|
||||
|
||||
doneWithCleanupF
|
||||
}
|
||||
|
||||
|
|
@ -1,24 +0,0 @@
|
|||
<configuration>
|
||||
|
||||
<appender name="FILE" class="ch.qos.logback.core.FileAppender">
|
||||
<file>logs/eclair-rpc-test.log</file>
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
|
||||
<encoder>
|
||||
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %logger{5}.%M\(%line\) - %msg%n</pattern>
|
||||
</encoder>
|
||||
</appender>
|
||||
|
||||
|
||||
|
||||
|
||||
<root level="INFO">
|
||||
<appender-ref ref="STDOUT" />
|
||||
<appender-ref ref="FILE"/>
|
||||
</root>
|
||||
|
||||
</configuration>
|
|
@ -12,7 +12,7 @@ class EclairRpcTestUtilTest extends AsyncFlatSpec with BeforeAndAfterAll {
|
|||
|
||||
private val logger = LoggerFactory.getLogger(getClass)
|
||||
|
||||
private implicit val actorSystem: ActorSystem =
|
||||
implicit private val actorSystem: ActorSystem =
|
||||
ActorSystem("EclairRpcTestUtilTest", BitcoindRpcTestUtil.AKKA_CONFIG)
|
||||
|
||||
private lazy val bitcoindRpcF = {
|
||||
|
|
|
@ -0,0 +1,15 @@
|
|||
package org.bitcoins.node
|
||||
|
||||
import org.bitcoins.testkit.node.NodeTestUtil
|
||||
import org.bitcoins.testkit.util.BitcoinSUnitTest
|
||||
|
||||
/**
|
||||
* Created by chris on 6/28/16.
|
||||
*/
|
||||
class NetworkMessageTest extends BitcoinSUnitTest {
|
||||
|
||||
"NetworkMessage" must "be able to serialize then deserialize a message and get the original hex back" in {
|
||||
NetworkMessage(NodeTestUtil.rawNetworkMessage).hex must be(
|
||||
NodeTestUtil.rawNetworkMessage)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,33 @@
|
|||
package org.bitcoins.node
|
||||
|
||||
import org.bitcoins.testkit.util.BitcoinSUnitTest
|
||||
import org.bitcoins.node.config.NodeAppConfig
|
||||
import org.bitcoins.core.config.TestNet3
|
||||
import com.typesafe.config.Config
|
||||
import com.typesafe.config.ConfigFactory
|
||||
import org.bitcoins.core.config.RegTest
|
||||
import org.bitcoins.core.config.MainNet
|
||||
|
||||
class NodeAppConfigTest extends BitcoinSUnitTest {
|
||||
val config = NodeAppConfig()
|
||||
|
||||
it must "be overridable" in {
|
||||
assert(config.network == RegTest)
|
||||
|
||||
val otherConf = ConfigFactory.parseString("bitcoin-s.network = testnet3")
|
||||
val withOther: NodeAppConfig = config.withOverrides(otherConf)
|
||||
assert(withOther.network == TestNet3)
|
||||
|
||||
val mainnetConf = ConfigFactory.parseString("bitcoin-s.network = mainnet")
|
||||
val mainnet: NodeAppConfig = withOther.withOverrides(mainnetConf)
|
||||
assert(mainnet.network == MainNet)
|
||||
}
|
||||
|
||||
it must "be overridable with multiple levels" in {
|
||||
val testnet = ConfigFactory.parseString("bitcoin-s.network = testnet3")
|
||||
val mainnet = ConfigFactory.parseString("bitcoin-s.network = mainnet")
|
||||
val overriden: NodeAppConfig = config.withOverrides(testnet, mainnet)
|
||||
assert(overriden.network == MainNet)
|
||||
|
||||
}
|
||||
}
|
53
node-test/src/test/scala/org/bitcoins/node/SpvNodeTest.scala
Normal file
53
node-test/src/test/scala/org/bitcoins/node/SpvNodeTest.scala
Normal file
|
@ -0,0 +1,53 @@
|
|||
package org.bitcoins.node
|
||||
|
||||
import org.bitcoins.core.crypto.DoubleSha256DigestBE
|
||||
import org.bitcoins.rpc.util.RpcUtil
|
||||
import org.bitcoins.testkit.node.NodeUnitTest
|
||||
import org.bitcoins.testkit.node.fixture.SpvNodeConnectedWithBitcoind
|
||||
import org.scalatest.FutureOutcome
|
||||
|
||||
import scala.concurrent.Future
|
||||
|
||||
class SpvNodeTest extends NodeUnitTest {
|
||||
|
||||
override type FixtureParam = SpvNodeConnectedWithBitcoind
|
||||
|
||||
override def withFixture(test: OneArgAsyncTest): FutureOutcome =
|
||||
withSpvNodeConnectedToBitcoind(test)
|
||||
|
||||
behavior of "SpvNode"
|
||||
|
||||
it must "receive notification that a block occurred on the p2p network" in {
|
||||
spvNodeConnectedWithBitcoind: SpvNodeConnectedWithBitcoind =>
|
||||
val spvNode = spvNodeConnectedWithBitcoind.spvNode
|
||||
val bitcoind = spvNodeConnectedWithBitcoind.bitcoind
|
||||
|
||||
assert(spvNode.isConnected)
|
||||
|
||||
assert(spvNode.isInitialized)
|
||||
|
||||
val hashF: Future[DoubleSha256DigestBE] = {
|
||||
bitcoind.generate(1).map(_.head)
|
||||
}
|
||||
|
||||
//check we have that hash inside of our chain project!
|
||||
val spvSyncF = for {
|
||||
_ <- hashF
|
||||
sync <- spvNode.sync()
|
||||
} yield sync
|
||||
|
||||
def isSameBestHash(): Future[Boolean] = {
|
||||
for {
|
||||
spvBestHash <- spvNode.chainApi.getBestBlockHash
|
||||
hash <- hashF
|
||||
} yield spvBestHash == hash
|
||||
}
|
||||
|
||||
spvSyncF.flatMap { _ =>
|
||||
RpcUtil
|
||||
.retryUntilSatisfiedF(isSameBestHash)
|
||||
.map(_ => succeed)
|
||||
}
|
||||
|
||||
}
|
||||
}
|
|
@ -0,0 +1,33 @@
|
|||
package org.bitcoins.node.headers
|
||||
|
||||
import org.bitcoins.core.config.TestNet3
|
||||
import org.bitcoins.core.number.UInt32
|
||||
import org.bitcoins.core.util.{BitcoinSUtil, CryptoUtil}
|
||||
import org.bitcoins.node.messages.VerAckMessage
|
||||
import org.bitcoins.testkit.node.NodeTestUtil
|
||||
import org.scalatest.{FlatSpec, MustMatchers}
|
||||
|
||||
/**
|
||||
* Created by chris on 6/10/16.
|
||||
*/
|
||||
class NetworkHeaderTest extends FlatSpec with MustMatchers {
|
||||
|
||||
"MessageHeader" must "must create a message header for a message" in {
|
||||
val messageHeader = NetworkHeader(TestNet3, NodeTestUtil.versionMessage)
|
||||
messageHeader.network must be(TestNet3.magicBytes)
|
||||
messageHeader.commandName must be(NodeTestUtil.versionMessage.commandName)
|
||||
messageHeader.payloadSize must be(
|
||||
UInt32(NodeTestUtil.versionMessage.bytes.size))
|
||||
messageHeader.checksum must be(
|
||||
CryptoUtil.doubleSHA256(NodeTestUtil.versionMessage.bytes).bytes.take(4))
|
||||
}
|
||||
|
||||
it must "build the correct message header for a verack message" in {
|
||||
val messageHeader = NetworkHeader(TestNet3, VerAckMessage)
|
||||
messageHeader.network must be(TestNet3.magicBytes)
|
||||
messageHeader.commandName must be(VerAckMessage.commandName)
|
||||
messageHeader.payloadSize must be(UInt32.zero)
|
||||
BitcoinSUtil.encodeHex(messageHeader.checksum) must be("5df6e0e2")
|
||||
}
|
||||
|
||||
}
|
|
@ -0,0 +1,18 @@
|
|||
package org.bitcoins.node.messages
|
||||
|
||||
import org.bitcoins.node.headers.NetworkHeader
|
||||
import org.bitcoins.testkit.node.NodeTestUtil
|
||||
import org.bitcoins.testkit.util.BitcoinSUnitTest
|
||||
|
||||
class NetworkPayloadTest extends BitcoinSUnitTest {
|
||||
|
||||
"NetworkMessage" must "create a payload object from it's network header and the payload bytes" in {
|
||||
val rawNetworkMessage = NodeTestUtil.rawNetworkMessage
|
||||
val header = NetworkHeader(rawNetworkMessage.take(48))
|
||||
logger.debug("Header: " + header)
|
||||
val payloadHex = rawNetworkMessage.slice(48, rawNetworkMessage.length)
|
||||
val payload = NetworkPayload(header, payloadHex)
|
||||
payload.isInstanceOf[VersionMessage] must be(true)
|
||||
payload.commandName must be(NetworkPayload.versionCommandName)
|
||||
}
|
||||
}
|
|
@ -0,0 +1,23 @@
|
|||
package org.bitcoins.node.messages
|
||||
|
||||
import org.bitcoins.node.messages.TypeIdentifier.{
|
||||
MsgBlock,
|
||||
MsgFilteredBlock,
|
||||
MsgTx
|
||||
}
|
||||
import org.bitcoins.testkit.util.BitcoinSUnitTest
|
||||
|
||||
class TypeIdentifierTest extends BitcoinSUnitTest {
|
||||
|
||||
"MsgTx" must "serialize to 01000000" in {
|
||||
MsgTx.hex must be("01000000")
|
||||
}
|
||||
|
||||
"MsgBlock" must "serialize to 02000000" in {
|
||||
MsgBlock.hex must be("02000000")
|
||||
}
|
||||
|
||||
"MsgFilteredBlock" must "serialize to 03000000" in {
|
||||
MsgFilteredBlock.hex must be("03000000")
|
||||
}
|
||||
}
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue