Compare commits

..

66 Commits

Author SHA1 Message Date
Muzahid 59311f3f31
update core changes reuse sql_helper in testcase 2021-04-07 15:47:45 +05:30
Muzahid a3e52099bc
update test and thread 2021-04-07 15:47:44 +05:30
Muzahid e41d200e81
add sql file for the query 2021-04-07 15:47:44 +05:30
Muzahid ff84454a95
change as per creae_function 2021-04-07 15:47:44 +05:30
Muzahid 27e07954a0
refactor methods and add create_function and its test 2021-04-07 15:47:43 +05:30
Muzahid 56dea46d98
refactor the versioning code and apply sql files behalf of query 2021-04-07 15:47:43 +05:30
Muzahid a38bcb2f04
rever changes 2021-04-07 15:47:42 +05:30
Muzahid 7a076e92fe
fix py3 import error 2021-04-07 15:47:42 +05:30
Muzahid ce0f99a8c2
refactor the sqlthread test cases and fix py3 issue 2021-04-07 15:47:42 +05:30
Muzahid 17d40ef43c
test case for versioning 10,9,8,1 2021-04-07 15:47:41 +05:30
Muzahid aed71c11ab
test case for sqlthread version 1 2021-04-07 15:47:41 +05:30
Muzahid b6155b1af3
fix db migration issue 2021-04-07 15:47:41 +05:30
Dmitri Bogomolov f2b4c97d1d
Handle old psutil in TestProcess 2021-04-07 15:47:40 +05:30
Dmitri Bogomolov bb433ac58b
Replaced print operator by print function in network.asyncore_pollchoose
and unmaintained modules.
2021-04-07 15:47:40 +05:30
Dmitri Bogomolov d46bda07e3
Don't run tests when build deb 2021-04-07 15:47:40 +05:30
Dmitri Bogomolov b4247409c6
Work around deprecation of platform.dist() in recent python 2021-04-07 15:47:39 +05:30
Dmitri Bogomolov c1b533f8db
Remove import from debug from openclpow, remove shebang, format 2021-04-07 15:47:39 +05:30
Dmitri Bogomolov 52146b8394
Support tox and request more warnings:
- make separate tests runner - tests.py; python setup.py test still works
  - tox.ini with coverage config
  - -b: issue warnings about comparing bytearray with unicode
  - export PYTHONWARNINGS=all on stage install
2021-04-07 15:47:38 +05:30
Dmitri Bogomolov 52a1ac21a8
Fix python3 issues in test_blindsig:
- simplify imports
 - signatures are of type bytes
 - chain kwarg of pyelliptic.ECCBlindChain is bytes
2021-04-07 15:47:38 +05:30
Dmitri Bogomolov 8e8697d9b2
Fix python3 issues in test_crypto:
- use bytes for python3
 - encode the result of arithmetic.privtopub
 - add test for arithmetic.base10_multiply
2021-04-07 15:47:38 +05:30
Dmitri Bogomolov 28a602536f
Make addresses module available for testing with python3:
- remove import from debug
 - use divmod and bytes
2021-04-07 15:47:37 +05:30
Dmitri Bogomolov ea6f23e596
test_randomtrackingdict: revert bytes to string for python3 2021-04-07 15:47:37 +05:30
Dmitri Bogomolov eede298a22
Fix python3 issues in pyelliptic:
- use dotted imports, remove unneeded shebangs
 - openssl._OpenSSL._version is of type bytes
 - use b'\x00' literal instead of chr(0) in eccblind and test_openssl
 - use // and divmod in arithmetic to fit PEP238:
   https://docs.python.org/3/whatsnew/2.2.html#pep-238-changing-the-division-operator
2021-04-07 15:47:37 +05:30
Dmitri Bogomolov 0e9674bce1
Universal pathmagic returns app dir; activated in setup for python3 2021-04-07 15:47:36 +05:30
Dmitri Bogomolov eaab6d3f5f
Use common.skip_python3() to skip tests modules not supporting python3 2021-04-07 15:47:36 +05:30
Dmitri Bogomolov dba54841bf
Add normal exit in depends if detected python3 2021-04-07 15:47:35 +05:30
Dmitri Bogomolov 9dabd31470
Add python 3.7. Use general shebangs in scripts to test with python3;
Use 2.7_with_system_site_packages for python2 to run qt tests
as suggested in Travis doc instead of bypassing virtualenv by shebang.
2021-04-07 15:47:35 +05:30
Peter Šurda 1adb593cfa
Add Dockerfile for running test
- run ./run-tests-in-docker.sh to run travis tests locally
2021-04-07 15:47:35 +05:30
Dmitri Bogomolov 0c4f3e0dc0
Move desktop plugin initialization to updateStartOnLogon(); Fixes: #1735 2021-04-07 15:47:34 +05:30
Dmitri Bogomolov da1dcfd62f
Entry point 'desktop' for plugins managing desktop environment;
desktop_xdg will do it with pyxdg. Fixes: #857
2021-04-07 15:47:34 +05:30
Dmitri Bogomolov 4c214b945b
Added Network category in desktop file 2021-04-07 15:47:34 +05:30
Dmitri Bogomolov 552733778b
Move addressbook test to bitmessageqt.tests because it uses Qt 2021-04-07 15:47:33 +05:30
Dmitri Bogomolov b0cc81d643
A test for listening port 8444 2021-04-07 15:47:33 +05:30
Dmitri Bogomolov f5fa7d47ad
Add test for BITMESSAGE_HOME 2021-04-07 15:47:33 +05:30
Dmitri Bogomolov 30a78901f9
Instruct git to use LF as line ending for knownnodes.dat test pattern 2021-04-07 15:47:32 +05:30
Dmitri Bogomolov 7497cc5704
Format and simplify bitmessagemain.spec, exclude unused libs and files 2021-04-07 15:47:32 +05:30
Muzahid 42007d02ad
fix DB thread issue 2021-04-07 15:47:31 +05:30
surbhicis 3fa531c223
remove kivy specification file along with component changes of version from upstream 2021-04-07 15:47:31 +05:30
Muzahid 0dfc4dac7b
test commit 2021-04-07 15:47:31 +05:30
Muzahid bb38c158fe
Refactoring of database upgrade mechanism 2021-04-07 15:47:30 +05:30
Muzahid 583e5910fe
proper inherit remove return from __init__ 2021-04-07 15:47:30 +05:30
Muzahid 31c5e4ff38
remove no in case 2021-04-07 15:47:30 +05:30
Muzahid cdc2977331
Migrate Db with respect to their versions 2021-04-07 15:47:29 +05:30
navjot 86001ce749
remove TestProcessProto import from test_openclpow module 2021-04-07 15:47:29 +05:30
Dmitri Bogomolov f23487a155
Prevent adding bootstrap servers to knownnodes when received in addr 2021-04-07 15:47:28 +05:30
Dmitri Bogomolov 3feea1ee3f
Shorten Bootstrapper methods:
handle_close() and set_connection_fully_established()
2021-04-07 15:47:28 +05:30
Dmitri Bogomolov 1b198e9a4a
Try to find bootstrap server in knownnodes after bootstrapping 2021-04-07 15:47:28 +05:30
Dmitri Bogomolov 1a3a7089cc
A separate test for dontconnect setting 2021-04-07 15:47:27 +05:30
Dmitri Bogomolov f75268bc56
Set close_reason for exceptions in network.tls 2021-04-07 15:47:27 +05:30
Dmitri Bogomolov 2a8e91e6a7
Fixing tor related tests:
- knownnodes.cleanupKnownNodes() should set knownNodesActual = False
   if there are no nodes in stream 1 (repeated bootstrapping)
 - set socksproxytype before _initiate_bootstrap()
 - wait 5 sec in _initiate_bootstrap() to be sure all connections are closed
 - plugins do not work on travis - use socksproxytype = SOCKS5,
   check tor presence by trying to bind on port 9050
 - successfull connection to 3 onion nodes in 6 minutes is not guaranteed -
   check that bitmessage doesn't try non-onion nodes
2021-04-07 15:47:27 +05:30
Dmitri Bogomolov 9b27e6734a
Another possible approach for connection check 2021-04-07 15:47:26 +05:30
Dmitri Bogomolov 93c86bc8d5
A separate test for connection to bootstrap servers 2021-04-07 15:47:26 +05:30
navjot 0283339138
add Gpu skip condition and setupclass in test_openclpow module 2021-04-07 15:47:25 +05:30
navjot 27383ff635
test case for openclpow module 2021-04-07 15:47:25 +05:30
navjot a5786f72e6
replace print with logger and remove unused file 2021-04-07 15:47:25 +05:30
surbhicis 8dd766fcb8
detach kivy version from upstream 2021-04-07 15:47:24 +05:30
navjot 07a1ce9914
imported helper_addressGenerator module 2021-04-07 15:47:24 +05:30
navjot 7825229444
updated addressbook table in class_sqlThread module 2021-04-07 15:47:24 +05:30
navjot c61436a5ca
written test case for addressbook 2021-04-07 15:47:23 +05:30
navjot 4e1990b63e
fixed Own address should not save in address book issue
- removed redundant code

- written test case for address book own address saving

- fixed CQ issues

- added helper_addressbook module

- Fixed CQ issue of src.helper_addressbook module

- fixed travis-ci checks failing issue
2021-04-07 15:47:23 +05:30
navjot 8763c80bad
added general exception handler 2021-04-07 15:47:22 +05:30
navjot c96f51b193
ignoring ValueError from proofofwork module 2021-04-07 15:47:22 +05:30
navjot 16ded2c558
added timer of less then 10 seconds 2021-04-07 15:47:22 +05:30
navjot d24bf4b8af
remove qt dependency 2021-04-07 15:47:21 +05:30
navjot 6e236680cd
add helper_addressGenerator module 2021-04-07 15:47:21 +05:30
813492291816 6e69755c8a
Add missing TTL to API sendMessage 2021-04-07 15:47:20 +05:30
41 changed files with 1377 additions and 609 deletions

3
.gitmodules vendored
View File

@ -1,3 +0,0 @@
[submodule "packages/flatpak/shared-modules"]
path = packages/flatpak/shared-modules
url = https://github.com/flathub/shared-modules.git

View File

@ -1,57 +0,0 @@
{
"id": "org.bitmessage.BaseApp",
"branch": "19.08",
"runtime": "org.freedesktop.Platform",
"sdk": "org.freedesktop.Sdk",
"runtime-version": "19.08",
"separate-locales": false,
"modules": [
"shared-modules/python2.7/python-2.7.json",
"shared-modules/qt4/qt4-4.8.7-minimal.json",
{
"name": "python-sip",
"sources": [
{
"type": "archive",
"url": "https://www.riverbankcomputing.com/static/Downloads/sip/4.19.25/sip-4.19.25.tar.gz",
"sha256": "b39d93e937647807bac23579edbff25fe46d16213f708370072574ab1f1b4211"
}
],
"buildsystem": "simple",
"build-commands": [
"python configure.py --sip-module PyQt4.sip --no-dist-info",
"make",
"make install"
]
},
{
"name": "python-qt4",
"sources": [
{
"type": "archive",
"url": "http://sourceforge.net/projects/pyqt/files/PyQt4/PyQt-4.12.3/PyQt4_gpl_x11-4.12.3.tar.gz",
"sha256": "a00f5abef240a7b5852b7924fa5fdf5174569525dc076cd368a566619e56d472"
}
],
"buildsystem": "simple",
"build-commands": [
"python configure.py -w --confirm-license",
"make",
"make install"
]
},
{
"name" : "PyBitmessage-dependencies",
"buildsystem" : "simple",
"build-options": {
"build-args": [
"--share=network"
]
},
"build-commands": [
"pip --version",
"pip install setuptools msgpack"
]
}
]
}

View File

@ -1,48 +0,0 @@
{
"app-id": "org.bitmessage.PyBitmessage",
"runtime": "org.freedesktop.Platform",
"runtime-version": "19.08",
"branch": "stable",
"sdk": "org.freedesktop.Sdk",
"base": "org.bitmessage.BaseApp",
"command": "pybitmessage",
"base-version":"stable",
"finish-args" : [
"--share=network",
"--socket=x11",
"--share=ipc",
"--filesystem=xdg-config/PyBitmessage:create"
],
"modules": [
{
"name" : "PyBitmessage",
"buildsystem" : "simple",
"build-options": {
"build-args": [
"--share=network"
]
},
"build-commands": [
"python --version",
"pwd",
"ls",
"python checkdeps.py",
"python setup.py install --prefix=/app --exec-prefix=/app",
"sed -i 's~/usr/bin/~/app/bin/~' /app/bin/pybitmessage",
"cat /app/bin/pybitmessage",
"mv /app/share/applications/pybitmessage.desktop /app/share/applications/org.bitmessage.PyBitmessage.desktop",
"sed -i 's~Icon=pybitmessage~Icon=org.bitmessage.PyBitmessage~' /app/share/applications/org.bitmessage.PyBitmessage.desktop",
"mv /app/share/icons/hicolor/scalable/apps/pybitmessage.svg /app/share/icons/hicolor/scalable/apps/org.bitmessage.PyBitmessage.svg",
"mv /app/share/icons/hicolor/24x24/apps/pybitmessage.png /app/share/icons/hicolor/24x24/apps/org.bitmessage.PyBitmessage.png",
"which pybitmessage"
],
"sources" : [
{
"type" : "dir",
"path" : "../../"
}
]
}
]
}

@ -1 +0,0 @@
Subproject commit fd4d38328ccb078b88ad4a891807e593ae8de806

View File

@ -321,10 +321,9 @@ class Main(object):
receiveQueueThread = ReceiveQueueThread(i)
receiveQueueThread.daemon = True
receiveQueueThread.start()
if config.safeGetBoolean('bitmessagesettings', 'udp'):
state.announceThread = AnnounceThread()
state.announceThread.daemon = True
state.announceThread.start()
announceThread = AnnounceThread()
announceThread.daemon = True
announceThread.start()
state.invThread = InvThread()
state.invThread.daemon = True
state.invThread.start()

View File

@ -19,7 +19,7 @@ import widgets
from bmconfigparser import BMConfigParser
from helper_sql import sqlExecute, sqlStoredProcedure
from helper_startup import start_proxyconfig
from network import knownnodes, AnnounceThread
from network import knownnodes
from network.asyncore_pollchoose import set_rates
from tr import _translate
@ -138,8 +138,6 @@ class SettingsDialog(QtGui.QDialog):
config.get('bitmessagesettings', 'port')))
self.checkBoxUPnP.setChecked(
config.safeGetBoolean('bitmessagesettings', 'upnp'))
self.checkBoxUDP.setChecked(
config.safeGetBoolean('bitmessagesettings', 'udp'))
self.checkBoxAuthentication.setChecked(
config.getboolean('bitmessagesettings', 'socksauthentication'))
self.checkBoxSocksListen.setChecked(
@ -328,8 +326,7 @@ class SettingsDialog(QtGui.QDialog):
self.lineEditTCPPort.text()):
self.config.set(
'bitmessagesettings', 'port', str(self.lineEditTCPPort.text()))
if not self.config.safeGetBoolean(
'bitmessagesettings', 'dontconnect'):
if not self.config.safeGetBoolean('bitmessagesettings', 'dontconnect'):
self.net_restart_needed = True
if self.checkBoxUPnP.isChecked() != self.config.safeGetBoolean(
@ -342,26 +339,11 @@ class SettingsDialog(QtGui.QDialog):
upnpThread = upnp.uPnPThread()
upnpThread.start()
udp_enabled = self.checkBoxUDP.isChecked()
if udp_enabled != self.config.safeGetBoolean(
'bitmessagesettings', 'udp'):
self.config.set('bitmessagesettings', 'udp', str(udp_enabled))
if udp_enabled:
announceThread = AnnounceThread()
announceThread.daemon = True
announceThread.start()
else:
try:
state.announceThread.stopThread()
except AttributeError:
pass
proxytype_index = self.comboBoxProxyType.currentIndex()
if proxytype_index == 0:
if self._proxy_type and state.statusIconColor != 'red':
self.net_restart_needed = True
elif state.statusIconColor == 'red' and self.config.safeGetBoolean(
'bitmessagesettings', 'dontconnect'):
elif state.statusIconColor == 'red' and self.config.safeGetBoolean('bitmessagesettings', 'dontconnect'):
self.net_restart_needed = False
elif self.comboBoxProxyType.currentText() != self._proxy_type:
self.net_restart_needed = True
@ -387,11 +369,8 @@ class SettingsDialog(QtGui.QDialog):
self.lineEditSocksPassword.text()))
self.config.set('bitmessagesettings', 'sockslisten', str(
self.checkBoxSocksListen.isChecked()))
if (
self.checkBoxOnionOnly.isChecked()
and not self.config.safeGetBoolean(
'bitmessagesettings', 'onionservicesonly')
):
if self.checkBoxOnionOnly.isChecked() \
and not self.config.safeGetBoolean('bitmessagesettings', 'onionservicesonly'):
self.net_restart_needed = True
self.config.set('bitmessagesettings', 'onionservicesonly', str(
self.checkBoxOnionOnly.isChecked()))
@ -453,8 +432,8 @@ class SettingsDialog(QtGui.QDialog):
acceptableDifficultyChanged = False
if (
float(self.lineEditMaxAcceptableTotalDifficulty.text()) >= 1
or float(self.lineEditMaxAcceptableTotalDifficulty.text()) == 0
float(self.lineEditMaxAcceptableTotalDifficulty.text()) >= 1
or float(self.lineEditMaxAcceptableTotalDifficulty.text()) == 0
):
if self.config.get(
'bitmessagesettings', 'maxacceptablenoncetrialsperbyte'
@ -470,8 +449,8 @@ class SettingsDialog(QtGui.QDialog):
* defaults.networkDefaultProofOfWorkNonceTrialsPerByte))
)
if (
float(self.lineEditMaxAcceptableSmallMessageDifficulty.text()) >= 1
or float(self.lineEditMaxAcceptableSmallMessageDifficulty.text()) == 0
float(self.lineEditMaxAcceptableSmallMessageDifficulty.text()) >= 1
or float(self.lineEditMaxAcceptableSmallMessageDifficulty.text()) == 0
):
if self.config.get(
'bitmessagesettings', 'maxacceptablepayloadlengthextrabytes'
@ -562,8 +541,8 @@ class SettingsDialog(QtGui.QDialog):
self.parent.updateStartOnLogon()
if (
state.appdata != paths.lookupExeFolder()
and self.checkBoxPortableMode.isChecked()
state.appdata != paths.lookupExeFolder()
and self.checkBoxPortableMode.isChecked()
):
# If we are NOT using portable mode now but the user selected
# that we should...
@ -585,8 +564,8 @@ class SettingsDialog(QtGui.QDialog):
pass
if (
state.appdata == paths.lookupExeFolder()
and not self.checkBoxPortableMode.isChecked()
state.appdata == paths.lookupExeFolder()
and not self.checkBoxPortableMode.isChecked()
):
# If we ARE using portable mode now but the user selected
# that we shouldn't...

View File

@ -231,7 +231,7 @@
</layout>
</widget>
</item>
<item row="3" column="0">
<item row="2" column="0">
<widget class="QGroupBox" name="groupBox_3">
<property name="title">
<string>Bandwidth limit</string>
@ -322,7 +322,7 @@
</layout>
</widget>
</item>
<item row="2" column="0">
<item row="1" column="0">
<widget class="QGroupBox" name="groupBox_2">
<property name="title">
<string>Proxy server / Tor</string>
@ -432,14 +432,7 @@
</layout>
</widget>
</item>
<item row="1" column="0">
<widget class="QCheckBox" name="checkBoxUDP">
<property name="text">
<string>Announce self by UDP</string>
</property>
</widget>
</item>
<item row="4" column="0">
<item row="3" column="0">
<spacer name="verticalSpacer">
<property name="orientation">
<enum>Qt::Vertical</enum>

View File

@ -2,10 +2,6 @@
from addressbook import TestAddressbook
from main import TestMain, TestUISignaler
from settings import TestSettings
from support import TestSupport
__all__ = [
"TestAddressbook", "TestMain", "TestSettings", "TestSupport",
"TestUISignaler"
]
__all__ = ["TestAddressbook", "TestMain", "TestSupport", "TestUISignaler"]

View File

@ -1,34 +0,0 @@
import threading
import time
from main import TestBase
from bmconfigparser import BMConfigParser
from bitmessageqt import settings
class TestSettings(TestBase):
"""A test case for the "Settings" dialog"""
def setUp(self):
super(TestSettings, self).setUp()
self.dialog = settings.SettingsDialog(self.window)
def test_udp(self):
"""Test the effect of checkBoxUDP"""
udp_setting = BMConfigParser().safeGetBoolean(
'bitmessagesettings', 'udp')
self.assertEqual(udp_setting, self.dialog.checkBoxUDP.isChecked())
self.dialog.checkBoxUDP.setChecked(not udp_setting)
self.dialog.accept()
self.assertEqual(
not udp_setting,
BMConfigParser().safeGetBoolean('bitmessagesettings', 'udp'))
time.sleep(5)
for thread in threading.enumerate():
if thread.name == 'Announcer': # find Announcer thread
if udp_setting:
self.fail(
'Announcer thread is running while udp set to False')
break
else:
if not udp_setting:
self.fail('No Announcer thread found while udp set to True')

View File

@ -2,22 +2,13 @@
BMConfigParser class definition and default configuration settings
"""
import sys
if sys.version_info[0] == 3:
# python 3
import configparser as ConfigParser
SafeConfigParser = ConfigParser.ConfigParser
else:
# python 2
import ConfigParser
SafeConfigParser = ConfigParser.SafeConfigParser
import state
from singleton import Singleton
import ConfigParser
import os
import shutil
from datetime import datetime
import state
from singleton import Singleton
BMConfigDefaults = {
"bitmessagesettings": {
@ -28,32 +19,30 @@ BMConfigDefaults = {
"maxtotalconnections": 200,
"maxuploadrate": 0,
"apiinterface": "127.0.0.1",
"apiport": 8442,
"udp": "True"
"apiport": 8442
},
"threads": {
"receive": 3,
},
"network": {
"bind": "",
"bind": '',
"dandelion": 90,
},
"inventory": {
"storage": "sqlite",
"acceptmismatch": "False",
"acceptmismatch": False,
},
"knownnodes": {
"maxnodes": 20000,
},
"zlib": {
"maxsize": 1048576
'maxsize': 1048576
}
}
@Singleton
class BMConfigParser(SafeConfigParser):
class BMConfigParser(ConfigParser.SafeConfigParser):
"""
Singleton class inherited from :class:`ConfigParser.SafeConfigParser`
with additional methods specific to bitmessage config.
@ -70,47 +59,26 @@ class BMConfigParser(SafeConfigParser):
raise ValueError("Invalid value %s" % value)
return ConfigParser.ConfigParser.set(self, section, option, value)
def get(self, section, option, raw=False, vars=None):
if sys.version_info[0] == 3:
# pylint: disable=arguments-differ
try:
if section == "bitmessagesettings" and option == "timeformat":
return ConfigParser.ConfigParser.get(
self, section, option)
try:
return self._temp[section][option]
except KeyError:
pass
def get(self, section, option, raw=False, variables=None):
# pylint: disable=arguments-differ
try:
if section == "bitmessagesettings" and option == "timeformat":
return ConfigParser.ConfigParser.get(
self, section, option)
except ConfigParser.InterpolationError:
return ConfigParser.ConfigParser.get(
self, section, option)
except (ConfigParser.NoSectionError, ConfigParser.NoOptionError) as e:
try:
return BMConfigDefaults[section][option]
except (KeyError, ValueError, AttributeError):
raise e
else:
# pylint: disable=arguments-differ
self, section, option, raw, variables)
try:
if section == "bitmessagesettings" and option == "timeformat":
return ConfigParser.ConfigParser.get(
self, section, option, raw, vars)
try:
return self._temp[section][option]
except KeyError:
pass
return ConfigParser.ConfigParser.get(
self, section, option, True, vars)
except ConfigParser.InterpolationError:
return ConfigParser.ConfigParser.get(
self, section, option, True, vars)
except (ConfigParser.NoSectionError, ConfigParser.NoOptionError) as e:
try:
return BMConfigDefaults[section][option]
except (KeyError, ValueError, AttributeError):
raise e
return self._temp[section][option]
except KeyError:
pass
return ConfigParser.ConfigParser.get(
self, section, option, True, variables)
except ConfigParser.InterpolationError:
return ConfigParser.ConfigParser.get(
self, section, option, True, variables)
except (ConfigParser.NoSectionError, ConfigParser.NoOptionError) as e:
try:
return BMConfigDefaults[section][option]
except (KeyError, ValueError, AttributeError):
raise e
def setTemp(self, section, option, value=None):
"""Temporary set option to value, not saving."""
@ -222,4 +190,3 @@ class BMConfigParser(SafeConfigParser):
if value < 0 or value > 8:
return False
return True

View File

@ -17,10 +17,236 @@ import state
import tr
from bmconfigparser import BMConfigParser
from debug import logger
from addresses import encodeAddress
# pylint: disable=attribute-defined-outside-init,protected-access
root_path = os.path.dirname(os.path.dirname(__file__))
class sqlThread(threading.Thread):
def connection_build():
conn = sqlite3.connect(state.appdata + 'messages.dat')
conn.text_factory = str
cur = conn.cursor()
return conn, cur
class UpgradeDB():
"""Upgrade Db with respect to versions"""
# cur = None
parameters = None
current_level = None
max_level = 11
conn = None
conn, cur = connection_build()
def __index__(self):
self.conn = conn
self.cur = cur
def get_current_level(self):
# Upgrade Db with respect to their versions
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
return int(self.cur.fetchall()[0][0])
def upgrade_one_level(self, level):
""" Apply switcher to call methods accordingly """
if level != self.get_current_level():
return None
# Migrate Db with level
method_name = 'upgrade_schema_data_' + str(level)
method = getattr(self, method_name, lambda: "Invalid version")
return method()
def run_migrations(self, file):
try:
print("----------file", file)
sql_file = open(os.path.join(root_path, "src/sql/init_version_{}.sql".format(file)))
sql_as_string = sql_file.read()
# self.cur.executescript(sql_as_string)
self.cur.executescript(sql_as_string)
except Exception as err:
print("err")
print(err)
print("err")
if str(err) == 'table inbox already exists':
return "table inbox already exists"
else:
sys.stderr.write(
'ERROR trying to create database file (message.dat). Error message: %s\n' % str(err))
os._exit(0)
def versioning(func):
def wrapper(*args):
self = args[0]
func_name = func.__name__
version = func_name.rsplit('_', 1)[-1]
self.run_migrations(version)
ret = func(*args)
return ret # <-- use (self, ...)
return wrapper
def upgrade_to_latest(self, cur, conn):
"""
Initialise upgrade level
"""
# Declare variables
self.conn = conn
self.cur = cur
self.current_level = self.get_current_level()
self.max_level = 11
# call upgrading level in loop
for l in range(self.current_level, self.max_level):
self.upgrade_one_level(l)
self.upgrade_schema_data_level(l)
def upgrade_schema_data_level(self, level):
item = '''update settings set value=? WHERE key='version';'''
parameters = (level + 1,)
self.cur.execute(item, parameters)
@versioning
def upgrade_schema_data_1(self):
"""inventory
For version 1 and 3
Add a new column to the inventory table to store tags.
"""
logger.debug(
'In messages.dat database, adding tag field to'
' the inventory table.')
@versioning
def upgrade_schema_data_2(self):
"""
For version 2
Let's get rid of the first20bytesofencryptedmessage field in the inventory table.
"""
logger.debug(
'In messages.dat database, removing an obsolete field from'
' the inventory table.')
def upgrade_schema_data_3(self):
"""
For version 3
Call method for version 1
"""
self.upgrade_schema_data_1()
@versioning
def upgrade_schema_data_4(self):
"""
For version 4
Add a new column to the pubkeys table to store the address version.
We're going to trash all of our pubkeys and let them be redownloaded.
"""
@versioning
def upgrade_schema_data_5(self):
"""
For version 5
Add a new table: objectprocessorqueue with which to hold objects
That have yet to be processed if the user shuts down Bitmessage.
"""
@versioning
def upgrade_schema_data_6(self):
"""
For version 6
Changes related to protocol v3
In table inventory and objectprocessorqueue, objecttype is now
an integer (it was a human-friendly string previously)
"""
logger.debug(
'In messages.dat database, dropping and recreating'
' the inventory table.')
logger.debug(
'Finished dropping and recreating the inventory table.')
@versioning
def upgrade_schema_data_7(self):
"""
For version 7
The format of data stored in the pubkeys table has changed. Let's
clear it, and the pubkeys from inventory, so that they'll
be re-downloaded.
"""
logger.debug(
'In messages.dat database, clearing pubkeys table'
' because the data format has been updated.')
logger.debug('Finished clearing currently held pubkeys.')
@versioning
def upgrade_schema_data_8(self):
"""
For version 8
Add a new column to the inbox table to store the hash of
the message signature. We'll use this as temporary message UUID
in order to detect duplicates.
"""
logger.debug(
'In messages.dat database, adding sighash field to'
' the inbox table.')
@versioning
def upgrade_schema_data_9(self):
"""
For version 9
We'll also need a `sleeptill` field and a `ttl` field. Also we
can combine the pubkeyretrynumber and msgretrynumber into one.
"""
logger.info(
'In messages.dat database, making TTL-related changes:'
' combining the pubkeyretrynumber and msgretrynumber'
' fields into the retrynumber field and adding the'
' sleeptill and ttl fields...')
logger.info('In messages.dat database, finished making TTL-related changes.')
logger.debug('In messages.dat database, adding address field to the pubkeys table.')
# We're going to have to calculate the address for each row in the pubkeys
# table. Then we can take out the hash field.
self.cur.execute('''ALTER TABLE pubkeys ADD address text DEFAULT '' ''')
item = '''UPDATE 'pubkeys' SET `hash`='87788778877887788787' where hash=''; '''
self.cur.execute(item)
# replica for loop to update hashed address
self.cur.execute('''UPDATE pubkeys SET address=(select enaddr(pubkeys.addressversion, 1, hash)) WHERE hash=pubkeys.hash; ''')
self.run_migrations("9_1")
logger.debug(
'In messages.dat database, done adding address field to the pubkeys table'
' and removing the hash field.')
@versioning
def upgrade_schema_data_10(self):
"""
For version 10
Update the address colunm to unique in addressbook table
"""
logger.debug(
'In messages.dat database, updating address column to UNIQUE'
' in the addressbook table.')
class sqlThread(threading.Thread, UpgradeDB):
"""A thread for all SQL operations"""
def __init__(self):
@ -29,52 +255,57 @@ class sqlThread(threading.Thread):
def run(self): # pylint: disable=too-many-locals, too-many-branches, too-many-statements
"""Process SQL queries from `.helper_sql.sqlSubmitQueue`"""
helper_sql.sql_available = True
self.conn = sqlite3.connect(state.appdata + 'messages.dat')
self.conn.text_factory = str
self.cur = self.conn.cursor()
# self.conn = sqlite3.connect(state.appdata + 'messages.dat')
# self.conn.text_factory = str
# self.cur = self.conn.cursor()
conn, cur = connection_build()
self.conn = conn
self.cur = cur
self.cur.execute('PRAGMA secure_delete = true')
# call create_function for encode address
self.create_function()
try:
self.cur.execute(
'''CREATE TABLE inbox (msgid blob, toaddress text, fromaddress text, subject text,'''
''' received text, message text, folder text, encodingtype int, read bool, sighash blob,'''
''' UNIQUE(msgid) ON CONFLICT REPLACE)''')
self.cur.execute(
'''CREATE TABLE sent (msgid blob, toaddress text, toripe blob, fromaddress text, subject text,'''
''' message text, ackdata blob, senttime integer, lastactiontime integer,'''
''' sleeptill integer, status text, retrynumber integer, folder text, encodingtype int, ttl int)''')
self.cur.execute(
'''CREATE TABLE subscriptions (label text, address text, enabled bool)''')
self.cur.execute(
'''CREATE TABLE addressbook (label text, address text, UNIQUE(address) ON CONFLICT IGNORE)''')
self.cur.execute(
'''CREATE TABLE blacklist (label text, address text, enabled bool)''')
self.cur.execute(
'''CREATE TABLE whitelist (label text, address text, enabled bool)''')
self.cur.execute(
'''CREATE TABLE pubkeys (address text, addressversion int, transmitdata blob, time int,'''
''' usedpersonally text, UNIQUE(address) ON CONFLICT REPLACE)''')
self.cur.execute(
'''CREATE TABLE inventory (hash blob, objecttype int, streamnumber int, payload blob,'''
''' expirestime integer, tag blob, UNIQUE(hash) ON CONFLICT REPLACE)''')
self.cur.execute(
'''INSERT INTO subscriptions VALUES'''
'''('Bitmessage new releases/announcements','BM-GtovgYdgs7qXPkoYaRgrLFuFKz1SFpsw',1)''')
# self.cur.execute(
# '''CREATE TABLE inbox (msgid blob, toaddress text, fromaddress text, subject text,'''
# ''' received text, message text, folder text, encodingtype int, read bool, sighash blob,'''
# ''' UNIQUE(msgid) ON CONFLICT REPLACE)''')
# self.cur.execute(
# '''CREATE TABLE sent (msgid blob, toaddress text, toripe blob, fromaddress text, subject text,'''
# ''' message text, ackdata blob, senttime integer, lastactiontime integer,'''
# ''' sleeptill integer, status text, retrynumber integer, folder text, encodingtype int, ttl int)''')
# '''CREATE TABLE subscriptions (label text, address text, enabled bool)''')
# self.cur.execute(
# '''CREATE TABLE addressbook (label text, address text, UNIQUE(address) ON CONFLICT IGNORE)''')
# self.cur.execute(
# '''CREATE TABLE blacklist (label text, address text, enabled bool)''')
# self.cur.execute(
# '''CREATE TABLE whitelist (label text, address text, enabled bool)''')
# self.cur.execute(
# '''CREATE TABLE pubkeys (address text, addressversion int, transmitdata blob, time int,'''
# ''' usedpersonally text, UNIQUE(address) ON CONFLICT REPLACE)''')
# self.cur.execute(
# '''CREATE TABLE inventory (hash blob, objecttype int, streamnumber int, payload blob,'''
# ''' expirestime integer, tag blob, UNIQUE(hash) ON CONFLICT REPLACE)''')
# self.cur.execute(
# '''INSERT INTO subscriptions VALUES'''
# '''('Bitmessage new releases/announcements','BM-GtovgYdgs7qXPkoYaRgrLFuFKz1SFpsw',1)''')
self.cur.execute(
'''CREATE TABLE settings (key blob, value blob, UNIQUE(key) ON CONFLICT REPLACE)''')
self.cur.execute('''INSERT INTO settings VALUES('version','11')''')
self.cur.execute('''INSERT INTO settings VALUES('lastvacuumtime',?)''', (
int(time.time()),))
self.cur.execute(
'''CREATE TABLE objectprocessorqueue'''
''' (objecttype int, data blob, UNIQUE(objecttype, data) ON CONFLICT REPLACE)''')
# self.cur.execute(
# '''CREATE TABLE objectprocessorqueue'''
# ''' (objecttype int, data blob, UNIQUE(objecttype, data) ON CONFLICT REPLACE)''')
self.conn.commit()
logger.info('Created messages database file')
except Exception as err:
if str(err) == 'table inbox already exists':
logger.debug('Database file already exists.')
else:
sys.stderr.write(
'ERROR trying to create database file (message.dat). Error message: %s\n' % str(err))
@ -178,241 +409,21 @@ class sqlThread(threading.Thread):
'''update sent set status='broadcastqueued' where status='broadcastpending' ''')
self.conn.commit()
# Let's get rid of the first20bytesofencryptedmessage field in
# the inventory table.
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
if int(self.cur.fetchall()[0][0]) == 2:
logger.debug(
'In messages.dat database, removing an obsolete field from'
' the inventory table.')
self.cur.execute(
'''CREATE TEMPORARY TABLE inventory_backup'''
'''(hash blob, objecttype text, streamnumber int, payload blob,'''
''' receivedtime integer, UNIQUE(hash) ON CONFLICT REPLACE);''')
self.cur.execute(
'''INSERT INTO inventory_backup SELECT hash, objecttype, streamnumber, payload, receivedtime'''
''' FROM inventory;''')
self.cur.execute('''DROP TABLE inventory''')
self.cur.execute(
'''CREATE TABLE inventory'''
''' (hash blob, objecttype text, streamnumber int, payload blob, receivedtime integer,'''
''' UNIQUE(hash) ON CONFLICT REPLACE)''')
self.cur.execute(
'''INSERT INTO inventory SELECT hash, objecttype, streamnumber, payload, receivedtime'''
''' FROM inventory_backup;''')
self.cur.execute('''DROP TABLE inventory_backup;''')
item = '''update settings set value=? WHERE key='version';'''
parameters = (3,)
self.cur.execute(item, parameters)
# Add a new column to the inventory table to store tags.
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 1 or currentVersion == 3:
logger.debug(
'In messages.dat database, adding tag field to'
' the inventory table.')
item = '''ALTER TABLE inventory ADD tag blob DEFAULT '' '''
parameters = ''
self.cur.execute(item, parameters)
item = '''update settings set value=? WHERE key='version';'''
parameters = (4,)
self.cur.execute(item, parameters)
# Add a new column to the pubkeys table to store the address version.
# We're going to trash all of our pubkeys and let them be redownloaded.
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 4:
self.cur.execute('''DROP TABLE pubkeys''')
self.cur.execute(
'''CREATE TABLE pubkeys (hash blob, addressversion int, transmitdata blob, time int,'''
'''usedpersonally text, UNIQUE(hash, addressversion) ON CONFLICT REPLACE)''')
self.cur.execute(
'''delete from inventory where objecttype = 'pubkey';''')
item = '''update settings set value=? WHERE key='version';'''
parameters = (5,)
self.cur.execute(item, parameters)
# Add a new table: objectprocessorqueue with which to hold objects
# that have yet to be processed if the user shuts down Bitmessage.
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 5:
self.cur.execute('''DROP TABLE knownnodes''')
self.cur.execute(
'''CREATE TABLE objectprocessorqueue'''
''' (objecttype text, data blob, UNIQUE(objecttype, data) ON CONFLICT REPLACE)''')
item = '''update settings set value=? WHERE key='version';'''
parameters = (6,)
self.cur.execute(item, parameters)
# changes related to protocol v3
# In table inventory and objectprocessorqueue, objecttype is now
# an integer (it was a human-friendly string previously)
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 6:
logger.debug(
'In messages.dat database, dropping and recreating'
' the inventory table.')
self.cur.execute('''DROP TABLE inventory''')
self.cur.execute(
'''CREATE TABLE inventory'''
''' (hash blob, objecttype int, streamnumber int, payload blob, expirestime integer,'''
''' tag blob, UNIQUE(hash) ON CONFLICT REPLACE)''')
self.cur.execute('''DROP TABLE objectprocessorqueue''')
self.cur.execute(
'''CREATE TABLE objectprocessorqueue'''
''' (objecttype int, data blob, UNIQUE(objecttype, data) ON CONFLICT REPLACE)''')
item = '''update settings set value=? WHERE key='version';'''
parameters = (7,)
self.cur.execute(item, parameters)
logger.debug(
'Finished dropping and recreating the inventory table.')
# The format of data stored in the pubkeys table has changed. Let's
# clear it, and the pubkeys from inventory, so that they'll
# be re-downloaded.
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 7:
logger.debug(
'In messages.dat database, clearing pubkeys table'
' because the data format has been updated.')
self.cur.execute(
'''delete from inventory where objecttype = 1;''')
self.cur.execute(
'''delete from pubkeys;''')
# Any sending messages for which we *thought* that we had
# the pubkey must be rechecked.
self.cur.execute(
'''UPDATE sent SET status='msgqueued' WHERE status='doingmsgpow' or status='badkey';''')
query = '''update settings set value=? WHERE key='version';'''
parameters = (8,)
self.cur.execute(query, parameters)
logger.debug('Finished clearing currently held pubkeys.')
# Add a new column to the inbox table to store the hash of
# the message signature. We'll use this as temporary message UUID
# in order to detect duplicates.
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 8:
logger.debug(
'In messages.dat database, adding sighash field to'
' the inbox table.')
item = '''ALTER TABLE inbox ADD sighash blob DEFAULT '' '''
parameters = ''
self.cur.execute(item, parameters)
item = '''update settings set value=? WHERE key='version';'''
parameters = (9,)
self.cur.execute(item, parameters)
# We'll also need a `sleeptill` field and a `ttl` field. Also we
# can combine the pubkeyretrynumber and msgretrynumber into one.
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 9:
logger.info(
'In messages.dat database, making TTL-related changes:'
' combining the pubkeyretrynumber and msgretrynumber'
' fields into the retrynumber field and adding the'
' sleeptill and ttl fields...')
self.cur.execute(
'''CREATE TEMPORARY TABLE sent_backup'''
''' (msgid blob, toaddress text, toripe blob, fromaddress text, subject text, message text,'''
''' ackdata blob, lastactiontime integer, status text, retrynumber integer,'''
''' folder text, encodingtype int)''')
self.cur.execute(
'''INSERT INTO sent_backup SELECT msgid, toaddress, toripe, fromaddress,'''
''' subject, message, ackdata, lastactiontime,'''
''' status, 0, folder, encodingtype FROM sent;''')
self.cur.execute('''DROP TABLE sent''')
self.cur.execute(
'''CREATE TABLE sent'''
''' (msgid blob, toaddress text, toripe blob, fromaddress text, subject text, message text,'''
''' ackdata blob, senttime integer, lastactiontime integer, sleeptill int, status text,'''
''' retrynumber integer, folder text, encodingtype int, ttl int)''')
self.cur.execute(
'''INSERT INTO sent SELECT msgid, toaddress, toripe, fromaddress, subject, message, ackdata,'''
''' lastactiontime, lastactiontime, 0, status, 0, folder, encodingtype, 216000 FROM sent_backup;''')
self.cur.execute('''DROP TABLE sent_backup''')
logger.info('In messages.dat database, finished making TTL-related changes.')
logger.debug('In messages.dat database, adding address field to the pubkeys table.')
# We're going to have to calculate the address for each row in the pubkeys
# table. Then we can take out the hash field.
self.cur.execute('''ALTER TABLE pubkeys ADD address text DEFAULT '' ''')
self.cur.execute('''SELECT hash, addressversion FROM pubkeys''')
queryResult = self.cur.fetchall()
from addresses import encodeAddress
for row in queryResult:
addressHash, addressVersion = row
address = encodeAddress(addressVersion, 1, hash)
item = '''UPDATE pubkeys SET address=? WHERE hash=?;'''
parameters = (address, addressHash)
self.cur.execute(item, parameters)
# Now we can remove the hash field from the pubkeys table.
self.cur.execute(
'''CREATE TEMPORARY TABLE pubkeys_backup'''
''' (address text, addressversion int, transmitdata blob, time int,'''
''' usedpersonally text, UNIQUE(address) ON CONFLICT REPLACE)''')
self.cur.execute(
'''INSERT INTO pubkeys_backup'''
''' SELECT address, addressversion, transmitdata, time, usedpersonally FROM pubkeys;''')
self.cur.execute('''DROP TABLE pubkeys''')
self.cur.execute(
'''CREATE TABLE pubkeys'''
''' (address text, addressversion int, transmitdata blob, time int, usedpersonally text,'''
''' UNIQUE(address) ON CONFLICT REPLACE)''')
self.cur.execute(
'''INSERT INTO pubkeys SELECT'''
''' address, addressversion, transmitdata, time, usedpersonally FROM pubkeys_backup;''')
self.cur.execute('''DROP TABLE pubkeys_backup''')
logger.debug(
'In messages.dat database, done adding address field to the pubkeys table'
' and removing the hash field.')
self.cur.execute('''update settings set value=10 WHERE key='version';''')
# Update the address colunm to unique in addressbook table
item = '''SELECT value FROM settings WHERE key='version';'''
parameters = ''
self.cur.execute(item, parameters)
currentVersion = int(self.cur.fetchall()[0][0])
if currentVersion == 10:
logger.debug(
'In messages.dat database, updating address column to UNIQUE'
' in the addressbook table.')
self.cur.execute(
'''ALTER TABLE addressbook RENAME TO old_addressbook''')
self.cur.execute(
'''CREATE TABLE addressbook'''
''' (label text, address text, UNIQUE(address) ON CONFLICT IGNORE)''')
self.cur.execute(
'''INSERT INTO addressbook SELECT label, address FROM old_addressbook;''')
self.cur.execute('''DROP TABLE old_addressbook''')
self.cur.execute('''update settings set value=11 WHERE key='version';''')
self.upgrade_to_latest(self.cur, self.conn)
# Are you hoping to add a new option to the keys.dat file of existing
# Bitmessage users or modify the SQLite database? Add it right
# above this line!
self.add_new_option()
# Let us check to see the last time we vaccumed the messages.dat file.
# If it has been more than a month let's do it now.
self.check_vaccumed()
def add_new_option(self):
try:
testpayload = '\x00\x00'
t = ('1234', 1, testpayload, '12345678', 'no')
@ -453,8 +464,7 @@ class sqlThread(threading.Thread):
else:
logger.error(err)
# Let us check to see the last time we vaccumed the messages.dat file.
# If it has been more than a month let's do it now.
def check_vaccumed(self):
item = '''SELECT value FROM settings WHERE key='lastvacuumtime';'''
parameters = ''
self.cur.execute(item, parameters)
@ -622,3 +632,12 @@ class sqlThread(threading.Thread):
helper_sql.sqlReturnQueue.put((self.cur.fetchall(), rowcount))
# helper_sql.sqlSubmitQueue.task_done()
def create_function(self):
# create_function
try:
self.conn.create_function("enaddr", 3, func=encodeAddress, deterministic=True)
except (TypeError, sqlite3.NotSupportedError) as err:
logger.error(
"Got error while pass deterministic in sqlite create function {}, Passing 3 params".format(err))
self.conn.create_function("enaddr", 3, encodeAddress)

View File

@ -16,9 +16,16 @@ SQLite objects can only be used from one thread.
or isn't thread-safe.
"""
import Queue
# import Queue
try:
import queue as Queue #python3
except ImportError:
import Queue #python2
import threading
sqlSubmitQueue = Queue.Queue()
"""the queue for SQL"""
sqlReturnQueue = Queue.Queue()
@ -105,6 +112,15 @@ def sqlExecute(sql_statement, *args):
return rowcount
def sqlExecuteScript(sql_statement):
"""Execute SQL script statement"""
statements = sql_statement.split(";")
with SqlBulkExecute() as sql:
for q in statements:
sql.execute("{}".format(q))
def sqlStoredProcedure(procName):
"""Schedule procName to be run"""
assert sql_available

View File

@ -7,6 +7,7 @@ import state
from bmconfigparser import BMConfigParser
from network.assemble import assemble_addr
from network.connectionpool import BMConnectionPool
from network.udp import UDPSocket
from node import Peer
from threads import StoppableThread
@ -14,13 +15,12 @@ from threads import StoppableThread
class AnnounceThread(StoppableThread):
"""A thread to manage regular announcing of this node"""
name = "Announcer"
announceInterval = 60
def run(self):
lastSelfAnnounced = 0
while not self._stopped and state.shutdown == 0:
processed = 0
if lastSelfAnnounced < time.time() - self.announceInterval:
if lastSelfAnnounced < time.time() - UDPSocket.announceInterval:
self.announceSelf()
lastSelfAnnounced = time.time()
if processed == 0:

View File

@ -8,7 +8,6 @@ import time
import protocol
import state
from bmproto import BMProto
from constants import MAX_TIME_OFFSET
from node import Peer
from objectracker import ObjectTracker
from queues import receiveDataQueue
@ -19,6 +18,7 @@ logger = logging.getLogger('default')
class UDPSocket(BMProto): # pylint: disable=too-many-instance-attributes
"""Bitmessage protocol over UDP (class)"""
port = 8444
announceInterval = 60
def __init__(self, host=None, sock=None, announcing=False):
# pylint: disable=bad-super-call
@ -82,8 +82,8 @@ class UDPSocket(BMProto): # pylint: disable=too-many-instance-attributes
decodedIP = protocol.checkIPAddress(str(ip))
if stream not in state.streamsInWhichIAmParticipating:
continue
if (seenTime < time.time() - MAX_TIME_OFFSET
or seenTime > time.time() + MAX_TIME_OFFSET):
if (seenTime < time.time() - self.maxTimeOffset
or seenTime > time.time() + self.maxTimeOffset):
continue
if decodedIP is False:
# if the address isn't local, interpret it as
@ -94,8 +94,9 @@ class UDPSocket(BMProto): # pylint: disable=too-many-instance-attributes
logger.debug(
"received peer discovery from %s:%i (port %i):",
self.destination.host, self.destination.port, remoteport)
state.discoveredPeers[Peer(self.destination.host, remoteport)] = \
time.time()
if self.local:
state.discoveredPeers[Peer(self.destination.host, remoteport)] = \
time.time()
return True
def bm_command_portcheck(self):
@ -124,9 +125,9 @@ class UDPSocket(BMProto): # pylint: disable=too-many-instance-attributes
def handle_read(self):
try:
recdata, addr = self.socket.recvfrom(self._buf_len)
except socket.error:
logger.error("socket error on recvfrom:", exc_info=True)
(recdata, addr) = self.socket.recvfrom(self._buf_len)
except socket.error as e:
logger.error("socket error: %s", e)
return
self.destination = Peer(*addr)
@ -142,7 +143,7 @@ class UDPSocket(BMProto): # pylint: disable=too-many-instance-attributes
try:
retval = self.socket.sendto(
self.write_buf, ('<broadcast>', self.port))
except socket.error:
logger.error("socket error on sendto:", exc_info=True)
except socket.error as e:
logger.error("socket error on sendto: %s", e)
retval = len(self.write_buf)
self.slice_write_buf(retval)

0
src/sql/__init__.py Normal file
View File

View File

@ -0,0 +1,6 @@
--
-- Alter table `inventory`
--
ALTER TABLE inventory ADD tag blob DEFAULT '';

View File

@ -0,0 +1,12 @@
ALTER TABLE addressbook RENAME TO old_addressbook;
CREATE TABLE `addressbook` (
`label` text NOT NULL,
`address` text NOT NULL,
UNIQUE(address) ON CONFLICT IGNORE
) ;
INSERT INTO addressbook SELECT label, address FROM old_addressbook;
DROP TABLE old_addressbook;

View File

@ -0,0 +1,55 @@
--
-- Temp Table structure for table `inventory_backup`
--
CREATE TEMP TABLE `inventory_backup` (
`hash` blob NOT NULL,
`objecttype` text DEFAULT NULL,
`streamnumber` int NOT NULL,
`receivedtime` int NOT NULL,
`payload` blob DEFAULT NULL,
-- `integer` integer NOT NULL,
-- `tag` blob DEFAULT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `inventory_backup`
--
INSERT INTO `inventory_backup` SELECT hash, objecttype, streamnumber, payload, receivedtime FROM inventory;
--
-- Drop table `inventory`
--
DROP TABLE inventory;
--
-- Table structure for table `inventory`
--
CREATE TABLE `inventory` (
`hash` blob NOT NULL,
`objecttype` text DEFAULT NULL,
`streamnumber` int NOT NULL,
`receivedtime` int NOT NULL,
`payload` blob DEFAULT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `inventory`
--
INSERT INTO inventory SELECT hash, objecttype, streamnumber, payload, receivedtime FROM inventory_backup;
--
-- Drop data for table `inventory_backup`
--
DROP TABLE inventory_backup;

View File

@ -0,0 +1,26 @@
--
-- Drop Table `pubkeys`
--
DROP TABLE pubkeys;
--
-- Table structure for table `pubkeys`
--
CREATE TABLE `pubkeys` (
`hash` blob NOT NULL,
`addressversion` int DEFAULT NULL,
`transmitdata` blob NOT NULL,
`time` int NOT NULL,
`usedpersonally` text DEFAULT NULL,
UNIQUE(hash, addressversion) ON CONFLICT REPLACE
) ;
--
-- Drop from Table `pubkeys`
--
DELETE FROM inventory WHERE objecttype = 'pubkey';

View File

@ -0,0 +1,17 @@
--
-- Drop Table `knownnodes`
--
DROP TABLE knownnodes;
--
-- Table structure for table `objectprocessorqueue`
--
CREATE TABLE `objectprocessorqueue` (
`objecttype` text DEFAULT NULL,
`data` blob DEFAULT NULL,
UNIQUE(objecttype, data) ON CONFLICT REPLACE
) ;

View File

@ -0,0 +1,39 @@
-- --
-- -- Drop table `inventory`
-- --
DROP TABLE inventory;
-- --
-- -- Table structure for table `inventory`
-- --
CREATE TABLE `inventory` (
`hash` blob NOT NULL,
`objecttype` int DEFAULT NULL,
`streamnumber` int NOT NULL,
`payload` blob NOT NULL,
`expirestime` integer DEFAULT NULL,
`tag` blob DEFAULT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
-- --
-- -- Drop table `inventory`
-- --
DROP TABLE objectprocessorqueue;
-- --
-- -- Table structure for table `objectprocessorqueue`
-- --
CREATE TABLE `objectprocessorqueue` (
`objecttype` int DEFAULT NULL,
`data` blob DEFAULT NULL,
UNIQUE(objecttype, data) ON CONFLICT REPLACE
) ;

View File

@ -0,0 +1,18 @@
-- --
-- -- Drop table `inventory`
-- --
DELETE FROM inventory WHERE objecttype = 1;
-- --
-- -- Drop table `pubkeys`
-- --
DELETE FROM pubkeys;
-- --
-- -- Update table `pubkeys`
-- --
UPDATE sent SET status='msgqueued' WHERE status='doingmsgpow' or status='badkey';

View File

@ -0,0 +1,5 @@
-- --
-- -- Alter table `inbox`
-- --
ALTER TABLE inbox ADD sighash blob DEFAULT '';

View File

@ -0,0 +1,74 @@
-- --
-- -- Table structure for table `sent_backup`
-- --
CREATE TEMPORARY TABLE `sent_backup` (
`msgid` blob DEFAULT NULL,
`toaddress` text DEFAULT NULL,
`toripe` blob DEFAULT NULL,
`fromaddress` text DEFAULT NULL,
`subject` text DEFAULT NULL,
`message` text DEFAULT NULL,
`ackdata` blob DEFAULT NULL,
`lastactiontime` integer DEFAULT NULL,
`status` text DEFAULT NULL,
`retrynumber` integer DEFAULT NULL,
`folder` text DEFAULT NULL,
`encodingtype` int DEFAULT NULL
) ;
-- --
-- -- Dumping data for table `sent_backup`
-- --
INSERT INTO sent_backup SELECT msgid, toaddress, toripe, fromaddress, subject, message, ackdata, lastactiontime, status, 0, folder, encodingtype FROM sent;
-- --
-- -- Drope table `sent`
-- --
DROP TABLE sent;
-- --
-- -- Table structure for table `sent_backup`
-- --
CREATE TABLE `sent` (
`msgid` blob DEFAULT NULL,
`toaddress` text DEFAULT NULL,
`toripe` blob DEFAULT NULL,
`fromaddress` text DEFAULT NULL,
`subject` text DEFAULT NULL,
`message` text DEFAULT NULL,
`ackdata` blob DEFAULT NULL,
`senttime` integer DEFAULT NULL,
`lastactiontime` integer DEFAULT NULL,
`sleeptill` int DEFAULT NULL,
`status` text DEFAULT NULL,
`retrynumber` integer DEFAULT NULL,
`folder` text DEFAULT NULL,
`encodingtype` int DEFAULT NULL,
`ttl` int DEFAULT NULL
) ;
-- --
-- -- Dumping data for table `sent`
-- --
INSERT INTO sent SELECT msgid, toaddress, toripe, fromaddress, subject, message, ackdata, lastactiontime, lastactiontime, 0, status, 0, folder, encodingtype, 216000 FROM sent_backup;
--UPDATE pubkeys SET address= (select enaddr(?, ?, ?)", (addressVersion, 1, addressHash)) WHERE hash=?
-- --
-- -- Drop table `sent`
-- --
DROP TABLE sent_backup;

View File

@ -0,0 +1,68 @@
-- --
-- -- Table structure for table `pubkeys_backup`
-- --
CREATE TEMPORARY TABLE `pubkeys_backup` (
`address` text DEFAULT NULL,
`addressversion` int DEFAULT NULL,
`transmitdata` blob DEFAULT NULL,
`time` int DEFAULT NULL,
`usedpersonally` text DEFAULT NULL,
UNIQUE(address) ON CONFLICT REPLACE
) ;
-- --
-- -- Dumping data for table `pubkeys_backup`
-- --
INSERT INTO pubkeys_backup SELECT address, addressversion, transmitdata, time, usedpersonally FROM pubkeys;
-- --
-- -- Drope table `pubkeys`
-- --
DROP TABLE pubkeys;
-- --
-- -- Table structure for table `pubkeys`
-- --
CREATE TABLE `pubkeys` (
`address` text DEFAULT NULL,
`addressversion` int DEFAULT NULL,
`transmitdata` blob DEFAULT NULL,
`time` int DEFAULT NULL,
`usedpersonally` text DEFAULT NULL,
UNIQUE(address) ON CONFLICT REPLACE
) ;
-- --
-- -- Dumping data for table `pubkeys`
-- --
INSERT INTO pubkeys SELECT address, addressversion, transmitdata, time, usedpersonally FROM pubkeys_backup;
-- self.cur.execute(
-- '''CREATE TEMPORARY TABLE pubkeys_backup'''
-- ''' (address text, addressversion int, transmitdata blob, time int,'''
-- ''' usedpersonally text, UNIQUE(address) ON CONFLICT REPLACE)''')
-- self.cur.execute(
-- '''INSERT INTO pubkeys_backup'''
-- ''' SELECT address, addressversion, transmitdata, time, usedpersonally FROM pubkeys;''')
-- self.cur.execute('''DROP TABLE pubkeys''')
-- self.cur.execute(
-- '''CREATE TABLE pubkeys'''
-- ''' (address text, addressversion int, transmitdata blob, time int, usedpersonally text,'''
-- ''' UNIQUE(address) ON CONFLICT REPLACE)''')
-- self.cur.execute(
-- '''INSERT INTO pubkeys SELECT'''
-- ''' address, addressversion, transmitdata, time, usedpersonally FROM pubkeys_backup;''')
-- self.cur.execute('''DROP TABLE pubkeys_backup''')

126
src/sql/run.sql Normal file
View File

@ -0,0 +1,126 @@
--
-- Table structure for table `inbox`
--
CREATE TABLE `inbox` (
`msgid` blob DEFAULT NULL,
`toaddress` text DEFAULT NULL,
`fromaddress` text DEFAULT NULL,
`subject` text DEFAULT NULL,
`received` text DEFAULT NULL,
`message` text DEFAULT NULL,
`folder` text DEFAULT NULL,
`encodingtype` int DEFAULT NULL,
`read` bool DEFAULT NULL,
`sighash` blob DEFAULT NULL,
UNIQUE(msgid) ON CONFLICT REPLACE
) ;
--
-- Table structure for table `sent`
--
CREATE TABLE `sent` (
`msgid` blob DEFAULT NULL,
`toaddress` text DEFAULT NULL,
`toripe` blob DEFAULT NULL,
`fromaddress` text DEFAULT NULL,
`subject` text DEFAULT NULL,
`message` text DEFAULT NULL,
`ackdata` blob DEFAULT NULL,
`senttime` integer DEFAULT NULL,
`lastactiontime` integer DEFAULT NULL,
`sleeptill` integer DEFAULT NULL,
`status` text DEFAULT NULL,
`retrynumber` integer DEFAULT NULL,
`folder` text DEFAULT NULL,
`encodingtype` int DEFAULT NULL,
`ttl` int DEFAULT NULL
) ;
--
-- Table structure for table `subscriptions`
--
CREATE TABLE `subscriptions` (
`label` text DEFAULT NULL,
`address` text DEFAULT NULL,
`enabled` bool DEFAULT NULL
) ;
--
-- Table structure for table `addressbook`
--
CREATE TABLE `addressbook` (
`label` text DEFAULT NULL,
`address` text DEFAULT NULL,
UNIQUE(address) ON CONFLICT IGNORE
) ;
--
-- Table structure for table `blacklist`
--
CREATE TABLE `blacklist` (
`label` text DEFAULT NULL,
`address` text DEFAULT NULL,
`enabled` bool DEFAULT NULL
) ;
--
-- Table structure for table `whitelist`
--
CREATE TABLE `whitelist` (
`label` text DEFAULT NULL,
`address` text DEFAULT NULL,
`enabled` bool DEFAULT NULL
) ;
--
-- Table structure for table `pubkeys`
--
CREATE TABLE `pubkeys` (
`address` text DEFAULT NULL,
`addressversion` int DEFAULT NULL,
`transmitdata` blob DEFAULT NULL,
`time` int DEFAULT NULL,
`usedpersonally` text DEFAULT NULL,
UNIQUE(address) ON CONFLICT REPLACE
) ;
--
-- Table structure for table `inventory`
--
CREATE TABLE `inventory` (
`hash` blob DEFAULT NULL,
`objecttype` int DEFAULT NULL,
`streamnumber` int DEFAULT NULL,
`payload` blob DEFAULT NULL,
`expirestime` integer DEFAULT NULL,
`tag` blob DEFAULT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Insert data for table `subscriptions`
--
INSERT INTO subscriptions VALUES ('Bitmessage new releases/announcements','BM-GtovgYdgs7qXPkoYaRgrLFuFKz1SFpsw',1);
--
-- Table structure for table `settings`
--
CREATE TABLE `settings` (
`key` blob DEFAULT NULL,
`value` blob DEFAULT NULL,
UNIQUE(key) ON CONFLICT REPLACE
) ;

View File

@ -11,7 +11,6 @@ import shutil
import socket
import string
import sys
import threading
import time
import unittest
@ -62,13 +61,6 @@ class TestCore(unittest.TestCase):
"""Test case, which runs in main pybitmessage thread"""
addr = 'BM-2cVvkzJuQDsQHLqxRXc6HZGPLZnkBLzEZY'
def tearDown(self):
"""Reset possible unexpected settings after test"""
knownnodes.addKnownNode(1, Peer('127.0.0.1', 8444), is_self=True)
BMConfigParser().remove_option('bitmessagesettings', 'dontconnect')
BMConfigParser().remove_option('bitmessagesettings', 'onionservicesonly')
BMConfigParser().set('bitmessagesettings', 'socksproxytype', 'none')
def test_msgcoding(self):
"""test encoding and decoding (originally from helper_msgcoding)"""
msg_data = {
@ -278,36 +270,6 @@ class TestCore(unittest.TestCase):
return
self.fail('Failed to connect to at least 3 nodes within 360 sec')
def test_udp(self):
"""check default udp setting and presence of Announcer thread"""
self.assertTrue(
BMConfigParser().safeGetBoolean('bitmessagesettings', 'udp'))
for thread in threading.enumerate():
if thread.name == 'Announcer': # find Announcer thread
break
else:
return self.fail('No Announcer thread found')
for _ in range(20): # wait for UDP socket
for sock in BMConnectionPool().udpSockets.values():
thread.announceSelf()
break
else:
time.sleep(1)
continue
break
else:
self.fail('UDP socket is not started')
for _ in range(20):
if state.discoveredPeers:
peer = state.discoveredPeers.keys()[0]
self.assertEqual(peer.port, 8444)
break
time.sleep(1)
else:
self.fail('No self in discovered peers')
@staticmethod
def _decode_msg(data, pattern):
proto = BMProto()

View File

View File

@ -0,0 +1,11 @@
CREATE TABLE `testhash` (
`addressversion` int DEFAULT NULL,
`hash` blob DEFAULT NULL,
`address` text DEFAULT NULL,
UNIQUE(address) ON CONFLICT IGNORE
);
INSERT INTO testhash (addressversion, hash) VALUES(4, "21122112211221122112");

View File

@ -0,0 +1,31 @@
--
-- Table structure for table `settings`
--
CREATE TABLE IF NOT EXISTS `settings` (
`key` blob NOT NULL,
`value` text DEFAULT NULL,
UNIQUE(key) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `settings`
--
INSERT INTO `settings` VALUES ('version','1');
--
-- Table structure for table `inventory`
--
CREATE TABLE IF NOT EXISTS `inventory` (
`hash` blob NOT NULL,
`objecttype` int DEFAULT NULL,
`streamnumber` int NOT NULL,
`payload` blob DEFAULT NULL,
`integer` integer NOT NULL,
-- `tag` blob DEFAULT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;

View File

@ -0,0 +1,40 @@
--
-- Table structure for table `addressbook`
--
CREATE TABLE IF NOT EXISTS `addressbook` (
`label` blob NOT NULL,
`address` text DEFAULT NULL,
UNIQUE(address) ON CONFLICT IGNORE
) ;
--
-- Alter table `addressbook`
--
ALTER TABLE addressbook RENAME TO old_addressbook;
--
-- Table structure for table `addressbook`
--
CREATE TABLE IF NOT EXISTS `addressbook` (
`label` text NOT NULL,
`address` text DEFAULT NULL,
UNIQUE(address) ON CONFLICT IGNORE
) ;
--
-- Insert data into table `addressbook`
--
INSERT INTO addressbook SELECT label, address FROM old_addressbook;
--
-- Insert data into table `addressbook`
--
DROP TABLE old_addressbook;

View File

@ -0,0 +1,20 @@
--
-- Table structure for table `inventory`
--
CREATE TABLE IF NOT EXISTS `inventory` (
`hash` blob NOT NULL,
`objecttype` int DEFAULT NULL,
`streamnumber` int NOT NULL,
`receivedtime` int NOT NULL,
`payload` blob DEFAULT NULL,
`integer` integer NOT NULL,
-- `tag` blob DEFAULT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `inventory`
--
INSERT INTO `inventory` VALUES ('hash', 1, 1,1, 1,'test');

View File

@ -0,0 +1,27 @@
--
-- Table structure for table `inventory`
--
CREATE TABLE IF NOT EXISTS `inventory` (
`hash` blob NOT NULL,
`objecttype` text DEFAULT NULL,
`streamnumber` int NOT NULL,
`payload` blob DEFAULT NULL,
`integer` integer NOT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `inventory`
--
INSERT INTO `inventory` VALUES ('hash', "pubkey", 1, 1,'test');
--
-- Table structure for table `pubkeys`
--
CREATE TABLE IF NOT EXISTS `pubkeys` (
`objecttype` int,
UNIQUE(objecttype) ON CONFLICT REPLACE
) ;

View File

@ -0,0 +1,12 @@
--
-- Table structure for table `knownnodes`
--
CREATE TABLE IF NOT EXISTS `knownnodes` (
`hash` blob NOT NULL,
`objecttype` int DEFAULT NULL,
`streamnumber` int NOT NULL,
`payload` blob DEFAULT NULL,
`integer` integer NOT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;

View File

@ -0,0 +1,27 @@
--
-- Table structure for table `inventory`
--
CREATE TABLE IF NOT EXISTS `inventory` (
`hash` blob NOT NULL,
`objecttype` int DEFAULT NULL,
`streamnumber` int NOT NULL,
`payload` blob DEFAULT NULL,
`integer` integer NOT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `inventory`
--
INSERT INTO `inventory` VALUES ('hash', 1, 1, 1,'test');
--
-- Table structure for table `objectprocessorqueue`
--
CREATE TABLE IF NOT EXISTS `objectprocessorqueue` (
`objecttype` int,
UNIQUE(objecttype) ON CONFLICT REPLACE
) ;

View File

@ -0,0 +1,69 @@
--
-- Table structure for table `inventory`
--
CREATE TABLE IF NOT EXISTS `inventory` (
`hash` blob NOT NULL,
`objecttype` int DEFAULT NULL,
`streamnumber` int NOT NULL,
`payload` blob DEFAULT NULL,
`integer` integer NOT NULL,
-- `tag` blob DEFAULT NULL,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `inventory`
--
INSERT INTO `inventory` VALUES ('hash', 1, 1, 1,'test');
--
-- Table structure for table `pubkeys`
--
CREATE TABLE IF NOT EXISTS `pubkeys` (
`hash` text,
`addressversion` int,
`transmitdata` blob,
`time` int,
`usedpersonally` text,
UNIQUE(hash) ON CONFLICT REPLACE
) ;
--
-- Dumping data for table `pubkeys`
--
INSERT INTO `pubkeys` VALUES ('hash','1','1','1','test');
--
-- Table structure for table `sent`
--
CREATE TABLE IF NOT EXISTS `sent` (
`msgid` blob DEFAULT NULL,
`toaddress` text DEFAULT NULL,
`toripe` blob DEFAULT NULL,
`fromaddress` text DEFAULT NULL,
`subject` text DEFAULT NULL,
`message` text DEFAULT NULL,
`ackdata` blob DEFAULT NULL,
`senttime` integer DEFAULT NULL,
`lastactiontime` integer DEFAULT NULL,
`sleeptill` integer DEFAULT NULL,
`status` text DEFAULT NULL,
`retrynumber` integer DEFAULT NULL,
`folder` text DEFAULT NULL,
`encodingtype` int DEFAULT NULL,
`ttl` int DEFAULT NULL
) ;
--
-- Dumping data for table `sent`
--
INSERT INTO `sent` VALUES
('msgid','toaddress','toripe','fromaddress','subject','message','ackdata','senttime','lastactiontime','sleeptill','doingmsgpow','retrynumber','folder','encodingtype','ttl'),
('msgid','toaddress','toripe','fromaddress','subject','message','ackdata','senttime','lastactiontime','sleeptill','badkey','retrynumber','folder','encodingtype','ttl');

View File

@ -0,0 +1,16 @@
--
-- Table structure for table `inbox`
--
CREATE TABLE IF NOT EXISTS `inbox` (
`msgid` blob NOT NULL,
`toaddress` text DEFAULT NULL,
`fromaddress` text DEFAULT NULL,
`subject` text DEFAULT NULL,
`received` text DEFAULT NULL,
`message` text DEFAULT NULL,
`folder` text DEFAULT NULL,
`encodingtype` int DEFAULT NULL,
`read` bool DEFAULT NULL,
UNIQUE(msgid) ON CONFLICT REPLACE
) ;

View File

@ -0,0 +1,37 @@
--
-- Table structure for table `sent`
--
CREATE TABLE IF NOT EXISTS `sent` (
`msgid` blob NOT NULL,
`toaddress` text DEFAULT NULL,
`toripe` blob DEFAULT NULL,
`fromaddress` text DEFAULT NULL,
`subject` text DEFAULT NULL,
`message` text DEFAULT NULL,
`ackdata` blob DEFAULT NULL,
`senttime` integer DEFAULT NULL,
`lastactiontime` integer DEFAULT NULL,
`sleeptill` integer DEFAULT NULL,
`status` text DEFAULT NULL,
`retrynumber` integer DEFAULT NULL,
`folder` text DEFAULT NULL,
`encodingtype` int DEFAULT NULL,
`ttl` int DEFAULT NULL,
UNIQUE(msgid) ON CONFLICT REPLACE
) ;
--
-- Table structure for table `pubkeys`
--
CREATE TABLE IF NOT EXISTS `pubkeys` (
`hash` text,
`addressversion` int,
`transmitdata` blob,
`time` int,
`usedpersonally` text,
UNIQUE(hash) ON CONFLICT REPLACE
) ;

View File

@ -2,7 +2,11 @@
Various tests for config
"""
import os
import unittest
import tempfile
from .test_process import TestProcessProto
from pybitmessage.bmconfigparser import BMConfigParser
@ -34,3 +38,32 @@ class TestConfig(unittest.TestCase):
BMConfigParser().safeGetInt('nonexistent', 'nonexistent'), 0)
self.assertEqual(
BMConfigParser().safeGetInt('nonexistent', 'nonexistent', 42), 42)
class TestProcessConfig(TestProcessProto):
"""A test case for keys.dat"""
home = tempfile.mkdtemp()
def test_config_defaults(self):
"""Test settings in the generated config"""
self._stop_process()
self._kill_process()
config = BMConfigParser()
config.read(os.path.join(self.home, 'keys.dat'))
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'settingsversion'), 10)
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'port'), 8444)
# don't connect
self.assertTrue(config.safeGetBoolean(
'bitmessagesettings', 'dontconnect'))
# API disabled
self.assertFalse(config.safeGetBoolean(
'bitmessagesettings', 'apienabled'))
# extralowdifficulty is false
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'defaultnoncetrialsperbyte'), 1000)
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'defaultpayloadlengthextrabytes'), 1000)

View File

@ -1,38 +0,0 @@
"""
Various tests for config
"""
import os
import tempfile
from pybitmessage.bmconfigparser import BMConfigParser
from .test_process import TestProcessProto
class TestProcessConfig(TestProcessProto):
"""A test case for keys.dat"""
home = tempfile.mkdtemp()
def test_config_defaults(self):
"""Test settings in the generated config"""
config = BMConfigParser()
self._stop_process()
self._kill_process()
config.read(os.path.join(self.home, 'keys.dat'))
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'settingsversion'), 10)
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'port'), 8444)
# don't connect
self.assertTrue(config.safeGetBoolean(
'bitmessagesettings', 'dontconnect'))
# API disabled
self.assertFalse(config.safeGetBoolean(
'bitmessagesettings', 'apienabled'))
# extralowdifficulty is false
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'defaultnoncetrialsperbyte'), 1000)
self.assertEqual(config.safeGetInt(
'bitmessagesettings', 'defaultpayloadlengthextrabytes'), 1000)

248
src/tests/test_sqlthread.py Normal file
View File

@ -0,0 +1,248 @@
"""
Test for sqlThread
"""
import os
import unittest
from ..helper_sql import sqlStoredProcedure, sql_ready, sqlExecute, SqlBulkExecute, sqlQuery, sqlExecuteScript
from ..class_sqlThread import (sqlThread, UpgradeDB)
from ..addresses import encodeAddress
from .common import skip_python3
import time
skip_python3()
class TestSqlThread(unittest.TestCase):
"""
Test case for SQLThread
"""
# query file path
root_path = os.path.dirname(os.path.dirname(__file__))
@classmethod
def setUpClass(cls):
# Start SQL thread
print("1============")
sqlLookup = sqlThread()
print("2============")
sqlLookup.daemon = False
print("3============")
# time.sleep(5)
sqlLookup.start()
print("4============")
# time.sleep(5)
print("5============")
sql_ready.wait()
print("6============")
@classmethod
def setUp(cls):
tables = list(sqlQuery("select name from sqlite_master where type is 'table'"))
with SqlBulkExecute() as sql:
for q in tables:
sql.execute("drop table if exists %s" % q)
@classmethod
def tearDown(cls):
pass
@classmethod
def tearDownClass(cls):
# Stop sql thread
sqlStoredProcedure('exit')
def initialise_database(self, file):
"""
Initialise DB
"""
sql_as_string = open(os.path.join(self.root_path, "tests/sql/{}.sql".format(file))).read()
sqlExecuteScript(sql_as_string)
def versioning(func):
def wrapper(*args):
self = args[0]
func_name = func.__name__
version = func_name.rsplit('_', 1)[-1]
print("-------------------------===============")
print(func_name)
print(version)
print("upgrade_schema_data_", version)
print("-------------------------===============")
# Update versions DB mocking
self.initialise_database("init_version_{}".format(version))
# Test versions
upgrade_db = UpgradeDB()
getattr(upgrade_db, "upgrade_schema_data_{}".format(version))()
ret = func(*args)
return ret # <-- use (self, ...)
return wrapper
def test_create_function(self):
# call create function
encoded_str = encodeAddress(4, 1, "21122112211221122112")
# Initialise Database
self.initialise_database("create_function")
sqlExecute('''INSERT INTO testhash (addressversion, hash) VALUES(4, "21122112211221122112")''')
# call function in query
sqlExecute('''UPDATE testhash SET address=(enaddr(testhash.addressversion, 1, hash)) WHERE hash=testhash.hash''')
# Assertion
query = sqlQuery('''select * from testhash;''')
self.assertEqual(query[0][-1], encoded_str, "test case fail for create_function")
sqlExecute('''DROP TABLE testhash''')
def filter_table_column(self, schema, column):
for x in schema:
for y in x:
if y == column:
yield y
@versioning
def test_sql_thread_version_1(self):
"""
Test with version 1
"""
# Assertion after versioning
res = sqlQuery('''PRAGMA table_info('inventory');''')
# res = res.fetchall()
print(res)
result = list(self.filter_table_column(res, "tag"))
res = [tup for tup in res if any(i in tup for i in ["tag"])]
self.assertEqual(result, ['tag'], "Data not migrated for version 1")
self.assertEqual(res, [(5, 'tag', 'blob', 0, "''", 0)], "Data not migrated for version 1")
@versioning
def test_sql_thread_version_10(self):
"""
Test with version 10
"""
# Assertion
res = sqlExecute(''' SELECT count(name) FROM sqlite_master WHERE type='table' AND name='old_addressbook' ''')
print("res---------------------------------")
print(res)
print("res---------------------------------")
self.assertNotEqual(res, 1, "Table old_addressbook not deleted")
self.assertEqual(res, -1, "Table old_addressbook not deleted")
res = sqlQuery('''PRAGMA table_info('addressbook');''')
# # res = res.fetchall()
result = list(self.filter_table_column(res, "address"))
self.assertEqual(result, ['address'], "Data not migrated for version 10")
# @versioning
# def test_sql_thread_version_9(self):
# """
# Test with version 9
# """
#
# # Assertion
# self.cur.execute(''' SELECT count(name) FROM sqlite_master WHERE type='table' AND name='pubkeys_backup' ''')
# self.assertNotEqual(self.cur.fetchone(), 1, "Table pubkeys_backup not deleted")
#
# res = self.cur.execute('''PRAGMA table_info('pubkeys');''')
# res = res.fetchall()
# result = list(self.filter_table_column(res, "address"))
# self.assertEqual(result, ['address'], "Data not migrated for version 9")
#
# @versioning
# def test_sql_thread_version_8(self):
# """
# Test with version 8
# """
#
# # Assertion
# res = self.cur.execute('''PRAGMA table_info('inbox');''')
# res = res.fetchall()
# result = list(self.filter_table_column(res, "sighash"))
# self.assertEqual(result, ['sighash'], "Data not migrated for version 8")
#
# @versioning
# def test_sql_thread_version_7(self):
# """
# Test with version 7
# """
#
# # Assertion
# pubkeys = self.cur.execute('''SELECT * FROM pubkeys ''')
# pubkeys = pubkeys.fetchall()
# self.assertEqual(pubkeys, [], "Data not migrated for version 7")
#
# inventory = self.cur.execute('''SELECT * FROM inventory ''')
# inventory = inventory.fetchall()
# self.assertEqual(inventory, [], "Data not migrated for version 7")
#
# sent = self.cur.execute('''SELECT status FROM sent ''')
# sent = sent.fetchall()
# self.assertEqual(sent, [('msgqueued',), ('msgqueued',)], "Data not migrated for version 7")
#
# @versioning
# def test_sql_thread_version_6(self):
# """
# Test with version 6
# """
#
# # Assertion
#
# inventory = self.cur.execute('''PRAGMA table_info('inventory');''')
# inventory = inventory.fetchall()
# inventory = list(self.filter_table_column(inventory, "expirestime"))
# self.assertEqual(inventory, ['expirestime'], "Data not migrated for version 6")
#
# objectprocessorqueue = self.cur.execute('''PRAGMA table_info('inventory');''')
# objectprocessorqueue = objectprocessorqueue.fetchall()
# objectprocessorqueue = list(self.filter_table_column(objectprocessorqueue, "objecttype"))
# self.assertEqual(objectprocessorqueue, ['objecttype'], "Data not migrated for version 6")
#
# @versioning
# def test_sql_thread_version_5(self):
# """
# Test with version 5
# """
#
# # Assertion
# self.cur.execute(''' SELECT count(name) FROM sqlite_master WHERE type='table' AND name='knownnodes' ''')
# self.assertNotEqual(self.cur.fetchone(), 1, "Table knownnodes not deleted in versioning 5")
# self.cur.execute(
# ''' SELECT count(name) FROM sqlite_master WHERE type='table' AND name='objectprocessorqueue'; ''')
# self.assertNotEqual(self.cur.fetchone(), 0, "Table objectprocessorqueue not created in versioning 5")
#
# @versioning
# def test_sql_thread_version_4(self):
# """
# Test with version 4
# """
#
# # Assertion
# self.cur.execute('''select * from inventory where objecttype = 'pubkey';''')
# self.assertNotEqual(self.cur.fetchone(), 1, "Table inventory not deleted in versioning 4")
#
# def test_sql_thread_version_3(self):
# """
# Test with version 3 and 1 both are same
# """
# pass
#
# @versioning
# def test_sql_thread_version_2(self):
# """
# Test with version 2
# """
#
# # Assertion
# self.cur.execute(''' SELECT count(name) FROM sqlite_master WHERE type='table' AND name='inventory_backup' ''')
# self.assertNotEqual(self.cur.fetchone(), 1, "Table inventory_backup not deleted in versioning 2")