init from memdb

This commit is contained in:
Andre Staltz 2023-04-01 11:36:06 +03:00
commit 0cebc33d94
18 changed files with 1197 additions and 0 deletions

25
.github/workflows/node.js.yml vendored Normal file
View File

@ -0,0 +1,25 @@
name: CI
on:
push:
branches: [master]
pull_request:
branches: [master]
jobs:
test:
runs-on: ubuntu-latest
timeout-minutes: 10
strategy:
matrix:
node-version: [16.x, 18.x]
steps:
- uses: actions/checkout@v2
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- run: npm install
- run: npm test

9
.gitignore vendored Normal file
View File

@ -0,0 +1,9 @@
.vscode
node_modules
pnpm-lock.yaml
package-lock.json
coverage
*~
# For misc scripts and experiments:
/gitignored

7
.prettierrc.yaml Normal file
View File

@ -0,0 +1,7 @@
# SPDX-FileCopyrightText: 2021 Anders Rune Jensen
# SPDX-FileCopyrightText: 2021 Andre 'Staltz' Medeiros
#
# SPDX-License-Identifier: Unlicense
semi: false
singleQuote: true

121
LICENSE Normal file
View File

@ -0,0 +1,121 @@
Creative Commons Legal Code
CC0 1.0 Universal
CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE
LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN
ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS
INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES
REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS
PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM
THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED
HEREUNDER.
Statement of Purpose
The laws of most jurisdictions throughout the world automatically confer
exclusive Copyright and Related Rights (defined below) upon the creator
and subsequent owner(s) (each and all, an "owner") of an original work of
authorship and/or a database (each, a "Work").
Certain owners wish to permanently relinquish those rights to a Work for
the purpose of contributing to a commons of creative, cultural and
scientific works ("Commons") that the public can reliably and without fear
of later claims of infringement build upon, modify, incorporate in other
works, reuse and redistribute as freely as possible in any form whatsoever
and for any purposes, including without limitation commercial purposes.
These owners may contribute to the Commons to promote the ideal of a free
culture and the further production of creative, cultural and scientific
works, or to gain reputation or greater distribution for their Work in
part through the use and efforts of others.
For these and/or other purposes and motivations, and without any
expectation of additional consideration or compensation, the person
associating CC0 with a Work (the "Affirmer"), to the extent that he or she
is an owner of Copyright and Related Rights in the Work, voluntarily
elects to apply CC0 to the Work and publicly distribute the Work under its
terms, with knowledge of his or her Copyright and Related Rights in the
Work and the meaning and intended legal effect of CC0 on those rights.
1. Copyright and Related Rights. A Work made available under CC0 may be
protected by copyright and related or neighboring rights ("Copyright and
Related Rights"). Copyright and Related Rights include, but are not
limited to, the following:
i. the right to reproduce, adapt, distribute, perform, display,
communicate, and translate a Work;
ii. moral rights retained by the original author(s) and/or performer(s);
iii. publicity and privacy rights pertaining to a person's image or
likeness depicted in a Work;
iv. rights protecting against unfair competition in regards to a Work,
subject to the limitations in paragraph 4(a), below;
v. rights protecting the extraction, dissemination, use and reuse of data
in a Work;
vi. database rights (such as those arising under Directive 96/9/EC of the
European Parliament and of the Council of 11 March 1996 on the legal
protection of databases, and under any national implementation
thereof, including any amended or successor version of such
directive); and
vii. other similar, equivalent or corresponding rights throughout the
world based on applicable law or treaty, and any national
implementations thereof.
2. Waiver. To the greatest extent permitted by, but not in contravention
of, applicable law, Affirmer hereby overtly, fully, permanently,
irrevocably and unconditionally waives, abandons, and surrenders all of
Affirmer's Copyright and Related Rights and associated claims and causes
of action, whether now known or unknown (including existing as well as
future claims and causes of action), in the Work (i) in all territories
worldwide, (ii) for the maximum duration provided by applicable law or
treaty (including future time extensions), (iii) in any current or future
medium and for any number of copies, and (iv) for any purpose whatsoever,
including without limitation commercial, advertising or promotional
purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each
member of the public at large and to the detriment of Affirmer's heirs and
successors, fully intending that such Waiver shall not be subject to
revocation, rescission, cancellation, termination, or any other legal or
equitable action to disrupt the quiet enjoyment of the Work by the public
as contemplated by Affirmer's express Statement of Purpose.
3. Public License Fallback. Should any part of the Waiver for any reason
be judged legally invalid or ineffective under applicable law, then the
Waiver shall be preserved to the maximum extent permitted taking into
account Affirmer's express Statement of Purpose. In addition, to the
extent the Waiver is so judged Affirmer hereby grants to each affected
person a royalty-free, non transferable, non sublicensable, non exclusive,
irrevocable and unconditional license to exercise Affirmer's Copyright and
Related Rights in the Work (i) in all territories worldwide, (ii) for the
maximum duration provided by applicable law or treaty (including future
time extensions), (iii) in any current or future medium and for any number
of copies, and (iv) for any purpose whatsoever, including without
limitation commercial, advertising or promotional purposes (the
"License"). The License shall be deemed effective as of the date CC0 was
applied by Affirmer to the Work. Should any part of the License for any
reason be judged legally invalid or ineffective under applicable law, such
partial invalidity or ineffectiveness shall not invalidate the remainder
of the License, and in such case Affirmer hereby affirms that he or she
will not (i) exercise any of his or her remaining Copyright and Related
Rights in the Work or (ii) assert any associated claims and causes of
action with respect to the Work, in either case contrary to Affirmer's
express Statement of Purpose.
4. Limitations and Disclaimers.
a. No trademark or patent rights held by Affirmer are waived, abandoned,
surrendered, licensed or otherwise affected by this document.
b. Affirmer offers the Work as-is and makes no representations or
warranties of any kind concerning the Work, express, implied,
statutory or otherwise, including without limitation warranties of
title, merchantability, fitness for a particular purpose, non
infringement, or the absence of latent or other defects, accuracy, or
the present or absence of errors, whether or not discoverable, all to
the greatest extent permissible under applicable law.
c. Affirmer disclaims responsibility for clearing rights of other persons
that may apply to the Work or any use thereof, including without
limitation any person's Copyright and Related Rights in the Work.
Further, Affirmer disclaims responsibility for obtaining any necessary
consents, permissions or other rights required for any use of the
Work.
d. Affirmer understands and acknowledges that Creative Commons is not a
party to this document and has no duty or obligation with respect to
this CC0 or use of the Work.

1
README.md Normal file
View File

@ -0,0 +1 @@
**Work in progress**

1
index.js Normal file
View File

@ -0,0 +1 @@
module.exports = require('./lib/plugin')

63
lib/encryption.js Normal file
View File

@ -0,0 +1,63 @@
function ciphertextStrToBuffer(str) {
const dot = str.indexOf('.')
return Buffer.from(str.slice(0, dot), 'base64')
}
function decrypt(msg, ssb, config) {
const { author, previous, content } = msg.value
if (typeof content !== 'string') return msg
const encryptionFormat = ssb.db.findEncryptionFormatFor(content)
if (!encryptionFormat) return msg
const feedFormat = ssb.db.findFeedFormatForAuthor(author)
if (!feedFormat) return msg
// Decrypt
const ciphertextBuf = ciphertextStrToBuffer(content)
const opts = { keys: config.keys, author, previous }
const plaintextBuf = encryptionFormat.decrypt(ciphertextBuf, opts)
if (!plaintextBuf) return msg
// Reconstruct KVT in JS encoding
const nativeMsg = feedFormat.toNativeMsg(msg.value, 'js')
// TODO: feedFormat.fromDecryptedNativeMsg() should NOT mutate nativeMsg
// but in the case of ssb-classic, it is
const msgVal = feedFormat.fromDecryptedNativeMsg(
plaintextBuf,
{ ...nativeMsg, value: { ...nativeMsg.value } }, // TODO revert this
'js'
)
return {
key: msg.key,
value: msgVal,
timestamp: msg.timestamp,
meta: {
private: true,
originalContent: content,
encryptionFormat: encryptionFormat.name,
},
}
}
function reEncrypt(msg) {
return {
key: msg.key,
value: { ...msg.value, content: msg.meta.originalContent },
timestamp: msg.timestamp,
...(msg.meta.size
? {
meta: {
offset: msg.meta.offset,
size: msg.meta.size,
},
}
: null),
}
}
module.exports = {
decrypt,
reEncrypt,
}

404
lib/plugin.js Normal file
View File

@ -0,0 +1,404 @@
const path = require('path')
const push = require('push-stream')
const AAOL = require('async-append-only-log')
const promisify = require('promisify-4loc')
const Obz = require('obz')
const { ReadyGate } = require('./utils')
const { decrypt, reEncrypt } = require('./encryption')
exports.name = 'db'
exports.init = function initMemDB(ssb, config) {
const hmacKey = null
const msgs = []
const feedFormats = new Map()
const encryptionFormats = new Map()
const onMsgAdded = Obz()
const latestMsgPerFeed = {
_map: new Map(), // feedId => nativeMsg
preupdateFromKVT(kvtf, i) {
const feedId = kvtf.feed ?? kvtf.value.author
this._map.set(feedId, i)
},
commitAllPreupdates() {
for (const i of this._map.values()) {
if (typeof i === 'number') {
this.updateFromKVT(msgs[i])
}
}
},
updateFromKVT(kvtf) {
const feedId = kvtf.feed ?? kvtf.value.author
const feedFormat = findFeedFormatForAuthor(feedId)
if (!feedFormat) {
console.warn('No feed format installed understands ' + feedId)
return
}
const msg = reEncrypt(kvtf)
const nativeMsg = feedFormat.toNativeMsg(msg.value, 'js')
this._map.set(feedId, nativeMsg)
},
update(feedId, nativeMsg) {
this._map.set(feedId, nativeMsg)
},
get(feedId) {
return this._map.get(feedId) ?? null
},
has(feedId) {
return this._map.has(feedId)
},
getAsKV(feedId, feedFormat) {
const nativeMsg = this._map.get(feedId)
if (!nativeMsg) return null
const feedFormat2 = feedFormat ?? findFeedFormatForAuthor(feedId)
if (!feedFormat2) {
throw new Error('No feed format installed understands ' + feedId)
}
const key = feedFormat2.getMsgId(nativeMsg, 'js')
const value = feedFormat2.fromNativeMsg(nativeMsg, 'js')
return { key, value }
},
deleteKVT(kvtf) {
const feedId = kvtf.feed ?? kvtf.value.author
const nativeMsg = this._map.get(feedId)
if (!nativeMsg) return
const feedFormat = findFeedFormatForAuthor(feedId)
if (!feedFormat) {
console.warn('No feed format installed understands ' + feedId)
return
}
const msgId = feedFormat.getMsgId(nativeMsg, 'js')
if (msgId === kvtf.key) this._map.delete(feedId)
},
delete(feedId) {
this._map.delete(feedId)
},
}
const log = AAOL(path.join(config.path, 'memdb-log.bin'), {
cacheSize: 1,
blockSize: 64 * 1024,
codec: {
encode(msg) {
return Buffer.from(JSON.stringify(msg), 'utf8')
},
decode(buf) {
return JSON.parse(buf.toString('utf8'))
},
},
validateRecord(buf) {
try {
JSON.parse(buf.toString('utf8'))
return true
} catch {
return false
}
},
})
ssb.close.hook(function (fn, args) {
log.close(() => {
fn.apply(this, args)
})
})
const scannedLog = new ReadyGate()
// setTimeout to let ssb.db.* secret-stack become available
setTimeout(() => {
let i = -1
log.stream({ offsets: true, values: true, sizes: true }).pipe(
push.drain(
function drainEach({ offset, value, size }) {
i += 1
if (!value) {
// deleted record
msgs.push(null)
return
}
// TODO: for performance, dont decrypt on startup, instead decrypt on
// demand, or decrypt in the background. Or then store the log with
// decrypted msgs and only encrypt when moving it to the network.
const msg = decrypt(value, ssb, config)
msg.meta ??= {}
msg.meta.offset = offset
msg.meta.size = size
msg.meta.seq = i
msgs.push(msg)
latestMsgPerFeed.preupdateFromKVT(msg, i)
},
function drainEnd(err) {
// prettier-ignore
if (err) throw new Error('Failed to initially scan the log', { cause: err });
latestMsgPerFeed.commitAllPreupdates()
scannedLog.setReady()
}
)
)
})
function logAppend(key, value, feedId, isOOO, cb) {
const kvt = {
key,
value,
timestamp: Date.now(),
}
if (feedId !== value.author) kvt.feed = feedId
if (isOOO) kvt.ooo = isOOO
log.append(kvt, (err, newOffset) => {
if (err) return cb(new Error('logAppend failed', { cause: err }))
const offset = newOffset // latestOffset
const size = Buffer.from(JSON.stringify(kvt), 'utf8').length
const seq = msgs.length
const kvtExposed = decrypt(kvt, ssb, config)
kvt.meta = kvtExposed.meta = { offset, size, seq }
msgs.push(kvtExposed)
cb(null, kvt)
})
}
function installFeedFormat(feedFormat) {
if (!feedFormat.encodings.includes('js')) {
// prettier-ignore
throw new Error(`Failed to install feed format "${feedFormat.name}" because it must support JS encoding`)
}
feedFormats.set(feedFormat.name, feedFormat)
}
function installEncryptionFormat(encryptionFormat) {
if (encryptionFormat.setup) {
const loaded = new ReadyGate()
encryptionFormat.setup(config, (err) => {
// prettier-ignore
if (err) throw new Error(`Failed to install encryption format "${encryptionFormat.name}"`, {cause: err});
loaded.setReady()
})
encryptionFormat.onReady = loaded.onReady.bind(loaded)
}
encryptionFormats.set(encryptionFormat.name, encryptionFormat)
}
function findFeedFormatForAuthor(author) {
for (const feedFormat of feedFormats.values()) {
if (feedFormat.isAuthor(author)) return feedFormat
}
return null
}
function findFeedFormatForNativeMsg(nativeMsg) {
for (const feedFormat of feedFormats.values()) {
if (feedFormat.isNativeMsg(nativeMsg)) return feedFormat
}
return null
}
function findEncryptionFormatFor(ciphertextJS) {
if (!ciphertextJS) return null
if (typeof ciphertextJS !== 'string') return null
const suffix = ciphertextJS.split('.').pop()
const encryptionFormat = encryptionFormats.get(suffix) ?? null
return encryptionFormat
}
function add(nativeMsg, cb) {
const feedFormat = findFeedFormatForNativeMsg(nativeMsg)
if (!feedFormat) {
// prettier-ignore
return cb(new Error('add() failed because no installed feed format understands the native message'))
}
const feedId = feedFormat.getFeedId(nativeMsg)
const prevNativeMsg = latestMsgPerFeed.get(feedId)
if (prevNativeMsg) {
feedFormat.validate(nativeMsg, prevNativeMsg, hmacKey, validationCB)
} else {
feedFormat.validateOOO(nativeMsg, hmacKey, validationCB)
}
function validationCB(err) {
// prettier-ignore
if (err) return cb(new Error('add() failed validation for feed format ' + feedFormat.name, {cause: err}))
const msgId = feedFormat.getMsgId(nativeMsg)
const msgVal = feedFormat.fromNativeMsg(nativeMsg)
latestMsgPerFeed.update(feedId, nativeMsg)
logAppend(msgId, msgVal, feedId, false, (err, kvt) => {
if (err) return cb(new Error('add() failed in the log', { cause: err }))
onMsgAdded.set({
kvt,
nativeMsg,
feedFormat: feedFormat.name,
})
cb(null, kvt)
})
}
}
function create(opts, cb) {
const keys = opts.keys ?? config.keys
const feedFormat = feedFormats.get(opts.feedFormat)
const encryptionFormat = encryptionFormats.get(opts.encryptionFormat)
// prettier-ignore
if (!feedFormat) return cb(new Error(`create() does not support feed format "${opts.feedFormat}"`))
// prettier-ignore
if (!feedFormat.isAuthor(keys.id)) return cb(new Error(`create() failed because keys.id ${keys.id} is not a valid author for feed format "${feedFormat.name}"`))
// prettier-ignore
if (opts.content.recps) {
if (!encryptionFormat) {
return cb(new Error(`create() does not support encryption format "${opts.encryptionFormat}"`))
}
}
if (!opts.content) return cb(new Error('create() requires a `content`'))
// Create full opts:
let provisionalNativeMsg
try {
provisionalNativeMsg = feedFormat.newNativeMsg({
timestamp: Date.now(),
...opts,
previous: null,
keys,
})
} catch (err) {
return cb(new Error('create() failed', { cause: err }))
}
const feedId = feedFormat.getFeedId(provisionalNativeMsg)
const previous = latestMsgPerFeed.getAsKV(feedId, feedFormat)
const fullOpts = {
timestamp: Date.now(),
...opts,
previous,
keys,
hmacKey,
}
// If opts ask for encryption, encrypt and put ciphertext in opts.content
const recps = fullOpts.content.recps
if (Array.isArray(recps) && recps.length > 0) {
const plaintext = feedFormat.toPlaintextBuffer(fullOpts)
const encryptOpts = {
...fullOpts,
keys,
recps,
previous: previous ? previous.key : null,
}
let ciphertextBuf
try {
ciphertextBuf = encryptionFormat.encrypt(plaintext, encryptOpts)
} catch (err) {
// prettier-ignore
return cb(new Error('create() failed to encrypt content', {cause: err}));
}
if (!ciphertextBuf) {
// prettier-ignore
return cb(new Error('create() failed to encrypt with ' + encryptionFormat.name))
}
const ciphertextBase64 = ciphertextBuf.toString('base64')
fullOpts.content = ciphertextBase64 + '.' + encryptionFormat.name
}
// Create the native message:
let nativeMsg
try {
nativeMsg = feedFormat.newNativeMsg(fullOpts)
} catch (err) {
return cb(new Error('create() failed', { cause: err }))
}
const msgId = feedFormat.getMsgId(nativeMsg)
const msgVal = feedFormat.fromNativeMsg(nativeMsg, 'js')
latestMsgPerFeed.update(feedId, nativeMsg)
// Encode the native message and append it to the log:
logAppend(msgId, msgVal, feedId, false, (err, kvt) => {
// prettier-ignore
if (err) return cb(new Error('create() failed to append the log', { cause: err }))
onMsgAdded.set({
kvt,
nativeMsg,
feedFormat: feedFormat.name,
})
cb(null, kvt)
})
}
function del(msgId, cb) {
const kvt = getKVT(msgId)
latestMsgPerFeed.deleteKVT(kvt)
msgs[kvt.meta.seq] = null
log.onDrain(() => {
log.del(kvt.meta.offset, cb)
})
}
function filterAsPullStream(fn) {
let i = 0
return function source(end, cb) {
if (end) return cb(end)
if (i >= msgs.length) return cb(true)
for (; i < msgs.length; i++) {
const msg = msgs[i]
if (msg && fn(msg, i, msgs)) {
i += 1
return cb(null, msg)
}
}
return cb(true)
}
}
function* filterAsIterator(fn) {
for (let i = 0; i < msgs.length; i++) {
const msg = msgs[i]
if (msg && fn(msg, i, msgs)) yield msg
}
}
function filterAsArray(fn) {
return msgs.filter(fn)
}
function forEach(fn) {
for (let i = 0; i < msgs.length; i++) if (msgs[i]) fn(msgs[i], i, msgs)
}
function getKVT(msgKey) {
for (let i = 0; i < msgs.length; i++) {
const msg = msgs[i]
if (msg && msg.key === msgKey) return msg
}
return null
}
function get(msgKey) {
return getKVT(msgKey)?.value
}
function loaded(cb) {
if (cb === void 0) return promisify(loaded)()
scannedLog.onReady(cb)
}
return {
// public
installFeedFormat,
installEncryptionFormat,
loaded,
add,
create,
del,
onMsgAdded,
filterAsPullStream,
filterAsIterator,
filterAsArray,
forEach,
getKVT,
get,
// internal
findEncryptionFormatFor,
findFeedFormatForAuthor,
}
}

21
lib/utils.js Normal file
View File

@ -0,0 +1,21 @@
class ReadyGate {
#waiting
#ready
constructor() {
this.#waiting = new Set()
this.#ready = false
}
onReady(cb) {
if (this.#ready) cb()
else this.#waiting.add(cb)
}
setReady() {
this.#ready = true
for (const cb of this.#waiting) cb()
this.#waiting.clear()
}
}
module.exports = { ReadyGate }

53
package.json Normal file
View File

@ -0,0 +1,53 @@
{
"name": "@ppppp/db",
"version": "0.0.1",
"description": "Default ppppp database",
"main": "index.js",
"files": [
"*.js",
"lib/*.js",
"compat/*.js"
],
"engines": {
"node": ">=16"
},
"author": "Andre Staltz <contact@staltz.com>",
"license": "CC0-1.0",
"homepage": "https://github.com/staltz/ppppp-db",
"repository": {
"type": "git",
"url": "git@github.com:staltz/ppppp-db.git"
},
"dependencies": {
"async-append-only-log": "^4.3.10",
"obz": "^1.1.0",
"promisify-4loc": "^1.0.0",
"push-stream": "^11.2.0"
},
"devDependencies": {
"c8": "^7.11.0",
"husky": "^4.3.0",
"prettier": "^2.6.2",
"pretty-quick": "^3.1.3",
"rimraf": "^4.4.0",
"secret-stack": "^6.4.1",
"ssb-bendy-butt": "^1.0.0",
"ssb-box": "^1.0.1",
"ssb-caps": "^1.1.0",
"ssb-classic": "^1.1.0",
"ssb-keys": "^8.5.0",
"tap-arc": "^0.3.5",
"tape": "^5.6.3"
},
"scripts": {
"test": "tape test/*.js | tap-arc --bail",
"format-code": "prettier --write \"*.js\" \"(test|compat|indexes|operators)/*.js\"",
"format-code-staged": "pretty-quick --staged --pattern \"*.js\" --pattern \"(test|compat|indexes|operators)/*.js\"",
"coverage": "c8 --reporter=lcov npm run test"
},
"husky": {
"hooks": {
"pre-commit": "npm run format-code-staged"
}
}
}

112
test/add.test.js Normal file
View File

@ -0,0 +1,112 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const os = require('os')
const rimraf = require('rimraf')
const SecretStack = require('secret-stack')
const caps = require('ssb-caps')
const classic = require('ssb-classic/format')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-add')
rimraf.sync(DIR)
test('add() classic', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.use(require('ssb-box'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
const nativeMsg = classic.toNativeMsg(
{
previous: null,
author: '@FCX/tsDLpubCPKKfIrw4gc+SQkHcaD17s7GI6i/ziWY=.ed25519',
sequence: 1,
timestamp: 1514517067954,
hash: 'sha256',
content: {
type: 'post',
text: 'This is the first post!',
},
signature:
'QYOR/zU9dxE1aKBaxc3C0DJ4gRyZtlMfPLt+CGJcY73sv5abKKKxr1SqhOvnm8TY784VHE8kZHCD8RdzFl1tBA==.sig.ed25519',
},
'js'
)
const msg = await p(ssb.db.add)(nativeMsg)
t.equal(msg.value.content.text, 'This is the first post!')
await p(ssb.close)(true)
})
test('add() some classic message starting from non-first', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.use(require('ssb-box'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
const nativeMsg1 = classic.toNativeMsg({
previous: '%6jh0kDakv0EIu5v9QwDhz9Lz2jEVRTCwyh5sWWzSvSo=.sha256',
sequence: 1711,
author: '@qeVe7SSpEZxL2Q0sE2jX+TXtMuAgcS889oBZYFDc5WU=.ed25519',
timestamp: 1457240385000,
hash: 'sha256',
content: {
type: 'post',
text: 'Nulla ullamco laboris proident eu sint cillum. Est proident veniam deserunt quis enim sint reprehenderit voluptate consectetur adipisicing.',
root: '%uH8IpYmw6uV1M4uhezcHq1v0xyeJ8J8bQqR/FVm0csM=.sha256',
branch: '%SiM9aUnQSk01m0EStBHXD4HLf773OJm998IReSLO1So=.sha256',
mentions: [
{
link: '&bGFib3J1bWRvbG9yYWxpcXVhY29tbW9kb2N1bHBhcGE=.sha256',
type: 'image/jpeg',
size: 1367352,
name: 'commodo cillum',
},
{
link: '@zRr3265aLU/T1/DfB8+Rm+IPDZJnuuRgfurOztIYBi4=.ed25519',
name: 'laborum aliquip',
},
],
},
signature:
'ypQ+4ubHo/zcUakMzN4dHqd9qmx06VEADAZPjK0OXbseaEg9s0AWccKgn+WFI0XSO1y7TIphFOA6Dyn6kDzXAg==.sig.ed25519',
})
const nativeMsg2 = classic.toNativeMsg({
previous: '%l8drxQMuxpOjUb3RK9rGJl6oPKF4QPHchGvRyqL+IZ4=.sha256',
sequence: 1712,
author: '@qeVe7SSpEZxL2Q0sE2jX+TXtMuAgcS889oBZYFDc5WU=.ed25519',
timestamp: 1457253345000,
hash: 'sha256',
content: {
type: 'post',
text: 'Commodo duis eiusmod est tempor eu fugiat commodo sint excepteur non est mollit est exercitation. Sit velit eu quis aute reprehenderit id sit labore quis mollit fugiat magna. Proident eu et proident duis labore irure laboris dolor. Cupidatat aute occaecat proident ut cillum sunt ullamco laborum labore cillum eu ut excepteur laborum aliqua. Magna adipisicing in occaecat adipisicing duis mollit esse. Reprehenderit excepteur labore excepteur qui elit labore velit officia non consectetur id labore ullamco excepteur. Laborum cillum anim ex irure ex proident consequat aute ipsum quis id esse. Exercitation mollit deserunt labore ut eu ea eu consectetur ullamco ex.\nEiusmod qui in proident irure consequat enim duis elit culpa minim dolore nisi aute. Qui anim Lorem consectetur ad do dolore laborum enim aute ex velit eu dolor et incididunt. Nisi nulla aliquip anim irure proident deserunt nostrud in anim elit veniam exercitation aliquip sint. Culpa excepteur sit et eu quis reprehenderit sunt. Id velit reprehenderit nostrud incididunt dolore sint consequat officia pariatur dolore ipsum. Nisi incididunt tempor voluptate fugiat esse. Amet ut elit eu nulla adipisicing non veniam nulla ut culpa.\nDolor adipisicing anim id anim eiusmod laboris aliquip. Anim sint deserunt exercitation nostrud adipisicing amet enim adipisicing Lorem voluptate anim. Sunt pariatur cupidatat culpa dolore ullamco anim. Minim laborum excepteur commodo et aliqua duis reprehenderit exercitation.',
root: '%0AwZP5C5aFwzCV5OCxG/2D6Qx70N6ZVIoZ0ZgIu0pPw=.sha256',
branch: '%oZF1M4cKj6t2LHloUiegWD1qZ2IIvcLvOPIiVHbQudI=.sha256',
},
signature:
'uWYwWtG2zTmdfpaSTmOghW3QsNCgYNGh5d3VKOFtp2MNQopSCAxjDDER/yfj3k8Bu+NKEnAy5eJ2ylWuxeuEDQ==.sig.ed25519',
})
const msg1 = await p(ssb.db.add)(nativeMsg1)
t.equal(msg1.value.sequence, 1711)
const msg2 = await p(ssb.db.add)(nativeMsg2)
t.equal(msg2.value.sequence, 1712)
await p(ssb.close)(true)
})

57
test/create.test.js Normal file
View File

@ -0,0 +1,57 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const os = require('os')
const rimraf = require('rimraf')
const SecretStack = require('secret-stack')
const caps = require('ssb-caps')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-create');
rimraf.sync(DIR)
let ssb
test('setup', async (t) => {
ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.use(require('ssb-box'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
})
test('create() classic', async (t) => {
const msg1 = await p(ssb.db.create)({
feedFormat: 'classic',
content: { type: 'post', text: 'I am hungry' },
})
t.equal(msg1.value.content.text, 'I am hungry', 'msg1 text correct')
const msg2 = await p(ssb.db.create)({
content: { type: 'post', text: 'I am hungry 2' },
feedFormat: 'classic',
})
t.equal(msg2.value.content.text, 'I am hungry 2', 'msg2 text correct')
t.equal(msg2.value.previous, msg1.key, 'msg2 previous correct')
})
test('create() classic box', async (t) => {
const msgBoxed = await p(ssb.db.create)({
feedFormat: 'classic',
content: { type: 'post', text: 'I am chewing food', recps: [ssb.id] },
encryptionFormat: 'box',
})
t.equal(typeof msgBoxed.value.content, 'string')
t.true(msgBoxed.value.content.endsWith('.box'), '.box')
const msgVal = ssb.db.get(msgBoxed.key)
t.equals(msgVal.content.text, 'I am chewing food')
})
test('teardown', (t) => {
ssb.close(t.end)
})

88
test/del.test.js Normal file
View File

@ -0,0 +1,88 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const os = require('os')
const rimraf = require('rimraf')
const SecretStack = require('secret-stack')
const AAOL = require('async-append-only-log')
const push = require('push-stream')
const caps = require('ssb-caps')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-del')
rimraf.sync(DIR)
test('del', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
const msgIDs = []
for (let i = 0; i < 5; i++) {
const msg = await p(ssb.db.create)({
feedFormat: 'classic',
content: { type: 'post', text: 'm' + i },
})
msgIDs.push(msg.key)
}
const before = ssb.db
.filterAsArray(() => true)
.map((msg) => msg.value.content.text)
t.deepEqual(before, ['m0', 'm1', 'm2', 'm3', 'm4'], 'msgs before the delete')
await p(ssb.db.del)(msgIDs[2])
const after = ssb.db
.filterAsArray(() => true)
.map((msg) => msg?.value.content.text ?? null)
t.deepEqual(after, ['m0', 'm1', null, 'm3', 'm4'], 'msgs after the delete')
await p(ssb.close)(true)
const log = AAOL(path.join(DIR, 'memdb-log.bin'), {
cacheSize: 1,
blockSize: 64 * 1024,
codec: {
encode(msg) {
return Buffer.from(JSON.stringify(msg), 'utf8')
},
decode(buf) {
return JSON.parse(buf.toString('utf8'))
},
},
})
const persistedMsgs = await new Promise((resolve, reject) => {
let persistedMsgs = []
log.stream({ offsets: true, values: true, sizes: true }).pipe(
push.drain(
function drainEach({ offset, value, size }) {
if (!value) {
persistedMsgs.push(null)
} else {
persistedMsgs.push(value)
}
},
function drainEnd(err) {
if (err) return reject(err)
resolve(persistedMsgs)
}
)
)
})
t.deepEqual(
persistedMsgs.map((msg) => msg?.value.content.text ?? null),
['m0', 'm1', null, 'm3', 'm4'],
'msgs in disk after the delete'
)
})

View File

@ -0,0 +1,45 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const os = require('os')
const rimraf = require('rimraf')
const SecretStack = require('secret-stack')
const caps = require('ssb-caps')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-filter-as-array')
rimraf.sync(DIR)
test('filterAsArray', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
for (let i = 0; i < 10; i++) {
await p(ssb.db.create)({
feedFormat: 'classic',
content:
i % 2 === 0
? { type: 'post', text: 'hello ' + i }
: { type: 'about', about: ssb.id, name: 'Mr. #' + i },
})
}
const results = ssb.db
.filterAsArray((msg) => msg.value.content.type === 'post')
.map((msg) => msg.value.content.text)
t.deepEqual(
results,
['hello 0', 'hello 2', 'hello 4', 'hello 6', 'hello 8'],
'queried posts'
)
await p(ssb.close)(true)
})

View File

@ -0,0 +1,49 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const os = require('os')
const rimraf = require('rimraf')
const SecretStack = require('secret-stack')
const caps = require('ssb-caps')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-filter-as-iterator')
rimraf.sync(DIR)
test('filterAsIterator', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
for (let i = 0; i < 10; i++) {
await p(ssb.db.create)({
feedFormat: 'classic',
content:
i % 2 === 0
? { type: 'post', text: 'hello ' + i }
: { type: 'about', about: ssb.id, name: 'Mr. #' + i },
})
}
const iterator = ssb.db.filterAsIterator(
(msg) => msg.value.content.type === 'post'
)
const results = []
for (const msg of iterator) {
results.push(msg.value.content.text)
}
t.deepEqual(
results,
['hello 0', 'hello 2', 'hello 4', 'hello 6', 'hello 8'],
'queried posts'
)
await p(ssb.close)(true)
})

View File

@ -0,0 +1,48 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const rimraf = require('rimraf')
const os = require('os')
const SecretStack = require('secret-stack')
const caps = require('ssb-caps')
const pull = require('pull-stream')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-filter-as-pull-stream')
rimraf.sync(DIR)
test('filterAsPullStream', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
for (let i = 0; i < 10; i++) {
await p(ssb.db.create)({
feedFormat: 'classic',
content:
i % 2 === 0
? { type: 'post', text: 'hello ' + i }
: { type: 'about', about: ssb.id, name: 'Mr. #' + i },
})
}
const results = await pull(
ssb.db.filterAsPullStream((msg) => msg.value.content.type === 'post'),
pull.map((msg) => msg.value.content.text),
pull.collectAsPromise()
)
t.deepEqual(
results,
['hello 0', 'hello 2', 'hello 4', 'hello 6', 'hello 8'],
'queried posts'
)
await p(ssb.close)(true)
})

48
test/for-each.test.js Normal file
View File

@ -0,0 +1,48 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const rimraf = require('rimraf')
const os = require('os')
const SecretStack = require('secret-stack')
const caps = require('ssb-caps')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-for-each')
rimraf.sync(DIR)
test('forEach', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
for (let i = 0; i < 10; i++) {
await p(ssb.db.create)({
feedFormat: 'classic',
content:
i % 2 === 0
? { type: 'post', text: 'hello ' + i }
: { type: 'about', about: ssb.id, name: 'Mr. #' + i },
})
}
const results = []
ssb.db.forEach((msg) => {
if (msg.value.content.type === 'post') {
results.push(msg.value.content.text)
}
})
t.deepEqual(
results,
['hello 0', 'hello 2', 'hello 4', 'hello 6', 'hello 8'],
'queried posts'
)
await p(ssb.close)(true)
})

45
test/on-msg-added.test.js Normal file
View File

@ -0,0 +1,45 @@
const test = require('tape')
const ssbKeys = require('ssb-keys')
const path = require('path')
const rimraf = require('rimraf')
const os = require('os')
const SecretStack = require('secret-stack')
const caps = require('ssb-caps')
const p = require('util').promisify
const DIR = path.join(os.tmpdir(), 'ssb-memdb-on-msg-added')
rimraf.sync(DIR)
test('onMsgAdded', async (t) => {
const ssb = SecretStack({ appKey: caps.shs })
.use(require('../'))
.use(require('ssb-classic'))
.call(null, {
keys: ssbKeys.generate('ed25519', 'alice'),
path: DIR,
})
await ssb.db.loaded()
const listened = []
var remove = ssb.db.onMsgAdded((ev) => {
listened.push(ev)
})
const msg1 = await p(ssb.db.create)({
feedFormat: 'classic',
content: { type: 'post', text: 'I am hungry' },
})
t.equal(msg1.value.content.text, 'I am hungry', 'msg1 text correct')
await p(setTimeout)(500)
t.equal(listened.length, 1)
t.deepEquals(Object.keys(listened[0]), ['kvt', 'nativeMsg', 'feedFormat'])
t.deepEquals(listened[0].kvt, msg1)
t.deepEquals(listened[0].nativeMsg, msg1.value)
t.equals(listened[0].feedFormat, 'classic')
remove()
await p(ssb.close)(true)
})