Damus
HORNETS profile picture
HORNETS
@HORNETS

Nostr Relay with Dashboard🌐 Choose Paid, Public or Invite-Only Mode👥 Self-Host Your Posts and Code📂

Relays (7)
  • wss://relay.damus.io – read & write
  • wss://relay.primal.net – read & write
  • wss://relay.nos.social – read & write
  • wss://nostr.bitcoiner.social – read & write
  • wss://nos.lol – read & write
  • wss://purplepag.es – read & write
  • wss://temp.iris.to – read & write

Recent Notes

jb55 · 22w
just getting started. WoT table next!
HORNETS profile picture
Scionic Merkle Tree v2.0.0 Release 🎉

• Fixed many verification issues
• Added diff helper functions
• Added parallel dag creation
• Added cleaner create dag with config for all customization options
• Fixed partial dag creation and verification
• Fixed transmission packet creation and verification
• Fixed batched transmission packet creation and verification
• Ensured batched transmission packets support partial dag

https://github.com/HORNET-Storage/Scionic-Merkle-Tree
🤙1
HORNETS profile picture
Usually discounts require the modification of weight parameters, so this median fee mechanism is historic and could be useful in future soft forks—even if the BIP for spam isn’t.

Detecting the median fee trend of a new transaction type to penalize old transaction types can avoid the need to modify weight when issuing discounts during soft forks.

This novel approach is paired with the tightening of every possible data bucket, without breaking basic Lightning and its anchors.
1🤙1
reiartur · 24w
In taproot scripts, I disabled the opcodes: OP_PUSHDATA2 and OP_PUSHDATA4; and limited 3 OP_PUSHDATA1 per script.
HORNETS · 24w
BIP-X: Discounting Normal Transactions without a Block Size Increase via Median Fees and Constrained Data Bucket Sizes https://github.com/HORNET-Storage/bips/blob/master/BIP-X.mediawiki https://imag...
HORNETS profile picture
TL;DR

We detect the median fee size of transactions that qualify for the discount, over the last 2 weeks of time.

Then we multiply that median fee by a little more than 3x to create an economic equilibrium where inscriptions become just as expensive as fragmenting NFTs across thousands of transactions.

Spam unilaterally becomes 3x more expensive than all basic transactions, while preserving enough outputs for Lightning anchors (which Luke has blocked). No OP_Codes are penalized, only bucket sizes—it strives to be timeless… future proof.

“1. Collect all qualifying transactions from blocks [H-2016, H-1]
2. Calculate MEDIAN_FEE = median(qualifying_fee_rates)
3. Required fee for non-qualifying tx ≥ MEDIAN_FEE × 3.14
4. Reject blocks violating this rule”
12🤙1
epsql · 24w
Interesting proposal, what are the implications for general programmability and for coinjoins in your opinion?
HORNETS · 24w
TL;DR We detect the median fee size of transactions that qualify for the discount, over the last 2 weeks of time. Then we multiply that median fee by a little more than 3x to create an economic equilibrium where inscriptions become just as expensive as fragmenting NFTs across thousands of transact...
JackTheMimic · 26w
"Luke Dashjr's pattern-based filtering in Bitcoin Knots creates a cat-and-mouse game that worsens mining centralization: when nodes filter inscription transactions, users may attempt to bypass the P2P full-node network entirely, establishing private relay networks directly to mining farms." Side ch...
Dee 007 · 26w
Run Knots. Dont run core v30.
SatsAndSports · 26w
TLDR (of just the opening sections, which I think/hope I have summarised here): - Add a new a SegData section of the block, which can be ignored by nodes that don't understand or enforce this section - data can be referenced by a 32-byte identifier: sha256(sha256(the_data) - op_returns with exact...
nomadshiba⚡ · 26w
or just store the hash, and store the data somewhere else.