Damus
Carsten Keutmann profile picture
Carsten Keutmann
@keutmann

Decentralized Web of Trust Reputation

https://github.com/DigitalTrustProtocol/DWoTR-Documentation/blob/main/Trust.md

Relays (6)
  • wss://fiatjaf.nostr1.com – read & write
  • wss://relay.primal.net – read & write
  • wss://relay.damus.io/ – read & write
  • wss://nos.lol/ – read & write
  • wss://nostr.wine/ – read & write
  • wss://relay.mostr.pub/ – read & write

Recent Notes

note1hsmmj...
Carsten Keutmann profile picture
You’re absolutely right—using a social network for filtering content is a simple and effective solution. You follow people because you like their posts, and in a way, that’s a form of indirect trust. Plus, the functionality’s already there in most cases, and I know clients like iris.to used to do it. It works for now, no doubt.

But here’s the thing: while it’s great for quickly filtering out bots and randoms, it’s not perfect. The follow system is more about content preference than actual trust. Just because you follow someone for their spicy memes doesn’t mean you’d trust them with, say, medical advice or fact-checking. And that’s where it falls short—it doesn’t allow for a reputation system or any sort of "fact-checking" style rating on posts. It’s like giving every post the same level of credibility just because you follow the person, even if it’s not all equal.

In a Web of Trust, new users (and bots) with zero reputation aren’t automatically filtered out—they're visible at first. But here’s the catch: bots will quickly be marked as untrustworthy by just a few people, and then they’ll be filtered for everyone else in the network. It’s like crowdsourced spam control—once a bot is flagged, it’s as good as invisible to the rest of us.

To add another layer of protection, relay servers could require proof of work before accepting posts from new accounts, enforce rules like rate limits on IP, etc., making it harder for them to endlessly spam.

As for bots that trust each other—well, that’s not really a problem. In a Web of Trust, it’s not about the size of the network; it’s about who you trust. So, if bots are busy trusting each other, it doesn’t affect your network unless someone in your network starts trusting them. And since no one in your network is likely to trust a bot, those fake trust loops don’t impact you at all.
note1p9yny...
Carsten Keutmann profile picture
Using the social graph as a Web of Trust isn’t ideal because it mixes two different things. The social graph is about who you follow and who follows you—connections, not trust. Just because you follow someone doesn’t mean you trust them with important info, and vice versa.

In my opinion, keeping trust separate from the social graph makes more sense. It lets you build a trust network based on actual trustworthiness, not just social ties. This way, new users aren’t automatically flagged as spammers but can earn trust through their actions, not just who they’re connected to.

Separating these systems ensures spam control while giving newcomers a fair shot to prove themselves.
note12lym7...
Carsten Keutmann profile picture
Not at all! It’s not about being "hidden forever" just because a few people don’t like your posts. In a Web of Trust, trust is subjective. So, if a few people distrust you, it only affects their network and the people who trust them. Your posts would still be visible to others who trust you or those who haven’t built trust with the people who flagged you. It's a decentralized system, so no single opinion or small group can wipe you out across the entire network—your visibility depends on the trust relationships you build.
note1a2ngk...
Carsten Keutmann profile picture
In a network of trust with, say, 100,000 people, it only takes one or a few people to spot the bot and decide to distrust it. Once that happens, the rest of the network gets the signal that the account isn’t trustworthy, and most people won’t even see the spam. Instead of everyone having to individually check if the content is valid, the Web of Trust allows for the community to quickly filter out the bad actors. It’s like crowdsourced spam protection—one or two people deal with it, and the rest benefit from it.

#wot #weboftrust #reputation #trust
note1jmw99...
Carsten Keutmann profile picture
Here's the thing: in a Web of Trust, your trust choices don’t just affect you—they ripple out to the people who trust you too. So, if you just go with the popular "trusted users," you might not be doing yourself, or your network, any favors. It’s kind of like everyone following the same food critic. Sure, they might know the trendy spots, but that doesn’t mean you’ll like the same thing. Plus, the people who trust your judgment might not appreciate those mainstream picks either. It’s better to trust based on your own experiences. Build a network that reflects what really matters to you—it’ll be more genuine, and your peers will trust you more for it!

In the decentralized space, trust and reputation is the new currency for services, time, and attention.

#wot #weboftrust #reputation
Carsten Keutmann profile picture
How to score trust and reputation in a Web of Trust system

Trust and reputation play a big role in both the real world and online. Trust is something that grows over time through personal experiences. It can be good or bad, depending on how someone has acted in the past. If someone consistently shows they can be reliable and honest, we naturally start trusting them more. That trust becomes a kind of safety net, allowing us to engage with others without constantly worrying about getting hurt or deceived.

But trust doesn’t just appear—it’s built slowly, through repeated interactions and consistent behavior. And, of course, trust is completely subjective. It’s shaped by how we each see and experience the world, which means one person’s trust in someone might not match another’s.

Reputation, on the other hand, is more about how others collectively see you. It’s the sum of all those individual trust levels, but still seen through everyone’s personal perspectives. In a way, reputation feels like a community reflection of trust. Even though it might look like an objective score, it’s really built on a whole range of subjective experiences. That’s what makes the dynamic between trust and reputation so interesting—it's constantly evolving, just like our interactions with people every day.

Figuring out how to score someone with trust in a digital system is like trying to rate your friend’s cooking. Do you go with 5 stars, 10 stars, or maybe something super detailed like 0-100%? The reality is, trust is a tricky thing to measure. We all have our own way of deciding who we trust—some of it’s personal, some of it’s cultural, and none of it fits neatly into a simple rating system. What makes sense to one person might be completely different for someone else.

So, instead of trying to create a complicated scale, the best solution is to simplify: trust, neutral, or distrust. It’s like a thumbs-up or thumbs-down—you either trust someone or you don’t. No need to overthink it. This binary approach is easy to implement digitally, and it keeps things straightforward. Plus, it’s easier for the algorithms to handle. No one wants an algorithm having an existential crisis over whether 4 stars means "pretty good" or "just okay."

Once you’ve got this simple trust/distrust system in place, you can start adding more details if needed. For example, you can confirm certain facts or give 5-star ratings for products, if that’s relevant. But at its heart, the binary trust system keeps things easy to understand and manage, for both people and computers.

Reputation gets interesting because it’s all about perspective. In a decentralized system, there’s no universal score—it’s all subjective, calculated by each observer based on their view. The Web of Trust comes in by aggregating these individual perspectives, creating a broader sense of someone’s reputation.

The process is pretty simple: every time someone trusts the person in question, they get a +1. If they’re distrusted, it’s a -1. It’s really just a running tally of trust vs. distrust across the network.

When the system calculates reputation, it only looks at the degrees of connections where the first opinion—either trust or distrust—appears about the subject. The idea is that opinions from closer peers matter more, while those further out aren’t as relevant and don’t get considered in the calculation.

This keeps the reputation system streamlined and ensures it reflects trust from those who are most relevant to you. After that, it’s up to you to decide how to interpret the score. For example, if the subject has 5 trust points and 2 distrust points, you might view them as generally trustworthy but still be cautious because some people in the network have expressed doubts. The system gives you the information, but how you weigh those points and act on them is entirely your call.

This simpel scoring approach enables algorithms that automated systems can follow, leveraging human trust to make decisions. By doing so, systems can guard against misuse and information spamming while still respecting individual preferences. The prospects for the Web of Trust are immense—an untapped industry where information filtering no longer depends on centralized platforms, but instead focuses on earning the trust of individuals. In this decentralized world, trust becomes the key currency, shaping how we engage and filter the overwhelming flow of information around us.

#weboftrust #wot #reputation #dwotr
note158jld...
Carsten Keutmann profile picture
I have extended the iris.to Nostr client, into a clone of the site with a reputation system on it (prof of concept). It works, but it is still fairly simple: https://dpeep.com.
You can trust people and posts, and the trust network extends 3 degrees deep. You need to create multiple accounts and trust each other to see the effects and the networks of web of trust.
I'll write some posts about how the general system works later.