Damus
Danie profile picture
Danie
@Danie

Testing out new wallet

Relays (20)
  • wss://nostr.oxtr.dev/ – read & write
  • wss://relay.nostr.band/ – read & write
  • wss://nostr.mom/ – read & write
  • wss://relay.momostr.pink/ – read & write
  • wss://relay.mostr.pub/ – read & write
  • wss://nostr-pub.wellorder.net/ – read & write
  • wss://relay.nostr.com.au/ – read & write
  • wss://nostr-verified.wellorder.net/ – read & write
  • wss://relay.nostr.bg/ – read & write
  • wss://relay.primal.net/ – read
  • wss://relay.damus.io/ – read
  • wss://nos.lol/ – read
  • wss://nostr.wine/ – read
  • wss://relay.snort.social/ – read
  • wss://puravida.nostr.land/ – read
  • wss://eden.nostr.land/ – read
  • wss://atlas.nostr.land/ – read
  • wss://relay.noswhere.com/ – read
  • wss://nostr.inosta.cc/ – read
  • wss://relay.orangepill.dev/ – read

Recent Notes

Danie profile picture
Microsoft's Copilot spills the beans, summarising emails it's not supposed to read

“The bot couldn't keep its prying eyes away. Microsoft 365 Copilot Chat has been summarising emails labelled “confidential” even when data loss prevention policies were configured to prevent it.”

Given that this is Microsoft, it is difficult to distinguish between this being devious or plain sloppiness. One would have thought with a company the size of Microsoft, neither should be the case. But it was only a year ago that we had that Windows screen recording debacle that had to be withdrawn and re-published.

It is probably just never a good idea to let any AI have raw access to your documents or e-mails. It is better to create a separate area with documents you have vetted, or to manually upload what you want to have processed.

AI is proving to be an incredible way of worming into private document and information repositories where it can just vacuum everything up. As an end user, you have zero control once you've opened that door.

I can see government's funding AI one day, as it is a tremendous way to spy on citizens, or opponents. As much as the USA warns us about China and their AI, I'm just wondering how much of the same behaviour is being perpetrated by the USA itself. Users basically invite AI into their private areas, and then let it scoop everything up. If the country owning the AI company, has warrantless search to data, just think of the possibilities.

We've also heard about AI reorganising user data, and then apologising for deleting it. AI is not infallible, and with zero contextual knowledge, it is also not actually intelligent.

You want to keep AI at an arm's length, and use it with cation, just like any other tool.

See https://www.theregister.com/2026/02/18/microsoft_copilot_data_loss_prevention

#technology #AI #Microsoft #privacy
Danie profile picture
This is Probably the Best Video Downloader App (And it is Free and Open Source)

“You come across an interesting video on social media and thought of downloading it so that you can send it to someone or modify it to share it on some other platform. You know, the meme videos? Not every platform allows downloading videos, and thus you need a good, reliable video downloader.”

I do like that it's options show you the available sizes along with FPS, storage size, etc. AS good as one of the paid options floating around which was quite popular.

See https://itsfoss.com/vidbee and their project at https://github.com/nexmoe/VidBee

#technology #opensource #downloader

Danie profile picture
LastSignal Is a New Open-Source Dead Man’s Switch You Can Self-Host

“A new open-source tool called LastSignal has emerged to let users run their own encrypted dead man’s switch on a self-hosted server. The software is designed for scenarios where messages should only be delivered if the sender becomes unreachable and stops responding to scheduled check-ins.”

In the past these services were usually hosted, and then after a few years the service would shut down (and hopefully you were aware of that). The other issue was you may have to leave some sensitive information that you want to pass on, and today we are more and more aware that you have to be careful when leaving such information in someone else's cloud.

So this does solve two of those issues: You are hosting it, so you can keep it available, and you have full control over the information.

Quite a bit of thought seems to have gone into it too. The project is about 3 weeks old so is pretty new still.

See https://linuxiac.com/lastsignal-is-a-new-open-source-dead-mans-switch-you-can-self-host and their open source project at https://github.com/giovantenne/lastsignal

#technology #opensource #selfhosting #deadmanswitch
Danie profile picture
Science says your Kindle might be better for sleep than a paperback

Funny thing is I've always “known” this. For many years I've been reading my Kindle every night before I go to sleep. OK that is anecdotal (which is not any evidence) but a Kindle has no bright backlight like an iPad or similar tablets have. It is not heavy either like many 600-page books are, and you do not have to turn physical pages, and the font can be made bigger too. You can also read it without disturbing a sleeping partner.

Katherine Sharkey, doctor and sleep researcher at Wake Forest School of Medicine, has stated “the actions required for using an e-reader are less than that of a traditional book, and that lighter load on the brain makes it easier to wind down for sleep”.

I get that some like the feel of paper, but that has nothing to do with going to sleep. The act of reading does help induce sleep, no matter what the media is.

Thinking back over the years I still have 4 large bookcases, stacked double deep, with books I used to read before I got my first Kindle. If I'd not got a Kindle, I have no idea how many more bookcases I'd need to have by now.

The only downside of a Kindle is, if you “buy” your books from Amazon (and especially them), you don't really own the e-book. It can disappear if Amazon so chooses. But I do back up all my e-books, and when possible, I source them elsewhere.

See https://www.androidpolice.com/e-readers-better-for-sleep-than-traditional-books-researcher-claims

#technology #reading #sleep
Danie profile picture
These video doorbells don’t rely on the cloud or subscriptions

“Picking a video doorbell that doesn’t rely on the cloud means you can save footage locally and not rely on pricey subscriptions. Pair it with a smart home platform like Home Assistant, and your doorbell will keep working when the internet doesn’t. Here are some ideas for video doorbells that work offline, even if they do have optional cloud subscriptions.”

Obviously when I bought my last Ring doorbell, there were no Reolink doorbells yet available. Reason is, I already have Reolink NVR with POE and cameras running. For me, in future, this will be a no-brainer choice. I especially like the POE option, as not only is that a nice clear signal, but there is no messing with the odd 19 AC power that a wired Ring doorbell wants.

See https://www.howtogeek.com/these-video-doorbells-dont-rely-on-the-cloud-or-subscriptions

#technology #security #doorbells #privacy
note1fvnz3...
Danie profile picture
Yes that is a good idea - I should see if I can get it to work. Have a good 50 containers some with ports on isolated networks, and some on host. That is where it gets messy when same ports are across different networks. Good practice is to true not expose external ports at all and only use container names with internal port number. That does keep things a bit simpler too, but there will always be external ports on the front facing stuff.
Danie profile picture
This Docker Compose visual builder is the tool I wish I had as a beginner

“If you’ve ever attempted to run too many containers on the same machine, you may have encountered failed deployments because multiple services tried accessing the same port. Fortunately, DCM sidesteps that issue by checking for port conflicts in your multi-container Compose files. Going back to our network stack example, Nginx and Pi-hole both use ports 80 and 443 on the host. If you try to deploy these services with the same Compose files, DCM detects the port conflicts and assigns different numbers to the second container.”

With over 40 to 50 containers running on one of my servers I constantly run into that ports clashing issue (when ports are exposed to the host machine). This tool does look quite interesting as well as how it handles volume mappings.

It also makes me realise how Portainer really needs to take a few steps forward with regard to better stack editing.

I'm going to be looking a bit deeper into this app. It looks like it will work in tandem with Portainer too. Basically DCM will help you create a clean and functional docker-compose file to paste into Portainer.

See https://www.xda-developers.com/this-docker-compose-visual-builder-is-the-tool-i-wish-i-had-as-a-beginner or the project site at https://github.com/ajnart/dcm

#technology #docker #selfhosting
1
Danie profile picture
Rallly is an open source self-hosted tool to make organising events and meetings easier

If you've ever had to try to find a time for an event of meeting that suites a family, colleagues, or other large group of people, you'll know what a pain that can be. There is often a lot of back and forth, and conversions can happen in different places too.

Rallly allows you to set up some time slot options, and provides an invite link which you can share via e-mail or other messenger chat groups. Everyone votes for their choices on the browser page that is presented, and can comment there too for others to read. Nothing needs to be installed by them, nor do they need to register to use it.

If you don't want to self-host the service, there is also a free tier on their cloud hosted service which you can use for the ad-hoc events or meetings you may want to schedule.

I like that it not only pretty seamless, but it is not locked into any vendor, operating system or even messenger platform.

My video highlights what I like about it, how to set up and poll and send it out, and what my docker configuration looks like. I also mention how you can add extra members on the self-hosted instance, but why you may not need to do so.

Watch https://www.youtube.com/watch?v=HYsJFPS9G-A

#technology #opensource #selfhosting #scheduling
Danie profile picture
Chatbots Make Terrible Doctors, New Study Finds*

“Chatbots may be able to pass medical exams, but that doesn’t mean they make good doctors, according to a new, large-scale study of how people get medical advice from large language models. The controlled study of 1,298 UK-based participants, published today in Nature Medicine from the Oxford Internet Institute and the Nuffield Department of Primary Care Health Sciences at the University of Oxford, tested whether LLMs could help people identify underlying conditions and suggest useful courses of action, like going to the hospital or seeking treatment.”

AI lacks any form of real context for situations. The more I've used them for technical issues, I've seen how limited they really are. They will doggedly go down deep holes without looking outside the box. They will grasp at all sorts of solutions to a GitHub project application, without even realising the newest issue logged is in fact a bug waiting to be fixed. I could go on and on, and I've lost count now of the apologies I've extracted from AI for its real stupidity.

Used as an assistant to a thinking human being is fine, but never follow AI advice blindly. It is great for answering questions, but the more you use it, you'd see its limitations surfacing. It can save time, but it can also waste a lot of time, and even do damage. Most recently a chatbot gave me a command to remove the history for a file I updated on GitHub, but it also deleted that specific file itself altogether (I got what I'd asked for, but I did not ask for the file to be deleted, and there was no mention of such a possibility by the chatbot).

So just like a lathe with be a tremendous tool in a skilled technicians hands, giving me a lathe to use would likely do more harm than good.

See https://www.404media.co/chatbots-health-medical-advice-study

#technology #medical #healthcare #AI
Danie profile picture
Matrix secure chat is quietly becoming the chat layer for governments chasing digital sovereignty

"Matrix is currently talking to circa 35 countries about FOSS communications infrastructure. For instance, Hodgson told us the United Nations is on board: it's using Matrix as the basis of its own in-house air-gapped communications tool, which helps it to remain independent of any country or hosting provider. It's also being adopted at the International Criminal Court (ICC), which as The Register reported in October 2025 is busily ditching Microsoft Office. After the Trump administration imposed sanctions on ICC chief prosecutor Karim Khan, he reported losing access to his email and banking, disrupting the court's work.”

Yes any government can host their own Matrix instance and have full control over it. Matrix is actually implemented inside of other messaging tools, so many are using it without even knowing about it.

See https://www.theregister.com/2026/02/09/matrix_element_secure_chat

#technology #privacy #digitalsovereignty #chat #opensource
2
Nate · 1w
It might be a bit rough around the edges, but I'm hoping Matrix wins out on becoming the chat standard.
Danie profile picture
Netdata and some free AI searches saved me a ton of resource usage on my desktop and server

I recently installed free Netdata in a container on my homelab server to see what it would show in terms of resource usage, bottlenecks, etc. When you start out of Netdata you get a 2-week trial business subscription which includes 10 free AI credits (for analysis reporting).

I let it run for 24 hours and then asked Netdata's AI to give me any key findings and recommendations. The key things it showed me pretty quickly, already blew me away. It's the old story of things are "working" but are far from optimal. Basically the issues I had were too many tasks firing off concurrently (or before others had finished) causing major bottlenecks (disk backlogs) on my different drives. I have three different backups, S.M.A.R.T. drive tasks running, Timeshift, RAID array scrubs, etc going.

I took the report from Netdata and fed it in as-is to Google Gemini (Perplexity has been leading me down very long rabbit holes the last few months) and asked what now. To cut a long story short, Gemini took me through various tests and recommendations around spacing all the tasks out far better, advising which should be daily, weekly, or monthly. It also suggested tweaking settings for the drives as well as the rsync jobs. For example when exporting to an external USB drive, it showed how to slow the rsync transfer down so that the drive was not choking, and neither was the server CPU. It also gave a nice summary table of how all the tasks were now spaced out over days and weeks.

I then decided to install Netdata on my desktop PC, and am glad I did. It boots quicker, and terminal screens open instantly (especially the Atuin history), etc. Again the issue being identified by Netdata was massive disk backlogs. It turns out my main /home data disk is 5.8 years old and has a 161ms response, where it should be 10x quicker. I need to replace this drive soon, but the optimisations suggested by Gemini, have now eased out a lot of the strain I was putting on this drive.

My Manjaro desktop configuration is a good 8 or 9 years old with tons of crud. I used to use VirtualBox for VMs but switched to KVM a while back, yet the old VirtualBox vboxnet0 network and kernel hooks were still in my system. I have a beautiful Conky window on my desktop, but I did not realise the amount of resources it was using through massive inefficiencies including firing off sudo smartctl every3 secs to check drive temperatures (polling the drive controller 28,800 times a day), if/then statements that each fire off the same query three times, using outdated network calls, etc. So Gemini helped optimise that dramatically by collapsing the queries and using memory caching instead, and reducing many checks to 30 secs or longer where the data does not change quickly. There was also rsync jobs that were made less intense, so that CPU was smoothed out more. Some old snapd stuff was also identified that was loading into memory, although I no longer used it, so that all got cleared out as well. I was using SAMBA shares with a Windows VM running in KVM, and it advised to ditch the SAMBA shares and rather use the faster Virtio-FS folder sharing, as well as the VirtIO network mode in KVM.

As Gemini pointed out initially, some events were coming together on my homelab server to create a perfect storm. My desktop PC is now booting up again in seconds, network acquisition is quicker, and with less intensive polling, my browsers are also more responsive.

I'm actually scaling back on my Grafana, Prometheus, Telegraf, InfluxDB stack on my server too. Netdata collects tons of data every second and seeing it is running, I'd rather try to optimise around that, seeing the information I get is a lot more useful. Netdat requires basically no configuration, unlike how Grafana, InfluxDB, Telegraf, and Prometheus must all work tother. There are some things that Grafana must still do, like pulling my Home Assistant stats through into graphs. The free Netdata tier only gives you 5 nodes in their cloud service, but you can view more locally if you host it yourself. Obviously after the trial period, I will also lose the AI credits.

Netdata is open source on the client agent (data collection) and "available" for the client dashboard. The cloud side, and the AI, is proprietary. I'll see how it goes on the free tier after 2 weeks, and what sort of reporting I can still export. But the benefit so far has made a dramatic difference, and will likely also ensure my hard drives have a longer and beneficial life.

Netdata running in a docker container on my homelab server is consuming 2.1% CPU and 327 MB of RAM. Disk space is now at 1.3 GB, so I'll need to keep an eye on that. There are retention sizes that can be set for each tier of data that is being stored (per second, per minute, and per hour tiers).

A tip on installing for Arch based systems. Netdata's script had all sorts of network permission issues on my PC. In the end I just did a plain pacman AUR package install and everything worked.

See https://github.com/netdata/netdata


#technology #optimisation #dashboards