Self-hosted nutstash.app wallet ftw π€π»
Michael Henke
michaelhenke@codingmerc.com
npub1arnm...vlnd
βΏitcoin Enthusiast
Please DM me a few nuts (from a Cashu mint that is not my own). I would like to test if I can redeem and/or convert them to sats on my own mint. Looking at you #[0]β or #[1]β π please and thank you in advance.
#plebchain is not working for me. I got more relays than followers π€π»


π β¦wave 6 was quelled and so clients got respite from the turmoils of today and continued waves of #zappening


I will be living on my #[0]β home feed for the next while as the #hellthread seems to have made things unbearably slow or broke things


Do you know the tingling sensation that stands up the hair at the back of your neck when #[0]β gets #[1]β ready for the next wave?


Today is the day #nostr blew up my purple bird app π€π»


PV keep #nostr purple π€π»


#[0]β should be the #lightning conductor and @zappr should be thunder clap.
#[0]β info
@zappr info
#[0]β What exactly happens in #Damus when I long-press a post and select Broadcast?
Does it push the same note to all connected relays?
#pv #nostr #nostrica #nostrich #nostrgram


In the context of machine learning, a stable diffusion model typically refers to a deep neural network that has been trained and optimized for a particular task, such as image classification or natural language processing.
Pruning is a technique used to reduce the size of a neural network by removing unimportant connections or neurons, while maintaining or improving its performance. Pruning can be applied to a stable diffusion model to reduce its computational complexity and memory footprint, making it more efficient to run on devices with limited resources, such as mobile phones or embedded systems.
When a stable diffusion model is pruned, it means that certain connections or neurons in the model have been identified as unimportant and removed. The remaining connections and neurons are then retrained to fine-tune the model's performance and compensate for the removed elements. The resulting pruned model may have a smaller number of parameters and a lower computational cost, while still maintaining similar levels of accuracy or performance compared to the original model.
Overall, pruning can be a useful technique to optimize a stable diffusion model for deployment in resource-constrained environments, without sacrificing its accuracy or performance.


PV #nostr π€π»


#[0]β info
I just noticed #[0]β. It made me reminisce of βΏitcoin faucets from way back in the dayβ¦
#[0]β should #[1]β work with LNURL hosted on Tor? Apparently my lightning address was no good as I just got notified of a failed zap.
Zip Zap
Zipped Zapped
PV #nostr 
