• 0 Posts
  • 4 Comments
Joined 4 months ago
cake
Cake day: January 13th, 2025

help-circle
  • Signal isn’t that kind of app. It protects your data in flight, but only has minimal protections after the recipient gets the message. It’s a whole other game to protect data at the endpoint. If you can’t trust your recipients to protect data, then you shouldn’t send them data needing protection. In order to do that you need control over all levels of the device receiving the data, hardware, operating system, file system, and software. Anything else will always leave openings for data at rest at tge destination to be compromised by untrustworthy recipients.



  • LLMs are perfectly fine, and cool tech. Problem is they’re billed as being actual intelligence or things that can replace humans. Sure they mimic humans well enough, but it would take a lot more than just absorbing content to be good enough at it to replace a human, rather than just aiding them. Either the content needs to be manually processed to add social context, or new tech needs to be made that includes models for how to interpret content in every culture represented by every piece of content, including dead cultures who’s work is available to the model. Otherwise, “hallucinations” (e.g. misinterpretation and thus miscategorization of data) will make them totally unreliable without human filtering.

    That being said, there many more targeted uses of the tech that are quite good, but always with the need for a human to verify.