• 1 Post
  • 22 Comments
Joined 2 years ago
cake
Cake day: October 4th, 2023

help-circle

  • I want someone to prove his LLM can be as insightful and accurate as paid one.

    I mean, you can train a model that’s domain-specific that some commercial provider doesn’t have a proprietary model to address. A model can only store so much information, and you can choose to weight that information towards training on what’s important to you. Or providers may just not offer a model in the field that you want to deal with at all.

    But I don’t think that, for random individual user who just wants a general-purpose chatbot, he’s likely going to get better performance out of something self-hosted. Probably it’ll cost more for the hardware, since the local hardware isn’t likely to be saturated and probably will not have shared costs, though you don’t say that cost is something that you care about.

    I think that the top reason for wanting to run an LLM model locally is the one you explicitly ruled out: privacy. You aren’t leaking information to someone’s computers.

    Some other possible benefits of running locally:

    • Because you can guarantee access to the computational hardware. If my Internet connection goes down, neither does whatever I’m doing with the LLM.

    • Latency isn’t a factor, either from the network or shared computational systems. Right now, I don’t have anything that has much by way of real-time constraints, but I’m confident that applications will exist.

    • A cloud LLM provider won’t change the terms of their service. I mean, sure, in theory you could set up some kind of contract that locks in a service (though the VMWare customers dealing with Broadcom right now may not feel that that’s the strongest of guarantees). But if I’m running something locally, I can keep it doing so as long as I want, and I know the costs. Lot of certainty there.

    • I don’t have to worry about LLM behavior changing underfoot, either from the service provider fiddling with things or new regulations being passed.







  • tal@lemmy.todaytoSelfhosted@lemmy.worldLiquid Trees
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 hours ago

    As I recall, at least under US law, you can’t copyright genetically-engineered life, just get a twenty year biological patent. So I don’t think that FOSS status would be directly germane other than maybe in how some such licenses might deal with patent licensing.


  • tal@lemmy.todaytoSelfhosted@lemmy.worldLiquid Trees
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    2 hours ago

    Just give me a 4U tank somewhere where someone else can deal with harvesting the algae and a webcam aimed at it and I can enjoy it just fine from here. For me, selfhosting is mostly about the privacy, not principally about needing to be resistant to loss of Internet connectivity or the like.





  • I don’t think that that’s the case. I remember reading something about theories about independent development of dragon traditions based on discovery of exposed dinosaur fossils.

    kagis

    Wikipedia says that the earliest stuff we have records of are from the Near East, but that it’s not clear where origins were. Possible that they independently developed.

    https://en.wikipedia.org/wiki/Dragon

    Draconic creatures are first described in the mythologies of the ancient Near East and appear in ancient Mesopotamian art and literature.

    Nonetheless, scholars dispute where the idea of a dragon originates from,[11] and a wide variety of hypotheses have been proposed.[11]

    In his book An Instinct for Dragons (2000), anthropologist David E. Jones suggests a hypothesis that humans, like monkeys, have inherited instinctive reactions to snakes, large cats, and birds of prey.[12] He cites a study which found that approximately 39 people in a hundred are afraid of snakes[13] and notes that fear of snakes is especially prominent in children, even in areas where snakes are rare.[13] The earliest attested dragons all resemble snakes or have snakelike attributes.[14] Jones therefore concludes that dragons appear in nearly all cultures because humans have an innate fear of snakes and other animals that were major predators of humans’ primate ancestors.[15] Dragons are usually said to reside in “dark caves, deep pools, wild mountain reaches, sea bottoms, haunted forests”, all places which would have been fraught with danger for early human ancestors.[16]

    In her book The First Fossil Hunters: Dinosaurs, Mammoths, and Myth in Greek and Roman Times (2000), Adrienne Mayor argues that some stories of dragons may have been inspired by ancient discoveries of fossils belonging to dinosaurs and other prehistoric animals.[17] She argues that the dragon lore of northern India may have been inspired by “observations of oversized, extraordinary bones in the fossilbeds of the Siwalik Hills below the Himalayas”[18] and that ancient Greek artistic depictions of the Monster of Troy may have been influenced by fossils of Samotherium, an extinct species of giraffe whose fossils are common in the Mediterranean region.[18] In China, a region where fossils of large prehistoric animals are common, these remains are frequently identified as “dragon bones”[19] and are commonly used in traditional Chinese medicine.[19] Mayor, however, is careful to point out that not all stories of dragons and giants are inspired by fossils[19] and notes that Scandinavia has many stories of dragons and sea monsters, but has long “been considered barren of large fossils.”[19] In one of her later books, she states that, “Many dragon images around the world were based on folk knowledge or exaggerations of living reptiles, such as Komodo dragons, Gila monsters, iguanas, alligators, or, in California, alligator lizards, though this still fails to account for the Scandinavian legends, as no such animals (historical or otherwise) have ever been found in this region.”[20]

    Robert Blust in The Origin of Dragons (2000) argues that, like many other creations of traditional cultures, dragons are largely explicable as products of a convergence of rational pre-scientific speculation about the world of real events. In this case, the event is the natural mechanism governing rainfall and drought, with particular attention paid to the phenomenon of the rainbow.[21]




  • “Each page of plaintiff’s complaint appears on an e-filing which is dominated by a large multi-colored cartoon dragon dressed in a suit,” he wrote on April 28 (PDF). “Use of this dragon cartoon logo is not only distracting, it is juvenile and impertinent. The Court is not a cartoon.”

    The Court is not a cartoon.

    They’re portraying themselves as a scalie, not you.

    That being said, why is anyone involved here watermarking PDF with anything? I mean, normally the purpose of a watermark is to link content with the creator. But I seriously doubt that the text and the background image have been merged into some kind of raster image.

    investigates

    Yeah, they link to the original dragonized PDF.

    https://storage.courtlistener.com/recap/gov.uscourts.miwd.114988/gov.uscourts.miwd.114988.1.0.pdf

    It’s just text on top of the image. You can copy-paste the text:

    DRAGON LAWYERS PC
    Jacob A. Perrone (P71915)
    Attorneys for Plaintiff
    325 East Grand River Ave., Suite 250
    East Lansing, MI 48823
    Phone: (844) JAKELAW
    jacob.perrone@yahoo.com

    It’s like having a screensaver on an LCD monitor.

    And pdftotext, in poppler-utils, looks like it makes a pretty decent de-watermarked text file of it too.


  • I don’t believe that they’re likely to do GNU/Linux. I bet that they’re going to do a fork of Android off AOSP or something like that.

    Android’s had a huge amount of work put into it to make it suitable to be a consumer mobile phone OS, and the companies here aren’t doing this because they want stuff that GNU/Linux does, but rather because they’re Chinese companies worried about a US-China industrial decoupling and its risks for them. Like, they were okay with the technical status; what changed was that they started to worry about having the rug pulled out from them.

    That being said, I can at least imagine that helping GNU/Linux phone adoption. So, think about what happened with video games. There were some major platforms out there – MacOS, iOS, Windows, various consoles, Android, GNU/Linux. That fragmented the market. Trying to port software to all platforms became a huge pain. What a lot of game developers did was to target a more-or-less platform-agnostic engine and let the engine handle the platform abstraction.

    If the mobile OS space fragments further – like, Android splits into “Google Android” and “China Android” — my guess is that that’ll help drive demand for platform-agnostic engines to help improve portability, and porting one engine to GNU/Linux is a lot easier than every individual program.


  • https://www.history.com/articles/royal-palace-life-hygiene-henry-viii

    The Western European belief that baths were unhealthy did not help matters, either. Although neat freak Henry VIII bathed often and changed his undershirts daily, he was a royal rarity. Louis XIV is rumored to have bathed twice in his life, as did Queen Isabella of Castile, Herman says. Marie-Antoinette bathed once a month. The 17th century British King James I was said to never bathe, causing the rooms he frequented to be filled with lice.

    It was the Sun King himself, Louis XIV, whose choice to no longer travel from court to court would lead to a particularly putrid living situation. In 1682, in an effort to seal his authority and subjugate his nobles, Louis XIV moved his court permanently to the gilded mega-palace of Versailles. At times over 10,000 royals, aristocrats, government officials, servants and military officers lived in Versailles and its surrounding lodgings.

    Despite its reputation for magnificence, life at Versailles, for both royals and servants, was no cleaner than the slum-like conditions in many European cities at the time. Women pulled up their skirts up to pee where they stood, while some men urinated off the balustrade in the middle of the royal chapel. According to historian Tony Spawforth, author of Versailles: A Biography of a Palace, Marie-Antoinette was once hit by human waste being thrown out the window as she walked through an interior courtyard.

    The heavily trafficked latrines often leaked into the bedrooms below them, while blockages and corrosion in the palace’s iron and lead pipes were known to occasionally “poison everything” in Marie-Antoinette’s kitchen. “Not even the rooms of the royal children were safe,” writes Spawforth. An occasional court exodus could have reduced the wear and tear on Versailles, perhaps leading to fewer unpleasant structural failures.

    This unsanitary way of living no doubt led to countless deaths throughout royal European households. It was not until the 19th century that standards of cleanliness and technological developments improved life for many people, including members of royal courts. Today, many European royals still move from residence to residence—but for pleasure, not to try and outrun squalor.


  • Me too, and have done it in the past on one laptop that I did get with Linux when there was no bring-your-own option, but I suppose that OP’s got a point — there are people out there for whom installing the OS on a blank laptop is going to be intimidating.

    If you’ve installed an OS a zillion times, this is all old hat. If you never have before, probably feels kind of scary.

    For those people, having a preinstalled OS can be a significant value-add.


  • which developers can activate some sort of device whitelist to allow the Steam Deck with SteamOS.

    The whole “kernel anti-cheat” PC multiplayer thing is just an endless game of whack-a-mole that I don’t want to get involved with. End of the day, PC hardware is open, and that’s a big part of what’s nice about it. With open hardware, though, comes the fact that you can change things, and one thing that people love using that ability to do is to cheat. And with multiplayer games, a cheater can ruin the game for a non-cheater. Trying to figure out some way to make an open PC be locked down is hard, unreliable, and ultimately just making the thing act like a bad console. If there’s enough demand for it and money in it, a game developer can keep playing that whack-a-mole, but it’s never something that can really be permanently fixed all that well.

    Consoles are really good at blocking players from doing things that will make the playing field not level. They are in a good position to stop fiddling with memory, or modifying game binaries, or extracting information that should be hidden and showing it to the player. They can restrict people from getting a controller or a keyboard or a mouse or a fancier GPU or whatever that will give Player 1 a pay-to-win edge over Player 2. That’s a desirable characteristic if you goal is to have players playing against each other on even footing.

    I really think that the long-term, realistic route to deal with this is for PC games to shift towards single-player, or at least away from competitive multiplayer.

    It used to be that PC multiplayer games were rare. There were two major changes after this that made PC multiplayer games a lot more viable:

    • The Internet came along. Now anyone can communicate with anyone in a very wide area cheaply.

    • People moved off POTS analog modems. This not only provided a lot more bandwidth, but slashed latency — a POTS modem inserted a bit over 100 ms of latency. A tenth of a second of latency at the hardware level was a serious issue for some genres, like first-person-shooter games, so getting rid of that solved a lot of problems.

    Okay, great. That unleashed a flood of multiplayer games. And making a game multiplayer solves a lot of problems:

    • Writing good game AI that stays competitive against a human is hard. Humans are pretty good at that.

    • Humans are good at coordinating, so any cooperative games have humans doing well with humans.

    • Some people specifically want to play against other people, to spend time with them.

    The problem is that I don’t think that there is going to be any future big jump like that improving the multiplayer competitive situation. Those two big jumps are pretty much the end of the road in terms of making multiplayer significantly better. Maybe it’s possible to see some minor gains via better predictive algorithms to shave off perceived latency, though I don’t think that there is going to be game-changing stuff there. Maybe someone can improve on matchmaking. But I think that we’ve seen all the big improvements that are going to come.

    And multiplayer comes with a host of problems that are very hard to fix:

    • By-and-large, realtime multiplayer games cannot be paused. There are a few exceptions, mostly for games with a small number of players. This is a real problem if you, say, have a kid and want to play a game and in the middle of it you hear something smash in the next room and the kid start screaming. Real life is not friendly to people requiring uninterrupted blocks of time.

    • People don’t always do things that are fun for other people. Griefing, spawn-camping, cheating, whatever. Even minor stuff like breaking out of character making the game less-immersive. You can try to mitigate that with game design, but it’s always going to be an issue. Human nature doesn’t change: humans come firmly attached to human foibles.

    • Multiplayer games stop being playable at some point, when they no longer have enough players. Often before that, if they have centralized servers operated by the publisher — which is almost universally the case today — and the servers get shut down.

    • Even with modern networks, latency and flaky connectivity is a factor for real-time games. For people living in remote areas, it’s particularly annoying.

    • For multiplayer competitive games, one can only win at some given rate; for a player to win against a human, that other player will lose. I’d wager that that rate is most-likely not the optimal rate for game enjoyment. If a player isn’t competing against humans, that constraint on game designers goes away.

    On the other hand, while it is hard to make sophisticated game AI, hard to make it as good as a human…there are also no real fundamental limits there. I am confident that while we are not there today, we can make an AGI comparable to a human, and that for the simpler problem of making game AI, it’s possible to make much-less sophisticated AIs that solve many of the problems that humans do in games. That’s expensive to do on a per-game basis – game AI is hard – but my guess is that most games have a fairly-similar set of problems to solve, that it’s probably possible to abstract a lot of that, have “game AI engines” used in many games that solve a lot of those problems. We’ve done that with graphics engines and physics engines; there was a point where having the kind of graphics and physics that we do in many games was prohibitively expensive too, until we figured out how to reuse a lot of work from game to game. And improvements in that game AI is a ratchet: it’s only going to get better, whereas human nature isn’t going to change.

    I’m not saying that multiplayer games are going to vanish tomorrow. But I think that the environment that we’re going to see is going to differ from the one that we’ve seen from maybe the 1990s to the 2010s, where technological change specifically dramatically relatively improved the situation for multiplayer games — I think that there’s going to be a continued trend towards the situation relatively-favoring single-player games.


  • My issue is frequent crashing/freezing, meaning I can’t play longer than a few minutes at a time

    Could be overheating.

    I use AMD hardware.

    However, a few years back, I had a particular AMD card that, using its default onboard power profiles, tended to overheat with the default on-card power profiles in games which really exercised the thing; I understand that the vendor that made these cards had issues with insufficient thermal paste or the thermal paste detaching or something. That’s the card vendor’s fault – the card shouldn’t reach a point where it can get into trouble via overheating, but regardless, it was still a problem. Some people disassembled the thing and put more thermal paste on. I forced the thing to a more-conservative power profile, and that worked.

    I haven’t done this with Nvidia hardware, but it sounds like nvidia-smi can do this:

    https://forum.level1techs.com/t/how-to-set-nvidia-gpu-power-limit-nvidia-smi/131467

    Then to query your power limit:

    sudo nvidia-smi -q -d POWER
    

    And to set it

    sudo nvidia-smi -pl (base power limit+11)
    

    Might try restricting the power usage and see if your crashing stops.

    EDIT: Might also try turning down in-game graphical settings. That’d decrease load and maybe also avoid any potential overheating issues, though it’d be a less-reliable option than the above, as you probably don’t want to make your system freeze just by running some program that happens to throw a lot of load at your card. That also might avoid any issues that the drivers could have that the game is tickling. Worth a shot, at least from an experimentation standpoint, if you are looking for things to try.

    EDIT2: If those do successfully address your problem and it looks like it’s an overheating problem, you might also try figuring out whether you can improve the cooling situation on the hardware side, rather than sacrificing performance for stability.