• 1 Post
  • 147 Comments
Joined 7 months ago
cake
Cake day: June 9th, 2024

help-circle
  • It is mostly professional/office use where this make sense. I’ve implemented this (well, a similar thing that does the same thing) for clients that want versioning and compliance.

    I’ve worked with/for a lot of places that keep everything because disks are cheap enough that they’ve decided it’s better to have a copy of every git version than not have one and need it some day.

    Or places that have compliance reasons to have to keep copies of every email, document, spreadsheet, picture and so on. You’ll almost never touch “old” data, but you have to hold on to it for a decade somewhere.

    It’s basically cold storage that can immediately pull the data into a fast cache if/when someone needs the older data, but otherwise it just sits there forever on a slow drive.


  • …depends what your use pattern is, but I doubt you’d enjoy it.

    The problem is the cached data will be fast, but the uncached will, well, be on a hard drive.

    If you have enough cached space to keep your OS and your used data on it, it’s great, but if you have enough disk space to keep your OS and used data on it, why are you doing this in the first place?

    If you don’t have enough cache drive to keep your commonly used data on it, then it’s going to absolutely perform worse than just buying another SSD.

    So I guess if this is ‘I keep my whole steam library installed, but only play 3 games at a time’ kinda usecase, it’ll probably work fine.

    For everything else, eh, I probably wouldn’t.

    Edit: a good usecase for this is more the ‘I have 800TB of data, but 99% of it is historical and the daily working set of it is just a couple hundred gigs’ on a NAS type thing.



  • One thing you probably need to figure out first: how are the dgpu and igpu connected to each other, and then which ports are connected to which gpu.

    Everyone does funky shit with this, and you’ll sometimes have dgpus that require the igpu to do anything, or cases where the internal panel is only hooked up to the igpu (or only the dgpu), and the hdmi and display port and so on can be any damn thing.

    So uh, before you get too deep in planning what gets which gpu, you probably need to see if the outputs you need support what you want to do.



  • They really do.

    The sound great, and the ANC is great, but the “official” battery life for a brand new one (which these are not) is “up to 4.5 hours” with ANC on, and 5 without it.

    It ends up being 2-3 charge cycles basically every day, plus a full recharge of the charging case.

    They do, however, work amazingly well if you’re in the Apple ecosystem; for example they’ll swap between my iPad and Mac Mini if audio starts on one or the other.

    But for actually sitting down with something and listening to a thing, I’d rather just plug in some headphones (via the lovely USB-C dongle) and not have to think about if the stupid things are going to die before I’m ready to stop listening.

    (Disclaimer: I’m also a weirdo who doesn’t carry a smartphone, and still uses an iPod for listening to stuff outside of the house, so feel free to roll your eyes and disregard my obviously bad opinions :P )


  • My complaint has always been that the stupid things need to endlessly be recharged.

    I’ve got some AirPod Pros and they’re great… for about 4 hours.

    Then you’re stopping what you’re doing, recharging for half an hour, and then you’re good for uh, another 3 hours because that wasn’t a full charge.

    And after the 2nd or 3rd time you’ve done that, your case is dead and you get to throw everything on a charger for a couple of hours.

    Ooooooooor I can put in my wired headphones, and not give a shit about any of that, because that’s not how those work at all.

    I suppose most people don’t spend most of their day listening to podcasts and audiobooks and thus 4 hours is fine, but good lord is it annoying as crap.







  • Looks like others have provided MOST of the answers.

    Radarr/sonarr do the heavy lifting making symlinks where symlinks are required, but there’s still the occasional bit of manual downloading.

    I also have a script that’ll check for broken symlinks like once a week and notify me of them and I’ll go through and clean them up occasionally, but that’s not super common and only happens if I’m manually removing content I made manual symlinks for, since I’ll just let radarr/sonarr deal with it otherwise.

    (The full stack is jellyseerr -> radarr/sonarr -> qbittorrent/sabnzb -> links for jellyfin)



  • So, this is a ~15 year old laptop?

    The first two things that immediately come to mind when you’re kernel panicing is bad ram, and bad cpu temperatures.

    Thermal paste doesn’t last forever, and it’s worth checking if your CPU or GPU are overheating, and repasting if so.

    And, as always, a memtest is a quick and easy step to rule that out - I’d say half the “weird crashes” I’ve ever seen ends up being bad ram and well, at least it’s cheap and easy to replace?