• 1 Post
  • 161 Comments
Joined 8 months ago
cake
Cake day: June 9th, 2024

help-circle



  • No.

    I pirate everything, but am very very reluctant to do so with software or games.

    I only pirate in cases where the company involved is just too gross to support (looking at you, Adobe), or if there’s absolutely no other option.

    But I consider pirated software and games absolutely suspect 100% of the time, because I’m old enough to remember when every keygen was also a keylogger, and every crack was also a rootkit and touching any pirated software was going to give you computer herpes without fail.

    So maybe it’s not that bad anymore, but I mean, do you fully trust in the morals of someone who would spend the time helping you steal someone else’s shit to not add just one more little thing to it for themselves?



  • One thing I ran into, though it was a while ago, was that disk caching being on would trash performance for writes on removable media for me.

    The issue ended up being that the kernel would keep flushing the cache to disk, and while it was doing that none of your transfers are happening. So, it’d end up doubling or more the copy time because the write cache wasn’t actually helping removable drives.

    It might be worth remounting without any caching, if it’s on, and seeing if that fixes the mess.

    But, as I said, this has been a few years, so that may no longer be actively the case.











  • It is mostly professional/office use where this make sense. I’ve implemented this (well, a similar thing that does the same thing) for clients that want versioning and compliance.

    I’ve worked with/for a lot of places that keep everything because disks are cheap enough that they’ve decided it’s better to have a copy of every git version than not have one and need it some day.

    Or places that have compliance reasons to have to keep copies of every email, document, spreadsheet, picture and so on. You’ll almost never touch “old” data, but you have to hold on to it for a decade somewhere.

    It’s basically cold storage that can immediately pull the data into a fast cache if/when someone needs the older data, but otherwise it just sits there forever on a slow drive.


  • …depends what your use pattern is, but I doubt you’d enjoy it.

    The problem is the cached data will be fast, but the uncached will, well, be on a hard drive.

    If you have enough cached space to keep your OS and your used data on it, it’s great, but if you have enough disk space to keep your OS and used data on it, why are you doing this in the first place?

    If you don’t have enough cache drive to keep your commonly used data on it, then it’s going to absolutely perform worse than just buying another SSD.

    So I guess if this is ‘I keep my whole steam library installed, but only play 3 games at a time’ kinda usecase, it’ll probably work fine.

    For everything else, eh, I probably wouldn’t.

    Edit: a good usecase for this is more the ‘I have 800TB of data, but 99% of it is historical and the daily working set of it is just a couple hundred gigs’ on a NAS type thing.



  • One thing you probably need to figure out first: how are the dgpu and igpu connected to each other, and then which ports are connected to which gpu.

    Everyone does funky shit with this, and you’ll sometimes have dgpus that require the igpu to do anything, or cases where the internal panel is only hooked up to the igpu (or only the dgpu), and the hdmi and display port and so on can be any damn thing.

    So uh, before you get too deep in planning what gets which gpu, you probably need to see if the outputs you need support what you want to do.