• darkmatternoodlecow@programming.dev
    link
    fedilink
    arrow-up
    73
    arrow-down
    1
    ·
    10 months ago

    The point hinted at in the title is not part of this article. This is an overview of various versions and branches of UNIX, and nothing more.

    • samc@feddit.uk
      link
      fedilink
      English
      arrow-up
      59
      ·
      10 months ago

      At the end there’s a little jab towards Wayland:

      Today, the Wayland enthusiasts like to talk about how they are modernizing the Linux graphics stack. But Linux is a Unix, and in Unix, everything is meant to be a file. So any Wayland evangelists out there, tell us: where in the file system can I find the files describing a window on the screen under the Wayland protocol? What file holds the coordinates of the window, its place in the Z-order, its colour depth, its contents?

      As far as I’m aware nobody has even considered extending the file metaphor to the graphics stack, and it sounds a bit ridiculous to me.

      It also reminds me of this talk that suggests maybe trying to express everything as a file might not be the best idea…

      • SavvyWolf@pawb.social
        link
        fedilink
        English
        arrow-up
        22
        arrow-down
        2
        ·
        10 months ago

        I have a 144Hz display. I’m sure my system would love every frame hitting the filesystem layer.

        • Kornblumenratte@feddit.de
          link
          fedilink
          arrow-up
          21
          arrow-down
          2
          ·
          10 months ago

          /dev/fb0 is the framebuffer. So yes, you can feed data into the filesystem and you’ll see it on your display.

          For Unixoids, being a file does not mean that this data is stored on a hard disk, but that all data, processes and hardware are accessible with the same toolkit. /dev/fb0, for instance, is part of the file-like interface of your graphics card.

          • skilltheamps@feddit.de
            link
            fedilink
            arrow-up
            7
            arrow-down
            1
            ·
            10 months ago

            /dev/fb is mostly one thing: deprecated. Also it is not really a interface of your graphics card, it is a legacy way kindly still provided for pushing fullscreen pixels to your monitor in an unaccelerated fashion for things that have not made it to kms drm (which at this point is pretty much merely the console emulation on the TTYs). It is not an interface to the graphics card, because it doesn’t provide any capabilities a graphics card has (like shaders etc). In fact for just pushing pixels you can leave any graphics card completely out of your computer if you connect your screen by other means (think stuff like SPI which is common in embedded devices; you can find many examples of such drivers in the kernel source at drivers/gpu/drm/tiny ).

      • Avid Amoeba@lemmy.ca
        link
        fedilink
        arrow-up
        16
        ·
        edit-2
        10 months ago

        It’s nonsense. The author arbitrarily decides on some expression of the windowing model in terms of files. OK cool. Every author of a system that uses files decides how to represent their data. E.g. how many files to use, sockets, what data to flow through each and what format that data should be represented in. Like why not go to the authors of Btrfs and argue why the data format /dev/btrfs-control is the way it is why it’s a single file instead of 5. It’s an arbitrary decision. When not used for storing data files in POSIX-like OSes are a type of IPC mechanism. How many channels that IPC needs and what data flows over these channels is an arbitrary decision by the authors on one or both sides of that IPC. The OS provides the IPC mechanism. The software that uses it creates some abstraction on top of it which doesn’t have to conform to any lower level OS models. Could we model Postgres tables and rows like files in a dir structure. Sure. There are pros and cons to using that model. Might not be great for terabyte scale db performance.

        • kbal@kbin.melroy.org
          link
          fedilink
          arrow-up
          7
          ·
          10 months ago

          Dennis Ritchie and Ken Thompson […] ignored what the industry was doing, went back to their original ideas, and kept working on refining them. The result is the next step in the development of Unix

          Plan 9 is clearly what the article is talking about. Odd that they don’t name it.

          • BaumGeist@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            They do, if you consider that this article doesn’t stand alone at all and read the blurb at the very bottom in italics acknowledging that it’s part of a bigger series

      • ijhoo@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        10 months ago

        This was a great talk (video you linked, not the article). Wonder what Linus would say about C being a wrong thing today.

  • Avid Amoeba@lemmy.ca
    link
    fedilink
    arrow-up
    35
    arrow-down
    4
    ·
    10 months ago

    The author lost me at “Linux is Unix.” I kept reading and it didn’t get any better. 🥺

  • LeFantome@programming.dev
    link
    fedilink
    arrow-up
    29
    ·
    10 months ago

    What an odd article. First, the author goes to great lengths to assert that “Linux IS UNIX” with pretty circumstantial evidence at best. Then, I guess to hide the fact the his point has not proved, he goes through the history of UNIX, I guess to re-enforce that Linux is just a small piece of the UNIX universe? Then, he chastises people working on Linux for not staying true to the UNIX philosophy and original design principles.

    Questions like “are you sure this is a UNIX tool?” do not land with the weight he hopes as the answer os almost certainly “No. This is not a “UNIX” tool. It is not trying to be. Linux is not UNIX.”

    The article seems to be mostly a complaint that Linux is not staying true enough to UNIX. The author does not really establish why that is a problem though.

    There is an implication I guess that the point of POSIX and then we UNIX certification was to bring compatibility to the universe of diverging and incompatible Unices. While I agree that fragmentation works against commercial success, this is not a very strong point. Not only was the UNIX universe ( with its coherent design philosophy and open specifications ) completely dominated by Windows in the market but they were also completely displaced by Linux ( without the UNIX certification ).

    Big companies found in Linux a platform that they could collaborate on. In practice, Linux is less fragmented and more ubiquitous than UNiX ever was before Linux. Critically, Linux has been able to evolve beyond the UNIX certification.

    Linux does follow standards. There is POSIX of course. There is the LSB. There is freedesktop.org. There are others. There is also only one kernel.

    Linux remains too fragmented on the desktop to displace Windows. To address that, a standard set of Linux standards are emerging: including Wayland, pipewire, and Flatpak.

    Wayland is an evolution of the Linux desktop. It is a standard. There is a specification. There is a lot of collaboration around its evolution.

    As for “other” systems, I would argue that compatibility with Linux will be more useful to them than compatibility with “UNIX”. I would expect other systems to adopt Wayland in time. It is already supported on systems like Haiku. FreeBSD is working on it as well.

    • Soleil (she/her ♀)@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      10 months ago

      This is my real problem with this (and also broadly pointing the finger to the “Unix philosophy” whenever a project like systemd or Wayland exists, ignoring that the large, complex, multifaceted, and monolithic Linux kernel itself flies in the face of that philosophy). Linux may have originally been built to be Unix-like but has become its own thing that shares a few similarities with Unix.

  • Veraxis@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    10 months ago

    Blah blah blah blah blah…

    tl;dr the author never actually gets to the point stated in the title about what the “problem” is with the direction of Linux and/or how knowing the history of UNIX would allegedly solve this. The author mainly goes off on a tangent listing out every UNIX and POSIX system in their history of UNIX.

    If I understand correctly, the author sort of backs into the argument that, because certain Chinese distros like Huawei EulerOS and Inspur K/UX were UNIX-certified by Open Group, Linux therefore is a UNIX and not merely UNIX-like. The author seems to be indirectly implying that all of Linux therefore needs to be made fully UNIX-compatible at a native level and not just via translation layers.

    Towards the end, the author points out that Wayland doesn’t comply with UNIX principles because the graphics stack does not follow the “everything is a file” principle, despite previously admitting that basically no graphics stack, like X11 or MacOS’s graphics stack, has ever done this.

    Help me out if I am missing something, but all of this fails to articulate why any of this is a “problem” which will lead to some kind of dead-end for Linux or why making all parts of Linux UNIX-compatible would be helpful or preferable. The author seems to assume out of hand that making systems UNIX-compatible is an end unto itself.

  • wolf@lemmy.zip
    link
    fedilink
    English
    arrow-up
    22
    ·
    10 months ago

    Seriously, I don’t understand the point of the article, if there is one.

    It seemed more like a confused enumeration of systems which are POSIX conform and in the end it talks about Wayland.

    Is the point that Wayland breaks compatibility with X11/X.org and is mostly a Linux thingy? (AFAIK FreeBSD is working on a Wayland port, but no one else.)

    Anyway, I am a happy Wayland user for several years now, although I am of course unhappy about the split with the *BSDs, OTOH most 'NIX software nowadays uses so many Linux APIs, that Wayland is IMHO no big game changer when talking about portability anyway.

    • Zamundaaa@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      FreeBSD isn’t working on a Wayland port, that’s already happened. The Plasma Wayland session has supported it for quite a while… KDE even runs a CI job on FreeBSD for every merge request, where kwin_wayland autotests are run.

      Considering the amount of complaints we got when something broke recently though (which is to say, none), it doesn’t look like it has a lot of users

      • wolf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Good to know that FreeBSD pulls Wayland off! :-)

        It is a pity, that FreeBSD is not more utilized for desktops.

    • Chewy@discuss.tchncs.de
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      10 months ago

      Is anyone even running anything besides maybe FreeBSD on desktops? Most advantages of BSD over Linux seem to be relevant for servers, but not really for typical desktop usage.

      Additionally, apps use toolkits anyway, which provides backends for Wayland and X11. If at some point X really isn’t viable anymore, people will put in the work and port Wayland from FreeBSD to other BSDs.

      • wolf@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        In my impression OpenBSD is used at least as much as FreeBSD on the desktop, if not even more.

        Nowadays I agree with your point, that for the ‘typical desktop usage’ the BSDs are not very viable (I try from time to time and always have to give up, because of missing hardware support or missing software.).

        Still, IMHO it is a great loss that the BSDs are not really an alternative on the desktop for most users. BSDs are extremely good engineered, when hardware is supported, it just works™, the base system is clean and has great documentation.

  • BaumGeist@lemmy.ml
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    10 months ago

    How Paying Attention in Grade School English Class Solves Climate Change: A Modest Proposal


    I’m begging people who didn’t pay any attention during English/Literature/Language class: establish your thesis early. (See what I did there?)

    Hell, I’m begging internet commenters that consistently fail to write short comments and experience self-awareness about it to do so too.

    Something I first noticed in video essays is that it takes them about 60% of the video to establish the thesis that the title begs. The wadsworth constant has been extended to twice its original length it seems.

    That’s a great thing!

    … if you want your audience to tune out/skip most of the content you spent days/months/weeks crafting. Otherwise you might want to establish why your topic is a problem your audience should care about. (See what I did there?)

    But in a bold move, this article’s author takes 90% of the article beating around the bush with a history lesson that we just have to take on faith is important. Just saying “those who don’t learn from history are doomed to repeat it” is not enough motivation to then delve into what amounts to little more than a loosely connected list of names and dates.

    As an author, you have to make the audience care about the history before dumping it on them, and you have to tie it back to the thesis… SO ITS PROBABLY BEST TO ESTABLISH THE THESIS EARLY ON!!!

    Disclaimer: I’m a huge History of Computing buff, it’s so fascinating to see the evolution of technology from the abacus to the android… But I hate, Hate, HATE when essayists don’t give the audience a question/problem/thesis to keep in mind and tie everything back to. It just comes off as meandering rambling.

    Look, it’s okay to just write about your special interest and ramble about it at length because it sets off the dopamine receptors in your brain’s reward center; not all knowledge needs to have an immediate use to be valuable, sometimes its just fun to learn. But if you’re going to open with a claim that there’s some worldwide problem that you can solve in the largest, most eye-catching part of your essay (the title), you better fucking deliver on establishing the problem and the solution.

    Otherwise you have an issue with communicating effectively, which is a much bigger problem than people not knowing which bell Dennis Thompson hurd in 1984.

    Do you see what I did there?

    • Brewchin@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      For those unaware, your thesis concept is also known as BLUF: Bottom-Line Up Front. Take a moment after you’ve finished your masterpiece to summarise it at the top in one sentence, or two at most.

      A tl;dr at the end of a post also works, but only for those who think to check for it. But either option works.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    4
    ·
    10 months ago

    This is the best summary I could come up with:


    But it hides most of the real Unix directory tree, its /etc is relatively empty, it doesn’t have an X server – it’s an optional extra.

    So taking that list of general characteristics, and adding a less visible one – that it’s programmed mainly in C or something C-like – and requiring that the OS looks like Unix and nothing else, meaning there’s no other native layer underneath, then the family is bigger.

    The original microkernel, CMU Mach, led to a whole bunch of Unix OSes, including the Open Group’s OSF/1 and DEC Tru64, as well as MkLinux and famously the GNU HURD.

    The only one that isn’t a historical curiosity or a tiny neglected niche is Apple’s macOS family, including iOS, iPadOS and so on.

    QNX is a commercial microkernel Unix-like OS, and it’s used in billions of embedded devices … although the only time you might have played with it was Blackberry 10.

    A host machine, plus dumb text terminals on serial connections, with no graphics and no networking – even so, high-end kit for the 1970s.


    The original article contains 1,892 words, the summary contains 179 words. Saved 91%. I’m a bot and I’m open source!

  • corsicanguppy@lemmy.ca
    link
    fedilink
    arrow-up
    3
    ·
    10 months ago

    We got shoveled systems like the worst shit sandwich.

    Anything supporting the Unix principle of design needs to address that cancer.

  • lorty@lemmygrad.ml
    link
    fedilink
    arrow-up
    2
    ·
    10 months ago

    This was a history lesson which has nothing to do with the issue raised by the title.