76 points by fanf2 4 days ago | 17 comments
amelius 4 days ago
Meanwhile, companies like Apple who integrate everything can have full control, and are likely to come up with the better OSes in the future, but they are even more closed and the only talks we'll see about them are keynote speeches by the CEO.
grisBeik 4 days ago
I agree. At least the first half of the presentation blames the sordid status quo on Linux, all the while it is actually the responsibility of the hardware vendors. Linux not being the boot loader, Linux not being the firmware, Linux not being the secure firmware, etc etc etc is all the fault of the hardware vendors. They keep everything closed; even on totally mainstream architectures. On x86, whatever runs in SMM, whatever initializes the RAM chips, etc is all highly guarded intellectual property. On the handful select boards where everything is open (Raptor Talos II?), or reverse engineered, you get LinuxBoot, Coreboot, ... Whoever owns the lowest levels of the architecture, dictates everything; for example where Linux may run.
> Meanwhile, companies like Apple who integrate everything can have full control
Yes. Conway's law. As long as your SoC "congeals" from parts from a bunch of vendors, your operating system (in the broad sense the presenter uses the term in) is going to be a hodge-podge too. At best, you will have formal interfaces / specifications between components, and open source code for each component, but the whole will still lack an overarching design.
Edited to add: systems are incredibly overcomplicated too; they're perverse. To me, they've lost all appeal. They're unapproachable. I wish I had started my professional career twenty years earlier, when C (leading up to C89) still closely matched the hardware. (But I would have had to be born twenty years earlier for that :/)
Edit#2: the suggestion to build our own hardware is completely impractical. That only makes the barrier to entry higher. (IIRC, Linus Torvalds at one point wrote that ARM64 in Linux wasn't getting many contributions becasue there were simply no ARM64 workstations and laptops for interested individuals to buy and play with.)
js8 4 days ago
> Whoever owns the lowest levels of the architecture, dictates everything
I think in IT, the people who can create most complexity for others, while keeping things relatively simpler for themselves, can dictate. Because these people then can sell the expertise, since they "produce" it cheaper than everyone else.
Using HW barriers, or just closed-sourcing the stuff just happen to be quite effective ways how to make things complex for others and simple for yourself. Another way is to create your own language, standard or API. Yet another way is network barrier and data ownership (aka SaaS).
My point is, it's possible to dictate on any level, not just the lowest.
grisBeik 4 days ago
rjsw 4 days ago
hakfoo 4 days ago
We could have had a system like Commodore's intelligent peripherals-- a defined set of commands issued on predictable ports-- and it shouldn't technically matter how the device chooses to implement it. This lets the vendor do whatever custom special sauce they want, but it also means that any operating system that speaks the standard API will be able to support it.
It could even have been moved one layer up-- letting them have some shim code running on the local machine, as long as it honoured a standard API at a sub-OS level. BIOS interrupts were a good example of this: everything from MFM hard discs to modern flash drives can all be supported with option ROMs providing interoperable INT 13h support.
It fell apart first when software chose to bypass the BIOS and twiddle the hardware directly, and second when BIOSes became vestigial, not really reimagined for 32 bit/multitasking use cases.
dooglius 3 days ago
linguae 4 days ago
However, I wonder if the reason behind fewer OS papers describing radical departures from Unix/Linux, whether it's back in 2000 when Rob Pike spoke on this topic or in 2021, is because the incentive structures that govern researchers' careers discourage this type of work? Writing an operating system requires a lot of effort. One could shrug this off, saying that the problem is worth the effort, but many researchers face career pressures that make taking on the task of writing an operating system difficult. In corporate environments, it is often the case that research activities must be justified from a business standpoint, and it is often the case that the company's direction is driven by short-term pressures. While Roscoe could argue that it's in a company's interest to invest in operating system infrastructure that is better-equipped to deal with modern systems, it may be cheaper for the company, at least in the short term, to just modify Linux and call it a day. Pre-tenured academics such as grad students, postdocs, and assistant professors have to deal with the "publish or perish" game. Perhaps a professor who already has tenure could pursue an operating system project, but even with tenure there's still the matter of getting grant money, and the grad students who contribute to it are often concerned about their own research careers; they are just starting the publication game.
Maybe if we had corporate labs these days that functioned more like golden-era Bell Labs and Xerox PARC, and maybe if we had an academic environment with less pressure to publish steady results at top venues, there'd be more researchers willing to take risks and build operating systems with new designs rather than modifying Linux.
giantrobot 4 days ago
I don't know if that would be the case. While Bell Labs and Xerox PARC produced a lot of very interesting/useful research a lot of it was tied up in corporate licensing for decades. The corpse of AT&T Unix has been haunting the industry for decades and cost many millions of dollars in lawsuits.
Linux ate the world largely because anyone could do what they wanted with it. Modifying or building on top of Linux will get you a very long way on commodity hardware you can get at Best Buy down the street for $200. You can spend a lot more time on your target of research rather than having to build the whole underlying system.
If you've got some genius idea for a process scheduler instead of writing a whole kernel and whatever hardware drivers you need you can just hack it into Linux. You can then distribute it easily to other researchers or testers since it's just patches on a kernel they've already got running.
I'm not saying Linux is the end-all be-all of OS design or systems research is pointless. It's just a pretty good starting point for a lot of research since it is free and quite capable on its own. As a researcher you get a lot of capability out of the box and a whole ecosystem of development tools all ready to use.
musicale 4 days ago
Linux ate the server world because 1) it didn't have server licensing fees like Windows NT and proprietary Unix 2) its closest competitor (BSD) was mired in lawsuits and uncertainty until 1994, 3) commodity x86 servers ended up competing very well on price/performance, and 4) there were possibly other factors like GPL vs. BSD, bazaar vs. cathedral, etc.
On desktop and mobile, Linux did not exactly eat the world. Android and ChromeOS use Linux kernels however.
mike_hearn 3 days ago
The problems that are interesting to research in systems are all at a level much higher than the kernel. They're around software distribution, distributed computing, user interfaces and so on. There's plenty of scope to do interesting stuff there but academia has mostly given up on doing this stuff, I think because better designs at higher levels aren't considered "research" by granting agencies/journals.
Most of the interesting OS research is getting done by cloud vendors, Facebook, Apple and startups these days.
gary_0 4 days ago
Previous discussion: https://news.ycombinator.com/item?id=28374523