[personal profile] mjg59
At some stage the seminal KDE vs Gnome paper vanished from its original home, and while it's still available in a few places (such as here) it set me thinking. What are the fundamental differences between Gnome and KDE development? There's lots of little differences (2006: Gnome parties on a beach. Akademy has melted ice cream in the rain) but they're both basically communities made up of people who are interested in developing a functional and interesting desktop experience. So why do the end results have so little in common?

Then I read this and something that had been floating around in my mind began to solidify. KDE assumes a platform and attempts to work around its shortcomings. Gnome helps define the platform and works on fixing its shortcomings.

It's pretty easy to see this across the platform. The developer of the Gnome Bluetooth support has multiple commits to the underlying Bluetooth stack, while nobody who's committed to bluedevil appears to. The main developer of the Gnome Networkmanager support is Networkmanager upstream, with the same applying to the Gnome power management infrastructure. And when Gnome developers find limitations in graphics drivers, those tend to be fixed in the graphics drivers rather than worked around in the UI code. KDE builds on top of what's already there, while Gnome is happy to flatten some mountains first.

I should emphasise that I'm not criticising KDE here[1]. These are both rational development models. One optimises for making things work and will compromise on functionality in order to be more portable to different underlying operating systems. The other optimises for additional functionality at the cost of being tied to a much smaller number of underlying operating systems that have to be completely up to date. But understanding that this distinction exists is key to understanding fundamental differences between the projects, and any argument about which is better or about how there should be more collaboration has to take these fundamentally different approaches into consideration. My personal belief is that a tightly integrated platform is going to produce a more compelling product in the long run than one built on top a series of abstraction layers, but we'll see what happens in the long run.

And then, of course, there's Unity and Canonical's gradual effort to turn Ubuntu into a platform distinct from either Gnome or KDE. But that's a separate post.

[1] Well, except for the melted ice cream at Akademy 2006. But I think that's fair.

Bad Science?

Date: 2011-04-27 06:19 pm (UTC)
From: (Anonymous)
I'm by no means saying that your conclusions are incorrect yet to extrapolate so much from a single example is...well...bad science. Are there more cases you might offer that would corroborate your conclusions?

Re: Bad Science?

Date: 2011-04-27 06:59 pm (UTC)
From: (Anonymous)
Another things I have to wonder about... are other differences between the GNOME and the KDE developers. For example, and I'm just guessing here... are most of the GNOME developers who are doing underlying infrastructure commits also Red Hat employees who are more closely linked to the underlying stuff already? My guess would be yes. If true, that is not a criticism of anyone... just another potential explanation.

Also, from their experience with KDE 1, 2, and 3... they used to have some lower level stuff and then they suffered from having trouble maintaining it. I'm thinking here mainly of their soundserver that I can't seem to remember the name of. One of their ideas with 4.0 was that they wanted to abstract more in the underlying KDE libs so that those developing KDE apps would have a stable, unchanging foundation to develop on... and wouldn't have to worry about if the underlying sound system changed or some other underlying piece changed. KDE decided to manage the damage in their abstraction layer... making it easier for the app developers. There seems to be a certain validity to that. With GNOME, if the underlying stuff changes, as if often does, all of those apps have to be fixed. App developers already have the big task of refactoring for widget set changes (gtk2 to gtk3, qt3 to qt4)... why not make something easier for them?

So, my guess is if you asked the designers of the KDE libs why they aren't doing what some of the GNOME devs are, they'd tell you that they learned from previous mistakes. :)

Again, not criticism of anyone... just an unproven set of assumptions and comments! hehe

Scott Dowdle
dowdle@montanalinux.org

Re: Bad Science?

Date: 2011-04-27 07:25 pm (UTC)
From: [identity profile] misc.id.fedoraproject.org
The name you seek is arts. ( ex kde sound server ).

Re: Bad Science?

Date: 2011-04-27 08:32 pm (UTC)
From: (Anonymous)
KDE also runs on more than one operating system. We have these abstraction layers for that very reason, not because we don't want to work with upstream.
ryan rix

Re: Bad Science?

Date: 2011-04-27 08:52 pm (UTC)
From: (Anonymous)
What platforms does KDE run on that GNOME doesn't? Not many I'd imagine. With Linux having essentially taken over the world now, the future is architecture support, and I wouldn't think there's a lot between KDE and GNOME there either.

- Chris

Meh

Date: 2011-04-27 06:38 pm (UTC)
From: (Anonymous)
Yeah, like the notification stuff or using broken standards instead of the ones that were accepted by the freedesktop.org, right? this kind of blogpost is bad for Gnome and for KDE. the criticism on the K post that you linked was that a graphics driver broke the BC in a minor release, and you are saying that G's ok with that? come on =/

Re: Meh

Date: 2011-04-27 11:05 pm (UTC)
From: (Anonymous)
You are missing something important on KDE's approach to graphics drivers: drivers lie. On KDE SC 4.5 many people suffered freezes and crashes when drivers claimed support for features they did not actually support. If you cannot trust what the drivers claim via standard API, the only way to have workable desktop is to do ugly hacks like these.

Re: Meh

Date: 2011-04-28 09:33 am (UTC)
From: [identity profile] mgraesslin.myopenid.com
Sorry, but I have to disagree. Of course working with the driver developers is the better choice, but that would not have fixed our problems we had at the release of 4.5. We need to support the drivers our users are using. We had no other option than to workaround - we really evaluated the options and not shipping the features would not have solved our problems, but even more would have taken away features for users on working platforms such as NVIDIA.

If you want to fully understand the situation around 4.5 please read my lengthy mail to the mesa developer list. It is that long as it explains the situation and is required to understand each other which I think is a prerequisite to work together in the end.

Re: Meh

Date: 2011-04-28 02:56 pm (UTC)
From: [identity profile] mgraesslin.myopenid.com
No, it isn't. We also require working drivers. When we developed 4.5 the drivers (Mesa 7.6/7.7) were working. When we released 4.5 the drivers (Mesa 7.8) were partially not working. This was completely unknown to us during development and it seems like also for the driver developers. Of course we could have said we need working drivers requiring the distributions to either not upgrade the drivers or to disable features which worked before. Both is unlikely to succeed.

Gnome has so far not been in such a situation with the graphics drivers as they were able to develop Gnome Shell against future driver releases. I wish Gnome to never face such a situation. But really it is not like "we require working drivers and will work with the driver developers", reality might differ from the ideal world. I really think that it is difficult to judge this without having ever been in such a situation and maintaining an OpenGL compositor.

All this "platform thinking" you derive from my decisions, is just a wrong conclusion. We did not have an easy decision on how to handle the situation and in the end decided for an approach we believed as the most suitable for KDE, the drivers and most important our users.

Date: 2011-04-27 06:50 pm (UTC)
From: (Anonymous)
I'm sceptical that this is a deliberate result of "the GNOME mentality" as it were rather than simply an inevitable outcome of many of the movers and shakers in the modern desktop stack being employed by companies who have chosen to throw their weight behind GNOME, most notably Red Hat / Novell (and Sun before them). Nokia and Trolltech invested in vertical integration on the KDE side of things back in the day.

- Chris Cunningham

Date: 2011-04-27 08:40 pm (UTC)
From: (Anonymous)
Right, but the point is that if you're employed by a company who are active further down the stack (such as Red Hat) then it's far easier to assume that you can get things changed down there. So if there is an attitude split it's between the "cans" and "cannots" in that sense, in that folk hacking on KDE desktop components don't really have any recourse to pushing things down the stack other than fd.o. So it's a sort of resignation rather than a deliberate bid for abstraction.

- Chris

Date: 2011-04-27 09:13 pm (UTC)
From: (Anonymous)
I wasn't meaning to cast aspersions on you (indeed I'd already replied once or twice before remembering you were a Red Hat employee). However, I'd suggest you're a somewhat atypical example. Most desktop hackers don't dive deep into the stack themselves; the big framework pushes of late have (or at least have the appearance of having) been possible because they've been endorsed by major vendors paying people to accommodate them. If you work for a company who can do that (or hack on a desktop supported by a company that can do that) you're obviously going to be bolder in proposing deeper changes to the stack.

- Chris

Date: 2011-04-27 10:35 pm (UTC)
From: (Anonymous)
Cause and effect? Red Hat presumably hires people *because* they're comfortable with working throughout the stack, not just in the kernel or just in the desktop layer.

platform definition problems

Date: 2011-04-27 07:05 pm (UTC)
From: [identity profile] https://www.community-id.org/identity/fche
It may be fair to mention one potential (and some would argue "actual") consequences of the gnome approach of "defining the platform" and dragging the lower layers along. That is the problem of overreach: when the platform ideology is far beyond mundane matters such as hardware capability or user expectations.

Re: platform definition problems

Date: 2011-04-27 10:50 pm (UTC)
From: (Anonymous)
My question is... where are the OEMs in terms of philosphy? I'm not sure OEMs have figured out how to interface with either approach that you are comparing and contrasting. And if the end goal is "competitive desktop/whatever" then these projects need to have some open lines of communication with OEMs so OEM product development culture can figure out how to work _inside_ the project philosophy instead just being _consumers_ of the project labor.

Date: 2011-04-27 07:13 pm (UTC)
From: (Anonymous)
That could be true about Gnome vs KDE. But when we talk about GTK vs QT, it's a completely different story...GTK is so behind that it's not even funny (most of the "features" in GTK 3.0 are mostly design bugs that QT did right several years ago and GTK people were happy to work around all that time)

Date: 2011-04-27 08:18 pm (UTC)
From: (Anonymous)
This (http://www.oreillynet.com/onlamp/blog/2005/03/gnome_vs_kde_in_45_words_or_le.html) always springs to my mind every time someone brings up the Gnome vs. KDE subject... :)

Date: 2011-04-28 01:43 pm (UTC)
From: (Anonymous)
Which is a pretty dumb summary, considering KDE code contains hardly any use of the STL ;).

Date: 2011-04-27 08:25 pm (UTC)
From: (Anonymous)
I think looking at this problem only from the desktops doesn't look far enough. Every "big" software project has to answer the question about how much time to spend on integration vs just working with the surrounding software. And you will find a lot of projects that do it the "work with" way, such as Mozilla, LibreOffice or Inkscape. And you'd have avoided singling out the KDE developers.

In fact, the "fix shortcomings" approach is a very rare approach and in my opinion it's one of the great success stories of the GNOME community[1]. Though I'm not entirely sure how much of this can be attributed to GNOME and how much of it should be attributed to the corporate sponsors (read: Red Hat).
But I suppose you'd have to think hard to find another project that participates in the development of surrounding software as much as GNOME does. Maybe around the kernel there's a bunch of projects that do?

[1]: another big success story is i10n.

Date: 2011-04-27 11:51 pm (UTC)
From: (Anonymous)
I guess the cross-platformness also counts for KDE to some extent (at least for Qt), although I guess it's not as important for them as for the projects I listed.

But I'm not a fan of the cross-platform argument. If you want to be cross-platform, the same argument applies, just that there's more platforms you should cooperate with. (Which then leads to the question of why GNOME only targets/integrates with one platform. But I guess that's a different topic and the answer is the same as the one or Xorg.)

Sponsoring issues

Date: 2011-04-27 08:44 pm (UTC)
From: (Anonymous)
I fully agree with the point mentioned above. Many times the low-level stuff was done by GNOME people because Red Hat sponsored them for doing exactly that.

Date: 2011-04-28 02:38 pm (UTC)
reddragdiva: (Default)
From: [personal profile] reddragdiva
Don't forget the influence of "Not Invented Here" and "Not Invented Yet". Mozilla invented its own damn toolkit (XUL) because there was nothing usable and cross-platform, though they later made the Unix backend of XUL just GTK. OpenOffice did much the same, though I suspect more from NIH, given that NIH was an ongoing theme of OOo core development (which is what first led to the Ximian, later Novell, patchset which became the base for LibreOffice).

Ironic

Date: 2011-04-27 11:48 pm (UTC)
From: (Anonymous)
It's ironic you refer to a post from a KWin developer about KDE not working with upstreams on fixing issues, given that KDE developer Fredrik Höglund has had several patches recently committed to low-level radeon chipset code in Mesa. (Feel free to browse the fd.o cgit for mesa/mesa if you don't believe me). Thiago Macieira contributes to D-Bus. The Okular developers contribute to Poppler.

In addition I know of at least one KDE dev finding difficulty getting patches accepted into Wayland. But it's easier to say you're working with upstream when you *are* the upstream, isn't it? :)

Re: Ironic

Date: 2011-04-28 01:45 pm (UTC)
From: (Anonymous)
> The Okular developers contribute to Poppler.

The poppler maintainer is a KDE guy, in fact.

Date: 2011-04-28 09:31 am (UTC)
From: [identity profile] https://www.google.com/accounts/o8/id?id=AItOawmYoHhjrYGWSn_ODG4dQ3ZamGwNcO2dxhQ
I think this is a pretty fair assessment, though one can only speculate on the actual reasons.

Two things that might have contributed to KDE's choice of handling things.

For one until recently it has been very hard to get changes/improvements/additions into Qt, thus requiring a kind of duplication in KDE libs. If it is hard to influence your most important upstream you might not feel trying with one you are not as connected with.

A second one would be (mostly politically motivated) opposition to choices of technology. E.g. using GObject based libraries in runtime infrastructure such as NetworkManager will not be considered an issue for any KDE frontend, while doing the same on QtCore based libraries can very likely be one on the GNOME side.

Date: 2011-05-25 05:33 pm (UTC)
From: [identity profile] http://openid.fraglimit.net/sorpigal
A contributing factor might be that a lot of the platform is written in C, not C++. It's easier for a C guy (GNOME) to hack up something else in C than for a C++ guy, who is really a Qt Framework guy, to do the same thing.

Akademy 2006

Date: 2011-04-28 01:54 pm (UTC)
From: [identity profile] jr [launchpad.net]
Akademy 2006 was in Dublin. I don't remember Matthew being there, nor did we have rain or ice cream. I do remember lots of sun and pizza.

Re: Akademy 2006

Date: 2011-04-28 01:56 pm (UTC)
From: (Anonymous)
Perhaps mjg is thinking of http://en.opensuse.org/Icecream

Date: 2011-04-28 02:57 pm (UTC)
From: (Anonymous)
Haha nice post, but do not search problems in other DE, when you have enough yours (unity fork of gnome). I hope that you and others gnoms will se that soon enough.

Not quite...

Date: 2011-05-01 05:49 am (UTC)
From: (Anonymous)
It seems you may have jumped too quickly to your conclusion. Its not that gnome likes to work with upstream to fix things. Its that they become upstream by creating new system level components. Unfortunately they don't ever seem to get it right. For example: Pulse Audio, was broken for a very long time, Network Manager, also broken on a lot of machines, the NM developers like to claim its the wifi driver's fault that it doesn't work, but really, I had my laptop almost two years before NM started working on it (yet both wicd and straight wpa_supplicant worked just fine), hal was also started in the gnome world, now its been split into all the u* packages.

I don't know if its just that a lot of gnome developers happen to also be major distro contributors, or work at the various commercial distros that make these sorts of decisions, but it seems every time the gnome crowd thinks up a fancy new toy, it gets accepted without a lot of testing or forethought, to the ultimate annoyance and frustration of users everywhere.

Of course the same thing happened with KDE 4, distros jumped the gun way too fast in including 4.0 as the default KDE version. At the time I recall the KDE project saying "DO NOT MAKE 4.0.0 the default" fairly loudly, yet it happened anyhow. And it seems the same thing is happening with Gnome 3, and things like Unity.

Maybe I'm being overly critical, but its just the pattern that seems to be playing out. It's happened over and over, and people just don't seem to be learning from their mistakes. We don't need new stuff just because its new. It should fit a need, and work reliably, then be transitioned to be a core component, not before.

Re: Not quite...

Date: 2011-09-08 06:23 pm (UTC)
From: (Anonymous)
Pulse is still broken, and it's more than just driver-side -- it's also application-side.

I think the problems here come from trying to do a complete architecture shift, which never works quickly. Basically very little works correctly with Pulse from *either* the driver *or* the application end. I suppose this has to do with none of the users of Pulse understanding the interface they should be programming to, but it makes for a big pain.

That said, KDE managed to make exactly the same mess with Phonon! Apparently you can screw people's desktops with *either* approach.

3D

Date: 2011-05-01 08:29 pm (UTC)
From: [identity profile] gaute.vetsj.com
But does not GNOME assume 3D?

Reasons for KDE vs Gnome

Date: 2011-05-23 11:04 pm (UTC)
From: (Anonymous)
There does seem to be some truth to what mjg says. I suspect that it mostly derives from two things: first, QT was developed to be more cross-platform than GTK. For a long time, GTK didn't support Win32 well, and its support on OSX is still immature.

Second, much Gnome development, and essentially all GTK development, is done in C, the language of the underlying platform. It's not surprising that someone who prefers C will be more comfortable hacking X than someone who prefers C++ like most KDE hackers.

Profile

Matthew Garrett

About Matthew

Power management, mobile and firmware developer on Linux. Security developer at Nebula. Member of the Linux Foundation Technical Advisory Board. Ex-biologist. @mjg59 on Twitter. Content here should not be interpreted as the opinion of my employer.

Expand Cut Tags

No cut tags