Hey Ubuntu, Stop Making Linux Look Bad


Ubuntu’s new Karmic Koala 9.10 release has been highly anticipated as the greatest release ever. In truth, it falls flat on its face in a time when Linux really needed to shine. by Christopher Smart

       It’s the same old story. A new Ubuntu release, a new series of pain and frustration.
Canonical releases a new version of Ubuntu every 6 months, come what may. Unfortunately what most often comes is a system full of bugs, pain, anguish, wailing and gnashing of teeth - as many “early” adopters of Karmic Koala have discovered.
The problem is, Ubuntu makes Linux look bad. As more and more people make the switch to free software this is not a good thing. Linux is meant to be stable, secure, reliable.
On the other hand, Ubuntu is obviously doing a lot right. People are indeed switching to Linux, and most of these users have come from an operating system far more torturous, but what they arrive to doesn’t have to be the way it is. Indeed, it shouldn’t be that way.
You see, “With great power comes great responsibility” and now that Ubuntu is very popular it really has a responsibility to create quality products.
As usual, some things which were broken in the previous release are now fixed, but things which were working are now broken. A friend of mine has two wireless USB devices. One works on 9.04 while the other one doesn’t, which is fair enough. With 9.10 however, the one which wasn’t working now works, but the one which was working now doesn’t. Come again? It’s not the first time either. Upgrading from 8.10 to 9.04 his TV tuner cards which used to work, then stopped.
There’s gotta be a better way to do this.
With each new release comes new features, newer software, yet somehow things go backwards. Free software is supposed to improve with each new release. Take OS X, which gets faster. Cleaner. Better. Sure, they have a much smaller hardware base to work on, but it can be done. Ubuntu with the potential for thousands of developers surely can do a better job? Or at least, surely it could at least move forward??
Perhaps Ubuntu’s success is also its curse. They came to fame by making the hard things easier and as such have done great things for the Linux desktop. When you introduce components like proprietary software however, things get more complicated. Sure, Jockey (the proprietary driver manager) warns that Ubuntu “cannot improve or fix these third party drivers,” but does the average user really know what that means? All they know is that their entire (supposedly stable) Linux box hard locks each time they log out or switch users.
Personally, upgrading a recent Jaunty install to Karmic entirely broke networking on the box. Meanwhile, a fresh Kubuntu 32bit install wouldn’t boot with a broken GRUB2 configuration and booting to the Live CD then hard locked the machine. On another machine, half-way through a fresh Ubuntu 64 bit install, the video card suddenly started to display artefacts on the screen. A power off and reboot and it’s still broken. Coincidence? Maybe.
Other people experience awesome features like broken graphics, crashing installer, misconfigured boot loader, USB drives not mounting, sound not working, broken wireless, the list goes on. Upgrading is so bad that a majority of the advice is to perform a fresh install. In fact, the entire term “early adopter” refers to the fact that most experienced Ubuntu users upgrading to the latest version will always wait at least a month before doing so, in order to ensure most major bugs are fixed. Is this seriously acceptable? Is this what you expect from a Linux system? Surely this is some kind of morbid, ironic joke.
Ubuntu is starting to make dents in the commercial arena and that’s great, but do we really need fancy new features like Ubuntu One when basic functionality (that quite frankly should be solved in the 21st century) doesn’t work as expected? Isn’t Ubuntu supposed to “Just Work”™?
Don’t believe me? Just take a look at the release notes for 9.10 and read the 40 odd bugs for this “stable” operating system:
    Boot from degraded RAID array broken
    File system corruption with so called “large files” over 512MB
    Hibernation unavailable with automatic partitioning
    Kubuntu package manager does not warn about installing from unsigned package repositories
    No USB devices work on MSI Wind netbooks, plus flickering graphics
    No Xv support for Intel graphics
    Samba nmbd daemon not started during boot
    System won’t boot with converted ext4 file system
    Ubuntu Netbook Remix missing shutdown applet
    Ubuntu One client corrupts data
    Wireless kill switch segfaults kernel
    X server crashes when using a Wacom tablet
    ..and others (plus more discovered after release).
You must be joking.
A poll on the Ubuntu forums shows just 10% of people had a flawless install. Now that’s something to be proud of! Still not convinced? Try it yourself.
They say, “What goes around comes around.” If Ubuntu doesn’t get their act together then they will be eclipsed by other distros, and rightfully so. What’s worse about all this, is that Karmic Koala had been talked up so much. “It’s a Windows 7 killer” and all that, which of course we’ve heard before. Shuttleworth boasts that he is even “looking forward” to the battle with Microsoft. In the face of Microsoft’s latest effort, just when Linux needed a knight in shining armor and a prime example of how amazing free software is we get, ah, Ubuntu. Hurrah.
Many years ago Linux was very command line focused (and still can be, thank goodness). Back then, many Windows users tried Linux and were scared off, never to try Linux again having been so deeply scarred by that initial experience. It’s happening again, except that this time many of the things which are great about Linux that are touted by the community are being destroyed. Linux is stable, it doesn’t crash. Whoops, Ubuntu just hard locked my machine. Whoops, Firefox is no longer starting up for some reason, whoops this package is now broken. Gah!
Canonical is not an open source company, they are just using free software to try and get a slice of the huge operating system market. Even so, one of Shuttleworth’s primary goals for Ubuntu is for it to be as good as OS X. With releases like Karmic Koala, they aren’t going to get there any time soon, especially when Apple is releasing excellent bug fix-only versions like Snow Leopard. Get your act together, because while Ubuntu might be gaining brave new users who have it worse on Windows, it just doesn’t cut it for experienced Linux users.
Of course these sort of issues are not limited to Ubuntu, but it certainly seems to have more than its fair share. Perhaps it’s the whole commercially driven “release on time” philosophy, or maybe there aren’t enough beta testers. Then again, Fedora has been pushing the limits more than Ubuntu recently and has introduced far more features, yet has had much more successful releases. Something is very wrong with Ubuntu’s release cycle.
Perhaps it’s just Karma, or perhaps the mascot too greatly epitomizes this release. Koalas are after all, very lazy beasts who sleep most of the time (and they don’t drink at all). Drop bears on the other hand..

Christopher Smart has been using Linux since 1999. In 2005 he created Kororaa Linux, which delivered the world's first Live CD showcasing 3D desktop effects. He also founded the MakeTheMove website, which introduces users to free software and encourages them to switch. In his spare time he enjoys writing articles on free software.
From: http://www.linux-mag.com/cache/7600/1.html
Read More

Conceptual map of FOSS (Free and Open Source Software)

Society and culture

Open source culture is the creative practice of appropriation and free sharing of found and created content. Examples include collage, found footage film, music, and appropriation art. Open source culture is one in which fixations, works entitled to copyright protection, are made generally available. Participants in the culture can modify those products and redistribute them back into the community or other organizations.

The rise of open-source culture in the 20th century resulted from a growing tension between creative practices that involve appropriation, and therefore require access to content that is often copyrighted, and increasingly restrictive intellectual property laws and policies governing access to copyrighted content. The two main ways in which intellectual property laws became more restrictive in the 20th century were extensions to the term of copyright (particularly in the United States) and penalties, such as those articulated in the Digital Millennium Copyright Act (DMCA), placed on attempts to circumvent anti-piracy technologies.

Although artistic appropriation is often permitted under fair use doctrines, the complexity and ambiguity of these doctrines creates an atmosphere of uncertainty among cultural practitioners. Also, the protective actions of copyright owners create what some call a "chilling effect" among cultural practitioners.

In the late 20th century, cultural practitioners began to adopt the intellectual property licensing techniques of free software and open-source software to make their work more freely available to others, including the Creative Commons.

The idea of an "open source" culture runs parallel to "Free Culture," but is substantively different. Free culture is a term derived from the free software movement, and in contrast to that vision of culture, proponents of Open Source Culture (OSC) maintain that some intellectual property law needs to exist to protect cultural producers. Yet they propose a more nuanced position than corporations have traditionally sought. Instead of seeing intellectual property law as an expression of instrumental rules intended to uphold either natural rights or desirable outcomes, an argument for OSC takes into account diverse goods (as in "the Good life") and ends.

One way of achieving the goal of making the fixations of cultural work generally available is to maximally utilize technology and digital media. In keeping with Moore's law's prediction about processors, the cost of digital media and storage plummeted in the late 20th Century. Consequently, the marginal cost of digitally duplicating anything capable of being transmitted via digital media dropped to near zero. Combined with an explosive growth in personal computer and technology ownership, the result is an increase in general population's access to digital media. This phenomenon facilitated growth in open source culture because it allowed for rapid and inexpensive duplication and distribution of culture. Where the access to the majority of culture produced prior to the advent of digital media was limited by other constraints of proprietary and potentially "open" mediums, digital media is the latest technology with the potential to increase access to cultural products. Artists and users who choose to distribute their work digitally face none of the physical limitations that traditional cultural producers have been typically faced with. Accordingly, the audience of an open source culture faces little physical cost in acquiring digital media.

Open source culture precedes Richard Stallman's codification of the concept with the creation of the Free Software Foundation. As the public began to communicate through Bulletin Board Systems (BBS) like FidoNet, places like Sourcery Systems BBS were dedicated to providing source code to Public Domain, Shareware and Freeware programs.

Essentially born out of a desire for increased general access to digital media, the Internet is open source culture's most valuable asset. It is questionable whether the goals of an open source culture could be achieved without the Internet. The global network not only fosters an environment where culture can be generally accessible, but also allows for easy and inexpensive redistribution of culture back into various communities. Some reasons for this are as follows.

First, the Internet allows even greater access to inexpensive digital media and storage. Instead of users being limited to their own facilities and resources, they are granted access to a vast network of facilities and resources, some free. Sites such as Archive.org offer up free web space for anyone willing to license their work under a Creative Commons license. The resulting cultural product is then available to download free (generally accessible) to anyone with an Internet connection.

Second, users are granted unprecedented access to each other. Older analog technologies such as the telephone or television have limitations on the kind of interaction users can have. In the case of television there is little, if any interaction between users participating on the network. And in the case of the telephone, users rarely interact with any more than a couple of their known peers. On the Internet, however, users have the potential to access and meet millions of their peers. This aspect of the Internet facilitates the modification of culture as users are able to collaborate and communicate with each other across international and cultural boundaries. The speed in which digital media travels on the Internet in turn facilitates the redistribution of culture.

Through various technologies such as peer-to-peer networks and blogs, cultural producers can take advantage of vast social networks in order to distribute their products. As opposed to traditional media distribution, redistributing digital media on the Internet can be virtually costless. Technologies such as BitTorrent and Gnutella take advantage of various characteristics of the Internet protocol (TCP/IP) in an attempt to totally decentralize file distribution.
Read More