Tuesday, March 30, 2010

Ubuntu Geek Interesting applications - PART 2

Some more applications I have found very interesting:

PINTA
(Paint.NET clone for Linux)

On to the UbuntuGeek ARTICLE

REMMINA remote desktop client
(Multiprotocol remote desktop client)

On to the UbuntuGeek ARTICLE

FREENX Server and client
(Visual remote desktop control)

On to the UbuntuGeek ARTICLE

TASK COACH
(Friendly Task Manager tool)

On to the UbuntuGeek ARTICLE

UDAV
(Graphing tool based on Math GL)

On to the UbuntuGeek ARTICLE

Saturday, March 27, 2010

KDE SC 4.4.1 Review

I keep three computers at home, one desktop and two laptops, all of which have some Linux incarnation installed. If possible, I usually set them up with more than one partition, so I can dual boot. This allows me to use several different distros, desktop managers. etc., at any given time. I can alpha and beta test that way, or simply keep it fresh so I am not stuck with the same configuration all the time. After all, Linux is about variety and freedom, so why not get the most out of it?

Up until very recently, I was using a Mandriva 2010 installation to satiate my KDE thirst, but not being able to try the latest KDE 4.4.1 was bugging me. After trying to find a way to update my Mandriva box, I didn't find anything that seemed easy or reliable.

I knew that Fedora, with its edge approach to new software packages, would allow me to give it a try. On top of that, I wanted to test another USB installation (I already tested Ubuntu 9.10 and was loving the results). Long story short, I am typing these lines from Fedora 12 running KDE 4.4.1, all installed on TDK 16GB USB drive.

IMPRESSIONS ON KDE SC 4.4.1

When I tried Fedora12 a few months back, it was sporting the latest KDE desktop (I believe 4.3.4). Therefore, knowing how Fedora was with that older version would help in isolating my testing so I could focus on SC4.4.1.

The looks

KDE developers have always put big efforts on this department, and this version continues to deliver. This is obviously very subjective, but I am liking what I am seeing. I have to admit I was hoping for a bigger step forward, but it is a nice progression over KDE 4.3.x. Some new menus have been added, some features have been rearranged and a few seem to have been removed.


KDE 4.4.1 on Fedora 12. Now, those are some good looking widgets!

I think it all looks smoother, the new Air theme is a nice departure from Oxigen, and controls, scroll bars and buttons look better than ever. The new widget addition menu is quite impressive, albeit a bit sloppy in functionality. When dragging a widget to the desktop or the panel, I had some strange results. Sometimes it would do nothing and would take it a few tries before it would work. Some new functionality has been added to a few widgets, they mostly look better as well.


This new menu makes it easier to add widgets (mostly).

The system tray also seems to sport a tighter integration. The applet dealing with removable drives also gets some new functionality and a refined look. Unfortunately, it suffers from a problem that is often there in KDE: It is anything but intuitive. When plugging in a new USB drive, a neat little dialog shows that there are a few actions available for that device. In other words, there is more than one application that could be used to deal with the contents in that drive. The problem is that only clicking on a very small icon on the left hand side of the menu displays the options available.


The menu dealing with removable drives looks good, but is far from intuitive

Fonts look sharper than in previous versions, but I still think GNOME has better rendering.


Font rendering detail on the (almost unchanged) main menu here.

Dolphin also looks very much the same. I still feel the KDE file manager of choice is a bit bloated and slow, but cannot say I count myself among those considering Konqueror a better alternative.


Dolphin on KDE SC 4.4.1

PERFORMANCE

Historically, KDE has suffered from slower performance when compared to GNOME. This issue has been addressed in late releases and performance keeps improving release after release. This last release is no exception. Having said so and while the gap keeps shrinking, I still feel GNOME performs better.

RELIABILITY (or lack of...)

KDE SC 4.4.1 is proudly presented as the result of resolving several hundred bugs, and it shows. It does feel more solid than 4.3.x, but I hear it still is far from being as solid as 3.x, and even farther from GNOME.

In just a few days I have got a few bugs, more than I've had in a full month under GNOME. Applications that get closed for no reason, sessions that close without notice... I haven't really been able to reproduce those crashes consistently, so I can't consider or log them as bugs, but the overall feel is sloppy. In fact, an element that was particularly buggy was the compiz integration. This was not really a problem for me under 4.3.x, so not sure what went wrong here. In this case, several key combinations simply are "forgotten" after the session is closed, while others just don't work for me

All in all, reliability is still a miss in KDE as far as I am concerned.

APPLICATIONS

If you like KDE applications, the good news is that they keep getting better. Kmail, K3b, Amarok, Ark... You name it, it's better. In fact, I downloaded the latest version of Amarok and I must say I very much like the path it's following.


Amarok gets a new splash screen.


Amarok menus look better integrated.


Chromium gets very good integration and seems to even work better than under GNOME.

"SAME OL, SAME OL..."

I started using KDE right when version 4 was released. Back then, there was a general consensus that there was still a lot of work pending. This desktop environment has come a long way since then, improving in all aspects. Having said so, I still think there are several things that are missing or are simply way too complicated for new users. Here are some of my main concerns/issues with KDE:

1.- Custom keyboard shortcuts missing. I love how easy it is to create custom keyboard shortcuts in GNOME. If I want to assign a new key combination in order to open the browser of choice, it is very simple. Likewise, if I installed an alternative web browser, I can set up a new custom key combination in 2 seconds. I am yet to find how this can be achieved under KDE.

2.- Panel launcher functionality. This one is a big drawback, I think. GNOME makes things really simple and clear here. If you want to add a new shortcut to your panel, you can right click on any of the menu items and do so, or just drag and drop. Similarly, you can simply right click on the panel and add a shortcut. In KDE this is ridiculously cumbersome. The only option I found involves adding a widget (Quicklaunch) to the panel. This widget has limited functionality:

- You can only add launchers by right-clicking and choosing the right option
- Customizing launcher icons is anything but intuitive
- Once you choose how many icons should be displayed, you can only change that from the Quicklaunch settings, it does not dynamically adjust as you keep adding icons.

I think one could argue that KDE offers alternative ways to achieve the similar functionality, but panel shortcuts are very widely accepted, even expected. Most Windows users are used to this functionality, which became so popular for a reason. I think it would be smart to make this feature clearer, easier and more intuitive in future versions.

3.- System tray. Any Windows user has seen how ridiculously crowded the system tray can get when too many applications land their icons there. GNOME has made an effort to be very strict about this, limiting the amount of icons that can populate the system tray. KDE, in turn, takes a similar approach to Windows, allowing many applications to dock under the system tray. The result is as bad as in the Microsoft operating system.

NOTE: By the way, it is about time the icons in this system tray are reworked. They are way old, low resolution and low quality, especially for a desktop manager so focused on the looks!

4.- Not taking advantage of multiple desktops. Once again back to a Windows analogy: What happens to the panel when you have 10 or more applications open at the same time? Well, once again it becomes overcrowded. Docked windows become small squares that cannot even show the window title. Some Windows users increase the panel height so twice as many docked windows fit in. While KDE does that for you, I really cannot understand why suffering from this when multiple desktops can be used?

GNOME works around this efficiently. Each desktop panel is only populated by the corresponding docked windows. As a result, if you have 3 applications open on desktop 1, and another 3 applications open on desktop 2, you will only see 3 docked windows on each panel. KDE would show 6 docked windows regardless of the desktop they are in (What tha...?).

5.- Very inconsistent icon themes. Once you get used to how easily you can change icon themes in GNOME and how well it works, it feels like a big step backwards when you try in on KDE. To begin with, the applet allowing to download icon themes from external sources (mostly kde-look.org) fails too often, for many of the icon themes displayed are simply not available.

When you get to download one of the ones available, it works so badly it is not even funny. Default folder icons are never updated correctly (you seem to be stuck with the Oxygen default), the system tray icons almost always remain untouched, and the main menu randomly updates some icons while leaving others unchanged.

Once again, I think this is important because KDE has been all about the looks and being able to customize things to the last pixel.

CONCLUSION

KDE SC 4.4.1 is definitely a step forward, one in the right direction, but I feel it still suffers from issues that have been there for way too long. If KDE plans to ever take the world as the best desktop manager available, it will have to bring its functionality to "human beings" and become flexible where it really matters.

Having said so, I very much encourage users using previous KDE versions to give SC 4.4.1 a try. If you are comfortable using KDE already, SC 4.4.1 will surely give you many reasons to be happy!

For those who have never used KDE, by all means give it a go. Even if some edges could benefit from a bit of polishing, KDE is still a great desktop manager. Most importantly, it really shows strong and continuous improvement, so things can only get better.

Enjoy!

Thursday, March 25, 2010

Ubuntu Road Test (Final Report)

About two weeks ago I posted the FIRST PART (recommended read to get the full picture) of this article. The basic idea was to find out how Linux (Ubuntu 9.10 Karmic Koala to be specific) could cope with not having a recycle during a month period while being under intense use, regularly going from suspend to standard mode and back throughout the whole time.

Since that first post, not much has changed, really. I am amazed at how user experience has stayed consistent all along. Resuming activity from suspension feels exactly the same as it did the first time, about a month ago, and so do all other tasks.

In terms of responsiveness, clicking on the start button wakes up the system in about 1 second, making it responsive straight away. A fresh Windows box usually wakes up pretty quickly as well, but then it takes a couple seconds before the keyboard becomes responsive so the session can be unlocked. After that, Windows does particularly well resuming wireless connection, though. My experience (and this has been logged as a bug by many already) with Ubuntu is that resuming wireless connection from suspend mode can take quite a long time, sometimes even requiring user intervention in order for it to work. NOTE: I believe that this problem is down to the gnome-network-applet, though, as I am using Wicd on this testing machine and it resumes connection faster than Windows. Most times it has already resumed connection by the time I manage to unlock my session!! If you can live without 3G modem support, Wicd is THE network manager.

In terms of performance, even if I am almost exclusively using this testing machine for all activities, I still do boot my other Linux boxes every now and then. I wanted to keep an unbiased view here, so it was important to compare the performance from my testing machine to that of others which were not going through the same demands.

Once again, not much to report here. My testing machine just felt fresh and fast, and I could not tell any significant difference to how other machines would perform similar tasks.

As you may have noticed, I am not really timing application opening or suspend recovery times, for example, but that's because I really can't see any drag or slowness. Remember the point of my testing was not to measure response to the millisecond, but to find if performance would be as heavily degraded in Linux as it was in Windows PCs.

Surprisingly, I have not found ANY performance degradation on my Linux box. I am aware that perhaps the Windows machines that influenced this test did get a bit more "beating", but there simply is no contest when it comes to results. The degradation in those XP machines varied from significant to simply making the machine useless, but nothing that could even stand a comparison to my testing results.

NOTE: I want to clarify that the machines that inspired this test were using Windows XP. I am pretty sure Windows Vista would suffer from similar or worse performance degradation, but I must admit I am curious about how Windows7 would do. Please share your experience if you have tested it!

CONCLUSION

So there you have it, Linux passed this test with flying colors, performing consistently throughout the whole month under some intense use. In addition, I want to note that I have not had a single issue or crash during these past 30 days, which may not be a surprise under Linux standards, but significant when comparing it to other operating systems.

I always recommend people I know to use the software that best fits their needs. I am no die-hard Linux fanboi and have no problem acknowledging Linux flaws or weaknesses. Having said so, I still feel many people try Linux and simply follow their first impression. Eventually, it is mostly an exercise of "Well, this is not how I do it in Windows", and they just go back to what they know better. If they got past that getting-used-to phase, though, I believe Linux could add a lot of value in terms of performance, consistency, security and flexibility. At the end of the day, that all translates in higher productivity for the end user which, unless you are using your PC as a gaming console or a media center, is what it's all about, isn't it?

Thanks for reading!

Tuesday, March 23, 2010

Understanding viruses in Linux

Before I started using Linux I was exclusively using Windows. I had never tried Apple, and my interaction with UNIX systems was limited and very seldom. When I started using Linux, it was kind of a first off for many new concepts that had little or nothing to do with those of Windows.

One of the things that got my attention initially was reading that there were no viruses in Linux, which was quite a departure from Windows ways. I was always curious about that... How could it be? After all, Windows users are flooded with attacks, so how was Linux performing the magic? Inevitably, I started searching for answers and found out that it was a somewhat controversial concept. Some people claimed that Linux was mostly benefiting from a very small market share, thus making it unattractive for those creating viruses. Some others claimed it was down to the very diverse and segregated nature of Linux (countless distros, no unified packaging, etc.). Finally, there were also people who claimed Linux was completely immune to viruses and that those who claimed otherwise didn't know what they were talking about.

Eventually, I found no reason to doubt Linux immunity to viruses, so I took it for granted, and thought it was a given in general. My experience is that lots of new Linux users understand, just like I did, that "no viruses" equals "no security threats". I believe that this is mostly down to how the term "virus" has been abused and misused. It almost has become a "wildcard" for all things malware.

In this post I will try to give some background about viruses and Linux security, hopefully clarifying some potential voids and misconceptions while I am at it.

WHAT IS A VIRUS?

First off, let me say that it is TRUE that there are no Linux viruses. That much is right, but it doesn't really mean much as long as we don't know exactly what a virus is. Here's the definition from Wikipedia:

"A computer virus is a program that can copy itself and infect a computer. The term "virus" is also commonly but erroneously used to refer to other types of malware, adware, and spyware programs that do not have the reproductive ability. A true virus can only spread from one computer to another (in some form of executable code) when its host is taken to the target computer"

This definition already clarifies many things. Here are the most important concepts:

- A virus must be an executable program.
- A virus must have the ability to run and copy itself somehow, with no user intervention.
- The only spread mechanism available for a true virus to infect a computer is through its host being on the target machine.

Now, if you have heard about the many forms of malware in existence, you will quickly realize this definition only covers a part of them. This subset is the one we Linux users should not be concerned about. I will briefly touch on the ones we should be conscious about at the end of this post, but for now, let's see why viruses are not our problem:

VIRUS BASICS AND LINUX ARCHITECTURE

As we just learnt, and this is a very important part of its definition, a virus must be able to "do its thing on its own". In other words, user interaction is not required and the virus activity should go unnoticed. There are two methods a virus can use to copy itself:

METHOD 1: Adding its own code to system executables.

Linux, being the good UNIX sibling it is, sports a file system that natively supports ownership and privileges. Simply put, here's how it would work in real life:

1.- If a user creates, copies or downloads a file into a Linux system, that file is owned by the user account and group, and it lacks executable rights. Therefore, it cannot execute itself (there is a practical exception to this which we will cover later).

2.- If the user is misled to trust some malicious piece of code and grants executable rights to it, it would still be bound to the user account's access rights, which are limited to the user home folder. Therefore if a user was having this kind of problem, it would be as simple as creating another account and moving the necessary files over to the new home folder. Note that in this case we would no longer be talking about a virus, for user interaction was required for the trick to work. In practical terms, a virus would have no way to infect any other applications unless it was run under the root (superuser) account.

The root (superuser) account is disabled on many Linux distros out of the box. If it is not, warning messages are displayed frequently while in use or at login time, trying to discourage the user from using it. In fact, unless you are a system admin, you should be able to get the most out of your Linux desktop without ever having to log in as root.

Please, DO NOT use the root account unless strictly necessesary!

METHOD 2: Anchoring itself to another process' memory during execution time

Linux runs on Intel's x86 architecture CPUs (AMD 64bits is actually an extension to Intel's x86), so it is important to understand how Linux uses it. The x86 architecture uses four rings, labeled 0 through 3. Linux uses 2 of those rings, namely ring 0 for Kernel (system) code and ring 3 for process(user tasks, applications, etc.) code. These two pieces of code are never mixed under Linux, they fall on different rings and there is only one "gate" for both to communicate. The fact of the matter here is that only the Kernel itself would be able to change this so a virus could exploit it.

So process code cannot infect kernel code... How about a process infecting another process?... Well, this is also a no go. The Linux kernel provides each process with an isolated piece of memory, one that is not shared with any other process. As a result, even if one of those processes scanned all memory available to it, it would not be able to address that of any other process, for it would be out of its scope. Long story short, this method does not work either.

Obviously, this is very technical talk, but hopefully I managed to explain why viruses are not a concern for Linux users without causing more confusion!!

OTHER FORMS OF MALWARE

Now that viruses are out of the way, let's talk a bit about other similarly malicious pieces of code.

ROOTKITS

Available for a wide variety of Operating Systems, Linux included, rootkits are either a modification to the kernel or to an application code. In the case of Linux, the former are most concerning, as they are very difficult to spot, and can compromise the whole system. As a result, even with the use of specific applications, it can be extremely difficult to detect a rootkit of this nature.

Fear not, for creating a successful rootkit for Linux is no trivial task. It must be created using the exact same code that will be available on the target machine, and its installation would once again require admin rights. Because of the sheer diversity in the Linux world, the fact that there are so many distros, so many packaging variants, etc., it would be very difficult to create something that could have any significant impact. Having said so, rootkit infections have been reported.

If you ran a certain executable you did not trust and suspect you could be infected by a rootkit, or if you simply want to give yourself some piece of mind, here's what you can do:

Because rootkits can become virtually undetectable during runtime, the best thing is to boot from a removable drive (CD-ROM, USB pendrive, etc). Then, use CHKROOTKIT or RKHUNTER , which are two popular rootkit scanners available for us Linux users.

TROJANS

Sometimes referred to as Trojan horses, these are applications designed to deceive the user, seemingly providing a service, while actually opening the door for a third party to remotely control the machine or access personal information. In other words, they can potentially steal passwords, confidential information, install software, log key strokes, use the machine for spamming, etc.

I have already discussed about a GNOME and KDE VULNERABILITY that would allow a trojan in the form of a launcher to execute without admin rights. It would still require the user to save the launcher locally and double click on it, but judging by how frequently that happened in Windows, I believe this should be something to watch out for.

Some users have reported being infected by trojans when using packages downloaded from a popular site containing eyecandy for the GNOME desktop. In fact, Linux users are potentially easy targets for such attacks, for what exactly could be wrong about anything downloaded from community resources? There is a sense of trust which is inherent to the community itself, and I believe that could be a weakness if it is misunderstood. Trust is fine, just do not be careless.

CONCLUSION

I guess the most important thing to take away from this article is that using Linux will do a lot for your computer security, but does not perform miracles. Viruses are no concern, but we sure cannot be careless. Be careful and protective of your own data and privacy. Stay away from using the root account, avoid running software from untrusted sources, never share your passwords... and react quickly if you think your computer has been compromised.

Thanks for reading and good Luck!

Friday, March 19, 2010

Ubuntu 10.04 Lucid Linx Beta 1 released NOW!

Alright, alright, the Lucid Linx is almost here. Get the first Beta while it's hot!

Official announcement and download mirrors HERE.

Download it, test it, report bugs, but most importantly... ENJOY IT!

Useful command line little tricks (part 4)

It's been a while, but I think it's time for yet another instance of these series. Last time I used an entire post to talk about lshw. Today I want to talk about commands that help in understanding what the system is doing and running diagnostics.

FINGER

Very nice command which displays who logged in to the system.
finger -l
This would return something like the following:
Login: user1   Name: John
Directory: /home/user1     Shell: /bin/bash
On since Fri Mar 12 12:40 (CET) on tty1   4 seconds idle
(messages off)
No mail.
No Plan.

Login: user2        Name: Jane
Directory: /home/user2     Shell: /bin/bash
On since Fri Mar 12 08:21 (CET) on tty7 from :0
4 hours 20 minutes idle
On since Fri Mar 12 11:26 (CET) on pts/0 from :0.0
No mail.
No Plan.
DMESG

Print Kernel information from its ring buffer. Specially interesting for diagnostics and understanding boot problems.
dmesg > boot.log
dmesg returns loads of information, most of which may not be interesting if you are trying to troubleshoot something specific. We can try the usual filtering to narrow down the content. For example, if I was interested in saving the full command output into a log file for later reviewing, but wanted to display on screen information about my wireless connection activity (wlan0 in my case), that would go something like this:
dmesg | tee boot.log | grep wlan0
As usual, creativity is key as you will need to use different command options and filters depending on what you are trying to achieve.

LOGS

As with most systems, the use of logs is frequently the best way to understand what is not working correctly. As we learned in the first article from these series, Linux stores log messages under /VAR/LOG. You can take a quick look to see what you can find in there:
ls -lh /var/log | less
By listing contents this way we can get a better understanding of them, as we see owners, groups, masks, sizes, permissions, etc. Many applications log their messages in here: dpkg, xorg, aptitude, etc. Even more importantly, you will see some logs which are system logs: syslog, boot, dmesg, kern and others display information related to system processes and services. You may have noticed that there are folders as well: apt, samba, cups, gdm...

Probably this is a bit of an overkill, so let's try to cover some of the most important or commonly used:

SYSLOG

Pretty much self explanatory, this log contains information about system activity. It is constantly login data while the system is up and running. You may have noticed how the system keeps a history of these logs, which are eventually compressed to save space (thus the *.gz extension).

As I mentioned, this system log is constantly adding data as system events occur. If we opened the file, we would only get a snapshot of what was happening at that very moment.
less /var/log/syslog
This allows us to scroll through the contents of that snapshot at command line level. If the system has been up and running for some time, you will surely realise that there is a lot of information in there!
less /var/log/syslog > system.log
Therefore, you may want to store that snapshot into a file and then open it with your text editor of choice, which should provide more of a friendly interface, as well as making the content a bit easier to read and work with.
tail -n20 -f /var/log/syslog
A very nice thing we can do is monitor this log in real time, as managed by the command right above. This is specially interesting when you are troubleshooting something you can control (like plugin a device, reproducing an application problem, etc), for you can see exactly what the system logs when the problem happens.

Let's say we are having issues with a external USB drive that is underperforming, maybe taking too long for read/write operations. Because PCs nowadays usually have several USB ports, and some are not as fast as others, it may be interesting to find out how the system recognises our external device. Now, let's run the same command before we plug in our device:
tail -n20 -f /var/log/syslog
Once we see its output in real time, let's plug in the USB drive. Here's the output I got:
Mar 19 16:54:52 KarmicKoala kernel: [17846.861024] usb 1-4: new high speed USB device using ehci_hcd and address 2
Mar 19 16:54:52 KarmicKoala kernel: [17847.328590] usb 1-4: configuration #1 chosen from 1 choice
Mar 19 16:54:52 KarmicKoala kernel: [17847.328865] scsi5 : SCSI emulation for USB Mass Storage devices
Mar 19 16:54:52 KarmicKoala kernel: [17847.328939] usb-storage: device found at 2
Mar 19 16:54:52 KarmicKoala kernel: [17847.328942] usb-storage: waiting for device to settle before scanning
Mar 19 16:54:57 KarmicKoala kernel: [17852.328183] usb-storage: device scan complete
Mar 19 16:54:57 KarmicKoala kernel: [17852.328818] scsi 5:0:0:0: Direct-Access     MEM      Drive Mini Metal 0.00 PQ: 0 ANSI: 2
Mar 19 16:54:57 KarmicKoala kernel: [17852.329346] sd 5:0:0:0: Attached scsi generic sg5 type 0
Mar 19 16:54:57 KarmicKoala kernel: [17852.333553] sd 5:0:0:0: [sdd] 31588352 512-byte logical blocks: (16.1 GB/15.0 GiB)
Mar 19 16:54:57 KarmicKoala kernel: [17852.334162] sd 5:0:0:0: [sdd] Write Protect is off
Once again, lots of interesting information here. We can see how my USB drive was recognized as a high speed device, the available disk space capacity, information about write protection, etc.

Obviously, this is an example, there is nothing here that looks concerning about this USB drive. If we had a problem though, this would be a good way to find out lots of info that could potentially help in troubleshooting.

CUPS

This folder contains logs detailing printer access and/or errors.

GDM

This folder stores logs from the GNOME display manager, which manages our login screen themes among other things.

XORG

The Xorg logs show information from this server, which would give us a start in troubleshooting problems with video (resolution, crashes, etc).

APT & APTITUDE

Having issues with any of these package managers? Look here first.

DIST-UPGRADE

As already discussed in other posts, upgrading to a new release could get painful. If the case, the information found here may prove useful.

CONCLUSION

I think you will agree about the relevancy of this logging feature. Moreover, this is a very important element in the system when we are trying to troubleshoot potential issues. /VAR/LOG certainly does not store every single log in existence, so you may need to look elsewhere for a specific application you have downloaded, but it offers tons of good information you may find very useful.

In fact, if you come from the Windows world, you may get a kick out of being able to get so much information from your system. Being able to actually "get" why something is not working as it should and eventually get to fix it is nice for a change.

Finally it feels like you (not your computer) are the one in control!

Enjoy!

Wednesday, March 17, 2010

HELP MySQL!

I recently found about this initiative and would like to add my two cents to try and help a little bit. In case you don't know, here's the drill, as explained on the first introductory paragraph from this initiative's HOME SITE.

"In April 2009, Oracle announced that it had agreed to acquire Sun. Since Sun had acquired MySQL the previous year, this would mean that Oracle, the market leader for closed source databases, would get to own MySQL, the most popular open source database."

If you ever used MySQL, you probably know what a great product it's become. In my opinion, it is one of those products that really show how good opensource software can be. If you have never used it, well, take my word for it. ;-)

What this initiative is trying to achieve could be summarised as follows:

- Ensure that MySQL future innovation is guaranteed. As you can imagine, chances that Oracle will invest in maintaining MySQL development are slim at best.

- Push competition authorities to ensure a viable solution for MySQL, either by transferring ownership to a third party, raising an exception so MySQL is licensed under GPL, or releasing it altogether under the Apache license.


I have already signed the petition, and if you have not already done the same, I would like to urge you to do so as soon as possible.

You can sign HERE

Thanks!

Tuesday, March 16, 2010

Gear up before upgrading to Ubuntu 10.04!!

One of the topics that gets the most heat every time a new Ubuntu release is out is that of Upgrading. If you take a quick look through any Ubuntu forum out there, you will likely see how most experienced users will advice not to upgrade to a new version, but backup your data and run a fresh install instead. In fact, I am pretty sure that most experienced Linux users will tell you to use a fresh installation. It is a fair statement, and a much safer approach.

Having said so, some times you have no choice but to take the upgrade route. Specially for corporations who stick to LTS (Long Term Support) releases and only upgrade once every 2 years, this is critical. Therefore, when you have to take the upgrade path, it is VERY important to get ready and be prepared for anything.

NOTE: This is by no means a Linux or Ubuntu specific thing. The company I work for now tests Windows Service Pack releases for weeks, even months some times. It does require a huge amount of effort and resources to find out how old applications will cope with new Windows releases or patches.

William Shotts has recently published two very good blog entries with advice on how to approach this upgrade process. With simple examples and very clear insights, I think these articles will help anybody who´s thinking of upgrading to Lucid Linx when it comes out next month.

Getting ready for Ubuntu 10.04, PART 1

Getting ready for Ubuntu 10.04, PART 2

I think these series will continue and involve more blog entries, so I would recommend following them up. However, even with the information provided so far, you have very good hints at what to look into. In fact, these activities usually have a lot to do with common sense, so do not simply follow these articles. Make sure you think of things which may be specific to your setup, and how you can overcome any issues that may be a result from the upgrade process.

Most importantly, expect no miracles from any upgrade process, regardless of the OS. Be sure to get yourself prepared!

Enjoy!

Monday, March 15, 2010

GNOME3: Not to everybody's liking

I knew the current GNOME3 mockups had been controversial, getting lots of negative feedback, so I was expecting to get a similar result from my poll. No surprises so far, here are the results:

Love it!                   6 (33%)
Not bad...                 5 (27%)
I don't like it            3 (16%)
It sucks bad               4 (22%)

So there you have it, seems like supportive feedback has an edge, but there are many that either don't appreciate it, or simply don't like it at all.

I personally hope they incorporate some nice new features to the current design, instead of creating something so radically different that can get significant rejection from the community.

Friday, March 12, 2010

Install OpenShot in Ubuntu

I was excited to see Openshot would be included in the Lucid repositories. I am no video editing expert, but this project looks amazing, maybe set to fill the gap on professional and easy to use video edition software many complain about in the Linux world.

I recently found that OpenShot 1.1 could easily be installed on current versions of Ubuntu, so I went ahead and gave it a try on my Karmic Koala installation. I want to share here how you can install it, as well as a few screenshots depicting how good it looks. Moreover, I found it very easy and intuitive, even for someone with no video editing experience like me.

INSTALL OPENSHOT 1.1

There are two ways you can install OpenShot 1.1 on Ubuntu Karmic, both are easy. Let's start with the "official" and recommended way:

Installing from the repositories

Open a virtual terminal and run the following command

sudo add-apt-repository ppa:jonoomph/openshot-edge

Then update...

sudo apt-get update

And finally, install...

sudo apt-get install openshot openshot-doc

You should then see OpenShot under Applications menu > Sound and Video > OpenShot 1.1.


Installing Openshot 1.1 on Ubuntu Karmic Koala

Installing from DEB packages

NOTE: This is the preferred installing option if you are running Ubuntu 8.04, 8.10 or 9.04.

Download the DEB package corresponding to your Ubuntu version and architecture from HERE. I think that in this case, the easiest is to just double click on the DEB file you just downloaded and complete the installation from the GUI. Should you want to do it from the command line, here's how:

sudo dpkg -i ????????

Where "????????" should include the path you saved the DEB file under, as well as the DEB file name. For example, I decided to save the OpenShot DEB file under my home folder, so in my case it would go like this:

sudo dpkg -i ~/openshot_1.1.0-1_all.deb

Just for your information, this method resulted in some dependencies missing when I tried, so once again, I recommend using the PPA. I reckon this could be related to my machine, as I have only tried once, but... For those of you not using Karmic, if you had this same problem, dependencies can be worked around easily. When Gdebi or dpkg complain about a dependency missing, just install that package on Synaptic or from the command line.

USING OPENSHOT

Once again, I have never done any video editing myself, but I found OpenShot to be very easy and intuitive on my brief playing around with it. I downloaded a brief video (One of Starcraft2's cinematic), and did some simple transitions, editing... It was a lot of fun!


Playing around with OpenShot 1.1

So there you have it, I completely recommend installing OpenShot if you want to get into video editing under Linux, or simply have some good fun playing around with it. There are other alternatives (I hear KDEnlive is probably the strongest now), but OpenShot is already showing lots of potential, and it is GTK oriented!

Have fun!

Tuesday, March 9, 2010

Ubuntu Road Test

Every now and again at work, I am involved in helping end users (sales representatives) with their machines. Sometimes I am amazed at the beating those machines get, it is surprising they even boot after a year in the field.

No kidding, those machines go through a lot: They are often thrown into the car trunk, or maybe dropped accidentally. They have an assortment of beverages spilled on them and it is not that unusual that they miss a key or two. Of course, they almost never get a full battery charge, and many users like to keep them always on, just on stand by mode when not in use. This is necessary when (as is the case in my company) the corporate build is so bloated and slow that it may take several minutes to boot. Obviously, that's not something you can afford in front of a customer.

To give a bit of background here, these are HP tablet PCs running a corporate Windows XP SP2 build under Safeboot drive encryption.

Now, if you have ever tried to keep windows running for days or weeks without a proper reboot, you surely experienced the pain of its performance degradation. This is obviously more exaggerated on a work PC that has to undergo the beating I explained above. The result? Well, I have seen cases in which it would take 1 or 2 minutes MS Word to open. That translated into maybe 5 or more minutes in the case of Lotus Notes, and even worse for Siebel. This poor performance was all over the place, which eventually rendered the PC useless for day to day work. For the most part, a reboot was all it took to bring things to acceptable levels, but I always wondered how my Linux boxes would cope with such intense use.

Obviously leaving the hardware physical abuse aside, I decided I would keep one of my machines on for a month, going from suspend to standard use mode, and putting it under intense activity. In fact, I am typing these lines from the PC in question. In the two weeks that passed, here are some things I have been doing:

- Coding: Working on a Python application, using Geany, MySQL administrator and Query Browser, the Python command line interpreter, PyGTK libraries, etc.
- Internet browsing: Let's just say that the poor fox is really on fire.
- eMailing: Pretty standard sending and receiving, often including attachments.
- OpenOffice frenzy: Calc is by far the most used, but Writer and Math have both had their time.
- Transmission: I have downloaded several beta and alpha distros which I have tested on...
- Virtual Box: Creating and removing virtual machines for Ubuntu 10.04 alpha 3 and PCLinuxOS beta.
- Social networking: Though most of it I cover on Firefox (Google stuff), Pidgin and Tweetdeck are usually online.
- Multimedia: Both XMMS and Songbird are often rocking, and I have watched a couple movies. Obviously, browsing did include lots of Flash video watching.
- Games: Gbrainy, Iagno and Mahjongg took care of some rests between coding sessions.
- Installation/Uninstallation: A few applications I was interested in testing came and went in these two weeks.

After two weeks of this intense use, I am finding very pleasant, even surprising results. The machine behavior is very consistent, taking always the same time to resume from suspension. Performance is as good as usual, even better for some tasks. This is probably related to the way Linux handles memory.

Essentially, Linux allocates more memory the longer it is online. In my case, I can confirm that is the case, as my testing PC is now showing in excess of 700MB in use, when it usually boots to about 220-230MB. This does not mean that Linux is ineffectively using memory, or that it will eventually use it all, but that it "reserves" it for its own use. As such, file transfers to devices happen very quickly, quicker than they normally do. That's mainly because they are not being committed immediately. Instead, they are committed to memory, and the actual transfer to the device is deferred to a later time, which Linux manages on its own. (NOTE: This is one of the reasons why it is important to properly unmount removable drives before they are unplugged. If we don't do that, we risk losing information that may have not been committed to the device yet.)

So long story short, halfway through the test, Linux excels in consistency and performance, not showing any signs of slowing down. OpenOffice, Firefox, Songbird... you name it, they all behave as usual, sometimes a bit quicker than I am used to!

Let's see how the rest of the month goes, expect another update in about two weeks...

Thanks for reading!

Monday, March 8, 2010

UbuntuGeek and Interesting Linux Applications

UBUNTUGEEK is a great site with lots of interesting Ubuntu related articles. It mostly covers all topics, from issue resolution to applications and everything in between. In fact, you can get very interesting information about upcoming releases, the future of Ubuntu, GNOME, etc.

Needless to say, I very much recommend keeping this site among your favorites if you want to be up to date with Ubuntu development and news.

As I said, Ubuntugeek does provide information about new/interesting applications for Linux/Ubuntu. Because of the sheer volume of applications published, I wanted to keep a record of those I find interesting for my own use, even if I may have not installed them yet. The idea is to keep track so I can install them when/if I need them.

I thought I'd share this list in case you guys find it useful as well. Here's my list so far:

LUCIDOR
(Simple Linux eBook Reader)

On to the UBUNTUGEEK ARTICLE

KONTROLPACK
(Cross platform remote network controller)

On to the UBUNTUGEEK ARTICLE

WINEASIO
(Wine with ASIO to JACK support)

On to the UBUNTUGEEK ARTICLE

HARDWARE MAP
(List computers in your network)

On to the UBUNTUGEEK ARTICLE

UBUNTU COMPILATOR
(Create .DEB packages easily from code)

On to the UBUNTUGEEK ARTICLE

ADOBE AIR uninstaller
(Uninstall ADOBE AIR applications)

On to the UBUNTUGEEK ARTICLE

Friday, March 5, 2010

Internet freedom at risk?

This blog entry has nothing to do with Linux, but I still wanted to bring this controversial matter to attention, as I believe it will impact us directly.

There are many initiatives already ongoing to try to control the only true free media left in the world: The Internet. Many have already died, unable to complete their purpose, but many more will come. It is almost certain that those in power will not stop until they can manipulate the Internet just like they do with Radio and TV.

We have recently seen examples of this very thing. I guess they ran out of ideas, so 'why not applying the same "war on terror" rationale to the Internet?', they might have thought. The concept is once again very simple: If enough concern and fear is raised, people will approve whatever measure of control and limitation to their own freedom, hoping that such initiative will bring safety "back".

We can see how they are slowly orchestrating this, as put forward by none other than Robert Mueller. You can read the official press conference content HERE.

As usual, a very scary scenario is brought to our attention, one that will surely bring concern. In case the reader is not smart enough, they spoon feed him/her:

"...And it again raises the question of whether a similar attack could happen in Seattle or San Diego, Miami or Manhattan."

Much on the contrary, it raises the question of why these so called terrorists can use such tools to avoid being caught, while US/UN military cannot to avoid civilian casualties in Iraq and Afghanistan (which by the way, count by thousands already). As you can understand, lots of similar questions could be asked.

Of course, I have absolutely nothing against protecting regular people from such attacks, and I would be the first to support such initiatives if they were truly real and done correctly. However, I fear it won't be long before they come up with proposals to limit contents on the internet. They may propose a licensing system, which would allow for controlling who can publish information on the web (paying a fee, of course). I can think of many similar scenarios, and I am sure so can you.

Before we know it, the masses may not only accept cuts on Internet freedom, but they shall be asked to pay for them. Needless to say, this is just a hypothesis, but one that I believe will not differ much from reality, unfortunately.

I really hope I am wrong here, but if I am not, let's all stand against Internet freedom cuts. The Internet should continue to be the amazing free communication media it's become, and not just propaganda in the hands of a few.

As usual, I am looking forward to reading your comments on this subject.

Thanks for reading!

Wednesday, March 3, 2010

Amazing new Ubuntu branding and proposed new themes!!!

I just found that Ubuntu has published their new branding, with some new themes, fonts, colors, etc.

For some, the departure from the brownish themes will be welcome, while for others like me... Well, I love it just as much.

Please take a few minutes to check out THIS ENTRY from the Ubuntu Wiki. There are mockups, screenshots, and very interesting design proposals for future releases.

Enjoy!

Tuesday, March 2, 2010

Some more desktops

It's about time I share a few more desktops!

UBUNTU 9.04 Jaunty Jackalope

With a fierce Lamborghini wallpaper, the Mac OS GTK2 theme and Cairo Dock with customized icons, this is one of my favorites!



UBUNTU 9.04 Jaunty Jackalope

In this case, this is a session I use more for work/programming and all that fun, so it is a bit more quiet and serious.



MANDRIVA 2010

Let's throw again a KDE screenshot for good measure. In this case, I decided to go back to the default KDE wallpaper, which is simplistic, but works very well with those few busy widgets.



Enjoy!