Ubuntu Articles

[Solved] Chrome Cordova Apps (cca) stuck on version 0.0.8

by Stephen Fluin 2014.04.26

Chrome Cordova Apps (cca) is a way for you to build Android, iOS applications from a standard Chrome App. This is the story of how my version got stuck, and what I did to fix it.

I operate Ubuntu 14.04 and at some point my cca package got stuck at 0.0.8 (the current version being 0.0.9). It seemed like no combination of npm install -g cca or npm update -g cca or npm uninstall -g cca had any effect, the binary was still present and represented the old version.

I followed the code a little bit and found that the old version was being installed to /usr/local/lib/node_modules, so I blew this folder away, and then tried installing cca again. This time cca was installed successfully and I was on the latest version.


permalink

Canonical Announces Ubuntu For Phones

by Stephen Fluin 2013.01.02

There is now a fourth player in the smartphone arena (and it's not RIM).

Today at 12:00PM CST, Canonical released a video recognizing the past, and announcing the future of Ubuntu. The future as proposed by Canonical's Mark Shuttleworth, is for Ubuntu to be a universal computing platform. Recognizing apps, content, and data as being universal, with customized interfaces for different form factors like Tablets, TVs, Phones, and Desktops. 

This news is exciting, because being Linux and open source-based, multiple architecture and multiple device application development will be an easier dream to achieve. This would be a strong message to Apple, Microsoft, and Google that they have failed to unify all of the devices of a user. Ubuntu is a platform that I have used every day of my life for the last 4 years in the desktop space, taking that level of capability, stability, and power to other devices may be a winning combination.

In late 2012, Ubuntu TV was announced as the first extension to the Desktop Ubuntu experience. Shortly thereafter, Ubuntu for Android was announced. Ubuntu 12.10 was heavily optimized for touch and tablet interfaces. Now to complete the specturm, Ubuntu for phones is here.

Major Innovations

The biggest and most exciting accomplishment has been the promotion of web applications as true applications. I've been predicting this from Google's Android for a while, but it looks like Ubuntu may beat them too it. Apps built using HTML5 for iOS and Android will work perfectly on this device.

QML as a native development technology that combines standard application development methodologies, with simplified markup and Javascript and CSS for UI Glue is exciting.

Finally, Ubuntu has been working on intelligent and context-based menus for more than a year now. On Ubuntu phone they are exposing all of these interfaces via phone commands. Surpassing even Google's voice capabilities.

Support for Android-designed hardware is a part of the plan. This is a huge deal and something neither Microsoft or Apple could ever replicate. This means that any Android phone (and there are a ton of great ones) should be able to run Ubuntu. This means that all of the great Android hardware I've acquired should work very well with this new exciting mobile operating system.

Ubuntu's Challenges

 The biggest challenge facing them is a lack of expeirence with heavy cloud applications. Historically they have relied on third parties, which could result in a fragmented or broken experience. This is exacerbated by the fact that the major service providers also have their own mobile platforms, so support may be slow coming or completely missing.

As of 1:00PM CST, their app development website is offline due to heavy load. This in interesting sign of the level of interest around developing for this platform.

Currently there's no plans for wearable computing, which will be an area for Google to innovate and easily exceed the capabilities of  Apple, Microsoft, and Ubuntu.


permalink

Localize Ubuntu Installs To Your Timezone

by Stephen Fluin 2012.03.14

One of the common tasks with a new Ubuntu server setup is to localize the server to a timezone.  Timezones are used by the filesystem for timestamps, they are used by PHP to perform date lookups, and by any databases when doing date comparisons based on NOW().

The way GNU/Linux, and Ubuntu in particular store time is by keeping the system clock synchronized to UTC (Universal Coordinated Time), and then storing an offset to the localized timezone.  To set this up in Ubuntu, just install the tzdata with the following command:

sudo apt-get install tzdata

After running this command and entering your sudo password, you will be prompted for which geographic region you are in, and then prompted for a selection of timezones.  After selecting the correct timezone for your server (for me, I like to keep my servers localized to my local time), you will need to restart any of the applications that use this. For a standard webserver, you will need to run the following:

sudo apache2ctl restart
sudo service mysql restart

Synchronizing and Updating the Server Time

Over time, the clock will get out of sync with the true passage of time (measured by atomic clocks accross the globe, by our GPS system, and by several publicly available network time pools).  Ubuntu makes it very easy to resync your system clock to the network time pool. In stock GNU/Linux, this is achieved by running the ntpdate application and supplying a network time pool. On Ubuntu it's even easier, you can just run sudo ntpdate-debian which will sync your system against the debian-specified time servers.

You can put this command in a crontab to frequently resync your clock. Syncing the clock takes very little network or CPU. My experience shows that depending on the quality of the hardware clock, your system will get a second out of date every few days, or every few weeks.


permalink

X Forwarding over Multiple SSH Hops

by Stephen Fluin 2010.08.07

X11 forwarding (AKA X Forwarding) is a slow but manageable way to run a program remotely, accessing a remote systems disk, memory, CPU, and filesystem, but sending all user interactions and display over the internet to be shown on your computer. X forwarding allows my to run visual diagnostic tools like kdirstat, or even pull up my home photo management program (fspot) without needing to install a local copy, or connect to the remote disk and deal with those complexities.

Typical X Forwarding Use Case

Let's image for a moment you just want to browse the remote filesystem with dolphin.

Step 1 - ssh server01 -XYC The XYC flags will enable X forwarding, compress the communication, and enables "trusted" X Forwarding.

Step 2 - Run your program, for example kdirstat / This command will run kdirstat using the memory, cpu, etc of server01, but the display will be shown, and interact with the mouse and keyboard of the client.

Multiple hop X Forwarding

Unfortunately it's not as easy to chain X11 forwarding as it is to chain normal SSH connections. The workaround is to use SSH tunneling. The general strategy is to create an SSH tunnel which you can open a second SSH connection with.

TERMINAL 1: ssh server01 -L2200:server02:22 TERMINAL 2: ssh localhost -XYC -p2200 kdirstat /

Using these commands in two terminal windows (the first one will just be a normal SSH connection to server01 that you will need to leave open) will open kdirstat / using the CPU, memory, etc of server02, on the display of the original client, as desired.


permalink

Dell Fully Drops Ubuntu

by Stephen Fluin 2010.07.26

Despite Dell's own statements about the quality and security of Linux (Ubuntu in particular), it seems that they have now dropped Ubuntu support from their website. As of now, Dell is no longer selling Ubuntu based machines from their website.

I'm continually astounded by the fact that more people don't use Linux. Economics should dictate that when people want Ubuntu, and it sells well, they increase their offering. The problem with Operating System economics is that there is a huge fear of changing operating systems, making lock-in much worse than with normal market economics. The other piece is that Microsoft has an established monopoly. In order to use the software someone wants, that software has to be built for one or more operating systems. Most people can't switch to a better, higher quality, lower cost solution because their software is built for a specific list of operating systems (GoToMeeting, Adobe Products). This lock-in hurts consumers, and prevents better options from being a choice.

This problem has been alleviated somewhat by web-based software, but the problem continues to this day. This is why it's so saddening that Ubuntu is giving up their Ubuntu offering.


permalink

Dell Makes Awesome Ubuntu Commercial - And doesn't show anyone

by Stephen Fluin 2010.07.25

It seems that Dell's investment to Ubuntu went further than they ever let anyone know. They actually made a pretty decent commercial for Ubuntu on Dell laptops. Check out the video below. The strangest part is that I've never seen it, and no one I know has seen it. It's almost as if they made a commercial, but then decided not to market Ubuntu on Dell. The simplest explanation is that Microsoft blocked them from marketing Ubuntu, as Microsoft has done time and time again with Dell, as well as all of the major manufacturers.


permalink

Connect Multiple Screens with x2vnc

by Stephen Fluin 2010.07.19

There are a lot of ways to connect various Linux or Windows machines. Linux has great support for the windows remote desktop protocol (RDP) with rdesktop, but if you have second (or third) screen that can be seen at the same time as your main screen, you should try out x2vnc.

What is X2VNC

X2VNC is a software tool readily available for linux that creates a mapping between a VNC server (on any system type) and an X screen in linux.

Example X2VNC Setup

I have a two screen setup on Kubuntu. I also have a TV screen above and to the left of my two monitors. x2vnc allows me to map the VNC server running on the TV as a third screen on my main system. The first step was to install a vnc server on the computer connected to my TV. The second step was to run x2vnc -west yt:5900, which creates the screen mapping.


permalink

Shotwell Replaces F-Spot in Ubuntu and Trashes your Collection

by Stephen Fluin 2010.06.15

It was allegedly announced that a new photo management tool called Shotwell will be replacing F-Spot in future Ubuntu releases. Whether or not this is true remains to be seen, as the only evidence I have found comes from a slashdot article which refers to various tech blogs.

If it is the case that Shotwell is taking over for F-Spot as the photo management tool, I'm extremely disappointed. I just installed Shotwell to try it out on my computer and was shocked by what I found. After opening Shotwell, I discovered that in traditional Gnome fashion, there are NO OPTIONS, which means you can't configure how the import works, or where you store your collection.

The inability to do any configuration is a huge deal for me because although I store my photos in ~/Pictures, I currently manage them with F-Spot. This means that after I tried out an import with Shotwell, my F-Spot folders were being filled with unmanaged (by F-Spot) pictures. Fortunately Shotwell makes copies of images, rather than moving them, so I was able simply to trash all of the photos in Shotwell to restore my computer to the earlier state. Because there are no configuration options, there is absolutely no way for me to try out Shotwell, or to properly migrate my collection from what I have presently.

The other problem with Shotwell is that it seems very far from complete. To add tags, you have to right click or use the menu to manually "Add Tags" which then allows you to manually type each of the tags you want associated with a photo or a set of photos. This type of interface is clunky and takes users away from the ease of use and visual capabilities that F-Spot has.

Shotwell Recommendation

Perhaps things will get better in the future, but for now, stay very far away from Shotwell, or install it in a virtual machine.


permalink

Guide for Moving to New Linux Distribution

by Stephen Fluin 2010.03.16

While Ubuntu updates usually work relatively well, maintaining settings, configurations, and upgrading packages to new versions. This is nice, but sometimes it's better to just install the new system from scratch. This is always a tough decision because a balance needs to be struck between migrating files from the old system to the new one, and leaving things behind that you don't need or want anymore.

Deciding when to start fresh with a new distribution

In general, if you are happy with what you have, DON'T START FRESH! The only time you want to start fresh is when you have had your computer for a while and you are beginning to suspect things are getting a little bit dusty or more clogged than they should be. For me this meant that I had gone through 4 Ubuntu release upgrades, but I had also been dangerous and tried out some of the betas and alphas, which probably left a few files and configurations in places that I didn't even know about. For me this also meant that my grub install had never been upgraded, and my /etc/ folder was filled with settings and configurations from programs I hadn't used in years. These things don't typically have a huge impact, but to me having as few files on my computer as possible, and as few legacy configurations as possible is a cleanliness and reliability goal.

Another reason for me to upgrade was to get new applications and features that wouldn't be deployed with normal upgrades. The main 3 features that come to mind for me were an EXT4 filesystem, GRUB 2, and starting fresh with my audio stack. I could have migrated and upgraded these things manually (the filesystem would have been a huge pain, and potentially corrupted my whole system in the process), but with the number of unknown factors in these, it was easier to begin again from scratch.

Before you start with a new distribution

Before you start, the most important thing is to make backups. Without backups your data is extremely likely to be lost. If you don't have good quality backups, your only hope when you data and applications and settings go missing is to give up and forget anything you lost.

I recommend you make a backup of your entire /home/ folder, as well as your /etc/ folder. You might not use either of these folders again, but they are extremely useful if you do need to restore an earlier system, or if you just want to copy or reference an earlier version. For example, my /etc/ backup came in handy about 2 months after upgrading recently; I needed to reference my /etc/X11/xorg.conf file to determine the proper setup for metamodes on my computer.

This is probably a good time to make a separate /home/ partition. Putting this folder on an entirely separate partition gives you the ease of upgrading, installing new distros, and trying things out in general. While changes to your operating system can completely break your install, there is almost nothing you can do to break your /home/ folder. Keep in mind one notable exception, which is if you have compositing enabled in your KDE settings in your home folder, you may not be able to boot into your desktop environment if you boot into an OS that doesn't support compositing. I recommend you disable compositing before doing any upgrades or reinstalls.

If you have spare disk space on another drive, you could rsync your entire root filesystem to a safe location. Most files are in /home/ or /etc/ that you would want to migrate, but occasionally there are files and programs that will store files in other places. Two examples of this are gitosis, which by default keeps its files in /opt, and any databases, which are likely to be stored in /var/db/.

Picking a distro

If you have a distro that you like, I wouldn't recommend starting from scratch while trying a new distro. New distributions are best learned in a virtual machine, or in a separate partition, where you still have something to fall back to.

Save your files

Here is a list of files you should consider transferring when you are moving to a new or updated distribution. One of the things I recommend is not transferring your home folder. At least for me, my home folder tends to get filled with configuration files from programs I ran months or years ago, as well as numerous files that should be sorted elsewhere. The strategy that worked for me was to have a separate home folder that I kept as a backup, and then copied files from as I discovered I needed them. If you have copies of the items below, that should cover some of your basic needs, but everyone uses their computers differently, so making sure you have a way to recover your files can be very important.

  • Browser Config
    This includes ~/.config/chromium/ or ~/mozilla/firefox/ depending on your browser of choice. The reason to keep this is that I believe there is value in saving your bookmarks (if you don't have online sync), as well as history and passwords. These are things not trivial to recreate in a new environment.
  • File System Tab
    This for me meant reading my old /etc/fstab file, and adding any entries that were missing from my new system. You don't want to just copy the old file because some of the entries may have different options or preferences chosen differently from your old distribution to your new one.
  • F-Spot (Or photo management)
    For F-spot, these files are located in ~/.config/f-spot/. This is important if you want to keep your database and organization associated with your images, especially if you don't store your tags in the files themselves (which I highly recommend)
  • Git Global Configuration
    Git stores several global configurations (such as your username and email address) when you set them up. It may be easier not to migrate the git configuration files, but simply keep track of what you had set in the old system, and remember to set these when you begin using Git in your new system. This item most likely only applies to developers.
  • Wine Environments
    These are stored in ~/.wine/, or if you use PlayOnLinux, they are stored in ~/.PlayOnLinux/. I highly recommend migrating these because they tend to be a lot of work to set up.
  • Virtual Machines
    Virtual machines have similar logic to Wine Environments in that you typically have them set up exactly as you need them, and they are non-trivial to rebuild to the same state.

permalink

Basic Ubuntu Network Hardening

by Stephen Fluin 2010.02.27

Hardening a linux system is something easy to do, and can have a lot of additional security benefits for the paranoid, and for those who might be targets. There are a lot of guides to hardening different parts of the system. I'm going to review the steps I took recently in order to harden one of my desktop computers.

Understand what is open

Before you can harden anything, you need to understand how exposed your currently are. There are a few ways of doing this, and I recommend you try at least a couple. The first way is to use nmap to scan your IP address. You will need to do this from another computer, preferably in your local network. This scan will provide information about all of the ports your computer responded on. The second way I recommend you understand what your system is exposed to is to run the command netsat -atuv. This command will return a list of listening and active connections your computer has.

Review the open ports list

This step will require some knowledge of your system and of networking. Each port typically serves a standardized purpose. This means that you will need port 22 open, for example, if you want to allow incoming ssh connections. You will need port 25 open if you want to run a mail server. You will need port 80 or 443 open if you want to run a web server. Unless you want no remote access on your machine (a valid assumption for some people), you will need to be careful what you disable.

In my default install ports list, I have ssh, smtp, mdns, bootpc, and port 37319 open. You can tell a port is open from netstat, as it will look as follows:

Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State
tcp 0 0 *:ssh *:* LISTEN
tcp 0 0 localhost:ipp *:* LISTEN
tcp 0 0 localhost:smtp *:* LISTEN

In this output, the "*:ssh" means that SSH is listening on all network addresses, and that any computer that can connect to mine can open a connection with SSH. THe "localhost:ipp" means that my CUPS printer setup is NOT listening on any address, and that only programs local to my machine can open connections. Localhost servers are generally safe and won't expose your machine to malicious remote users, unless they break your user authentication, or already have a local user.

Secure smtp or Postfix (port 25)

Postfix by default listens to your network and trusts mail reporting that it is coming from your local network. This typically isn't needed, and you can secure postfix to only listen to the local machine for additional security. Open the file /etc/postfix/main.cf find a line containing "inet_interfaces = all", and change it to "inet_interfaces = localhost". You then restart postfix with sudo postfix stop and sudo postfix start. You should check netstat -atuv again to ensure this worked.

Secure mdns (port 5353)

After researching mdns briefly, it seems this is something I wish to keep. This allows computers on the local network to find other computers using friendly names. This means that your computer will auto resolve when using "hostname", rather than always having to rely on your router, or needing to use the IP directly. This is useful to me, and worth the security risk to leave open.

Secure bootpc (port 68)

Bootpc is a part of the standard DHCP system. How DHCP works is that you send out a multicast UDP packet to the network requesting information from any available server. This means that you must also have a process listening for responses. This is a standard part of linux networking. You could remove your DHCP capabilities, but the boopc listener shouldn't be a security risk.

Remember to use a firewall

One of the important things to remember is that you should always use a separate firewall between yourself and the internet, and only use port forwarding for the things you absolutely need to be able to forward. I recommend only forwarding SSH, as you can use SSH to tunnel any other sorts of traffic. I also recommend running SSH on a nonstandard port, as this will reduce the amount of "doorknocking" your computer receives. These security steps are primarily designed to protect you if someone else has gained access to your network, or if there is another compromised machine somewhere behind your firewall.


permalink

File Menu Missing in Dolphin

by Stephen Fluin 2010.02.11

One of the most annoying things I have encountered with Dolphin is the fact that there is a keyboard shortcut to remove the menu bar, as well as a button to remove the menubar, but there is no way to get it back in the same way. You can remove the menu bar by hitting the keyboard shortcut "Ctrl+M", or by right clicking on the menu bar and clicking "Hide Menubar".

How to get it back

For me, finding out how to get it back was a frustrating experience. My first problem was that I didn't know KDE called it the "Menubar", when I looked in the keyboard shortcuts, or when I googled online, I couldn't find anything relating to the dolphin file menu. The second usability problem I see is that right clicking on the title bar, or any of the toolbars (which take the place of the menubar at the top of the screen after it is removed don't give you the option to bring it back.

It was finally after I had practically given up looking for the menubar that I right clicked in the MAIN pane of the screen. This context menu then provided me with both the appropriate keyboard shortcut, as well as a button to bring the menu bar back.


permalink

AutoPano Pro and Giga Font Problems

by Stephen Fluin 2010.02.06

I recently installed copies of AutoPano Pro and AutoPano Giga, which are some great panorama stitching programs that I would highly recommend, as they work on Linux, Mac, and Windows. After installing these programs, there was a horrible font problem that made all of the text in the application illegible.

The solution I believe was to install qt4-config, run qtconfig, change the font size up to 9, and then back down to the default of 8. When you launch the program again, all of the fonts and text should be fixed and legible. Have fun using this awesome program!


permalink

Upcoming talk on Linux Browsers

by Stephen Fluin 2010.02.05

In March I'm going to be giving a talk on Browsers in Linux, with a lot of specific examples and demos from Ubuntu. Hopefully this talk will cover some of the basic things for beginners who need to know things like "What are the options", and "how to install new browsers in ubuntu", as well as advanced options detailing some of the more unique and advanced features of the available browsers, as well how to get started with development for browsers.

This talk will be at the Penguins Unbound Linux users group in Falcon Heights, Minnesota on the last Saturday of the month. I will post additional details and documents closer tot he date.


permalink

Video File Thumbnail Previews in Dolphin

by Stephen Fluin 2010.01.31

In a default Kubuntu install, Dolphin is a great file manager. It typically works with many different file types seamlessly. One of the things it doesn't seem to handle out of the box is previews/thumbnails for video files.

How to install video thumbnails

The first step is to install the mplayer thumbnails package. My research indicates that this has been in the standard repos since Jaunty.

sudo apt-get install mplayerthumbs

This will change dolphin so that that when you click on a video, it will be previewed in the preview panel. To make thumbnails show up in the icon view, you need to do a little more configuration. Go into Settings->Configure Dolphin. Click on the "General" section from the list of sections on the left, select the "Previews" tab, and check the box next to "Video Files (MPlayerThumbs)".

That's it, enjoy your video thumbnails in dolphin on KDE


permalink

KDE 4.4 Rocks Part 1 and other thoughts on Lucid Lynx alpha 2

by Stephen Fluin 2010.01.29

A few weeks ago I installed the Ubuntu Lucid Lynx second alpha. Typically Ubuntu alphas and betas have quite a few bugs, and I have been burned in the past by upgrading Ubuntu versions prematurely. This time I decided to try it out a little early by installing it in a separate partition where I had cleaned up some space and taken it back from my ntfs partition.

Lucid Lynx - Ubuntu Alpha 2

So far, Lucid Lynx has been extremely stable. For the entire release cycles of Jaunty and Karmic, as well as alpha 1 of Lucid Lynx, the Live CD wouldn't work on my system. It wouldn't boot at all, regardless of using the alternative installer, or the live CD with a plethora of boot flags attempted. Lucid Lynx went so far as to work in my system without the Kernel flag acpi=off which I have needed for my entire life with Ubuntu.

The installer was nice, easy to use, and although my Nvidia drivers weren't installed properly by the Jockey (KDE) GUI, they were very easy to install with sudo apt-get install nvidia-current

In addition to this, Kubuntu installed ALSA without PulseAudio by default, which in the past hasn't worked with my microphone, but this install seemed to fix everything I have been fighting with for the past few years. Once again we will see how long it lasts, and I had almost gotten used to the per-application controls of pavucontrol. Who knows, maybe I will install PulseAudio a year from now and everything will just work with ALSA + PulseAudio in perfect harmony. I'm not holding my breath.

KDE 4.4

A couple of the features I was waiting for with KDE have finally landed. The first which I wanted, but I had no idea why I wanted it is a feature that is most easily described as "Windows 7 Snap". This means that I can drag windows to the left or right edges of my screens, and they will "snap" into place at 100% height and 50% width. You can also drag windows to the top of the screen and they will maximize. This hurts my workflow a little bit in that it isn't instant to drag and drop maximized windows between monitors, but I believe I will get used to it. This also works for multiple monitors, which is surprising because as of a week ago (before I ran some package updates) wasn't working for the middle bar between the monitors.

Another improvement that I wasn't expecting is that they remade (or finished) the Add Widget menu. Now when you add a widget, whether to the desktop or a panel, a very nice bar pops out that is easily navigable, and uses drag-and-drop for placement.

The final thing I didn't expect was that moving files and deleting files is much smoother. In Karmic and before, when I deleted a set of files in Dolphin (or sent them to the trash), they would remain on the screen for a few moments while KDE worked in the background. Now these types of processes are instant, as they should be. The notifications for file transfers and activities have also been much improved. Now the useful data is presented first, with the option to expand the notification to show the rest of the information. Biggest of all about the notifications, they actually seem accurate now.

Hopefully I will be able to write more about any new features of KDE I discover as I continue this dangerous journey through the Ubuntu 10.04 alphas.


permalink

Install Firefox 3.6 in Ubuntu

by Stephen Fluin 2010.01.27

Installing the latest Firefox 3.6 release in Ubuntu is easy. You have to get the packages from Mozilla rather than Ubuntu, as Ubuntu is much more cautious with their release cycle, and with the rate they import packages into their system. This is a step you will only have to do once, and then you will have access to all of Mozilla's releases designed for ubuntu. Simply type the following commands to install Firefox 3.6:

sudo apt-key adv --recv-keys --keyserver keyserver.ubuntu.com 247510BE
sudo add-apt-repository ppa:ubuntu-mozilla-daily
sudo apt-get update
sudo apt-get remove firefox
sudo apt-get install firefox-3.6

The step to remove existing firefox installations may or may not be necessary for you. On the machine I tried it on, I had to remove "firefox" before the new required packages for firefox-3.6 would install properly. Don't worry, as installing/uninstalling software should never affect your user preferences.

What's awesome in Firefox 3.6?

One of the best things about Firefox 3.6 is their new concept of "personas". Rather than a full theme, personas are more similar to a desktop background. This means the persona is going to affect the color and images used in the browser, but not much else. The best thing about personas is that they can be previewed live in the browser simply by hovering over each of the themes, and apparently there are already 30,000 themes, most of which are pretty cool looking. Check out getpersonas.com to browse the entire catalog.

The second most interesting thing is that Mozilla has continued improving the speed of the browser overall, and apparently it's 20% faster than Firefox 3.5, which is great for a browser that most people love, but has been receiving a lot of flak for being slower than Chrome.


permalink

Fixing DenyHosts after being blocked

by Stephen Fluin 2010.01.25

One of the built-in pieces of security in many linux machines is a piece of software called denyhosts. This piece of software works with the SSH daemon to catalog the types of requests and failures come in. If it detects an IP trying to connect to the computer that has exceeded certain thresholds, it adds it tot he /etc/hosts.deny file, which immediately bans an IP from connecting with ssh.

In the default settings in Ubuntu, valid users are only allowed 10 failed valid-username attempts over a period of 5 days, or 5 failed invalid-usernames over the same period.. This may seem like plenty, but occasionally when travelling you will repeatedly use the wrong password or the wrong username, you can get your entire IP blocked. At this point you have to use another machine (or another IP) to connect to the server to fix the problem.

How to unban yourself in DenyHosts

The easiest way to temporarily unban yourself is to delete the entry containing your IP in /etc/hosts.deny. This will work for a few seconds, before DenyHosts regenerates the file. It should be enough to connect and fix the issue from your main host. You will also want to do this in combination with the next step.

How to fix the problem permanently

In my opinion, the problem with DenyHosts in a default Ubuntu install is that successful connections don't reset any of the failure counts. This means that if you attempt to connect each day, and fail twice, and succeed on the third attempt, by the 3rd day of this, you could be banned from your own server.

The fix involves editing your /etc/denyhosts.conf. Open this file as root in your preferred editor, and locate the configuration option RESET_ON_SUCCESS and make the line read as follows:

RESET_ON_SUCCESS = yes

After making this change, and restarting denyhosts with /etc/init.d/denyhosts restart, you will want to unban your IP, and connect again successfully so that your fail counts are reset. Hopefully from there, you won't be banned by your own innocent login failures.


permalink

Chromium: The best browser on Linux

by Stephen Fluin 2010.01.24

The browser wars are ongoing, but there is a clear leader on Linux at this time. This winner is Chromium. Chromium is the open source base that Chrome is built from, but it doesn't have any of the proprietary parts or unknown data reporting built into Chrome.

Chromium includes a super-fast webkit rendering engine, their own v8 javascript engine, and process-separated tabs. Chromium also includes support for HTML5 tags, including the video tag, supporting both Ogg Theora/Vorbis, as well as the controversial .h264 / mp4.

How to install Chromium on Ubuntu

The easiest way to install it is to add the PPA to your repositories. You can follow the instructions on the PPA directly, or follow the instructions below, assuming you are running Karmic or later.

sudo apt-key adv --recv-keys --keyserver keyserver.ubuntu.com 4E5E17B5
sudo add-apt-repository ppa:chromium-daily
sudo apt-get update sudo apt-get install chromium-browser chromium-codecs-ffmpeg-nonfree

How to Get Started

Either run chromium-browser directly, or choose it from your menu. Chromium supports extensions, as well as bookmark sync that uses your existing google account. Find these things in the preferences menu, which can be opened by clicking on the wrench icon in the upper right hand corner.

Youtube HTML5 Videos

Until youtube offers full support for HTML5 rather than Flash for videos, you can install the following extensions which swaps out the page components for you: YouTube HTML5-ifier which is in development by Mark Renouf and myself.


permalink

Restart Ubuntu faster with kexec

by Stephen Fluin 2010.01.20

One of the longest and most time consuming parts of the bootup process for any computer is the fact that the default reboot process completely turns off the computer, and returns the CPU to the very first startup task, the BIOS. BIOS initialization and POST does a lot of steps involving test and hardware detection and initialization. When rebooting the computer, this doesn't really add any value, as these devices have already been detected and had their first initialization done.

This is where kexec comes in. kexec is a tool that allows Ubuntu (or any Linux distribution) to pass control between kernels directly. This means that when rebooting, control never get's passed back to the BIOS for reboot, and no additional device testing or detection (beyond that required by linux) needs to be done. For me, all of the initialization steps in the BIOS and GRUB before Linux takes over takes around 30-40 seconds. This is a little bit longer than the average case, but I'm sure this is a problem for a lot of people.

How to install kexec

You can install kexec in Ubuntu using the following command:

sudo apt-get install kexec

The installation will ask you if you want to replace the reboot process. In order to have the benefits of skipping unnecessary steps in reboot, you need to choose the option that uses kexec for your reboot.

What could go wrong?

The only problem with this is when you are attempting to actually return to the BIOS, or if you want to go back to GRUB to choose another operating system in the case where you dual boot. This isn't that big of a deal though. Although I haven't found a way to pass control directly to GRUB or back to the BIOS, it is possible just to choose shutdown, and then turn the computer back on after the shutdown has completed. It doesn't take any extra time, it just takes more effort as you need to press the power button on the computer.


permalink

Use KDirStat to clean up disk space in Ubuntu

by Stephen Fluin 2010.01.16

My traditional process for cleaning up disk space is to recursively run "du -s * | sort -n" starting at /, and then descending into folders and looking for things I know I can delete. This process has worked well for me, and in the past I have tried out some of the visual tools to diagnose the issue, but in the end it always seemed like I broke down and went back to command line.

Today I tried a new tool, called KDirStat that I found in the Ubuntu package repository. After installing and trying it out, I found that it is a great tool for diagnosing disk usage problems, and for cleaning up your disk.

Steps To Use

After launching the application, and selecting a folder as a starting point (I chose my NFTS partition mounted at /media/disk/), the program greeted me with a useful list of the folders in that directory, and immediately began scanning them for folder size. Not only could I see the calculated subtree size, but I also reported the queue of disk reads that would be necessary in order for it to complete the scanning.

To clean up the filesystem, I sorted by subtree size and started descending my file tree. My general process is as follows:

  1. Duplicate Backups
  2. Duplicate Files
  3. Internet Media (.iso files, then media such as Audio/Video that could be retrieved from the internet later)
    1. If these steps aren't enough, it's probably time to purchase additional hard drives. I'm a huge fan of software RAID 5, which is easy on linux when using mdadm


      permalink