For nearly a week now I have been exclusively using an Eee PC. I promised an action shot and here it is:
Using this device I have went about all my regular activities, even a bit of web development.
The biggest hindrance I've found has been using a screen resolution lower than what I'm accustomed to and I'm not too keen on the button strip below the trackpad but I have gotten used to the keyboard to an extent though would certainly benefit from smaller hands
Enough negativity, buy one today!
After a lot of frustration, reading of documentation and even giving up completely on certain paths of action I finally got Jabber up and running. The “Jabber burnout” as Adian called it was terrible and only now do I feel de-stressed enough to write about it.
I initially setup an installation of jabberd2 as I have had previous experience with it and was comfortable with it's administration. I got it working without difficulty and could connect to it via a standalone client but ended up abandoning it when I tried to get a web interface working with it.
As is, jabberd2 doesn't have it's own http polling system, which is necessary for use with a web client and I looked at a couple of mplementations, including Punjab
but decided to throw in the towel and go with a different Jabber server which had this facility built-in, along with logging and multi-user chat: all things jabberd2 didn't provide. Enter ejabberd.
I found configuring ejabberd a bit awkward and the documentation a bit unclear but it eventually bent to my will. This was followed with a lot of playing about with mod_proxy which made me want to cry more than once, but the result was that I got JWChat working and from behind a restrictive corporate firewall
Getting multi-user chat operational was the final piece of the puzzle and I learned an important lesson: ejabberd's
mod_muc module likes to be configured to use a subdomain of the virtual host, even if it doesn't resolve to anything in DNS. This was a painful lesson to learn…
Today, we have a groupchat facility, with logging, that's accessible from anywhere on the web, magic!
As I previously mentioned, my current task as Infurious system admin is providing the team with a bug/task tracking system, namely Trac.
My initial thought was: “our server runs Ubuntu, this should be easy…”
I could get Trac running via
tracd and I could see that
mod_python was working via
mod_python.testhandler but the two didn’t seem to want to play together. Last night, after much frustration, I just gave up and configured Trac to run as a CGI application. Problem solved.
Unfortunately this solution will introduce a performance penalty but at this stage it’s my priority to get the system functional before I start worrying about access speed.
All I have left to do is get the Trac permissions set up and I’m going to move on to configuring a Jabber server which will free us from our dependency on Campfire.
Me? A Linux hippy? You bet!
It’s been a busy week. The lads and myself have been quite industrious, making plans and Getting Things Done.
I’ve taken on responsibility of taking care of the Linux side of things and last night finished setting up an SSL enhanced, WebDAV accessible Subversion repository, for which Aidan has written an introductory guide.
My current task is getting Trac installed and I’m quite enjoying being up to my elbows in command line goodness. It must be the Linux hippy in me
There’s definitely an atmosphere of excitement about the endeavour and it makes the day job seem more tolerable knowing there could be more interesting things on the horizon. Geek on!
I got speaking to Matt this morning when I arrived in the office and he demonstrated to me the wealth of information provided by Google Analytics. It was really interesting stuff and I could see why he was so excited about it.
I recently upgraded to Apache 2.2 on my development machine,
substance, to “easily” get TLS working so I could use AjaxTerm and I haven’t had the time yet to play about getting awstats up and running again.
I wouldn’t call myself a blogger and my site doesn’t draw huge traffic but it’s still nice to look at the statistics every once in a while so I decided to give it a go.
All I have to do now is to let the numbers crunch
I finally got sick of not being able to use SQL subqueries and decided to upgrade my MySQL installation from 4.0.x to 5.0.x.
I had wanted to do this previously but was afraid I’d end up breaking something and be left without a working development environment or a website either, for that matter, so I resorted to complicating my custom queries in CakePHP with JOIN statements
I couldn’t find a 5.0.x package for Slackware 10.0 in the package browser, so I bit the bullet and downloaded the source archive…
I extracted the files, issued the immortal
./configure && make commands, left things for a while and was pleasantly surprised when the compilation succeeded.
removepkg got rid of the old package, a new one was easily produced using checkinstall and
installpkg installed it for me.
The only problem I had was when I went to fire up the daemon and nothing happened. It turned out that the startup script was expecting
mysqld_safe to be found in
/usr/bin instead of
/usr/local/bin where it had been installed to: that was quickly remedied with the creation of a symbolic link. From there it was plain sailing, all my databases were functional.
But enough techno-gibberish. The point of the matter is that I managed to build a package from source, get it up and running and the system as a whole still worked! Linux administration definitely appears to coming more natural to me. Bob be praised
Not 2 days after mentioning my initial impressions of EuroFeeds usenet service I get an email notifying me that they have upgraded retention on binary newsgroups to 60 days, with a planned upgrade to 85 days over the summer, as well as doubling the number of simultaneous connections on unlimited accounts from 4 to 8. Nice.
I’m looking forward to testing this out
Update: Ninan is now maxing out at 1.3 MB/s. My previous newhosting.com account had it peaking at 1.1 MB/s.
My annual subscription to NewsHosting expired recently and as I had a huge backlog of media to get through I put up with the lack of new stuff for a while. I managed to leech a couple of releases from my ISPs servers but was unable to obtain a few others which piqued my interest, so I endeavered to remedy the situation.
A credit card is not something I currently have access to, so I had no option but to go with a European based provider which would accept payment via Maestro. I studied the available information, compared and contrasted and decided to sign up for a one-off month with EuroFeeds.
I opted for the unlimited 12Mbit/s account, reckoning that it would be capable of saturating my 10 Mbit cable modem, even though I’m only allowed 4 simultaneous connections compared to 8 with NewsHosting. So far my impression is that the EuroFeeds subscription I went with isn’t as fast as my previous one, even with their servers being based in Romania and NewsHosting having theirs in the US. I haven’t tested the limits of article retention yet, but I don’t think it will be in the same league as before.
All in all, I’m glad to have access to binary newsgroups again I’m even willing to overlook the price being comparable to the unlimited offering from the current leader in premium usenet access, Giganews. I now have a month to either track down someone with a credit card or even do the unthinkable and apply for one myself and if that’s the case I might well go with Giganews.
Now that MacServ has been deployed keeping development and production copies of the code synchronised has become an issue. The app is still very much a work in progress, with daily requests for fixes & tweaks from the technicians using it and instead of keeping track of modified files and then manually updating them via
scp, I decided to let laziness motivate me to utilise a less painful system.
I spent a bit of time researching the use of
rsync but decided that subversion would better suit my needs. I started by adding a new user and creating a directory for my repository:
# adduser svn
# su svn
$ cd ~
$ mkdir repo
Next was to create the repository and make it writable by all accounts in the
$ svnadmin create repo/
$ chmod -R g+w repo/
Just to be awkward, I decided on having multiple access methods to the repository. For local access I added a username & password for myself into
repo/conf/passwd and fired up the daemon, restricting it to
$ svnserve -d -r /home/svn/repo/
In order for remote access to preserve file permissions on the repository, wrappers for
svnlook had to be created:
$ cat /usr/local/bin/svn
To prevent the logs on my development machine from filling with failed login attempts,
ssh connections to it are on a non-standard port, so the final step in enabling remote
svn+ssh:// access was to tweak the
ssh settings on the hosting account:
$ cat ~/.ssh/config
Finally I was able to perform the initial check-in of my code and then check it out on the production side of things
I’ve still got a lot of things to learn about the day-to-day use of version control: for instance, I had problems with some configuration and
.htaccess files which are required to be different between development & production. Having to enter my password multiple times to perform an update is also a bit of a drag, but it might motivate me to look into the use of ssh-agent and public-key access to my development machine, doing away with login passwords altogether…
The latest stable of Ninan was released a few days ago and I’ve just gotten around to upgrading my installation of it.
There wasn’t much to the upgrade process: I downloaded and untared the archive and I thought I’d play it smart and copy over my old
ninanconfig.xml and it appeared to work, but gave up the ghost when it came to actually downloading something. I renamed the file, restarted Ninan, reentered all my details and preferences and I’m now happily downloading at 1.2 MB/s
I love this program although I still haven’t put the effort into getting the restart feature to work and I’m not too keen on having to use the memory-hog that is Java to run it, but Ninan does the job at hand and does it well.
May the leeching resume!