I was recently interviewed by Ryan Arsenault, the Assistant Site Editor for SearchEnterpriseLinux.com for a feature on their website discussing the new fifth edition of The Official Ubuntu Book. You may read that interview here.
Not long ago I had the privilege of interviewing the authors of what I consider to be the best book for learning systems administration with Unix or Linux from a large, enterprise perspective. This book is unusual in another way: it was published by Prentice-Hall and the forward was written by one of their competitors, Tim O’Reilly, the founder and head of O’Reilly Media. That says something.
My interview with the authors appeared today on InformIT’s website. Take a look.
I have a new job that I am pretty excited about. The one downside is that the amount of time I have available to dedicate to Ubuntu-related projects will be a bit more limited, especially as I get going. I’ll still be around, but I probably won’t be quite as quick to respond or as readily available.
I am thrilled that as of last week, I am the Senior Technical Documentation Specialist for iPlant Collaborative, a National Science Foundation funded project that is creating a new cyberinfrastructure to assist research in plant biology. My responsibilities include working with programmers and biologists to create the documentation for the project software, which requires some translation between those who are highly proficient in computer technology but not biology and those who are highly proficient in biology but not computer technology…which means I’m spending some time in intense study to learn about plant genetics. Fun stuff. 🙂
Hackers: Heroes of the Computer Revolution is a history of the beginning, growth and rise of the use of computers by people outside of the big businesses and governments that worked to create them in proprietary silos. This 25th anniversary edition of Steven Levy’s classic book retains its detailed and interesting chronicle of the events that brought computing power to the masses. It also records some of the problems, pitfalls, and failures along the way. Here you will find many names that computer lovers are sure to recognize from Bill Gates to Richard Stallman as well as many that are not as well known, but that deserve to have their victories recorded also.
I greatly appreciate that this book exists. To be honest, it wasn’t always a fun read. That isn’t a commentary on the quality of the writing, but rather on the ups and downs of the narrative. There were times when I found myself wishing I was there in the middle of the action and other times when I had difficulty knowing who to root for. There were still other moments when I found myself cringing as I read about events long past, wishing that different decisions had been made or disappointed at the actions and attitudes of geniuses.
I’m not going to spoil the book for anyone interested by giving out specific details. All I’ll say here is that the story begins with a bunch of model railroaders who love technology and who fall in love with a computer they discover they may access freely in an out of the way room in a building at MIT in the late 1950s. They took their love of piecing together technological gadgets in imaginative and creative ways (hacks) and applied it to this new tool / toy. The story follows their exploits and adventures through the 1960s en route to a second wave of hackers in Northern California in the 1970s who take the love home, creating machines on a smaller budget that could be used by ordinary people. Hot on their heels were another group of Californians who led a third wave, hacking software to do things never before dreamed of and leading the way to the commercialization of the computer. The book ends with a series of afterwards, one written when the book was first published in 1983, another written 10 years later, and another just added to this newly published edition. Each adds details and commentary to the history that were not known at the time of the original interviews and research.
If the history of hacking, free and open source software and the attitudes embodied in the current movement interest you, you will appreciate this book greatly.
Disclosures: I was given my copy of this book free by O’Reilly as a review copy, I also write for O’Reilly.
Note: this post is outdated. Use at your own risk.
I’ve been using nginx for this blog and other sites for well over a year, beginning with Ubuntu 8.10. I have had to figure out some things, but overall I have been very pleased. I have upgraded the server for each Ubuntu release since then with no real problems. Yesterday I upgraded to 10.04 and thought all was well when I went to bed last night. However, at some time during the night all of my sites began to return
504 Gateway Timeout errors. Hmm.
I did some checking in logs and detective work with top and such and found that my load averages were running between 6 and 8, on a server that has averaged less than 1 for well over a year. After some research, I discovered that the php-fastcgi process was spawning child processes that did not die off when complete. I have no idea why as I did not change any of the nginx, php-fastcgi or other settings. The high load averages dropped to 0 when I stopped the php-fastcgi service.
After some documentation reading and other failed attempts, I finally solved the 504 problem by making one change in my
I would really like to figure out why the child processes were not ending properly before and why they are now and better understand what is going on. I’ve also noticed that the responsiveness of the site seems slower, but that could just be my imagination as I have no measurements to confirm/deny. Anyway, if anyone has any ideas, please comment.
All of my sites, including this one, were offline for a bit today while I upgraded the operating system on my server. I am now running Ubuntu 10.04 LTS on this server. The upgrade was easy and smooth. Yay!
Yes, we can! Robbie Williamson, Engineering Manager at Canonical and an influential voice in Ubuntu’s release schedule, responded on his blog to Mark Shuttleworth’s call to see if we could release 10.10 on 10/10/10 (which, if thought of as the binary number 101010 would equal 42, every geek’s favorite number). Take a look.
Do you like it when your operating system “just works?” I do. This does not happen easily or without hard work. Ubuntu has a wonderful QA team that has a systematic method of testing releases on diverse hardware platforms. However, they don’t own every piece of equipment out there. This doesn’t have to be the end of the story. Anyone who is willing to do a little bit of work and follow some very clearly outlined procedures may become a part of the team and help make releases better. Interested? Take a look at https://wiki.ubuntu.com/Testing for ways that community members can join the Testing Team and http://qa.ubuntu.com/ for information on the QA Team. These two groups work together toward the common goal of making Ubuntu releases the best they can be through finding bugs, reporting them, and helping find problems on an even wider set of hardware.
For those who hadn’t seen it and want to.
There is a new initiative from the Canonical/Ubuntu Design Team to do a much better job communicating their thoughts, ideas and plans to the wider community. They have started a blog at http://design.canonical.com/ that I believe is worth reading regularly. Fire up your RSS feed reader and subscribe after taking a look at the wonderful foundation they have created to kick things off.
EDIT: I should mention that the main way that the Design Team communicates is via the ayatana mailing list. You can find it here: