Scott Dowdle's blog
Don't custom build that site! The many uses for Drupal by Jakob Perry.
Here's the video of Jesse Keating's introduction to git.
This is is the second hour of the two hour Intro to SELinux presentation by Hal Pomeranz.
Please note, my battery died with about 10 minutes to go so the last bit of the presentation is missing. Sorry!
This is is the first hour of the two hour Intro to SELinux presentation by Hal Pomeranz.
Day 1 went very well. The presenters of all of the presentations I went to allowed me to record their presentations... and as in previous years, I plan on sharing those with everyone. I have a problem though... and that is with transcoding the videos.
My family bought a new video camera for Xmas - a Samsung SC-MX20. I'll admit it, we bought it because it was inexpensive. I read the reviews of it before I bought it and I was very aware of the issues I was going to have with it. I have run into one problem that I have yet to be able to resolve... but first a little background. This camera records to an SDcard in .mp4 format. I had not previously had any problems transcoding MP4 video with mencoder, ffmpeg, ffpmeg2theora... but for some reason the .mp4 files created by this camera confuse every piece of transcoding software I've tried. How? At second 2186 some property of the video changes and just break the transcoder. Unfortunately I don't have the error messages in front of me so I'm going to be quite vague. mencoder doesn't freak out but when it hits second 2186 I start to see a lot of duplicate frame messages... which is usually no problem because I see that a lot in various videos I've converted. In this case however, the output video mencoder creates freezes on the last frame although the audio continues just fine. I even tried iMovie on an iMac and it flat out crashed when it it hit second 2186.
At first I thought it was a filesystem problem or a corrupted filesystem on the SDcard. But I've allowed the camera to format every card I've used, I've used multiple cards by different vendors, and I've used multiple computers... and every video that is longer than 2186 has encountered this problem. I'm baffled. I've done some Internet searches but so far they have come up empty.
What's odd is that the original .mp4 video files playback just fine... I'm just having a problem transcoding them. So the worst case scenario is that I have to distribute the very big .mp4 files. I do have one strategy left to try and that is to use Handbrake and convert them to another .mp4 file and try to transcode that. I haven't had time to do that yet but at this point I'm crossing my fingers and hopes that works. If I can't find a solution to this problem, I'm definitely going to have to buy another camera. For videos shorter than that, no problem. I guess while recording I could hit stop and start every 30 minutes but that is very error prone.
I didn't have much time to proof read this so sorry for any typos!
Warren Sanders, Paul Arnot and I arrived in Bellingham, WA pretty late. Warren drove up the night before and attended the BozemanLUG meeting. We left about 7:45 AM. Got to Missoula and stopped for breakfast at The Shack. Our good buddy psymin was there as was Franklin (I don't recall how to spell his handle) and we also got to meet sthon. psymin for the second year in a row paid for our meal. We really owe him some food if and when he ever makes it to Bozeman.
Stopped about 44 miles outside of Bellingham for dinner at some Janpanese fast food type place. I had the #32 which was grilled vegetables. Warren and Paul had something called "Bento boxes". It were good.
Finally made it to the Hampton Inn about 10 PM PDT. It was amazing that we actually made it there considering Warren didn't have any map or instructions for the trip and just decided why not just wing it. Amazingly I found our previous hotel from two years ago but Warren did have to consult Google Info for the street address of the Hotel and then was transfered to the hotel to ask for the highway exit number. There it is. We found it.
The weather? Getting lots of "Seattle area sunshine" which means rain.
Paul won't stop talking about Japan. I have lots of gas although less than in previous years. Paul said he didn't hear me snoring which both Warren and I have trouble believing.
Yeah, silly blog post but hey, it WAS day 0.
Charged up the digital still camera and the video camera... so I hope to get a lot of sharables today. I'm wearing my Fedora tee-shirt and hope to get a Fedora netbook skin because I AM running Fedora on my netbook.
I recently started using a tool that I find very handy. It is named func and it is a remote api for management, configuration, and monitoring of systems. What does that mean exactly? I'll get into that but first a little background.
In my day job I manage a number of Linux systems. Some are servers and more are desktop machines in labs used by students. All of the lab machines are triple-boot (Windows XP Pro, CentOS 5.4, and Fedora 12). Fedora has a lot of updates... and it is hard to keep up with them. Typically I have to ssh into each machine to work on it but most of what I do is the same thing over and over again. Wouldn't it be nice to be able to manage multiple machines at once with one command line? That is what func does for you. func allows you to manage remote machines with one command line in parallel.
func was written by Fedora developers mainly to help them manage the server infrastructure that makes up the Fedora distribution's online public servers and build systems. They have an active mailing list that you are encouraged to participate in if you are interested in asking questions and helping to shape the future development of func.
func is written in Python and comes with a number of modules that are custom built for certain tasks. If there is an existing module for your task(s), use the existing module. If not, you can use the command module which basically allows you to run whatever command(s) you want on your remote machines.
Red Hat Enterprise Linux 5 (Tikanga) was released on March 14, 2007 and yesterday was RHEL 5's 3rd birthday. Since then we have gotten 4 update releases.
Given the fact that Red Hat's original plan was to have a new RHEL release every 18 - 24 months, one has to wonder where RHEL 6 is and why it is so late. My best guess is that RHEL 6 (which so far has had a non-public alpha release within Red Hat as witnessed in some Bugzilla reports) will come out sometime this summer... possibly in time for the Red Hat Summit in Boston (June 22-25, 2010). For that to happen I would expect a public beta for RHEL 6 to be released in the not too distant future. We'll see how that pans out.
While we are waiting, how about some idle discussion?
I periodically check out Fedora Planet and today I noticed a big post by Josh Boyer entitled, "Why Fedora needs an Updates Policy". I left a medium-sized comment there that I decided to post here as well.
It is working pretty well without a policy... but that isn't to say that a policy isn't needed, because it would be good to have an update policy. I however like the rapid pace of updates and version churn in Fedora and I think the codification of an update policy would be slanted to always favor more conservative updates.
I like that Fedora updates KDE every time there is a new release from the KDE project. I like how I can get newer versions of things as they appear... and yes it will sometimes lead to breakage, but that was one of the charms of Fedora. On the other hand it seems that some packages are constantly updated, like every other week. That may be an exaggeration but sometimes it feels like that.
Ideally there would be a conservative updates repo and a newest-stuff repo... but I'm sure that would be more work than your already overworked group of Red Hat employees and Fedora volunteers would want to take on... and I don't blame them.
Given the rapid 6 month development cycle of Fedora and the limited lifespan of any given release... the better answer, if stability is the considern, would be to lengthen the development release cycle... but no one wants to do that, right? Another solution would be to have stated LTS releases every couple of releases, but again... that idea has been batted around several times and dismissed.
It seems many wish something would fall between the rapid development cycle of Fedora and the slow development cycle of RHEL. I don't see how that is going to happen.
Not having an update policy and the recent complaints about it will be something that is heavily criticized by those from other distros and the Linux press... but it doesn't mean that the system you have been working with and the decisions you have been making haven't been working well enough. Package makers are supposed to submit their stuff to testing, people are supposed to test and provide feedback, and only when a package is deemed sufficiently ready should it be considered. I think it is better to leave it up to the package maintainers themselves on what version of a piece of software they want to release... unless of course is an underlying package that disrupts things above it... and you have tried to address that by identifying core/critical packages and putting more rules on their being updated.
I would hope any update policy Fedora comes up with would retain the current flavor of Fedora with rapid and constant updates... rather than being stuck with older releases of things when upstream has fixed a lot of bugs and released newer versions with additional features. If you don't retain that quality then it will just encourage the development of yet more third-party repositories with newer software and just make an even bigger mess. This gets back to the seeming constant desire for Fedora to define itself and who it is targeting... and then potentially limiting itself to those more strictly defined goals. I for one like it fast and loose... but I'm just a user. :)
Just got done reading, "Confessions of an Ubuntu Fanboy". While I'm glad the author has decided to be more practical in his promotion of Linux and Ubuntu, I strongly disagree with some of his conclusions and I'll cover them below.
I have been using Linux for about 15 years now and over the course of that time I've helped more people than I care to count with Linux installs, removals and everything in-between. I've seen people try Linux out for a few days and give up on it. I've seen people tough it out and become valued members of our local Linux community. Linux isn't for everyone and choice is good. I no longer advocate Linux for someone who isn't willing to learn new things. I quit trying to push it on people and now I'm somewhat selective in helping people the second they say they want to try Linux. I state up front that there is a learning curve and that they will need to expect it. If I sense that they don't have patience to learn new things, I don't even bother.
The problem with the article in question is that the author seems to want to try to make Linux for everyone and in doing so, he advocates violating some important tenants. He primarily focuses on Windows users but it could be any proprietary OS or applications.