“Civilization, in fact, grows more and more maudlin and hysterical; especially under democracy it tends to degenerate into a mere combat of crazes; the whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by an endless series of hobgoblins, most of them imaginary."
H.L. Mencken
From the following article:
the airport security follies
Monday, December 31, 2007
Thursday, December 6, 2007
Unix command fun ...
I try to use the Unix commands whenever possible to keep my knowledge of them fresh. Today, I needed to remove quite a few files from an SVN repository. I moved them using mv instead of svn mv. So I used the following to capture the output from the status command, break it up and filter and then run through a script I wrote to perform the svn del.
svn_del.sh `svn st | cut -d' ' -f7 | grep .sql`
Wednesday, October 10, 2007
Dynamic Reports In Excel using HTML
We use XSLT to generate reports and typically the destination is Excel. So the report is generally basic HTML tables and Excel will parse the table info. Today, I needed to sum an amount column but had no way of knowing how many rows were in the result. After spending considerable time on Google, I came away with this:
Seemed pretty intuitive once I found it. The problem with this is hitting the right combination of keywords when searching. So hopefully, by describing my problem here and the solution, I have narrowed someone else's search a bit.
=SUM(A1:INDIRECT(ADDRESS(ROW()-1,1)))
Seemed pretty intuitive once I found it. The problem with this is hitting the right combination of keywords when searching. So hopefully, by describing my problem here and the solution, I have narrowed someone else's search a bit.
Friday, September 28, 2007
Service companies that aren't
I completely agree with this post.
crappy-companies-with-broken-services
And perhaps the White House and those in Congress should read it as well. I heard this morning they are considering legislation for the airline industry with respect to on-time arrivals. They only treat customers this way because we keep flying with them. If we stop flying, things will change. And you're all saying "but we need to fly and we don't have any other options". You can choose not to fly with these airlines OR stop complaining about the way they treat you. I know, I do it too but we really don't have the right because we condone it by continuing to fly with them.
And honestly, don't the people in D.C. have more critical matters to concern themselves with.
crappy-companies-with-broken-services
And perhaps the White House and those in Congress should read it as well. I heard this morning they are considering legislation for the airline industry with respect to on-time arrivals. They only treat customers this way because we keep flying with them. If we stop flying, things will change. And you're all saying "but we need to fly and we don't have any other options". You can choose not to fly with these airlines OR stop complaining about the way they treat you. I know, I do it too but we really don't have the right because we condone it by continuing to fly with them.
And honestly, don't the people in D.C. have more critical matters to concern themselves with.
Wednesday, September 26, 2007
Financial Information
Why don't banks have RSS feeds. I spend a considerable amount of time writing screen scrapping utilities to get my data out of various banking web sites. Why can't they simply make this information available via a feed? I would gladly pay additional fees for this service.
Saturday, September 22, 2007
S3 as object store
I'm not entirely sure where this is going but these two links appear to be suggesting that S3 could be used as a database for certain applications. I've been playing with S3 persistent objects as suggested in nutrun's post and it is pretty interesting.
amazon-s3-persistent-ruby-objects
read-consistency-dumb-databases-smart-services
Doubt it will be of much use in the immediate future but certainly something to play with and understand better.
amazon-s3-persistent-ruby-objects
read-consistency-dumb-databases-smart-services
Doubt it will be of much use in the immediate future but certainly something to play with and understand better.
Wednesday, September 19, 2007
Mac OSX and gcc_select
I was installing a gem and then trying to run a short test and received this error.
After a quick Google search I saw the problem appeared to be with gcc itself. I then ran gcc_select and realized I set the version to 3.3 at some point in the past (most likely to install other software) and when I set it back to 4.0, the problem vanished.
cc: installation problem, cannot exec `cc1': No such file or directory
After a quick Google search I saw the problem appeared to be with gcc itself. I then ran gcc_select and realized I set the version to 3.3 at some point in the past (most likely to install other software) and when I set it back to 4.0, the problem vanished.
sudo gcc_select 4.0
Sunday, September 9, 2007
Generating Value
I saw this over the weekend and thought it was interesting. They were able to generate value for the client well before the actual technology solution was even close to ready. I think we often think of value residing in the end-product when in fact it can be realized much quicker than that through the services delivered.
RollerSkateImplementation
It also helps the development staff tweak the final feature because they are going through the process iteratively. Good stuff!
RollerSkateImplementation
It also helps the development staff tweak the final feature because they are going through the process iteratively. Good stuff!
Saturday, September 8, 2007
TestXSLT - XSL Processor for the Mac
I needed to update a few XSL templates and decided to look for a processor that would run on my MBP to test the output. I ran XMLSpy on WinXP previously and found it quite useful. Although TestXSLT doesn't have all the cool IDE features of XMLSpy, it's prefect for making and reviewing quick changes.
Download it here!
Download it here!
Branding trumps quality
I was listening to Bloomberg today (podcast) and part of the discussion went along the lines of "craftsmanship doesn't matter as much as the branding". The point was no one cared about the quality and focused on the "brand". This seems completely counter-intuitive. Quality of product and service will always be the most important feature for me!
Thursday, September 6, 2007
Email Issue Solved (smtp transport event sink to the rescue)
I was asked to make a small modification to the emails sent out when a new user is registered for an application. In 99% of the cases, this is a simple task. However, I found the 1% that it's not.
This is a rather large legacy application which I have not attempted to recompile yet. Recompile you ask? To change an email? Yep, the text of the email was stuck right there in the code. So to make my minor change, I would have to tackle this and do a significant amount of testing.
So I decided to look for an alternative and stumbled upon the transport event sink in the SMTP server running in Windows 2000 Server. This feature will allow you to register a script to run before the email is actually sent.
How to register a transport event sink for the SMTP Service in Exchange 2000 Server
So I followed the instructions, wrote a little script to handle my changes and went on about my day.
This is a rather large legacy application which I have not attempted to recompile yet. Recompile you ask? To change an email? Yep, the text of the email was stuck right there in the code. So to make my minor change, I would have to tackle this and do a significant amount of testing.
So I decided to look for an alternative and stumbled upon the transport event sink in the SMTP server running in Windows 2000 Server. This feature will allow you to register a script to run before the email is actually sent.
How to register a transport event sink for the SMTP Service in Exchange 2000 Server
So I followed the instructions, wrote a little script to handle my changes and went on about my day.
Wednesday, September 5, 2007
CouchDB
Assaf over at Labnotes has a great introduction to an interesting new DB. After searching around a bit more I found this to help get everything installed on my Mac. building and installing couchdb on osx.
I find this interesting because it appears to be highly flexible with respect to the data elements each table contains. Would certainly help in applications where users want to extend concepts with their own attributes.
I find this interesting because it appears to be highly flexible with respect to the data elements each table contains. Would certainly help in applications where users want to extend concepts with their own attributes.
Monday, August 13, 2007
Ruby Exceptions and instance_eval
I started working on the tax natural language project again and noticed that whenever the parser encountered a line of text it did not understand, an exception was thrown. This is the intended behavior although I was not able to catch the error. After a few minutes on Google, I found an article stating that in order to catch errors thrown through this method, the custom exception class MUST inherit from StandardError. Once I made this change all worked as expected.
http://hasno.info/2006/12/14/ruby-gotchas-and-caveats
http://hasno.info/2006/12/14/ruby-gotchas-and-caveats
Sunday, March 25, 2007
Breaking up large files for transport to S3
I decided to try and place a copy of my windows XP virtual image on S3 for backup. I quickly realized that JungleDisk has a file size limitation and so does S3. Probably not a bad idea since transferring a large file from home or office will most likely be limited to 384k or 768k up. For a 30G file, this would take a while. If we break the file up and something happens to the connection, it will be much easier to start over the next time.
I looked for an easy way to split the files up and stumbled across SPLIT. This will take an input file and break it into pieces based on info provided. See the MAN page for details.
Here's the command I used:
This broke my file up into 100M chunks. Now I can load them to S3.
If you ever need to re-construct the files, the cat command will work well. You'll need to CAT them together in the appropriate order but that shouldn't be difficult given the way we split them up. For example, to create the original image, you would simple type
Could this be any easier?
I looked for an easy way to split the files up and stumbled across SPLIT. This will take an input file and break it into pieces based on info provided. See the MAN page for details.
Here's the command I used:
split -b 102400k winxp.hdd.aa winxp.hdd.part
This broke my file up into 100M chunks. Now I can load them to S3.
If you ever need to re-construct the files, the cat command will work well. You'll need to CAT them together in the appropriate order but that shouldn't be difficult given the way we split them up. For example, to create the original image, you would simple type
cat winxp.hdd.part.aa winxp.hdd.part.ab ... winxp.hdd.part.az
Could this be any easier?
Monday, March 12, 2007
More EC2 Progress
I was able to get a few more things installed. After finishing up with SWIG and the ruby bindings so I could get Collaboa running, things started acting funny. Turns out I was running out of space on the main volume. It was then I realized that I needed to move some directories to the /mnt volume.
I moved all the usr/local stuff directories, the /var directory and with it the MYSQL files. I then went back and uninstalled a few of the items that were already installed via YUM since it appears to dump everything into the /usr directory. Since I complied Apache 2.2 myself, everything was in /usr/local/apache2.
This also lead me to further enhancements with the autorun script. It now does a bit more than simply register the domain name with ZoneEdit. I now sync the /mnt directory with S3 and then pull it back down on boot. Therefore, I'll need to create some symbolic links so everything looks good and then restart some of the services that tried to start before my script was run.
Haven't tried a reboot yet, but it's not too difficult, it will lengthen the start up time though since it's pulling about 200M from S3.
I think I'll also be running ec2-bundle-vol possibly daily to have a hot backup ready. Not sure on this one yet.
I moved all the usr/local stuff directories, the /var directory and with it the MYSQL files. I then went back and uninstalled a few of the items that were already installed via YUM since it appears to dump everything into the /usr directory. Since I complied Apache 2.2 myself, everything was in /usr/local/apache2.
This also lead me to further enhancements with the autorun script. It now does a bit more than simply register the domain name with ZoneEdit. I now sync the /mnt directory with S3 and then pull it back down on boot. Therefore, I'll need to create some symbolic links so everything looks good and then restart some of the services that tried to start before my script was run.
Haven't tried a reboot yet, but it's not too difficult, it will lengthen the start up time though since it's pulling about 200M from S3.
I think I'll also be running ec2-bundle-vol possibly daily to have a hot backup ready. Not sure on this one yet.
Saturday, March 10, 2007
Editing Cron through a script
This one is pretty easy ...
Basically, copy the current file to a temporary file. Then append the text from some_file. Finally, have crontab mv the file back to it's proper location.
Basically, copy the current file to a temporary file. Then append the text from some_file. Finally, have crontab mv the file back to it's proper location.
crontab -l > /tmp/file
cat some_file >> /tmp/file
crontab /tmp/file
EC2 Progress
I finally managed to create a baseline image on EC2. I needed to have Apache be able to serve various sites as well as Subversion. That took a while due to my lack of understanding the Apache config files. Basically, when a site is running in Virtual mode meaning more than one site on a single IP address, all the baseline configurations are ignored for DocumentRoot and ServerName. That took a while to work through.
Next step will be to have the image connect to S3 on boot up and pull down a set of scripts that will perform various tasks such as setting the domain name properly since we don't get a static IP. We'll also need to mod the cron files to make sure all the data is backed up on a regular schedule. It will be set up in such a way that based on the URL passed in when the instance is started, it can perform various tasks. This way we can use the same mechanisms across multiple instances for different needs.
Next step will be to have the image connect to S3 on boot up and pull down a set of scripts that will perform various tasks such as setting the domain name properly since we don't get a static IP. We'll also need to mod the cron files to make sure all the data is backed up on a regular schedule. It will be set up in such a way that based on the URL passed in when the instance is started, it can perform various tasks. This way we can use the same mechanisms across multiple instances for different needs.
Subscribe to:
Posts (Atom)