06 December, 2007

OpenOffice File Converter Macro

Some time back I published an entry showing how to execute a Open Office macro from the command line. This opens the possibilities of batch processing files, the original reason I started pursuing this. Authoring a series of publications stored in odt format allows storing in native file format. Unfortunately, since this file format is far from a conventional standard this is only useful if I can convert them to a more recognized standard (like pdf).

The macro defined below accepts a file name as an argument, converts it to an associated Pdf file and exits.


REM ***** BASIC *****

Sub SaveAsPDF( cFile )

' cFile = "/home/lipeltgm/sample.odt"
cURL = ConvertToURL( cFile )
oDoc = StarDesktop.loadComponentFromURL( ConvertToUrl( cFile ), "_blank", 0, Array())
cFile = Left( cFile, Len( cFile ) - 4 ) + ".pdf"
cURL = ConvertToURL( cFile )
oDoc.storeToURL( ConvertToUrl( cFile ), Array(MakePropertyValue( "FilterName", "writer_pdf_Export" )))
Shell("pkill soffice.bin")
End Sub

Function MakePropertyValue( Optional cName As String, Optional uValue ) As com.sun.star.beans.PropertyValue
Dim oPropertyValue As New com.sun.star.beans.PropertyValue
If Not IsMissing( cName ) Then
oPropertyValue.Name = cName
EndIf
If Not IsMissing( uValue ) Then
oPropertyValue.Value = uValue
EndIf
MakePropertyValue() = oPropertyValue
End Function



Execute the macro from the command line as follows:


$ soffice -invisible "macro:///Standard.Module1.SaveAsPDF(/home/myUser/sample.odt)"

It's worth noting that the 3rd '/' in 'macro:///' implies the root directory of the application, in this instance the root directory.

02 December, 2007

Command Line OpenOffice Macro Execution

First start by creating a new macro, something simple.

Create the macro by selecting Tools->Macros->Organize Macros->OpenOffice.org Basic

Edit a new Macro1 in Module1










Define the Macro1 as below:

Sub Macro1
MsgBox( "Hello World" )
End Sub


Save it and execute it from the command line by:

$ soffice macro:///Standard.Module1.Macro1

You can run OpenOffice headless by specifying the -invisible argument.

$ soffice -invisible macro:///Standard.Module1.Macro1

19 September, 2007

Piping to Tcl Script

Ok, maybe I've lived up to my name. I may very well be the fat slow
kid.

I struggled for some time trying to figure out how to pipe input to a Tcl script and immediately ventured off on a quest for opening a pipe (|) file descriptor. In actuality however, since I wanted to pipe input into the tcl script my needs are met by:

#!/usr/bin/tclsh
while { [gets stdin line] >= 0 } {
puts "got '$line'"
}

so

$ ls | ./tclPipe

results in

got 'Desktop'
got 'IM'
got 'ImagingManager.tar.gz'
got 'ISC'
got 'tclPipe'


My quest however also recovered if the input command is fixed:

#!/usr/bin/tclsh

set fp [open "|ls" r]
while { [gets $fp line] >= 0 } {
puts "got $line"
}

would surfice.

16 September, 2007

Command Line Processing with Popt

General practice for writing flexible applications lend toward the ability to specify command-line arguments to tailor the behavior of your application. For instance, a common practice of specifying -v to enable a verbose mode where debugging information is echoed as your application runs. Specifying a unique port for networking applications, or enablement/disablement of an application gui are two other examples that are common. Since command line arguments can take the flavor of integer, floats, doubles, strings...the code for parsing these arguments can be far from simple. Tailoring flexible parsers every time you develop an application is labor-intensive and a waste of time if a general mechanism were available.

Lucky for us, such a mechanism does exist; namely the Popt library.

A short example of an application which uses the library follows:

1 #include
2 #include
3 #include
4
5 int main(int argc, char **argv) {
6 /* option parsing variables */
7 char ch;
8 poptContext opt_con; /* context for parsing command-line options */
9 char *extra_arg;
10 int i=0;
11 char *s="";
12 float f=0.0;
13 double d=0.0;
14 int verbose=0;
15
16 static struct poptOption options_table[] = {
17 { "integer", 'i', POPT_ARG_INT, &i, 'i', "grab an integer", "INT" },
18 { "string", 's', POPT_ARG_STRING, &s, 's', "grab a string", "STRING" },
19 { "float", 'f', POPT_ARG_FLOAT, &f, 'f', "grab a float", "FLOAT" },
20 { "double", 'd', POPT_ARG_DOUBLE, &d, 'd', "grab a double", "DOUBLE" },
21 { "verbose", 'v', POPT_ARG_NONE, NULL, 'v', "enable verbose", "" },
22 POPT_AUTOHELP
23 { NULL, 0, 0, NULL, 0 } /* end-of-list terminator */
24 };
25
26 opt_con = poptGetContext(NULL, argc, (const char **)argv, options_table, 0);
27
28 /* Now do options processing */
29 while ((ch = poptGetNextOpt(opt_con)) >= 0) {
30 printf("between while & switch: ch = %c\n", ch);
31 switch (ch) {
32 case 'i':
33 printf("handling 'i' option.\n");
34 break;
35 case 's':
36 printf("handling 's' option.\n");
37 break;
38 case 'f':
39 printf("handling 'f' option.\n");
40 break;
41 case 'd':
42 printf("handling 'd' option.\n");
43 break;
44 case 'v':
45 printf("handling 'v' option.\n");
46 verbose = 1;
47 break;
48 }
49 }
50
51 if (ch < -1) { 52 // the user specified an invalid option, tell them 53 poptPrintHelp(opt_con, stderr, 0); 54 } 55 56 /* non-option args */ 57 while (extra_arg = (char *)poptGetArg(opt_con)) { 58 printf("extra arg: %s\n", extra_arg); 59 exit(1); 60 } 61 62 63 /* cleanup */ 64 poptFreeContext(opt_con); 65 66 printf("(%s:%d) i = %d\n",__FILE__,__LINE__,i); 67 printf("(%s:%d) s = '%s'\n",__FILE__,__LINE__,s); 68 printf("(%s:%d) f = %f\n",__FILE__,__LINE__,f); 69 printf("(%s:%d) d = %lf\n",__FILE__,__LINE__,d); 70 printf("(%s:%d) v = %d\n",__FILE__,__LINE__,verbose); 71 72 73 return EXIT_SUCCESS; 74 }


You compile and link the application by:
g++ -Wall   main.o  -lpopt -o main


An added bonus of using this application is the automatic introduced interfaces for a brief and more verbose help descriptions.


~/Desktop/sourceCode/C/popt$ ./main --help
Usage: main [OPTION...]
-i, --integer=INT grab an integer
-s, --string=STRING grab a string
-f, --float=FLOAT grab a float
-d, --double=DOUBLE grab a double
-v, --verbose enable verbose

Help options:
-?, --help Show this help message
--usage Display brief usage message



~/Desktop/sourceCode/C/popt$ ./main --usage
Usage: main [-v?] [-i|--integer INT] [-s|--string STRING] [-f|--float FLOAT]
[-d|--double DOUBLE] [-v|--verbose] [-?|--help] [--usage]



The heart of the library lies with proper initialization of the popt structure which is defined as follows:

struct poptOption {
const char * longName;
char shortName;
int argInfo;
void * arg;
int val;
const char * descrip;
const char * argDescrip;
};

The ability to specify either a long or short argument name is common practice; -h or --help is a common example of this form.
The long and short forms are specified as the 1st and 2nd element in the popt structure. The 3rd and 4th specify of what type the following argument consists of as well as the address in which the value will be stored. The 6th and 7th arguments specify the argument description and argument field name which is displayed in the help and brief. The 5th field is a bit of a mystery at this time.

That's all for now.

28 August, 2007

Gprof Profiling

Profiling is a mechanism for identifying program 'hotspots', identifying where your program spends a good majority of it's time. These regions are prime candidates for optimizations, where concentration of efficiency will give you the most bang for your buck.

Profiling with Gprof consists of 3-phases: 1) compiling your application with profiling enabled, 2) executing the application to gather profiling metrics, and 3) evaluating the collected metrics.

Compiling
You first must compile your application similar to the way you normally compile it. Two additional flags must be specified however, the -pg option to enable profiling, and the -g option to introduce the debugging symbols for tracking source lines of code. Actually, the -g option is only necessary for line-by-line profiling, but for good measure I suggest specifying it regardless.

$ gcc -pg -g foo.c -o foo


Executing
You run your application in the same manner as your normally run it, specifying the same arguments, inputs, outputs, . . . What you may notice however is that the program will execute slower than normal. This is reasonable when you consider what is taking place. Profiling metrics are collected during the execution of the program. Worth noting, your executable needs to terminate in a normal fashion, by returning from the main routine or calling exit().
Immediately prior to terminating the program, an output file (gmon.out) is generated that contains the collected profiling metrics. This file will later be used in evaluating the program performance.

Evaluation
Evaluation of the results stored in the gmon.out file is the last step in our quest. Various types of reports are available from the collected profiler info, the most readily used is the flat model. This is done by executing the command:

$ gprof foo


This results in the default report, a flat model followed by the call graph. The flat model will depict the list of functions in decreasing order with respect to time spent in each function. This allows for quick evaluation of the functions that your application spends most it's time in. You may notice that two functions appear in every profile; mcount and profil, each function is part of the profiling apparatus. The time spent in both functions can be viewed as profiler overhead.

12 August, 2007

FFMpeg Video Clip Generation

Hey all,

I just published my first two videos on YouTube.

I downloaded a couple videos from our Tivo, used tivodecode to decode to Mpegs and FFMpeg to extract 30 sec clips.


$ ffmpeg -ss 600 -t 30 -i infile.mpg outfile.mpg


Will result in seeking in 10 min, encoding the next 30 seconds from infile.mpg and saving to outfile.mpg

02 August, 2007

Theora/Ogg Example

I've recently been playing with Theora for encoding video. I found few examples for encoding raw video frames, and those I found were more complicated that I wanted. Below you will find my first toe-dip in the Theora realm. It basically generates a spinning dot in the middle of the frame and encodes to video at 10 frames per sec.

Have fun.


#define _FILE_OFFSET_BITS 64

#include <stdio.h>
#include <stdlib.h>
#include <ogg h="">
#include "theora/theora.h"
#include <string.h>
#include <math.h>

static FILE *ogg_fp = NULL;
static ogg_stream_state ogg_os;
static theora_state theora_td;
static theora_info theora_ti;

static int
theora_open(const char *pathname)
{
printf("(%s:%d) out filename: %s\n",__FILE__,__LINE__,pathname);
ogg_packet op;
ogg_page og;
theora_comment tc;

ogg_fp = fopen(pathname, "wb");
if(!ogg_fp) {
fprintf(stderr, "%s: error: %s\n",
pathname, "couldn't open output file");
return 1;
}

if(ogg_stream_init(&ogg_os, rand())) {
fprintf(stderr, "%s: error: %s\n",
pathname, "couldn't create ogg stream state");
return 1;
}

if(theora_encode_init(&theora_td, &theora_ti)) {
fprintf(stderr, "%s: error: %s\n",
pathname, "couldn't initialize theora encoding");
return 1;
}

theora_encode_header(&theora_td, &op);
ogg_stream_packetin(&ogg_os, &op);
if(ogg_stream_pageout(&ogg_os, &og)) {
fwrite(og.header, og.header_len, 1, ogg_fp);
fwrite(og.body, og.body_len, 1, ogg_fp);
}

// encode a comment into the packet
theora_comment_init(&tc);
theora_encode_comment(&tc, &op);
ogg_stream_packetin(&ogg_os, &op);
if(ogg_stream_pageout(&ogg_os, &og)) {
fwrite(og.header, og.header_len, 1, ogg_fp);
fwrite(og.body, og.body_len, 1, ogg_fp);
}

theora_encode_tables(&theora_td, &op);
ogg_stream_packetin(&ogg_os, &op);
if(ogg_stream_pageout(&ogg_os, &og)) {
fwrite(og.header, og.header_len, 1, ogg_fp);
fwrite(og.body, og.body_len, 1, ogg_fp);
}

if(ogg_stream_flush(&ogg_os, &og)) {
fwrite(og.header, og.header_len, 1, ogg_fp);
fwrite(og.body, og.body_len, 1, ogg_fp);
}

return 0;
}

static int
theora_write_frame(unsigned long w, unsigned long h, unsigned char *yuv)
{
yuv_buffer yuv_buf;
ogg_packet op;
ogg_page og;

unsigned long yuv_w;
unsigned long yuv_h;

unsigned char *yuv_y;
unsigned char *yuv_u;
unsigned char *yuv_v;

unsigned int x;
unsigned int y;

/* Must hold: yuv_w >= w */
yuv_w = (w + 15) & ~15;

/* Must hold: yuv_h >= h */
yuv_h = (h + 15) & ~15;

yuv_y = malloc(yuv_w * yuv_h);
yuv_u = malloc(yuv_w * yuv_h / 4);
yuv_v = malloc(yuv_w * yuv_h / 4);

yuv_buf.y_width = yuv_w;
yuv_buf.y_height = yuv_h;
yuv_buf.y_stride = yuv_w;
yuv_buf.uv_width = yuv_w >> 1;
yuv_buf.uv_height = yuv_h >> 1;
yuv_buf.uv_stride = yuv_w >> 1;
yuv_buf.y = yuv_y;
yuv_buf.u = yuv_u;
yuv_buf.v = yuv_v;

for(y = 0; y <x =" 0;" y =" 0;" x =" 0;">> 1) + (y >> 1) * (yuv_w >> 1)] = 0;
yuv_v[(x >> 1) + (y >> 1) * (yuv_w >> 1)] = 0;
}
}

for(y = 0; y < x =" 0;" y =" 0;" x =" 0;">> 1) + (y >> 1) * (yuv_w >> 1)] =
yuv[3 * (x + y * w) + 1];
yuv_v[(x >> 1) + (y >> 1) * (yuv_w >> 1)] =
yuv[3 * (x + y * w) + 2];
}
}

if(theora_encode_YUVin(&theora_td, &yuv_buf)) {
return 1;
}

if(!theora_encode_packetout(&theora_td, 0, &op)) {
return 1;
}

ogg_stream_packetin(&ogg_os, &op);
if(ogg_stream_pageout(&ogg_os, &og)) {
fwrite(og.header, og.header_len, 1, ogg_fp);
fwrite(og.body, og.body_len, 1, ogg_fp);
}

free(yuv_y);
free(yuv_u);
free(yuv_v);

return 0;
}

static void
theora_close(void)
{
ogg_packet op;
ogg_page og;
static int theora_initialized = 0;

if (theora_initialized) {
theora_encode_packetout(&theora_td, 1, &op);
if(ogg_stream_pageout(&ogg_os, &og)) {
fwrite(og.header, og.header_len, 1, ogg_fp);
fwrite(og.body, og.body_len, 1, ogg_fp);
}

theora_info_clear(&theora_ti);
theora_clear(&theora_td);

fflush(ogg_fp);
fclose(ogg_fp);
}

ogg_stream_clear(&ogg_os);
}

int
main(int argc, char *argv[])
{
int c;
int n;
unsigned i;
const char* const outFile="foo.ogg";
const unsigned NumFrames = 512;

const unsigned Width = 640/2;
const unsigned Height = 480/2;
const int video_fps_numerator = 30;
const int video_fps_denominator = 1;
const int video_aspect_numerator = 0;
const int video_aspect_denominator = 0;
const int video_rate = 0;
const int video_quality = 63;
theora_info_init(&theora_ti);

theora_ti.width = ((Width + 15) >>4)<<4; height =" ((Height">>4)<<4; frame_width =" Width;" frame_height =" Height;" offset_x =" 0;" offset_y =" 0;" fps_numerator =" video_fps_numerator;" fps_denominator =" video_fps_denominator;" aspect_numerator =" video_aspect_numerator;" aspect_denominator =" video_aspect_denominator;" colorspace =" OC_CS_UNSPECIFIED;" pixelformat =" OC_PF_420;" target_bitrate =" video_rate;" quality =" video_quality;" dropframes_p =" 0;" quick_p =" 1;" keyframe_auto_p =" 1;" keyframe_frequency =" 64;" keyframe_frequency_force =" 64;" keyframe_data_target_bitrate =" video_rate" keyframe_mindistance =" 8;" noise_sensitivity =" 1;" i="0;" frame="" d="" of="" n="" r="255;" const="" unsigned="" char="" y="(abs(R" 2104="" g="255;" 4130="" b="255;" 802="" 4096="">> 13);
unsigned char U = (abs(R * -1214 + G * -2384 + B * 3598 + 4096 + 1048576) >> 13);
unsigned char V = (abs(R * 3598 + G * -3013 + B * -585 + 4096 + 1048576) >> 13);
// generate a frame with specified color
{
unsigned i;
for(i = 0; i < r =" 0;" g =" 0;" b =" 255;" y =" (abs(R">> 13);
unsigned char U = (abs(R * -1214 + G * -2384 + B * 3598 + 4096 + 1048576) >> 13);
unsigned char V = (abs(R * 3598 + G * -3013 + B * -585 + 4096 + 1048576) >> 13);
const unsigned cX = Width/2;
const unsigned cY = Height/2;
const unsigned Radius = 50;
static double theta = 0.0;
const unsigned x = Radius * sin(theta) + cX;
const unsigned y = Radius * cos(theta) + cY;
const unsigned k=3*(x + (Width*y));
theta -= 5.0 * 3.14159/180.0;
yuv[k] = Y;
yuv[k+1] = G;
yuv[k+2] = B;
}

if(theora_write_frame(Width, Height, yuv)) {
theora_close();
exit(1);
}
}

theora_close();
return 0;
}

05 July, 2007

Logging to SysLog

Trace logging of significant events is a fundamental method for debugging and isolating errors encountered. User warnings, pop-up notification boxes, and application-level log files are one way to go. Another; to make use of the standard system log services found on *nix systems. The remainder of this post will some brief information I've recently discovered concerning interfacing with the system log services.

The syslog system import defines the interface:


extern void syslog (int __pri, __const char *__fmt, ...)


which allows for variable-lengthed argument lists, much like printf. The interface is pretty simple, the first argument specifies a priority or type of log statement; the second, a formatted string in the same form that printf takes.

A simple C

#include <stdio.h>
#include <syslog.h>

int main(int argc, char *argv[]) {
syslog(LOG_NOTICE,"%s process started", argv[0]);
sleep(3);
syslog(LOG_NOTICE,"%s process terminating", argv[0]);
}


Compile and running this example will result in two log entries in the system logs. Determining what log will contain the log statements can be determined by 1) the log type, and 2) the /etc/syslog.conf file. Note, that the log type is defined as LOG_NOTICE, and the entry in the syslog.conf file entry;

*.=info;*.=notice;*.=warn;\
auth,authpriv.none;\
cron,daemon.none;\
mail,news.none -/var/log/messages


shows that the notice log entries will be evident in /var/log/messages.

Since the files are only accessible to superusers, logging in as root and tailing the log file should show the messages in the same manner as:

# cat /var/tmp/messages
.
.
.
Jul 5 20:30:18 riffraff syslogd 1.4.1#18: restart.
Jul 5 20:42:07 riffraff app: ./app process started
Jul 5 20:42:10 riffraff app: ./app process terminating


You can interface with the syslog service using scripts as well.

$ logger -p user.notice "hello again"
# cat /var/tmp/messages
.
.
.
Jul 5 20:42:07 riffraff app: ./app process started
Jul 5 20:42:10 riffraff app: ./app process terminating
Jul 5 21:04:06 riffraff lipeltgm: hello again


Pretty slick, huh?

17 June, 2007

Software Programming Paradigms

Oh, what did we do before Wikipedia!

My wife is defining some software development training materials focusing on the object-oriented programming paradigm. One of the slides presents the idea that object-oriented programming is _a_ programming technique, not _the_ programming technique.

Far too often I think we get caught up with a new technology and believe that it is the end-all of software development. Object-oriented programming, agile development, Ruby-on-rails, . . . I think we all too often jump on the new technology bandwagon. It is worth a lunch-hour to investigate some of the other programming paradigms.

http://en.wikipedia.org/wiki/Programming_paradigm

15 April, 2007

Whiz Kid

As I stated in a previous entry, I've recently got back from an SPIE conference. This has been the first I've ever attended and with luck this won't be the last.

An interesting event that occur ed at this conference was the presentation of a technique for 'head-locating' in infrared camera images. While the technique didn't exactly blow my skirt up...while I have some experience in computer vision that isn't my current focus. What did however get my attention was the presenter....a grade school kid with an internship with Johns Hopkins. Not only did he author the paper, he presented as well and by all rights did so with more enthusiasm and professionalism that many of the adults.

Sure, the kid can locate a humans head in a busy scene....he can use terms like 'modality' and solve differential equations....but dollars to donuts he's never seen a boobie first-hand.

That is the only comfort that lets me sleep at night.

Minnesota Bound

I am currently en route from Orlando, FL on my way back from an SPIE conference where a colleague authored and presented a paper. I tagged along as a result of my co-author status.....senor' coat-tails has been the title that comes to mind.

A number of interesting topics were presented, primarily concerned with wireless communications and fields of study concerning sensors. An observation worth noting was the evident differences in the disciplines of the presenters and how it affected the detail of the topic and the presentation style.

As I've said in the past, I am a software engineer by trade, having a BS and MS degree in Computer Science. Both school and industry alike, I've been surrounded by individuals both in Comp Sci as well as other engineering disciplines such as physics, electrical as well as mechanical engineering and the differences in the disciplines are very evident.

As much as a person wishes not to admit it....computer science is still a 'soft' science. Sure, it is far more disciplined than the likes of some other engineering persuasions, but is still far less structured as the likes of pure mathematics, physics, optics, electrical or mechanical engineering. The majority of the presentations were conducted by doctorates of physics, electrical or optical engineering, but every once in a while you'd get a computer science major up there and the presentation took a different tone. The best way I can say it is that the logic was softer or more fuzzy. Most presentations from more formal disciplines would be followed with mathematical description of the technology. Computer science presentations rarely had such formality, with the occasional exception to a presenter that had an undergrad degree in a more disciplined major and a PhD in computer science.

Am I knocking Comp Sci...to a degree I am doing just that. With the minor exception to those that pursue theoretical computer science focuses which are more in line with formal mathematics the majority of us still groan in the presence of a formal proof, a detailed description of the fundamental technology in the only pure language...mathematics. As a result, I think we give up the things that could make our discipline more successful. I can, for one, say that if we took a more structured look at proving techniques on paper rather than jump in and start coding we may have less error-prone technologies.

I'm not touting that we should all begin doing formal proofs on the effectiveness and efficiency of our authored algorithms. Instead, I'm saying that perhaps it's worth a 1/2 a day investigation on paper before firing up your trusty editor out of the gate.

04 March, 2007

Technology Webcasts

Recent availability of personal high-speed networking made available delivery of tech netcasts over the web. Two of my favorite regular broadcasts include 1) CommandN: available from youtube.com and 2) Tech Talks off of video.google.com.

Worth a view if you haven't seen them before.

28 February, 2007

Desires and Diversions

In college our computer science department had a series of videos, each with a computer science theme. One of the "Distinguished Lecutres" Series was given by Allen Newell entitled Desires and Diversions and is one of the most motivational lectures I've ever experienced. I recently located it on the web and figured I'd share.

I've recently uploaded to YouTube, links below:























26 February, 2007

Tinkering with Ruby

I've recently decided to expand on my OO arsenal by picking up Ruby and Python.

Not much to discuss yet as I am just beginning my journey into Ruby.

One thing perhaps worth noting is the reason that I chose Ruby, that Scott Meyer a notable authority of C++ has high opinions on Ruby and "Ruby on Rails" stating that he feels that it's the next generation object-oriented language.

14 February, 2007

The Dirty Little Secret of CMMI-Level 5

There are a-many things that I think are great concepts but tend to fall apart in the real world; for instance "always tell the truth", "turn the other cheek", "fat-free desserts"...sure, they are great ideas on paper, but when practically implemented they leave you with a black eye, two bruised cheeks, and a mouthful of what tastes like damp sawdust.

The first time I heard about the CMM-level assessment concept I thought what a great idea. Have an independent assessment of your companies strengths and weaknesses, identify an action plan to strengthen your weakness and strive for an even stronger level of capabilities by reaching for the next ladder rung and just work your way to the top. Hey, it was authored by a character from CMU, a notable university, it must be good.

I won't speculate on what might have been, but this concept took a downward spin the day that the Department of Defense decided that all defense contractors have a deadline to achieve level-5 or risk loss of program funding (sigh). Since that wise and ever-so-informed decision (wink) it's been every defense contractor's ambition to make it up to the last rung as quickly as possible. I've witnessed some of the most unethical behaviors in pursuit of this highly acclaimed prize that I only wish I never again see. Contracted independent assessor findings bought and paid for, the hand-selecting of individuals that will mindlessly tout from the Book of Process, and 9th inning e-mails identifying the expected questions assessors will be asking and the proper response (with emphasis on stating only the defined answer and directing to say no more). I ask you, in the pursuit of all contractors achieving level-5 how greatly has it reduced the cost of development programs? Not a dime, it has instead increased program costs.

Sorry for the short rant, the validity of CMMI assessments isn't the subject I wanted to write about, but is directly related. See, now that most defense contractors have fully embraced CMM there is an underlying tone that comes with the price of admission. Assets are all interchangeable; this is the basis of CMM, take a development process and define enough paperwork and instructions such that any monkey that can read can follow it to success (hearty laugh). In recent history employees somehow lost loyalty to their employers and as a result the employers suffer high turn-over. How do you fix this you ask.....well let me tell you.......CMMI level-5(tah da). This magic elixir will allow you to lose an employee, snag an innocent pedestrian off the street, strap a cubical 'round them, hand 'em your defined process and procedures and not miss a beat. Employees are one-size-fits-all and are replaceable at a moments notice.

While I've always speculated that this is how the Mahogany Row muckety mucks looked at their underlings it became obviously apparent as I spent 4 1/2 hours in process training this morning. When asked how the company selects teams for a newly defined program he all but said pick who is available. "How do you find candidates with the proper tool, development methodology, domain experience and such" I asked. "I don't know" he replied. So what you're saying is that there is no way to establish my skill sets nor career interests in hopes to be recognized as a candidate for a new exciting program. What you are saying is that their is no way to align my career aspirations, nor technical experiences, or interests with the companies goals? So even though I accepted a position at company X because they were working in an interesting domain Y and I have expertise in Y when that program is completed there is no way for me to be selected for a new project in Y? Are you kidding?

My point is, I'm still young, ambitious and still love working on 'interesting problems'. I hired into my current company because they have 'interesting problems' and a boat-load of dull ones. Looking at me as a cog and not taking into account my experiences and technical desires you risk losing me when you blindly move me to a position not in line with my individual goals. Treating me like a replaceable cog will increase the likelihood that I'll leave and only increases your turn-over which was the reason you adopted CMMI. The solution becomes the problem.

31 January, 2007

Code Generators

On our way to lunch a collegue of mine asked for my thoughts concerning MatLab-generated code. I repeat my thoughts on the subject here, just for fun.

Even in college, some 16 years or so, I can recall strong opinions on generated code and drag-n-drop programming languages. As a computer scienced major, I shared the same opinions as the majority of my peers; frankly, that drag-n-drop and auto-generated code is sloppy, unmaintainable and for people who can't handle programming in a real programming language.

I find over the past couple years or so, that my opinion on such subjects have changed a bit. My opinion has softened from the defensive stance of yester-years. Present-day however, some of my collegues share my past opinion with the same level of zest and fire as I once had.

My change of opinion started first by not getting emotional on the matter; which is easier said than done by individuals that make their livelihood developing software. Code generators and drag-n-drop utilities can be viewed as a replacement to the software developer of the day. The popularity of Visual Basic is a testimate of such an opinion; VB is marketed as a tool for MBA's to develop software without the presence of software developers. My opinion began to change after I quit taking that personally.

Auto-generated code still has challenges, two of which are maintainability and performance.

One of the first major criticisms of auto-generated code is that the code is unmaintainable. Code generators typically take a model representation, generate a language-specific representation of the model which is then presented to a compiler. It takes little time to form an opinion of the maintainability of this source code if you've ever glanced at the auto-generated C source from many code generators. While the syntax and semantics are valid, they can easily be a contendor for The International Obfuscated C Code Contest http://www.ioccc.org/. The variable naming conventions are cryptic enough to produce a slight migraine. The mere idea of modifying such source, even slightly readily produces a sense of panic. I began becoming less concerned about maintainability when I stopped focusing on the source code and more on the model. The source code is merely a bi-product of the model. Maintainability of the model is the issue, not maintainability of the source code. In actuality, the source code doesn't have to be understandable or maintainable for that matter. The model however has to be.

Performance is the strongest criticism of auto-gen'd code and brings out the strongest feelings in software developers. A quick examination of the generated source code can certainly prove that it can be more memory and cpu intensive than hand-tailored software. It would take little more than a second-year computer science major to produce more efficient code than most code generators. But to put it into perspective, I've witnessed some incredibly memory & cpu-intensive hand-tailored code by very experienced professionals. Wastefullness isn't unique to auto-generated code, or drag-n-drop programming; it is a product of rapid development. Given a relatively complex problem, and a desire to solve it quickly you tend to produce waste; period. Aggressive time constraints tend to lead you to the simplist and readily available solution. Generally speaking, the simplist solutions are typically greedy and therefore wasteful. Before you cry foul, remember what you gained...you gained a working solution in a short time period.

I've witnessed on two seperate occasions real-time, distributed, military systems prototyped in autogenerated code surpassing the re-engineered hand-tailored software developed by experienced, professional developers. The auto-generated code performed system missions faster consistently than the hand-tailored code and also performed more reliably as well. More importantly however is the fact that the auto-generated code was developed quicker and by a significantly smaller team (read that cheaper as well). To be fair however, this team was highly motivated, and were the recognized domain-experts.

Time to throw in the towel and use code generators exclusively? Hardly. I am not implying that code generators are the wave of the future. I'm only saying that you should look at it as another tool in your arsenal. Stop looking at software from a technologist's perspective and start looking at it from a business perspective. If your domain experts are fluent in code generators and less experienced in general-purpose programming languages, consider using the code generators. If your cpu/memory requirements can accomodate the overhead of autogenerated code and you believe that the development effort using code generators will be less than hand-tailoring at least give them a look.

15 January, 2007

Private Classes in C++

For all you object-oriented software developers using C++ out there I hope this post will be of interest.

Without getting into specifics; there are numerous occasions when an abstraction spans a single class and to preserve encapsulation you may choose to define a private class (an alternative to defining a friend class). I generally have chosen to embed the private class declaration exhaustively in the public class'es private region. Recently however I've encountered on a couple other options that I thought would be interesting to share.

Option 1
Let's start with the most straightforward method; embedding the private class in the public classes private region.


// -- file: SomeClass.h --
#ifndef SOMECLASS_H
#define SOMECLASS_H

class SomeClass {
public:
SomeClass();
virtual ~SomeClass();
protected:
private: //-- methods and private classes --
class SomePrivateClass {
public:
SomePrivateClass();
~SomePrivateClass();
void doSomethingInteresting();
protected:
private:
};
private: //-- attributes --
SomePrivateClass myPrivates_;
};

#endif

// -- file: SomeClass.cpp --
#include "SomeClass.h"
#include <stdio.h>

SomeClass::SomeClass():myPrivates_() {
myPrivates_.doSomethingInteresting();
}

SomeClass::~SomeClass() {
}

SomeClass::SomePrivateClass::SomePrivateClass() {
}

SomeClass::SomePrivateClass::~SomePrivateClass() {
}

void SomeClass::SomePrivateClass::doSomethingInteresting() {
printf("(%s:%d) doing something interesting\n",__FILE__,__LINE__);
}


One of the disadvantages of this approach is that the header file becomes quite large to accommodate the multiple class definitions. Additionally, fully declaring the private class smack dab in the middle of the private region of the public class can easily give the reviewer/maintainer a headache. Indentation is pretty much your only indicator of what class you are currently looking at. Imagine your current page shows 3 method declarations (w/ documentation) you can easily lose track of what class the methods are associated with.

Option 2
The second option is similar to option 1 in that both class declarations are defined in the header, but note that the public classes instance of the private class is no longer contained by value but instead contained by reference. Since the private class declaration is not complete prior to declaring an instance of the class the size of the private class is unknown, therefore an instance cannot be defined. A pointer to an object however is allowed because a pointer is of fixed size (oftentimes the size of an integer). As you are probably aware, pointers are evil and the definition of an unmanaged pointer is even more evil. If you don't know why pointers are evil I'd highly recommend reading Marshall Cline's FAQ. I've defined a dumb pointer solely for ease of demonstration, I'd certainly recommend a managed pointer to avoid resource leaks.


// -- file: SomeClass.h --
#ifndef SOMECLASS_H
#define SOMECLASS_H

class SomeClass {
public:
SomeClass();
virtual ~SomeClass();
protected:
private: //-- methods and private classes --
class SomePrivateClass;
private: //-- attributes --
SomePrivateClass* myPrivates_;
};

class SomeClass::SomePrivateClass {
public:
SomePrivateClass();
~SomePrivateClass();
void doSomethingInteresting();
protected:
private:
};

#endif

// -- file: SomeClass.cpp --
#include "SomeClass.h"
#include <stdio.h>

SomeClass::SomeClass():myPrivates_(new SomePrivateClass()) {
myPrivates_->doSomethingInteresting();
}

SomeClass::~SomeClass() {
}

SomeClass::SomePrivateClass::SomePrivateClass() {
}

SomeClass::SomePrivateClass::~SomePrivateClass() {
}

void SomeClass::SomePrivateClass::doSomethingInteresting() {
printf("(%s:%d) doing something interesting\n",__FILE__,__LINE__);
}


While this option eases navigation of what class you are currently looking at it still suffers from a potentially large header file. Additionally, since the private class declarations are located in the header file, changes to the private class due to implemenation changes will result in header compile-time dependencies.

Option 3
This option is similar to option 2 except the private class is fully declared and defined in the cpp file; seperating the implementation from the public interface.
This is formally known as the Pimpl Principle.


// -- file: SomeClass.h --
#ifndef SOMECLASS_H
#define SOMECLASS_H

class SomeClass {
public:
SomeClass();
virtual ~SomeClass();
protected:
private: //-- methods and private classes --
class SomePrivateClass;
private: //-- attributes --
SomePrivateClass* myPrivates_;
};

#endif

// -- file: SomeClass.h --
#ifndef SOMECLASS_H
#define SOMECLASS_H

class SomeClass {
public:
SomeClass();
virtual ~SomeClass();
protected:
private: //-- methods and private classes --
class SomePrivateClass;
private: //-- attributes --
SomePrivateClass* myPrivates_;
};

#endif

// -- file: SomeClass.cpp --
#include "SomeClass.h"
#include <stdio.h>

// note the definition of private class prior to declaration of
// instance of class
class SomeClass::SomePrivateClass {
public:
SomePrivateClass();
~SomePrivateClass();
void doSomethingInteresting();
protected:
private:
};

SomeClass::SomeClass():myPrivates_(new SomePrivateClass()) {
myPrivates_->doSomethingInteresting();
}

SomeClass::~SomeClass() {
}


// private class definitions
SomeClass::SomePrivateClass::SomePrivateClass() {
}

SomeClass::SomePrivateClass::~SomePrivateClass() {
}

void SomeClass::SomePrivateClass::doSomethingInteresting() {
printf("(%s:%d) doing something interesting\n",__FILE__,__LINE__);
}




I don't know if I strongly recommend any of these methods as of yet. I've currently discovered the latter two options and am slowly forming opinions on them. Regardless of my preferences it's nice to have options.

14 January, 2007

Top 5 Technologies That Have Changed My Life

There are no shortages of technologies these days, few of them I could live without; these, I could not live without major inconveniences.

5) Internet
Perhaps a clique...but truth be told, the Internet has changed my life significantly. God bless Al Gore :) Whether at work investigating the proper usage of a STL container, researching competitor products specifics, or just general leisurely browsing for stupid humor tricks. Couple the WWW with a wireless 802.11 connection and an available laptop and you've got yourself a party.

4) Text Messaging
Sure, cell phones change may peoples lives. You can be reached at anytime, anyplace; which brings up all kinds of edicate issues. Sure, cellphones are nice...but what I really love is text messaging. Properly used, you never really have to speak to your wife, nor she to you. Besides, what else can you do during your hours of training or meetings.

3) Tivo
While the remote resembles a phallic symbol; once you get past that you'll learn to love your Tivo. What other product comes with a window sticker that you can proudly display on your car window? Once you've spent some time defining your wishlist and programs you'd like recorded as well as defined the priority of each recording, you're all set. My wife and I must have a couple dozen programs defined in our season pass manager and each night we come home we've got a selection of programs to watch....and can fastforward thorough those pesky commercials. I'd be willing to bet that we even watch less television since we've got our little cable-buddy. We no longer spend time watching shows just 'cause they are on, now we only watch those programs that are really worth watching. An hour program takes ~45 minutes to watch, a 1/2 hour take ~20 minutes; end result is more entertainment in less time.


2) Wikipedia
Rarely has a day gone by when a passing conversation or lunch topic does not result in a peaked curiosity. In the days of yourn, one would normally consider digging out the Brittanica for a full 7 seconds, shrug it off and just go about your day...never to know who "Aqua Lad" was. Fast forward to today and a similar question takes less than 7 seconds to retrieve the answer (or at least someones rendition of the answer) from this God-given website.


1) Google
Last but not least, you have to admit that Google has changed your life if you spend any time rattling on a keyboard. Few other products of our generation warrant it's name morphing into a formal verb. Sure, Yahoo was my search engine of choice during the college years (along with AltaVista), but today Google is first choice in the web searching realm.

That's it boys and girls. My top 5 technologies that have made a significant change to my life.

09 January, 2007

4 Books Every C++ Developer Should Have On His/Her Shelf

Over the years development languages change from project-to-project. A challenge for learning a language involves selection of what references are worthwhile. Far too often I've made poor selections in textbooks and selecting C++ references some years back was no exception.

I'd like to list the top 4 books that I've selected for my arsenal that I feel are essential to doing proper C++ (in no particular order):
1) The C++ Programming Language - Stroustrup
To understand C++, or any other language, you've gotta understand the language syntax and semantics. The rules...what's allowed and what is not. Who better to inform you of such things than the father of the programming language; nuff said.
2) C++ FAQs 2nd Edition - Marshall Cline
After you know the language syntax and semantics, the next thing you should be educated in is what should you do and what shouldn't you do. Marshall Cline identifies many of the pitfalls of common practices in an educational and surprisingly entertaining manner. I've attended a series of training sessions with the man himself and I highly recommend both the book and the courses if you can get someone else to pay for it :)
3) Effective C++ 3rd Edition - Scott Meyer
I'd consider this book to be the Bible with respect to best C++ programming practices. I cannot say enough good things about this book, but be forwarned...it expects you to understand the language to a fair degree. Don't buy this as your introduction to C++, buy this as a moderately experienced developer.
4) Effective STL - Scott Meyer
What is C++ without the STL.....the equivalent of a three-legged dog; never living up to the potential of a 4-legged dog. Come on, most interesting problem domains are complex enough to benefit from utilizing the STL including a great deal of embedded software products.

These are the books that I'd recommend to anyone developing in C++. I have a set of Sutter and Koenig books available to me and I've heard great things but since I haven't read them personally yet I cannot recommend them yet.

06 January, 2007

I am the Victim of a Drive-By Coding

I've been a software developer for the past 9 years for various defense organizations.

A common practice, in times of need, is to employ temporary assets in the form of contractor engineers. This practice has time and time again proven to aid in the development of programs that have an aggressive schedule and a budget to accomodate the extensive salaries for contractors.

The typical role of a contractor is that of being brought in at a time of need and after that need has been eliminated the contractor(s) are shortly let go. The loftly salaries of these contractors is justifyable when you consider that they are brought in and placed immediately on 'critical path', and the understanding that they will be eliminated shortly after they are no longer needed. I've worked with dozens of contractors over the years and my opinion of the majority of them is quite positive. Most of them are not what I'd consider to be extraordinarily talented; the majority of contractors are pretty average. This coming from a software engineer that rates himself...average. Besides, on average....people are average. The reason I even mention this is to disolve any delusion that contractors are extraordinarily efficient, knowledgeable or brilliant. No, most are just like you or I.

I notable observation that I've made over these years is a typical side effect of contracted work. Note that typical contractors are involved with a project for a matter of months. Most often, they come in at the preliminary or detailed design phase and are responsible for designing and coding of some defined task(s). Keeping the short duration of involvement in mind, a common side-effect of contractor work is minimal or non-existant documentation in detailed design or the implementation phases of development. I don't necessarily fault contractors for failing to document their design or code. Over the years of playing revolving doors with companies they are never around long enough to see the pain and suffering resulting from updating a design or code that contains no documentation. One of my most recent sorrows entailed reviewing a design for a radio component from a contractor engineer. The majority of the minimal documentation was of the nature of 'do good things here'. I suffered through a code review of this product, and after a numbing 16 hours of reviewing I was near tears. One method in particular spanned approximately 3 pages (or ~120 lines since hardcopy is dead). It took better than an hour studying some of the most complicated code on the planet I finally realized that the method was responsible for scheduling message delivery with a priority scheme where messages intended for a single destination are given priority to those messages intended for multiple destinations. How was I suppose to know of this scheme.....well, not from the non-existent comments, nor from the radio interface control document (the priority scheme wasn't dictated by the hardware)...no, the only way to become enlightened to this design was to read the code.....the hundreds of lines of code....the code that rivals in complexity that of the Apollo space program. Days from now this contractor will be gone and some sap will be responsible for his product. This sap has become me on more than one occasion and I suspect it will once again be me. I look forward to the days of being bombarded with questions about this software component, followed by hours of diving into the code producing the answer and returning for a fresh set of questions. Better yet, fielding the software component will undoubtedly identify errors which will involve hours and hours of unpaid overtime.

I'll end this post with another plea. For God's sake, document your damn code!!

05 January, 2007

A Love-Hate Relationship

I love my job...I really do. The software field is by-far one of the most exciting fields of our time. With a $500 PC, expertise in some area, motivation and dedication you can accomplish some extraordinary feats. A couple college drop-outs in a garage create an operating system that one day dominates the PC market. A kid with an idea for a centralized video repository later sells it for billions. Ideas that can be accomplished with software are near endless, they are not directly bound by physics like that of electrical engineering and mechanical engineering. I really love this field.

At the same time there are days that I am one step away from my co-workers finding me hanging from the rafters, lifeless...with a smile on my face signifying an end to the madness.

One of the present-day aggravations that steers me toward a noose and a chair is a newly hired contractor. I have to say that most often bringing in a new body to the team is exciting. They often times come with fresh experiences, a newly defined sense of drive, and often times technical expertise that is of interest. I have to say that I was exceptionally excited to work with this new guy. I helped conduct a phone interview with him and of all the candidates he was by far the most qualified. His responses to each interview question was spot-on, textbook responses. I was really looking forward to working with him.

The usual new guy dance entails a couple days of 20 questions, an assignment of a well-defined task for them; they finish the task, they may get another of increasing complexity, otherwise they are on their way. Our new guy however has two-right feet; he doesn't know the steps and is stuck in the 20-question loop. Touting his 20+ years of experience he is under the delusion that he was hired as a consultant. I do believe that I've been asked every possible question with two exceptions....why our hardware casings are green, and why we are using C++. I expect them to be raised within a week.

While I have no experience as a consultant, nor as a contractor I have not yet to meet a contractor that would not rather be a consultant. The difference? A contractor is a temporary replacement to a senior engineer, a consultant a temporary replacement to a system architect. Consultants are typically hired to look at the big picture of the system, making recommendations to assist in bettering the product. Education of, and enforcement of best practices is an example of a typical role for a consultant. Consultants are typically an authority on a given subject or domain. Hiring the likes of Scott Meyer or Marshal Cline to teach proper C++ best practices would be an example of a consultant. Contractors on the other hand are typically a temporary resource with more average skill sets. The typical contractors that I have been involved with are software engineers. I think of them as a coder for hire. If you ever wonder if you're a contractor or a consultant just take a peak at your paycheck. If you are bringing in ~$500/hr then chances are you are a consultant, if you're in the ~$100/hr range then your a contractor.

So, what does this have to do with our new guy? Well, he's been hired as a contractor and is playing consultant. Rather than focus on accomplishing his tasks at hand, he instead snoops around our entire source tree asking questions about everything and then expressing his 'opinion' on why it should have been done differently (sigh). He is currently just finished week 3 of his self-proclaimed 2 day task, the other 13 days entail a good deal of 'why did you do it like this....why didn't you do it like that'. Worst yet, he's slowed down everyone else with his unproductive questions.

I'll finish my rant with a plea. For those of you that are contractors, hired as contractors but want to consult....either sit-down, shut-up and do your current job, or seek out a consulting opportunity elsewhere.

04 January, 2007

Who Is The Fat, Slow Kid

Why am I the Fat, Slow Kid?

Well, I'm shaped more like a Weeble than that of an Olympic swimmer. And while I am not exactly 'slow', I'm not necessarily the sharpest pencil in the pencil box.

As a profession, I am a software engineer having attained an undergrad and graduate degree in Computer Science and have been a software engineer going on 9 years now.

I anticipate the majority of my posts will entail various rantings and ravings concerning software development as a profession and technologies that I find of interest.