Wednesday, December 30, 2009

That strange old thing called "Windows"

Apparently, I've been working with Linux more than I realize.

It's not just the little things, like trying to use CTRL-W to close windows (the key sequence used by MS-Windows is CTRL-F4). It's bigger things.

This morning, I came to the office, logged in to Windows, and started MS-Outlook. Outlook refused to run. It provided a dialog with the message "Outlook has had a fatal error" or something to that effect. I and a colleague worked on the problem for about fifteen minutes with no success. Eventually the colleague suggested that I restart Windows.

Restarting Windows is of course the first suggestion offered by most help desks. And it works. (It worked in this morning's case, too.)

I've gotten used to Linux and its ability to run and run and run, for days or weeks or months. So much so that I forgot the number one solution to problems in Windows.

Somehow I don't feel bad about that.


Monday, December 21, 2009

Side trip to Ruby

The office was closed today, so I took advantage of the free time to work on a side project. It's one I started earlier this year: creating OOXML files with Ruby.

This project was a nice change from the normal work. (That work uses C++.) I needed some time to "dust off" the programs and refresh myself on their internals. I had last used them back in October, and while they are not that large, they are large enough to require some thought.

The organization of the project helped me, as did the test framework. I've divided the project into a two groups of programs. One group generates a set of files in a mini-language that is specific to XML, and the second group (a single program, actually) converts the mini-language scripts into proper XML. I find this division of labor yields a clean design and small programs that are easily constructed and modified.

I also met some former co-workers for lunch. They were working, so I went to them and we had lunch at a nice little Irish pub. It was good to chat with them and catch up on news.


Thursday, December 17, 2009

Microsoft .NET - implementation, not innovation

I just finished reading O'Reilly's ".NET Framework Essentials", printed in 2001. It's a good overview of the first release of Microsoft's .NET platform.

The authors comment on Microsoft's focus on distributed development. Looking back, we can see that distributed development did not occur. The idea that .NET was a better DCOM didn't take hold with developers.

Yet the book highlights the success that Microsoft has achieved. The .NET platform is much easier to use than the previous mix of DLL, VBX, OLE, COM, DCOM, ATL, DAO, ADO, and what-have-you components. The object framework is reasonably consistent and provides a greater degree of interoperability for Windows development. (Not with much outside of Windows, perhaps, but better interoperability within Windows.)

Granted, Microsoft invented very little in .NET. They took ideas from Java and other platforms. They cannot be lauded for innovation, but we can appreciate the implementation. Microsoft made development in Windows a lot saner. (And possibly saved their company. Without .NET and C#, Microsoft would be losing ground to Java, Perl, Python, and Ruby, languages with run on multiple platforms.)


Wednesday, December 16, 2009

I was wrong

In my previous post, I described a problem that occurred in 'release' mode but not in 'debug' mode. These can be difficult problems to solve. (Frequently they are the result of optimizations made by the compiler.)

I was wrong.

The problem is not a defect in the code, or in the compiler. The problem was in my code, but occurred only when run with certain input data. My debugger tests don't run with that data, but the more comprehensive test suite (which runs in 'release' mode) does use it.

I identified and corrected the problem. And learned a lesson: don't assume that a problem is caused by someone else.


Tuesday, December 15, 2009

A visit by an old problem

I have been visited by an old and long-forgotten friend, like Scrooge.

For me, the friend is a coding problem. Or perhaps I should call it a debugging problem. Or an environment problem. It is the classic problem of "it works in 'debug' mode but not in 'release' mode".

Yep, that's the problem I have. I wrote some code and carefully tested it today. (The code is the 'delete a tree of directories' routine, and I was extra careful when testing it.) The code works -- in the debugger. The code almost works in release mode. It fails to remove the one directory (but removes all the files and subdirectories).

I cannot remember the last visit by this problem. If pressed, I would guess sometime in 1997, but I can't be sure.

Some things change, and some things don't.


Tuesday, November 24, 2009

Good and bad, at the same time

I think it was Kernighan who ranted about the differences between C and Pascal, and how Pascal was a straightjacket and that there was no way to fix it.

I've been working with C++ today, and my mind has been thinking along two lines.

One thought has been: This is the best C++ experience I have had in a long time. And it has been a good experience, for I am using the STL for the first time. STL fixes a lot of problems with C++ and makes many things easier. The 'string' class has made many things easier, and the 'vector' and 'map' classes have also helped. I can program faster than before (in C++) and get more done.

The counter-thought has been: This is the worst programming experience I have had in a long time. And it has been a bad experience, for I am stuck in the static-type-world of C++ and used to languages such as Perl and Ruby. I have been working harder than usual to perform simple tasks. An especially difficult problem is one of collections. The static types of C++ demand that I define a collection and its contained types in advance, where in Perl and Ruby the languages don't care and accept anything you throw at them. (Well, almost anything.)

Perhaps I have learned a bit of programming in Perl and Ruby. During the day, I had various thoughts on the design of programs, but found that C++ could not handle the designs. I had to "dumb down" the design for the compiler. (Perhaps there is a way to torment the compiler into accepting the design, but it eludes me. I suspect it would involve a lot of template magic, and I find that templates tend to make programs harder to read, not easier.)

In the back of my mind is the thought: There is no way to fix C++. Kernighan's rant (if it was his) has come back to haunt me. Yes, the C language is "better" than Pascal because you can do what you need. In that sense Perl and Ruby are better than C++ because you can do what you need.

The problem seems to be the static typing. And that got me thinking.

Static typing is good, for the compiler. It allows the compiler to check operations at compile-time. (Languages with dynamic typing must perform checks at run-time, and that incurs a cost to performance.) The compiler can also optimize code.

But static typing is a pain when developing programs. It forces me to think about the program. It distracts me.

I want a language (or development environment) with variable typing. I want a language that allows me to design programs with dynamic typing and let me create the design. Then, after I have the correct algorithms and code, I want to turn one static typing. Perhaps in small amounts, or for specific objects in the program. (I would expect the IDE to show me which objects have static typing and which do not, and let me implement static typing on my schedule.)

I'm not against static typing. At least, not all the time. Only when I'm programming.


Sunday, November 15, 2009

Looking back

I've completed three weeks at the gig at OMB (at gotten paid for one of them!) and things are still going well.

I've had some time to reflect on the experience of finding a job. It's something that few people do often. I hadn't done it for over nineteen years.

Some thoughts:

If you have a job, it's easy to become complacent. It's also easy to think that your skills are up-to-date. The corporate process for evaluation is not always a good indicator of your skills. Your company has its technology and culture. Other companies have different technologies and cultures. What works in one company does not necessarily work in others.

When looking for a job, optimism helps. So does discipline.

I split my time between three major tasks: job search, new technical skills, and networking. The job search consisted of the traditional things: posting my resume on job sites, talking with recruiters, and going on interviews. The networking consisted of meeting people, from former co-workers to local user groups.

I revised my resume many times. Seventeen. (I kept count.) It took longer than I expected to pull together a list of my responsibilities and accomplishments and mix in the technologies that I used.

Keeping my normal routine helped. I woke at the usual time, ate at the usual times, and stayed focused on the "work" of finding a position. Having no television (I have no cable TV and the television is an old analog-TV) made the task easier. I also set up a fake commute to motivate me to get out of bed. The fake commute was nothing more than a walk to the light rail, a short ride of about three stops, and then a walk back home. But it was enough to keep me in the "I have to get up and catch the train" mindset.

Advertising helps. Not advertising in the sense of purchasing air time or magazine pages, but other types. I used two: job-board freshness and e-mails to recruiters. The e-mails are easy: I picked the four or five best recruiters and sent them e-mails of my accomplishments. (This is where learning new tech is helpful.) I sent the e-mails twice a month: not too frequent to be a pain but frequent enough to keep me in their mind.

For job boards, I used a different form of advertising: I kept my resume up to date. I set up a weekly plan to visit the different job boards (dice.com on Monday, monster.com on Tuesday...) and updated my profile and resume. Sometimes it was as trivial as adding a space or blank line; other times I would make corrections or add new items. The point was to keep my "last updated" date on the web site recent. Recruiters who searched by "show me the latest" would have a better chance of seeing my resume.

I picked the best recruiters and worked with them. I worked with other recruiters, too. At first, I had no idea of a recruiter's skills. After working with them I could group them into three categories: Recruiters, placers, and head-hunters.

Head-hunters are my least favorite. They do very little to learn about the candidate or the positions they have. They perform a simple keyword match against my resume and their jobs and then call me to learn my interest. Their conversations are brief and to the point, usually "I have a position with ${skill} in ${city}, are you interested and what is your rate?"

Placers are better than head-hunters. They ask questions about one's skills and background. They take some time to get to know the candidate. They have a decent understanding of the position. Often they will have placed other people at the same company. But they work on a position, and once it's filled, don't bother to talk with you.

Recruiters (as I use the term) work harder than head-hunters and placers. They know quite a bit about the hiring company. They want to learn about the candidate. Often they require an in-person interview before presenting the candidate. They work with multiple companies and multiple positions. They can suggest alternative positions, and will even tell a candidate that they are not a good fit for some positions.

I don't know for certain but I expect that recruiters have a better "hit rate" with qualified candidates than the others. Which means that hiring companies, if they are interested in a long-term match, should do better with recruiters than with placers or head-hunters. Candidates should do better, too.


Sunday, November 1, 2009

Starting a new gig

No posts for a week! Have I been slacking?

Hardly. I started a gig in Washington DC last Tuesday. I've been commuting in, working during the day, commuting home, and handling necessary chores in the evenings.

Starting a new gig is interesting. You meet a bunch of people (and promptly forget their names), get shown a desk in an office (and sometimes forget the way to it), and review some code. You change passwords on your new accounts. You learn the process for time-tracking and billing.

The code for this project is in C++. There are three major parts. I've looked at them and the code seems reasonable. It can be improved, but I've seen worse. (Much worse.) I used the MKS tools to count lines of code (LOC) and get a feel for the size of the three programs. (All are small enough for a small team to handle.) I wrote some Perl scripts to parse the programs and generate class dependencies. These show a lot of linkage between classes. The programs are tightly coupled. One of our goals is to make them less so.


Thursday, October 22, 2009

e-mail servers

I finally configured "postfix" on //grendel today. Configured it such that it works reasonably well. Woo-hoo! I can send e-mail messages to specific users! (All local; the e-mail server is not configured to relay messages to other servers.)

Next is DNS.

I'm configuring e-mail and DNS for PHPmotion. It sends confirmation e-mails, and uses DNS to find your e-mail server. Without the confirmation e-mails, I have to fake out the confirmation and I think my procedures to "fake out" such confirmations are incomplete. PHPmotion doesn't document the complete set of steps; why should it, just click on the link in the e-mail and you're done!


Tuesday, October 20, 2009

The date is set

I heard from the contracting company today. I start on Tuesday. Yay!

I filled out the "user agreement" form today. This lays out the basic working conditions: government equipment is for government business, the equipment may be monitored and I have no expectation of privacy, don't attach any non-government devices, and the equipment is non-secured and cannot be used for classified data. The rules are pretty much the same as the rules at UPS, only spelled out a bit better.


Printer yes, scanner no

I worked on hardware today.

I moved the HP Deskjet printer from //desdemona to //grendel. //grendel is becoming quite the server! It now supports, FTP, HTTP, printers, and soon e-mail. Not bad for a ten-year-old Pentium II box. I tested the printing from //desdemona and had no problems.

The other hardware was the Epson Perfection 1200S scanner. I have it attached to //desdemona, through the funny little SCSI adapter card that came with it. //desdemona runs SuSE 11.1 and almost works with the scanner. It sees the little adapter card (I think it is an Adaptec 2904). It has a driver (a few drivers, actually) for the Epson scanner. But it does not recognize the scanner. YAST scans for hardware and reports that there were no scanners found. YAST also won't let me configure the driver - it says that only drivers with devices attached may be configured.

So one victory and one failure.


Monday, October 19, 2009

The end of this part of the journey

The contracting company called late today. I've passed phase 1 of the security check, which means I can work in the office! Now they must negotiate the start date. It could be as early as Wednesday. Friday and Monday are out, as I am visiting my parents. Thursday is possible and so is next Tuesday.

This has been a long journey. I'm glad to move on to the next phase.


Trains, e-mail servers, and Ruby

This morning I made a test run of the commute into Washington DC. It went smoothly, almost. The train stopped at New Carrollton, held there by dispatchers. (Apparently there was a problem with the catenary.) The primary purpose of the test commute was to ensure that I could wake up and be at the train station on time. The secondary purpose was to train my body for the commute. Having met both goals, I decided to return home rather than be obstinate.

The train rides allowed me time to read up on e-mail servers. I will probably need an e-mail server for the test environment for the video web site project. The book from the CPOSC prize table has just the information that I need. (Or so it seems. I have yet to try any of it.)

I met my former co-workers Larry and Will for lunch. We had half-price hamburgers and good conversation.

This afternoon I sat down with Ruby and the Microsoft OOXML files for Excel. Extracting data from them is possible but not straightforward -- the files are linked and there are pointers from one file to another. It's not impossible, just not simple. Extracting data should be easier than creating new files; I should have started on the extraction side first. Oh well, I'm learning about the files either way.


Sunday, October 18, 2009

Where the open source boys are

I attended the Central Pennsylvania Open Source Conference this weekend. It was a small conference. With the attendee count at 150, a better description might be "tiny". Yet even with a small number of people, it was a good conference.

There are few conferences for open source software. The big conference is O'Reilly's OSCON, which has been held for over ten years, at a variety of locations. Beyond OSCON, there are smaller conferences, but no large cluster of conferences. CPOSC is in Harrisburg, PA; Open Source Bridge is in Portland, OR; and there are conferences in Utah and Georgia.

I started thinking that there might be a clustering of conferences around open source communities, which led me to think about such communities. Is there a geographic concentration of open source projects? Something akin to a Silicon Valley for open source?

The more I thought about it, the more I realized that it did not exist, and probably would not exist. Open source projects are typically manned by volunteers, working at home or in employer-supplied facilities, but not in a central location. The open source model does not require a central office. Contributors do not commute to a common office every day, report to managers, sit at assigned desks, or attend mandatory status meetings.

Open source works in a distributed manner. The resources are people (and a few computers and network connections), not ores extracted from the ground or chemicals manufactured in a large plant. Open source projects don't need massive assembly plants, deep supply chains, volumous warehouses, starched uniforms, large cafeterias, or any of the industrial-age mechanisms that require incredible support mechanisms. The economic forces that pulled people together in the industrial age don't exist in open source. Daily physical presence is not needed.

Which is not to say that physical presence is completely useless. Physical presence is useful. E-mail, instant messaging, and web cameras provide a narrow channel of communication, much less than physical presence. Physical presence provides a "high bandwidth" channel, and it lets one get to know another person quickly. The communication through non-verbal language lets one person build trust in another. Physical presence is needed occasionally, not every day.

I expect to see more open source conferences. Small conferences, such as the Open Source Bridge conference in Portland and the CPOSC in Harrisburg. I expect that they will be in cities, in places with support for travel, lodging, and meetings. Convention centers, hotels, and colleges will be popular places.

I expect that they will occur across the country, and around the world. There will be no Silicon Valley of open source, no one location with a majority of the developers. Instead, it will be everywhere, with people meeting when and where they like.


Thursday, October 15, 2009

Web site demo

I met with Ben today. I showed him the PHPmotion web site. He likes it; PHPmotion has the features that he wants.

Now to get e-mail and DNS working. PHPmotion sends confirmation e-mails to new users. I think that it does things when a user clicks on the registration link. My system does not have DNS or e-mail hooked up, so PHPmotion skips the e-mail and never completes the setup work. The web site for the one user has a few things broken, such as uploading files.

I need to understand e-mail servers and DNS. Maybe I can configure them to be a private network, with no connections to the outside world.


Wednesday, October 14, 2009

Spreadsheet comparison

I tried three different spreadsheets today, performing the same task in each and comparing the results.

The spreadsheets were OpenOffice.org, Google Docs, and Zoho. (Each of these office suites has a spreadsheet component.) I did not include Microsoft Excel, as I know that it can perform this task. (Also because I don't have a copy of Microsoft Office.)

The task was to paste some data and create a chart. I used the same data for each spreadsheet. The data showed browser popularity over the past fifteen months. It had statistics for IE8, IE7, IE6, Firefox, Safari, Chrome, and Opera.

I used Google Docs first. I found the experience reasonable. Google Docs has a very good user interface, mimicking the typical (2007) spreadsheet program. (No "ribbon" UI.) I pasted the data and Google Docs did the right thing and parsed my values properly. I deleted some blank rows (they were in the source data), inserted a column and gave it formulas to calculate total IE popularity, sorted the data (the source data was in reverse order), and created a chart. Google Docs walked me through a short wizard to define the data for the chart, define some properties such as title and legend position, and then gave me a chart. The chart was readable and pleasant to look at.

My experience with Zoho was similar. It does not mimic the windows application as much as Google Docs; it takes some time to adjust to Zoho's UI. (But not much.) Zoho also parsed my data properly and provided a wizard to create the chart. (I also sorted the data and created formulas for total IE popularity.) Zoho's performance was slightly better than Google Docs, although Google Docs has the edge on UI. The chart in Zoho was not quite as nice as the one in Google Docs, but still quite usable. Zoho correctly omitted datapoints for empty cells; Google Docs drew the lines with values of zero.

The OpenOffice.org spreadsheet had the best performance. (It was a local application, not a web app.) It provided more granular control over the pasting of data, allowing me to select delimiters. I performed the same operations, removing blank rows, adding columns and formulas, and sorting the data. OpenOffice.org was the only application that could sort a subset of data; Google Docs and Zoho sorted everything with no option to exclude rows or columns. (Perhaps they can with judicious use of the selected cells.) OpenOffice.org also used a wizard to create the chart. It's configuration of chart data is more granular than Google Docs and Zoho; OpenOffice.org lets me select the range for each series where the others use a single block of cells. Oddly, OpenOffice.org's chart was the worst in appearance, with thick, fuzzy lines where the other packages used thin, crisp lines.

Based on this brief evaluation, I see no clear leader. All three did the job. The web-based packages offer sharing of documents; OpenOffice.org uses the "my files on my computer" model. But OpenOffice.org has the better performance and the finer control. I think any could be used successfully for basic spreadsheet needs.


Monday, October 12, 2009

whereis for Windows

One of the utilities that had written at UPS was 'whereis'. Windows does not have such a thing (or if it does, it is called something else). It is a useful program and I resolved a number of problems with it.

The 'whereis' program tells you where a specific command is located. In our Windows systems, we had multiple copies of programs named 'find', 'diff', and 'sort'. The different versions behaved differently, and we found it important to set the PATH variable properly for our applications. The 'whereis' program helped us identify errors in the PATH variable.

Today I missed my old friend, so I decided to write it. (Write it again, since I do not have access to the programs I wrote at UPS.)

My old version was in Perl. My new version is in Ruby. The new version is much cleaner than the old version. I remember that I had a number of issues with the original program, and I needed several weeks (working on and off) to get the program correct.

The Ruby version is shorter, cleaner, and correct. It comes it at 20 lines (including comments and blank lines). I think that is about half the length of the Perl version. I built the program in three stages, testing each stage as I went. Total time: about ten minutes. (Less time that it took me to write this entry!)


Refactoring Ruby

I refactored some Ruby code this afternoon. It went faster than I expected. The changes were in the "Excel sheet creator" programs; I changed some modules to classes. (Ruby allows for modules and classes. I had chosen the module approach but really had designed the code for classes, so the change made things simpler and more obvious.) I used my test cases to verify that my programs still work as expected. (They did not immediately after the change, due to syntax errors.) It felt good having the tests in place to verify my work.


More Rails

I picked up the third edition of Agile Web Development with Rails this week-end. It was at the local Barnes and noble. I used it this morning and followed the exercises in the book. With this book (which is for Rails V2) the exercises work as expected. Yippee!


Friday, October 9, 2009

Rails without alignment

I did some work with Ruby on Rails today; made less progress than I would have liked.

The problem I encountered was a mis-alignment of my installation of Ruby and the Agile Web Development with Rails book. I have Rails version 2 installed; my copy of the book is for version 1. I need a later edition of the book.

I made some progress, and have a pretty good idea of Rails and how it organizes Ruby code. I want to pursue this, and getting the later book should be possible.


Wednesday, October 7, 2009

Refactoring Ruby code

I worked on my "macron" macro processor today. This is a program written in Ruby, and it has two purposes. The first is to convert text into XML files and provide closing tags at the proper place. The second is to let me improve my knowledge of Ruby.

I focussed on the second purpose today. I had two sections of code that read macro definitions from a file. (My initial design of the program was sloppy.) Today I refactored the program to use a single section of code. The adjustment used bits of both old sections of code, combining them into a single function.

I'm pleased with the result. The code is smaller (from 270 lines to 235 lines) and easier to read. Also, the code is more robust; the two old sections of code each had their specific functions but neither did everything. Now the logic is complete and in a single place.

I used automated tests during the refactoring effort. They helped immensely. At the end of the day, I knew that I had all functionality working.

The program is not perfect, nor complete. I have more features to add, and a few tests to add. But for today, it's good progress.


Thoughts on Silverlight

I attended a local user group yesterday evening and saw a presentation on Silverlight, Microsoft's web app environment. The presentation was quite good; it gave us a lot of information about Silverlight and the presenter knew his stuff. He walked us through the construction of a simple Silverlight application, showing us the support in Visual Studio.

Silverlight is an odd duck. It is not a standard Windows application, nor is it a standard web application (read that as "ASP.NET"). Originally built to compete with Adobe's Flash, Silverlight has become much more than a video player. Microsoft has extended it, giving it useful controls and widgets and making it more capable.

-- I'll digress here and say that Microsoft's extension strategy is not the "embrace and extend" strategy that developers dislike. SIlverlight, from what I know, was never "Flash-compatible". Microsoft is not "corrupting the X standard" (in this case the Flash API) with proprietary extensions. With SIlverlight Microsoft has created a competing technology (version 1.0) and improved it (version 2.0) with extensions of its own. This kind of innovation is more welcome than the "embrace and extend" innovation that Microsoft used with Java. -- End of digression.

Silverlight is quite different from Microsoft's ASP.NET platform. The biggest difference is in screen layout. (I still use the term "screen", showing my development roots.) Silverlight defaults to a dynamic layout, not the fixed-position layout that old-school Visual Basic and Visual C++ programmers know and love. The new model requires a re-thinking of screen design, and while some may complain, I think that it is the right thing for Microsoft to do. Silverlight apps live in the web, and the web uses dynamic screens, not the known, fixed windows of a PC application.

Yet Silverlight has its faults. The construction of a simple app, even something like "Hello, world!", requires a fair amount of typing and clicking. The presenter showed us that he could bind a data source to a grid with only four mouse clicks, but he had done a but of work in advance. The effort is not terribly large, and it is less than previous Microsoft platforms, but I think the advantage still goes to other tools such as Rails. (Although Rails requires more effort on the screen layout side.)

My other complaint with SIlverlight is its use of XML, specifically the XAML configuration files. This is my personal bias against XML for configuration files, and less of a technical complaint. I find XML files hard to read -- that is, my internal XML parser is inefficient. Microsoft has provided support for constructing and editing the XAML files with Intellisense type-completion and syntax checking, and those help, but it still leaves me with my internal parser for reading the files.

Overall, I'm impressed with Silverlight. For Microsoft shops, I think that it is the way to go.


Thursday, October 1, 2009

A practice commute

Today I took a practice run at the expected commute for the OMB gig. Up at 6:00 (well, a little earlier), a quick shower, a regular breakfast, and then a brisk walk to Penn Station. Arrived with time to spare -- which means I may be able to sleep until 6:00.

On the ride in I read a copy of InforWorld, some of "Communications of the ACM", and a bit of "Web Navigation" from O'Reilly. The last is an old book on web design, dating back to 1998. (Wow... is anything web-related that old?) While it predates a lot of the modern web stuff, it still has good ideas. Ideas that are now accepted as the norm, such as "keep your navigation controls consistent".

On the commute back I read a copy of The Washington Post. I also noted items for the OMB gig, mostly set-up such as access to servers, e-mail, and tools for building the application. It's a longish list, going on about meetings, core hours, building access, work location, resources, and procedures for technical support. An article from Fast Company gave me a shorter list:

 - Meet a bunch of people, talk with them, and learn about the application
 - Work on the application
 - When I need help, talk to the right people

It's things like this short list that make Fast Company useful.


Tuesday, September 29, 2009

Recruiters and insurance

I spoke with several recruiters in the past few days. When they call I give them the news that I have accepted a position. All of them have been very good about the news. They have all spoken with me before.

I think that I have completed the "job search shutdown", going to job sites and marking resumes as hidden and disabling job alerts. The cybercoders.com web site had no way to mark resumes as hidden, so I removed the resume. The guru.com web site keeps sending me notices about projects; I may keep those coming to stay informed about tech requests.

A person from Constellation Energy called today, thinking that we had a technical screening interview. I gave him the news. While I had spoke with the recruiter last week, I guess that the news did not reach him. Somewhat embarrassing.

On another front, I have been debating medical insurance. I can subscribe through Prism (the contracting company for the OMB gig) or stay on the COBRA plan. The latter is expensive and ends in July 2010 (COBRA is good for only 18 months). The Prism-offered plan is also expensive and is not really set up for Maryland. It appears that I would be constantly "out-of-network".

A third option is to buy medical insurance from someone like United Health Care or Aetna. I looked at some plans and the also seem expensive. I don't understand all of the numbers, either. (I can, with a bit more studying.) Purchasing my own insurance may be the better choice for me.


Thursday, September 24, 2009

Closer to a situation

I visited the good folks at Prism today and signed a bunch of papers. This act starts the process for the position at OMB. The next step is a background check, something can take up to four weeks. At least one trip into Washington will be necessary, for an in-person interview.

I feel pretty good about this opportunity. And I feel pretty good about finding it; I have built up my self-confidence. This blog started as the leap from a known position into the unknown, with faith that I would find something. While I have yet to land (much less check out the landing zone) I think that the exercise has been worth the fear, frustration, and doubt.

I am shutting down some activities. I have pulled my resume off of job boards (or marked it 'non-searchable'), turned off the e-mail alerts for job postings, and let some recruiters know that I am no longer looking. I have a few more recruiters to inform, and maybe some web pages to adjust.

I will keep some things going, though. I've joined Facebook, Twitter, and Identi.ca; they provide useful bits of news. (Won't be able to view them at the job, though.) I've joined some user groups for MSDN and Linux and would like to keep active in them. That may be tricky, given the timing and location of the meetings and my work and commute schedule.

I have a list of preparations for the job. Most items are logistical: train pass, metro pass, new shoes, and maybe a few new shirts. I can handle them in small batches.


Windows 7(RC) allergic to KVM switch?

I experimented with Windows 7(RC) and the KVM switch this morning. When the SystemMax is the only computer attached to the KVM switch, Windows behaves consistently: it recognizes the keyboard but not the mouse.

I've replaced the mouse (with an old Microsoft mouse!), moved the KVM cable to a different port on the switch, and replaced the KVM cable. The behavior remains: Windows sees the keyboard but not the mouse.

I booted the SystemMax with Ubuntu Linux 9.04, keeping the hardware configuration. Ubuntu Linux finds the keyboard and mouse!

Which leaves the sole variable of Windows 7(RC). Is it doing something with the mouse that the KVM switch does not understand? The PS/2 port for the mouse and the communication protocol are old, and should be well-understood by everyone making equipment. I doubt that the KVM switch manufacturer got it wrong. (The KVM is a Starview SV411.) Let's see what the interweb says.

. . .

After some searches I can see other people have various problems with various configurations and various KVM switches. Some have Windows 7, others have other versions of Windows. Some have Starview KVM switches, others have other brands. The combination of Windows 7 and Starview did not occur with any great frequency. (Which doesn't give me information, or comfort.)

I will try the power supply for the KVM switch. (I want to see if that fixes the problem caused by the Dell GX240.) If that doesn't work, I can use the old Microsoft mouse on the Windows PC and use the KVM to switch only the keyboard and monitor.


Wednesday, September 23, 2009

KVM blues

The KVM switch arrived today. (That was fast!)

I hooked it up and things are a bit dodgy. I have three PCs attached: a Dell Dimension 2350, a Dell GX240, and a SystemMax 765. The Dimension 2350, running SuSE Linux 11.1, works without problems. The GX240 should work -- I have yet to test it -- but when turned off it prevents the KVM switch from working. (It did this to another KVM switch too, so I know the problem is the Dell.)

The SystemMax 765 has odd behavior. The keyboard and mouse don't work... most of the time. Sometimes they do, for a short while. Sometimes the mouse works but not the keyboard. I don't see a pattern. It might be the cable, or the KVM switch, or the SystemMax CPU board (specifically the PS/2 ports) or it might be Windows 7(RC). Or possibly the mouse -- some KVM switch/mouse combinations do not get along. Or maybe the KVM switch needs an external power supply.

I don't think I have a spare PS/2 mouse. Or a power supply for the KVM switch. (It does not come with one.)

Another puzzle to solve!


A long process for a job

I heard from the staffing company (well, one of the staffing companies) that has been talking about possible positions. If all goes well, I will have a gig in mid-October. This has been a long process -- it started in July.

The position has some interesting challenges with a C++ program that reads spreadsheets, processes data, and writes the results back to spreadsheets. The people seem pretty reasonable too.

The position is in Washington, DC. I can get there via the MARC train and metro. I'm glad that I have kept to my daily routine: get up early, have a small breakfast, and start the day. I've even been walking to Penn Station in the morning and checking the time that I arrive. I could have slacked and slept in late. Had I done so, it would be harder to restart a normal commute.


Tuesday, September 22, 2009

Ruby in Windows

I installed Ruby 1.9 on my Windows computer and tested some programs. The install went smoothly and the programs work without problems.

The Ruby package includes SciTE, an editor with syntax highlighting. It seems about the same as KWrite in Linux, but I really ought to examine it more before commenting.


Scripts to Ruby

I converted the bash scripts to Ruby this morning. They do some simple text processing, converting text into an XML file for OOXML.

The conversion went quite quickly, and the Ruby programs seem more readable than the scripts. Maybe that's a personal bias; I'm more comfortable with Ruby and Algol-syntax languages (C, C++, Java, C#) than I am with bash scripts.

Or maybe its not just a personal bias. I find that Ruby programs are more consistent. Bash scripts are collections of different things. Ruby scripts are collections of things, too, but they are all Ruby things. The things in bash are files and programs, and the programs have some degree of variation.


Monday, September 21, 2009

More Ruby and some scripting

This afternoon I built more of the program for creating an OOXML spreadsheet. I started in Ruby but ended with a lot of it in bash scripts.

I tend to push tasks down to the lowest possible level. C#, C++, and Java are at the uppermost level. Perl and Ruby are in the middle. Bash and DOS scripts are at the lowest level. (Assembly language does not appear in the hierarchy, nor does Visual Basic.)

I have since realized that the bash script is the wrong level for my work. I want this utility to run on Linux and Windows. Therefore, it must live in a layer that is compatible across operating systems. The scripting level (bash or DOS) is the wrong level. I will have to re-do this afternoon's work and put it into a higher level. Probably Ruby, as other parts are already there and I want the experience.

Moving the work to Ruby may simplify the program. I had to resort to some bash tricks to get all of the information that I need. In Ruby, I will have more capable data structures. I've learned a few lessons today.


Ruby in the morning

I did some work with Ruby this morning, extending the 'macron' program. I added capabilities for variable substitution, much like the macro expansion in bash or other shells. Nothing new in terms of the Ruby language or environment; this morning's work was using what I already knew and reinforcing my knowledge.

With today's changes, I can build the XML file for the Excel OOXML strings file. OOXML uses a number of files for Excel spreadsheets; one file holds the string entries and other files use indexes into this file.

The code is a bit rough. It works, but needs error checking.

I also spoke with a recruiter this morning, about a position in Pittsburgh. The client needs someone to test their client/server system on configurations with the new Windows Server. This position is not the best fit for me. I don't have experience with Windows Server, and I suspect that the client needs someone to examine their product and tune Windows Server for it.

I think that I talked myself out of the job. I'm not upset -- I would rather talk myself out of a job than land a job that is not a good match.


Friday, September 18, 2009

Getting up in the morning

I accomplished several things today.

First, I woke up a little earlier than usual: 6:00 sharp! I've been slacking a bit, getting up at 7:00 or maybe 7:15. I want to stick with my normal routine, so that when I get the job I will be ready. Being late on the first day is not good.

I read a bit about Matlab. It is an interesting language, and very capable. It can perform many different mathematical operations (lots on matrices, explaining the name) and plot graphs of the results. I'm not sure that I can use it for my normal work, which tends to be text-based and not mathematical.

I examined the OOXML format for Microsoft Excel spreadsheets. My intent is to create routines that can convert plain text (such as a text stream from "regular" tools like sed and awk) into a readable .xlsx file. I also added the 'options' module to my 'interpret' program, which is one step in the bigger process. Now 'interpret' can parse options on the command line.

I lunched with Julie and Derek, two former co-workers. We had a too-short lunch at the local Qdoba.

I joined the LinkedIn group 'Baltimore Connections', for people looking for contacts in the area.

I signed up for the Central PA Open Source Conference. It's a one-day show, with an admission fee of $35. They have some interesting sessions, but I am more interested in making contacts for job opportunities.

The folks at Omnyx decided to pass on me as a new employee. The reason they gave was that I didn't know the 'Dispose' pattern and could speak only superficially about the differences between multi-threaded and multi-process applications. Both true, but I suspect that the real reason is just not "clicking" on the phone interview. When interviewing people, I wouldn't flag them on a few technical items -- but if I didn't have a good feel for them, I would use a few technical items as an excuse.

An odd day, at the end. I feel that I did little. Yet looking at the list, I did quite a few things.


Thursday, September 17, 2009

Sysadmin work

I made adjustments to the configurations for NFS and Samba, and now I can read and write files on my server. I'm pretty sure that I will want to make a few more adjustments, limiting the areas for reading and writing, but this configuration works for me.

I've also adjusted the configuration for NTP on //desdemona, and I think that it is now getting time updates from //grendel. Which is what I wanted. Yay!


File sharing

This morning I configured //grendel for file sharing. I have it working as a "publisher" of sorts, as I can read files on it but not write to it.

I configured //grendel with NFS and Samba. I use NFS with my other Linux computers and Samba with my one Windows computer. The configuration for each took longer than expected and had more "fiddling" than I expected. (And since I'm sharing in "read only" mode, I have more adjustments ahead of me.)


Tuesday, September 15, 2009

Limiting factors

I did some more work in Ruby this afternoon, refactoring a small program and using Ruby's code block capabilities. While I was working, I had a small realization about the power of Ruby (which can possibly be extended to other recent tools):

I work a a pace governed by a limiting factor. With Ruby, the limiting factor is not the technology but my knowledge of programming.

For the past three decades, my programming has been limited by the technology. First was BASIC, with its limited file I/O operations. Assembly language had full access to the operating system API for file handling but constructing meaningful programs took lots of time, planning, and discipline. C and later C++ had better support for I/O and better support for code modules but still took lots of planning. Packages such as dBase II, R:Base 5000, and Condor had limited control structures and error handling.

In all of those environments, my brain was ahead of the technology. My frustration was not with my knowledge of data structures, control structures, and other programming concepts; my frustration was with the tools.

Ruby changes that. Ruby offers excellent support for data structures, control structures, I/O, and error handling. So much so that I don't spend time thinking about them. Now I can think about the problem at hand and easily use efficient programming concepts. I can use advanced programming concepts such as closures.

For once, the tool is not holding me back. Now, what holds me back is my understanding of the advanced programming concepts. That's a big change. For years, I have felt smart and considered the tools dumb and unsophisticated.

No I don't feel so smart.


Windows Update

This past week-end I started Windows update on the new PC. The update feature is configured to update Windows and nothing else. (At some point I will ask it to update other things. For now, updating Windows is a sufficient test.)

There are lots of updates for Windows 7(RC). I downloaded and applied the "serious" updates on Sunday. They went quickly and had no problems. There were thirty or so "optional" updates, all of them language pack updates. I have been applying these in batches, due to the total size. In total, the updates are 1GB in size; I believe that is the download size and the actual update is larger when uncompressed.

The updates, even in batches, take a bit of time to download (about an hour for four or five update files) and then a bunch more time to apply (about thirty minutes for the same updates). The download process consumes all external network bandwidth, and the update process requires a lot of disk activity. It seems like a lot of activity for updates to language packs. But then maybe I don't understand the method that Microsoft used to store its text information; maybe MS uses lots of files for each application.

The Windows Update process and the Windows Power Management function work poorly together. Since the updates take so long, I start a set and then walk away from the PC. After a period of no activity on the keyboard and mouse, the Windows Power Management routines kick in and put Windows in "sleep" mode. This stops the update process. I would expect them to be more coordinated. Re-trying the operation has reported success, so there seems to be no permanent harm.

The Windows Update process also creates "restore points", which I assume are a set of backup files that can be restored in the event of an update problem. It's nice to know that I have these backups... but I can't manage them. That is, I see no way to look at different restore points and possible discard some. (I know that they take disk space, and perhaps I want it back.) Also, Windows has a separate feature called "Windows Backups" which are independent of restore points. These are apparently user-requested backups and different from the backups of restore points.

I'm trying to schedule updates for "off" periods. Since they consume network bandwidth, they interfere with my "normal" work, such as checking job boards, using on-line documents, and posting to this blog.


Sunday, September 13, 2009

New PC and Windows 7(RC)

The SystemMax PC arrived on Friday. This PC was a special deal from TigerDirect.com - a refurbished unit at a discounted price, and with no operating system. The PC has an Intel Pentium dual-core processor, 2GB RAM, and 160GB DASD. It has a spacious case, with room for six of seven additional disks. (The CPU card has two open SATA ports, so anything beyond two additional disks would require an adapter.)

The "no operating system" aspect did not bother me -- in fact, I found it appealing. My plan was to install Windows 7 (the Release Candidate, or RC, version) and I didn't want to throw away a good operating system. A PC without an operating system is better for me.

I installed Windows 7RC with no problems. Microsoft has improved their install program. I am used to the old DOS-mode, white-characters-on-blue-screen setup programs for Windows. Windows 7 uses a nice, graphics mode (just like Linux installs), asks a few questions up front and not one at a time during the install (just like Linux), and needs no interaction during the install (just like Linux).

Windows found drivers for my video card and network card, a pleasant surprise. I am used to the MSDN OEM installs for Windows XP, which include drivers for 640x480 and nothing for networks. Those installs required the hunting of drivers and then the acrobatics to copy them onto a computer without a network interface.

I like Windows 7 more than Windows Vista. Microsoft has learned from the experience of Windows Vista and fixed things that were problems. For me, the most noticeable change is in the wallpaper: Vista had visually noisy pictures (the mountain lake was particularly annoying) and Windows 7 has quieter, soothing images. Also, the screen widgets do not come up by default. (I dislike the screen widgets, especially the clock with it's constantly-moving second hand.)

Microsoft has also tuned UAC (User Access Control). Dialogs for access to critical areas are still displayed, but only when necessary. In this, Microsoft has matched the experience of Linux distros such as SuSE and Ubuntu.

I downloaded some additional packages: Word viewer, Excel viewer, and Visual Studio Express for C# and Web Development. I have yet to work with them.

All in all, Microsoft has made changes to the Windows experience that I consider improvements. They are on par with Linux for the user interface. This might be a problem for Microsoft, since while they are no worse than Linux, they are no better.


Friday, September 11, 2009

PHPmotion hacks, and then Ruby

I tried hacking around the PHPmotion e-mail problem this morning. I looked through the MySQL database for the web site and found a table called 'member_profile'. I changed the 'account_status' column from 'new' to 'active' and now the user can log in!

It's not a perfect hack, though. I can log in and change a few things, but I cannot view my profile. I'm guessing that the 'confirmation' action does a few more things after changing the 'account_status' value.

I could hack more, but maybe setting up an e-mail server would be a better solution.

I sent Ben a summary of my progress. The e-mail was lengthy; I would like a way to write shorter e-mails.

After the PHPmotion tests, I spoke with two recruiters about possible positions. One is in Pittsburgh (from the same recruiters I spoke with earlier this week -- they have another opportunity) and one is in West Palm Beach. I'm quite OK moving to Pittsburgh; it has been recommended by web sites and friends. West Palm Beach was not on the 'tech radar' and I did not consider it as a possible location. I'm pondering the move.

In Ruby, I modified my basic macro translator program (interpret) and extracted the core function into a Ruby module. This is the first step in creating a common 'macro' module that can be used by 'interpret' and 'macron', my two macro-expansion programs. The process was fairly easy; most of the effort is in picking a good name for the module. I consider that a good sign -- the tool gets our of my way and lets me think about the problem I am solving, not the ceremony for the tool.


Thursday, September 10, 2009

PHPmotion in test environment

I installed PHPmotion in my test environment today. Using my notes from the experiment on my development server, I put together a set of installation steps and performed them as I installed PHPmotion on //patricia. I had to make some adjustments as I went along, but that is usual for a new procedure.

I have Apache, PHP, PHPshield, and PHPmotion working on //patricia, mostly. The one thing that I need is an e-mail handler. PHPmotion wants to send e-mails for various events. One of them is to confirm that a new user really does exist. My test server (//patricia) has no e-mail handler.

I may be able to hack around the problem by adjusting entries in the database. PHPmotion probably has a flag for the user, indicating that the user has not verified their e-mail address. If it is a simple yes/no value then I should be able to change it manually and fake out PHPmotion. (The production system will need an e-mail server, though.)

I also learned about PHPmotion and its capabilities for translations. PHPmotion V2 allows you to configure it for one of a number of languages. (You pick a language and then everyone sees your site in that language.) PHPmotion V3 has no such capability; every page is in English. The producers have plans to allow users to select their language, but this feature is not available (nor is there an expected date).


Monitoring systems with 'reconnoiter'

I accomplished some other things yesterday.

I finished reading Eric Raymond's The Cathedral and the Bazaar. I found the predictions for Windows 2000 amusing: Raymond forecast a wide rejection of the operating system due to its size and incompatibilities. Looking back, Windows 2000 was accepted and became a popular choice. Perhaps ESR was predicting the release of Windows Vista?

I added some features to the Ruby version of 'interpret'. I also created a script to run test cases. Good thing I did; I found two problems in the code and one in the test data. Fixing both gave me a better program.

In the evening I attended the CALUG (Columbia Area Linux User Group) meeting and heard a presentation on "Reconnoiter", a tool for monitoring systems. It is an impressive tool. I think I am more impressed with it because of the work I did monitoring systems at UPS - I understand the need for such a tool and the issues in collecting data.

A recruiter (a pair of them, actually) called yesterday about a position in Pittsburgh. The work would be automating tests for a team developing a C#/.NET application. I'm talking with them more today. I'm OK with moving to Pittsburgh -- it's one of the cities that has been recommended to me by friends and web sites. I'm also OK working on the testing side (and not pure development) because there is enough challenge in automating manual tests and running the test "operation".


Wednesday, September 9, 2009

Fonts, graphics, and PHP

I solved the image-generation problem with PHPmotion, PHP, and Ubuntu this morning. It's all a matter of having the right packages installed and naming font files properly in the application. I installed the PHP5-GD package and a bunch of fonts (including the msttcorelib fonts), modified the PHPmotion script to use an available font (its default of 'DoradoHeadline is not available, at least not easily), and things are working!


Tuesday, September 8, 2009

A busy Tuesday

Lots accomplished today.

I sent Ben an update on my progress with PHPmotion. I described my progress, some of the problems that I had to overcome, the open problems, and my ideas for the next steps to solve them.

I sent updates to several recruiters. The updates contain a summary of my recent accomplishments: new technology learned, experiments, and such. I send them out twice a month, mostly to keep in front of recruiters. (And it worked: one recruiter called with a possible position. Unfortunately, it was with a company I interviewed back in March.)

I read a bit in The Cathedral and the Bazaar, Eric Raymond's collection of essays on open source development.

I had a conversation with a recruiter about a position for a network storage engineer. (Different from the March position I mentioned above.)

I dug into PHPmotion and diagnosed some problems. I have identified the cause of the login problem - it's a failure to generate the captcha image. The failure occurs because the specific font used by the captcha is not available on my system. I ran tests on //desdemona (my main development system) and figured out the changes to make the routine work. I can migrate them to //grendel, the server I used for PHPmotion.

The solution raises more questions. My intention was to run PHPmotion on //patricia, a laptop that is more portable than //grendel. The PHP GD package apparently uses X windows to create the image. I have X on //desdemona and //grendel but not on //patricia, since that PC is slated to be a simple server. Maybe I will have to install X. Or maybe not. More questions.

I finished the day with some work in Ruby. I built the core routine for my 'interpret' program. I plan on using the same core routine in 'macron', the macro expander for XML files. Once again I made a lot of progress in a short amount of time.


Sunday, September 6, 2009

Progress with PHPmotion

After some experimentation, I have PHPmotion working. Or at least working to the point where it lets me in as the site administrator.

The problem was that the setup routines would not connect to the database. The actual problem was with the account that I was using and the rights that it had. I switched to the MySQL root account, and PHPmotion lets me configure it. (It would be nice if the documentation mentioned the MySQL account requirements. I like to keep rights to the minimum necessary.)

With that problem solved, I can now try some templates and some uploads. I suspect that I do not have all of the codecs loaded; I will need more time to investigate those.

Saturday, September 5, 2009

The Ruby puzzle

I've been working on a little project the past 24 hours. Once again, I am impressed with the power of Ruby.

The project is a re-make of a utility I attempted at UPS. It's a macro expander, one that's tuned for XML.

Most macro expanders (at least the ones that I know) will expand a macro into text. But I want something a little different, one that expands macros into XML. ("But you can expand into XML! Just define the right expansion results!" I hear people cry. Yes, I can. But I need something more. Read on.)

I want to start with something like this:

.sheet name=budget
.row
.col
.value value=Jan
.col
.value value=100
.row
.col
.value value=Feb
.col
.value value=120

and end with something like this:





Notice the closing tags? I want the macro expander to provide them. I don't want a pair of macros (one for the open tag and a second for the closing tag). I want one macro that generates both tags.

I want the tags inserted at the proper place. That means that the expander must remember which macros it has expanded, and also have a sense of precedence. (I don't need the closing tag until another starts or the file ends. But when either of those conditions occur, I want the closing tag.)

I've built a small utility to perform this task. Using Ruby, it was easy to do. (My code is about 100 lines.) It reads a file of tag definitions (the definitions include the opening tag, the closing tag, and precedence) and then processes the macro script. It's not complete, but the basic logic works. I want to add variable substitution and some error handling.

What impresses me is the speed at which I can create a usable program.I've spent about three hours with Ruby. In C#, this would have taken me about seven hours. (That's an estimate. If I get a Windows machine working, and Visual Studio, I can re-write the program in C# and measure that effort. My guess is that I will spend more time re-writing the program in C# than I took writing it in Ruby.)

WhatI do not understand is the reason for the performance gain. I am faster in Ruby than in other languages. (How much faster depends on the other language.) Even when I solve the problem and get a good algorithm, I spend more time in the second implementation. (And I keep the second implementation simple -- no added features!)

So this is what I call the "Ruby puzzle": Why am I faster at development in Ruby? (Especially when I barely know the language?)

Friday, September 4, 2009

Frustrations with PHPmotion

I installed PHPmotion this morning. Almost. It's been a frustrating experience.

Downloading the ZIP file was easy. Unzipping it was easy. Following the on-line directions is easy, but they are incomplete.

My first frustration is with PHPmotion's use of PHPshield. The install instructions for PHPmotion assume that PHPshield is installed and configured. Not necessarily true. A few minutes of reading up on PHPshield, a few more minutes of configuring php.ini, and I have solved the problem.

Now PHPmotion wants to talk to a database. I have a database, and it is ready, but PHPmotion won't talk to it. Won't provide much of an error message either. Bah!

A phone call with a recruiter

I had a long phone conversation with my recruiter-friend Lynn this morning. She had a possible position; after discussion we determined that it was for Becton-Dickinson. Since I've already been presented there by another recruiter, she can't present me.

Beyond the one position, we talked more about what I am looking for. We talked about technology, we talked about company size, and we talked about culture/personality matches.

Lynn and I agree that the company culture and candidate personality are the primary criteria. A poor match there will result in poor results, regardless of the skills match.

I'm holding out for a position that I want, one where I feel a part of the team and feel that I can contribute. I'm not going to jump at any job that happens to come my way. (As tempting as that is.)

Wednesday, September 2, 2009

Job fair

I attended a job fair this morning. It was run by monster.com, they call it the "Keep America Working" fair.

It followed the same format as a monster job fair I attended earlier this year. This time my registration had been recorded, so I simply walked up, said my name, received a packet, and walked in. The number of hiring companies was about the same as the earlier fair -- about twenty. The companies covered a wide spectrum of the field: Sleepy's Mattresses, Manpower, Lumber Liquidators, and even some technical recruiters. Of the four technology companies registered, two were present, and I talked with them quickly. After that, I was free to leave.

The job-huntung crowd seemed to be mostly professionals, in a variety of ages. The match between the company selection and the candidate selection seemed poor. That may be due to monster.com's targeting of e-mails and marketing (on both candidates and companies) or it might be due to the local economy.

On the way home, I visited the Ukazoo bookstore. It's larger than I expected, with a pretty good selection of books in the science fiction category. (Several books by Glen Cook and a bunch by Sheri S. Tepper but nothing that I had to have.)

I also visited the local Apple store, primarily to see if it was still there.My one Apple Macbook is running without problems and serves me well. I can defer other Apple purchases.

I will give more thought to Windows development this afternoon. I would like to try Windows 7 (I have the release candidate DVD) but the one computer that can run it is the Dell Dimension 4600 -- which needs a power switch. ...I priced a new Dell PC for running Windows 7, and the desktop (sans monitor) cost just over $500. I will check the TigerDirect.com web site later today.

Tuesday, September 1, 2009

CMAP meeting - Microsoft Sketchflow

I attended tonight's meeting of CMAP (Central Maryland Association of .NET Professionals). They had a presentation on Microsoft's Sketchflow, a prototyping/agile-design tool. I was impressed with the presentation -- I learned more than I expected.

I was less impressed with Sketchflow. It seems to be made for business analysts and designers, allowing people to quickly lay out screens and define the flow from one screen to another. It even uses the "sketchy thing" look that the Art Technology Group had a decade ago, with line drawings of windows and controls.

Yet it fails in major and minor ways. Microsoft gets a lot of the minor details wrong. For example, they spent a great effort on building the sketchy thing look and allowing third party controls to take on the skin. But the sample photos provided by Microsoft are real photos. They stand out like sore thumbs, since every other control is in squiggly lines and the photos are in nice, high-res color. (You can create your own photos and add them, but isn't the tool supposed to save time?)

The major failings are more disturbing. While targeted to analysts and non-programmer users, Sketchflow has a lot of programming details in it. The events for controls have been trimmed, but the general properties remain numerous. So numerous that Microsoft uses the "search" box to find the property you want. (And it works, if you know the name of the property.)

The other failing is the lack of collaboration abilities. Microsoft sticks to the single-user document model, in which every person is working on their own document, or occasionally a copy of someone else's document. To share a Sketchflow model, you must e-mail the file to the recipient who must use a player (or their own copy of Sketchflow) to run it. They can add comments and e-mail the modified file back to you. This is a lot of work for collaboration.

The bottom line: Sketchflow was designed by individually-minded programmers. It will work for an individual but using it in groups will be time-consuming.

Prizes! - sort of

This past week-end I've been reading Deconstructing Flash from the team at JUXT. It's a bit old but it provides a very nice overview of Flash and some techniques for neat-o things.

I've been reading about Flash as part of the web site for Ben. He wants to move away from news and to video. If I'm going to help him, I need to know the technologies. I found the "deconstructing" book at the Book Thing.

This morning, on my way home from the ATM, I found a bunch of books on the sidewalk. (Along with some pots and pans and a few other household items.) The books include Flash 5 for Dummies, Macromedia Flash MX Developer's Guide, Coldfusion for Flash, Flash Developer Study Guide, and Using Dreamweaver 3. The books also included Through the Global Lens, and The Clash of Civilizations and the Remaking of World Order.

I guess this is my opportunity to learn about Flash and possibly take over the world.

Wednesday, August 26, 2009

Dell 1, me 0

The replacement part for the Dell Dimension 4600 computer arrived today, just when Dell said it would.

I need the on-off switch and the cable that connects it to the motherboard.

I opened the box... and the part is the wrong part.

Dell shipped me the front panel USB port cage and the cable that connects them to the motherboard.

This is the part I ordered. That is, it is the part number I specified. I got that from the Dell technical support line.

The Dell web site was not helpful in locating this part. (If you have a part number, you can order it. If you don't know the part number, there is no way to find it.)

I could fight Dell and demand my money back. But I just don't have the energy for that debate. Instead, I will simply drop Dell down in the supplier list and consider someone like Asus for the next PC purchase.

In the meantime, I must re-consider my plan for running Windows. The idea was to install Windows on the Dell Dimension 4600. I hoped that the power switch would arrive today. I could order memory and a hard drive tonight and have them ready for me when I returned from a family visit this week-end.

Bother.

A fast moving company

Some companies move faster than others.

Most of the recruiters that I deal with work at a certain pace. I've gotten used to this pace. When they have an opportunity, they send an e-mail. I respond with an e-mail. The next day, they might call and we talk about the position. They then contact the hiring company and schedule a phone-screen interview, usually for the next week.

Within the past twenty-four hours, a company has: found my resume, sent me an e-mail (to which I responded), sent a follow-up e-mail, talked with me on the phone, and run through a technical screening interview. They move quickly.

The position sounds challenging, with some work in C++ and C# and some work in web technologies. (They list both ASP.NET and PHP; they may be using different technologies on different projects.) The location is a bit far of a commute (probably at the eighty-minute range) but can be made by train and subway.

I am excited about this opportunity. I may not get it -- they have other candidates -- but I have a chance at it.

Tuesday, August 25, 2009

Yet another schema, and results from interview

I worked on the news aggregation web site today. I added advertisements, or rather made changes to the database to support advertisements. I think we're on schema revision 6 now. I should probably stop counting. It's not the number of revisions, it's the end results than count.

Tomorrow I can revise the HTML and PHP to display ads. My idea is to pull them from the database at random, matching advertisement image size.

In other news, I heard from the recruiter about Becton Dickinson. They are going with another candidate, one with more experience in C#. Given their needs, I can understand. The recruiter said that they were willing to bring me on in a later position. I'm not sure if they are sincere or just being nice.

I talked with another staffing company about a position in West Chester PA. The hiring company (ING) wants a person to reverse engineer their annuities modelling sofwtware and remove the third-party library. They want someone who knows C++, annuities, and UML. (And the staffing company is willing to pay a glorious $48/hr on a 1099 basis.) I'm not sure that they will find someone with those skills at that price.

And in other other news, I met some folks at the pool today (of all places) and gave them some business cards. One is in the hotel industry, the other is a waiter here in Baltimore. I gave the waiter a copy of Ubuntu; he said that his Windows had BSOD problems. (And I have the copies of Ubuntu to give away to people!)

Monday, August 24, 2009

Network admin work

A little bit of network administration this morning.

Over the week-end, I picked up two monitors and a Linksys router. This morning I tested them.

The monitors were easy: plug them in, attach to a working computer, and verify the display. Both are working and will be good monitors for servers on Ben's project.

The router was a little more work. You can attach a router to a computer but you must know its IP address. And to change anything, you must know the password. I knew neither.

The router is a Linksys BEFSR41 4-port WAN/LAN router, suitable for home use with a broadband feed. I use the wireless sibling of this router for my home network.

A little bit of research on the internet provided the instructions for a complete reset of the BEFSR41. (Power on, press and hold the reset button until the red lights come on and go out.) It also provided the default password ('admin') and the default IP address (192.168.1.1). I reset the router on a completely separate network, to avoid conflicts on my home network.

Once reset, I was able to configure the router and I am confident that we can use it for a small network.

OK, that's enough hardware fun for today!

Friday, August 21, 2009

An hour with Dell

After my last post I felt rather silly about the Dell 4600 and not using it because of the power switch. Or rather, I felt that it was blocking me, preventing me from moving forward.

So during my lunch hour, I visited the Dell web site and hunted down the part. My search included various web sites (courtesy of altavista.com), dell.com, the Dell on-line chat service, and Dell phone support. The on-line chat was helpful in that it provided me with a part number. Dell's phone support was not particularly helpful. The dell.com web site was helpful -- once I supplied the part number. The web site did not help me find the part number for the power switch.

The switch assembly (its the switch, a mounting bracket, and a cable) should arrive next week. (Probably Wednesday.) When it arrives, I will verify that it is the right component. If it is, I will order memory and a hard drive for the computer. (Those will take another few days.) By early September, I should have the PC working.

This PC will be the Windows computer. My plan is to install Windows 7 and as many free Microsoft packages as I can. I'm thinking of Visual Studio Express, SQL Server Express, IIS, and... um... that's all I can think of for now.

I may want two PCs running Windows: one server and one client.

I'm avoiding virtual PCs for now. My hardware can support Windows, but not virtual PCs running Windows.

Possible job -- oops, no

Accomplished two things this morning: some web site stuff and some recruiter stuff.

I took some screen-shots of the news aggregator web site and sent them off to Ben, my partner-in-crime. I think that the web site is now 'portable', meaning that I can take the laptop computer and bring it to others for a demonstration.

On the recruiting side, I spoke with Lynn E about a position here in town. Lots of skill matches but apparently they insist on Microsoft SQL Server experience. (Possibly for admin or reporting tools.) That's a hole in my skillset and I have no simple way to gain that experience. I have one computer that is capable of running Windows Vista (or Windows 7) but it needs memory, a disk, and most importantly an on-off switch. I can get all of those but I need the on-off switch. I even have the Windows 7 preview disks, so I could install it.

This afternoon I will start reading Understanding the Linux Kernel. It's an O'Reilly book, and I think it will help me on the Linux side of things.

Thursday, August 20, 2009

Web site progress

I made a bit of progress with the news web site today.

First, I added a separate page for displaying a news story. I don't want the news story on the front page, just the headlines and teaser. Adding a new page was easy, linking to it was easy, and changing the query to retreive a specific story... was hard.

I needed a way to specify a story; the database had no unique key for stories. It was a minor change to add the column and make it auto-increment in SQL. (I dropped the table, changed the 'create' scripts, and re-added the table and data.)

For a while, PHP was having problems. I thought it was an SQL error, but it turns out to have been a PHP syntax problem. I had changed "$q" to "q" in my code, and PHP did not like it. The errors from PHP were not very helpful, and in fact lead me down the wrong path. I had to get up and walk around for a bit, and then come back to the code with "fresh eyes". Once I did, the problem was obvious.

Once that problem was solved, I then had two web pages with a fair amount of identical HTML code. I learned a bit about PHP's "include" capabilities and used them to combine the common HTML into a single module. The result was a cleaner set of web pages.

I still have a bit of clean-up to perform. But that can wait until tomorrow. I want to absorb what I learned with my adventures today.

Wednesday, August 19, 2009

Success with characters

I've solved the "question mark" problem with my web site! After careful testing and experiments (with some help from web pages), I found the problem was in my implementation of Gettext(). I needed one additional call to bind_textdomain_codeset () and now the web site displays the expected characters. Yay!

This was an interesting problem. My early analysis of the problem led me down the wrong path. I had used PHP in command-line mode, and those tests produced the proper characters on the console. (I'm still not sure how that happened.) Since PHP and Gettext() were giving me the correct results, I thought the problem was in Apache.

Later tests with other text in the web page showed me that I was wrong -- Apache was providing characters just fine. I narrowed the problem to Gettext(), and then it was an easy fix.

During these tests, I put the project under version control. It's now important enough to keep older versions. And I can use the experience with Subversion -- I still have problems  getting the right directory names when importing projects. (I had difficulties in this area with PVCS and SourceSafe too. Since it affects every version control system that I have used, maybe it is in how I think about version control systems.)

Friday, August 14, 2009

More progress with web server

I've made a little more progress with my web server project today. I have successfully added text into the MySQL database and used PHP to read that text when building web pages. This is another step (or maybe set of steps) towards a dynamic web page.

What's interesting is that the text is displayed with all non-US characters. I've been having problems with non-US characters on the items that are supplied by gettext(). I thought the problem was in Apache, but that doesn't seem to be the case. Perhaps the problem is in gettext -- but then PHP and gettext() do the right thing in command-line mode.

This problem is proving to be an interesting puzzle. But I'm not letting it stop me. I can make progress in other areas and keep working on the problem.

Next steps will be to add more data to my database, remove the constant (phony) text in the web page, and demo the results to the client.

Thursday, August 13, 2009

Success with Apache, PHP, and MySQL

I have successfully configured //patricia to use Apache, PHP, and MySQL. (And all in console mode -- no GUI!)

The database will hold news articles. I'm thinking of storing text in the MySQL database as HTML, and simply pulling the text and dropping it into the web page. I wrote a small Ruby program to convert plain text into HTML. Right now, it assumes that each line is a paragraph. I think a better arrangement would be to convert blank lines into paragraph breaks.

More progress with Debian Linux

I installed MySQL on Debian Linux, created a database, and re-configured things to allow for remote access. (Out of the box, MySQL allows connections from the local host and no others.) It meant assigning //patricia (the host running Debian, Apache, PHP, and now MySQL) a static IP address. That went easier than I expected. It was the first time I configured a Linux host via console mode and text files; my previous experience has been with GUI programs like YaST.

Now I need to verify that PHP can talk to MySQL. (Do I have all of the right packages? Do the package installs do the right thing to configuration files?)

After that, I need a way to insert news stories into the MySQL database. Should I do something with newline characters? (I suppose that I should, as MySQL considers them significant in its scripts.)

Tuesday, August 11, 2009

Progress with Debian Linux

Debian Linux 5 seems to be happy on the IBM T21 Thinkpad. I installed it with the "web server" configuration, so there is no GUI desktop. Linux runs in "plain text" mode, with a simple terminal console and no graphics except for what can be displayed with ncurses.

The web server is working and I used the 'aptitude' package to install the PHP modules. Next I will copy over my web site files from //grendel and try them first with just PHP and then with PHP inside Apache.

Monday, August 10, 2009

Setting up a Linux server

The IBM T21 Thinkpad has decided to power-up again. It has Ubuntu Linux 5.10 on it; later versions will not install. I need a configuration with Apache and PHP and that means a later version of Linux. I tried Puppy Linux 4.1.1 which runs (nicely too) but it seems geared for the desktop and not a server. (There's no package to install Apache, much less PHP.)

My next choice is Debian 5. It seems to be installing. (Puppy Linux ran live off the CD-ROM; Debian wants to live on the hard drive.)

Friday, August 7, 2009

Interview - BD

I interviewed Becton Dickinson today. It was a three-hour process! Various folks talked with me, and asked questions. Some questions were technical, some focussed on people issues. They seemed to have a lot of questions about presenting ideas and persuading others -- that may be a bad sign.

I won't know what they think until next week. They have at least one other candidate to interview.

If I time the job offer (from anyone) right, I could have a few days to visit my parents. That might be a good thing.

Thursday, August 6, 2009

Interview with OMB

I interviewed the folks at OMB today. It went well; we both learned a lot during the meeting. ("An interview is a meeting with a purpose", as the HR director at MCSB used to say.)

They have some C++ systems that need maintenance. And possibly some improvement. They want to move across platofrms (Windows today and Linux in the future) so they are thinking about staying with C++. They also have constraints (deliverables, deadlines, audits, and communications retention) that limit the options for a development project. The project is not impossible, but a bit harder than an environment without the constraints.

During the interview I mentioned my work with the Paranoid build system at UPS. That system built Worldship every night and reported problems. I think that they found the idea of nightly builds appealing. Maybe this will work in my favor.

The interview lasted an hour. We covered the systems, the technology, the major tasks, where they wanted to go, and some side issues such as commuting and work hours.

Tuesday, August 4, 2009

Lambdas for LINQ

I went to the monthly CMAP meeting tonight. I didn't want to -- I had been working on a problem with PHP and translated text. But I'm glad that I did go.

Tonight's meeting had free pizza, a long-ish advertisement for Telerik (one of their sponsors), and a presentation about LINQ. The latter was the most interesting.

Microsoft added LINQ to the .NET framework and its languages a few years ago, in release 3.0. The advertised reason was for easy connection to databases and web services. I must admit it does deliver on that promise. In .NET 2.0, queries required quite a bit of ceremony; in .NET 3.0 and LINQ, queries are brief and to the point.

But the changes for LINQ were not simply new classes in the .NET framework. Microsoft changed the compilers (for C# and Visual Basic at least) to allow the new syntax for easy queries. They changed they was code is generated and possibly the CLR run-time engine to allow for new code constructs such as lambda expressions.

These changes are significant and keep Microsoft competitive with other technologies. Cloud computing needs the ability to create lambda expressions, if you want to use it effectively.

What bothers me is that Microsoft has hidden these changes. They have not advertised them; while LINQ uses them it does make them obvious. The magazine articles I have read laud LINQ and explain its neat-o capabilities for queries but don't explain the underlying changes. Is Microsoft ashamed of them? Afraid of them? Fearful that their average developers will be unable to understand or use the advanced concepts? I don't know.

As a bonus, I won a book at the group's end-of-meeting raffle. It is the Visual Studio 2008 Unleashed tome. Rather heavy. Not sure that I want to read it.

Modest success with translations

I have convinced gettext() to work with my web pages!

At least in my development environment. The development environment is SuSE 11.1; my production web server runs on Ubuntu 8.10. I'll have to check the production environment and make sure that it has support for the expected locales.

Gettext is very picky about the locale specification. I was using 'fr' for my locale... since that was listed in the example. But that text does not match any of the available locales on my computer. Since gettext() found no exact match, it fell back to the default message, which is the ID you provided. Instead of 'fr' I must specify 'fr_FR' for the locale. (And 'UTF-8' for the encoding.)

Solving this problem took longer than I expected. (Problems often do.) I learned a few other things about PHP while I was struggling with this problem:

1) Debugging PHP is difficult. Or maybe I need to learn more debugging skills for PHP.

2) Using PHP in script mode is a quick way to test files. It's much easier than editing a file, FTP'ing it to the server, moving it to the proper directory, and then calling up the web page.

3) I can use better search techniques for documentation on the web. My initial searches kept pointing me to the PHP documentation. The GNU documentation (although lengthy) had the information that I needed.

4) I can use more patience reading long documents.

5) Sometimes one has to think, run  tests, think again, and run more tests. Thinking alone does not solve the problem, nor does running random tests.

Monday, August 3, 2009

Web sites in multiple languages

After neeting with Ben last week, I started thinking about his idea. He wants a web site that provides news and serves people in several different countries. This means web pages in different languages.

My experience on the Worldship project at UPS is helpful here. I understand the general process of "translating" a web site (or application) into multiple languages.

Some quick searches on the web show that the gettext toolset will do the job nicely. As a bonus, gettext plugs in to PHP! (It's callable from Java and other languages too.)

After some reading of web sites, some consulting with the books I have on hand, and a little bit of thinking, I was ready. I took the static web page, pulled out some text strings, processed the string file with the gettext tools, added the logic in the web page to call gettext, and tried it.

It doesn't work.

I've checked that files are present in the proper directories, and contain the correct (or so I think) PHP commands. It seems to fail to find the messages.mo file.

Blah!

Thursday, July 30, 2009

Start-up opportunity

I met with Ben A. and discussed his start-up opportunity. He has an interesting idea and I need a little time to get things sorted out in my head. (I won't discuss details here.)

We talked for about two hours, reviewing his plan and some of the technology that is necessary to make it happen. It is certainly reachable with technology; for me the question is will it succeed in the marketplace?

Javascript experiments part 1

I experimented with Javascript this morning. The "Javascript Missing Manual" explains the language and the concepts clearly and with meaningful examples. (I'm using the ACM's connection to O'Reilly Safari for the on-line version of the book. I prefer paper books but on-line at least allows for copy and paste of code samples.)

So far, I've learned a bit about Javascript. Read that as "made errors and then figured out what was wrong". I learn only from my mistakes, so the more mistakes the more I learn!

I'm using KWrite as an editor and Opera as the browser. I'm not concerned about browser compatibility since I am using jQuery and it handles most of the browser issues. I'm not too fond of KWrite though; I would prefer an editor with better syntax highlighting.

In other news, a recruiter from ManPower called last yesterday with an opening for what they called a senior architect. I looked at the job description that they sent via e-mail and am not quite sure what to call it. The hiring company (I think that it is Lexis/Nexis) wants a person with strong technical skills, strong project management skills, and strong presentation and mentoring skills. The recruiter told me that the position was in either Cary, NC or in a city (I forget the name) Ontario.

The recruiter called back an hour later and asked if I had reviewed the description and the questions accompanying it, and also indicated that this was an urgent item. I promised a response by this morning, and she seem satisfied if a little disappointed.

I'm not too keen on recruiters playing the "urgent" card, especially when they don't really know the candidate. I'm guessing that an "urgent" position has been handed out to several firms and they are running to find candidates quickly. A brief search of the web showed that there were several job postings with selected text identical to Manpower's description.

I understand that staffing companies compete, and that they need to supply candidates before the other guy. I think the last minute scramble to search the web job boards, quickly identify candidates with scripted questions, vet them with a senior recruiter, and possibly place them before a hiring company is a poor use of time and resources. It certainly wastes my time, and I can't see it making the recruiters any friends. It sends the message "you are a commodity that we happen to have a need for right now, and we'll take advantage of that, but after this opportunity we're not really interested in talking with you".

Maybe this is the result of the "hiring company pays" model. Since I'm not paying, than I am the commodity.

Wednesday, July 29, 2009

Javascript and interview

I read more on Javascript this morning. I learned abuot opening new windows and with my new-found knowledge I understand better the problems we had at UPS with the HTML help files generated by RoboHelp. (They would open new windows or replace contents of existing windows, usually when we wanted the other behavior.)

Today's phone interview went well. It was a technical screening interview, to verify that I have the skills that they need. Or to at least get me to the next level of interview. I did better than I thought on the database questions and worse than I thought on the C#/.NET questions. If I pass, I will be invited for an in-person interview.

The person running the Yahoo-like start-up called today, and we have a meeting set for tomorrow. From his explanation, this is a completely new venture - no code or infrastructure exist. No employees, either. I've begun listing the things that they will have to worry about, from development and source code control to production and migration processes. I filled the back of an envelope quickly and I am still thinking of more things. There is quite a lot; fortunately they don't have to be handled all at once. Many items can be deferred; the important thing is to decide which.

Tuesday, July 28, 2009

Well, today I read up on Javascript but ran no experiments. I'm comfortable with the language - lots of operations are similar to those in Perl. I have some specific tasks in mind for Javascript: setting the focus to an input field, verifying that text has been entered, and validating text against a format. (All things that can be done natively in HTML 5, since they are so common on the web. I won't need the exact code when the web grows up to HTML 5, but I can use it before then.) I see more reading in the future.

The phone interview with OMB turned out to be not OMB but the contracting company that provides people to OMB. (There are lots of layers in this opportunity.) The job has a number of challenges -- better than the original description. Of the positions I am discussing with folks, this one seems most interesting. The manager I spoke with said he had a few other candidates to interview; I hope to hear something in the next few days.

Tomorrow I have another phone interview with a different company, one a little more local to Baltimore than the OMB. It's a drive, though, not within the transit circle.

Back to normal

After the conference, one returns to the normal routine.

I spent a bit of the morning going through contacts from OSCON and connecting with them through LinkedIn and e-mail. One company (Eventbrite) is looking for a Python programmer for San Francisco. I'm not the right person for that slot, since I don't know Python. (Note to self: add Python to the list of tech to learn.)

I met some former co-workers for lunch yesterday. It got me out of the apartment, and away from the computer -- for a short time.

This morning I checked Craigslist and responded to an ad placed by Ukazoo. They ran ads a few months ago, and I responded to them too -- with no response from Ukazoo. I find the position interesting but the ad a little disturbing: it lists "experience 0 to 15 years". I wonder if they are discriminating on age. And with the frequency of ads either they are expanding, or they didn't hire someone, or they cannot hold on to people.

Today I have a phone interview with someone from OMB. Also, I plan to experiment with Javascript.

Friday, July 24, 2009

OSCON 2009 day 5

Day 5 of OSCON 2009 -- the last day of the conference.

The OSCON schedule is pretty aggressive: four full days of keynotes, sessions, and events with evening activities too. The fifth day is a "half day" with keynotes and sessions running up to noon or thereabouts. Yet it doesn't end at noon. There was a get-together (with lunch provided) after the wrap-up session and then informal conversations for as long as one wanted. I spent some time chatting with a fellow Nokia N800 enthusiast.

Keynote sessions today covered open source principles in federal government and "rewilding", the ability to value natural effects and refrain from controlling them. (Think wildfires on a natural schedule for stronger forests, but applied to many other areas.)

I attended sessions on HTML 5 and cloud computing. There are good things in HTML 5 and the major browsers are on board to implement the standard. Cloud computing still eludes me, but it is slowly coming into focus. Right now, I view it as commodity computing power for web servers, scalable up and down as you need it. There are competing APIs, just as there were competing standards in the early days of electricity. I'm confident that common standards will emerge; once they do I expect adoption to increase quickly.

Cloud computing also raises the bar for programming. One needs strong languages to work effectively in the cloud; transporting current applications won't work. Programs need to be designed for scalability and parallel processing (or at least multiple instances). Languages such as Ruby and Groovy are more effective; I'm not sure that Java or C# can do the job.

Looking back at the conference, I am impressed with the variety of projects. Open source has moved beyond the typical set of Linux, Apache, and Perl. (Of course, it expanded beyond those many years ago.) There are lots -- and I do mean lots -- of projects and new concepts in open source. It will take some effort to identify the relevant fraction and keep up with them. If you're not paying attention, then you are falling behind!

Thursday, July 23, 2009

OSCON 2009 day 4

Another packed day at OSCON.

The day started with keynote sessions.

Technical sessions included cloud computing with Eucalyptus, managing innovative teams, graph databases, and patterns in dynamic languages. The last was the best of the sessions. The presenter showed how Ruby and Groovy have implemented some patterns (Iterator and Command) into their syntax, and how other patterns (Strategy, Decorator, and Command) can be implemented easily. The "patterns" book was written in the age of C++ but languages have advanced a lot.

A bonus in the "patterns" session was seeing what Ruby and Groovy can really do. They have capabilities that are not present in the traditional languages of C++ and Java. (Which is interesting since Groovy sits on top of Java, or at least the JVM.)

In other news, I have given away almost all of my business cards. Next time I must bring more!

Wednesday, July 22, 2009

OSCON 2009 day 3

I attended several sessions today. First up was Tim O'Reilly's keynote on open source and transparency in government. Then sessions on maintenance and abstractions, private cloud systems, profiling tools for Perl, leading teams, testing web apps, Ruby on Rails 3, and visualizing large data sets.

There were conversations between classes and at lunch. Some even talked about computers and tech! (I think that others, like myself, need to talk about other things to keep from going crazy.)

I ended the day with an informal session held by Yahoo. They are looking to hire people and they presented information about Yahoo -- their internal systems, the skills they look for, and the good (and bad) things about working at Yahoo. I chatted with a few Yahoo-vians (Yahooligans?) and left my contact information.

Tuesday, July 21, 2009

OSCON 2009 - days 1 and 2

I've attended some good sessions at OSCON 2009 so far. Yesterday saw sessions for the Google App Engine and iPhone programming. The former lets you create cloud applications; the latter lets you build apps for iPhones. Both are easier than I expected.

Today I was in sessions for Gearman (the remote worker task manager), improving APIs (Damien Conway), and Semantic technologies for storing data.

The best sessions have been lunch and dinner, where I can meet other folks and we can exchange ideas more informally. Today at lunch I chatted with a few folks and one of them showed me a web site that tracks consulting jobs by rate, technology, and location. Very interesting! This kind of information lets one identify the "up and coming" technologies.

Saturday, July 18, 2009

Javascript

I've been experimenting with Javascript these past few days. Using the ACM-provided subscription to Safari, I've been reading books on Javascript and going through the tutorials. The overview of browser internals from Dynamic HTML has been very helpful -- it provided a good mental map of the innards of the browser.

The tutorial exercises, so far, have been light. They are designed for people familiar with web pages but not coding. But I'm happy with the pace; I need to absorb/internalize the browser side of things.

While that's going on, I've been thinking about creating my own task for Javascript code. I work better with specific goals, and I used that technique to learn about PHP programming on the server side. My goal there was to provide statistics for source code.

I'm thinking of a list subscription web service, one that allows individuals to sign up for different lists. I had wanted this at UPS; it would have eliminated some administrative work for Worldship builds. The idea is this: System administrators create mailing lists and use them to send notices (for whatever events they want). Individuals sign up for mailing lists (subscribe) and unsubscribe as they want. A self-service mailing list, if you will. (I need a way to verify e-mail addresses; I can't allow Sam to sign up with Ryan's address.)

Wednesday, July 15, 2009

Job criteria

I'm often asked to describe my ideal position. I look for several things.

First and foremost: smart, creative people. I don't need specific technology

Second: challenging work. I want to contribute and I want to learn new things.

Third: new technology. For me, this would be web technologies, social networks, and mobile app technologies.

Fourth: location. The location tells me the commute and the possible options for the commute. My preference is a location with options for the commute. If I have to drive, then my preference is for a location away from the rush hour traffic.

Fifth: the compensation package. This includes salary, insurance, time off, and retirement package.

Most job boards and recruiters match the technology against current skills, with no thought to the other criteria.