Thursday, October 22, 2009

e-mail servers

I finally configured "postfix" on //grendel today. Configured it such that it works reasonably well. Woo-hoo! I can send e-mail messages to specific users! (All local; the e-mail server is not configured to relay messages to other servers.)

Next is DNS.

I'm configuring e-mail and DNS for PHPmotion. It sends confirmation e-mails, and uses DNS to find your e-mail server. Without the confirmation e-mails, I have to fake out the confirmation and I think my procedures to "fake out" such confirmations are incomplete. PHPmotion doesn't document the complete set of steps; why should it, just click on the link in the e-mail and you're done!


Tuesday, October 20, 2009

The date is set

I heard from the contracting company today. I start on Tuesday. Yay!

I filled out the "user agreement" form today. This lays out the basic working conditions: government equipment is for government business, the equipment may be monitored and I have no expectation of privacy, don't attach any non-government devices, and the equipment is non-secured and cannot be used for classified data. The rules are pretty much the same as the rules at UPS, only spelled out a bit better.


Printer yes, scanner no

I worked on hardware today.

I moved the HP Deskjet printer from //desdemona to //grendel. //grendel is becoming quite the server! It now supports, FTP, HTTP, printers, and soon e-mail. Not bad for a ten-year-old Pentium II box. I tested the printing from //desdemona and had no problems.

The other hardware was the Epson Perfection 1200S scanner. I have it attached to //desdemona, through the funny little SCSI adapter card that came with it. //desdemona runs SuSE 11.1 and almost works with the scanner. It sees the little adapter card (I think it is an Adaptec 2904). It has a driver (a few drivers, actually) for the Epson scanner. But it does not recognize the scanner. YAST scans for hardware and reports that there were no scanners found. YAST also won't let me configure the driver - it says that only drivers with devices attached may be configured.

So one victory and one failure.


Monday, October 19, 2009

The end of this part of the journey

The contracting company called late today. I've passed phase 1 of the security check, which means I can work in the office! Now they must negotiate the start date. It could be as early as Wednesday. Friday and Monday are out, as I am visiting my parents. Thursday is possible and so is next Tuesday.

This has been a long journey. I'm glad to move on to the next phase.


Trains, e-mail servers, and Ruby

This morning I made a test run of the commute into Washington DC. It went smoothly, almost. The train stopped at New Carrollton, held there by dispatchers. (Apparently there was a problem with the catenary.) The primary purpose of the test commute was to ensure that I could wake up and be at the train station on time. The secondary purpose was to train my body for the commute. Having met both goals, I decided to return home rather than be obstinate.

The train rides allowed me time to read up on e-mail servers. I will probably need an e-mail server for the test environment for the video web site project. The book from the CPOSC prize table has just the information that I need. (Or so it seems. I have yet to try any of it.)

I met my former co-workers Larry and Will for lunch. We had half-price hamburgers and good conversation.

This afternoon I sat down with Ruby and the Microsoft OOXML files for Excel. Extracting data from them is possible but not straightforward -- the files are linked and there are pointers from one file to another. It's not impossible, just not simple. Extracting data should be easier than creating new files; I should have started on the extraction side first. Oh well, I'm learning about the files either way.


Sunday, October 18, 2009

Where the open source boys are

I attended the Central Pennsylvania Open Source Conference this weekend. It was a small conference. With the attendee count at 150, a better description might be "tiny". Yet even with a small number of people, it was a good conference.

There are few conferences for open source software. The big conference is O'Reilly's OSCON, which has been held for over ten years, at a variety of locations. Beyond OSCON, there are smaller conferences, but no large cluster of conferences. CPOSC is in Harrisburg, PA; Open Source Bridge is in Portland, OR; and there are conferences in Utah and Georgia.

I started thinking that there might be a clustering of conferences around open source communities, which led me to think about such communities. Is there a geographic concentration of open source projects? Something akin to a Silicon Valley for open source?

The more I thought about it, the more I realized that it did not exist, and probably would not exist. Open source projects are typically manned by volunteers, working at home or in employer-supplied facilities, but not in a central location. The open source model does not require a central office. Contributors do not commute to a common office every day, report to managers, sit at assigned desks, or attend mandatory status meetings.

Open source works in a distributed manner. The resources are people (and a few computers and network connections), not ores extracted from the ground or chemicals manufactured in a large plant. Open source projects don't need massive assembly plants, deep supply chains, volumous warehouses, starched uniforms, large cafeterias, or any of the industrial-age mechanisms that require incredible support mechanisms. The economic forces that pulled people together in the industrial age don't exist in open source. Daily physical presence is not needed.

Which is not to say that physical presence is completely useless. Physical presence is useful. E-mail, instant messaging, and web cameras provide a narrow channel of communication, much less than physical presence. Physical presence provides a "high bandwidth" channel, and it lets one get to know another person quickly. The communication through non-verbal language lets one person build trust in another. Physical presence is needed occasionally, not every day.

I expect to see more open source conferences. Small conferences, such as the Open Source Bridge conference in Portland and the CPOSC in Harrisburg. I expect that they will be in cities, in places with support for travel, lodging, and meetings. Convention centers, hotels, and colleges will be popular places.

I expect that they will occur across the country, and around the world. There will be no Silicon Valley of open source, no one location with a majority of the developers. Instead, it will be everywhere, with people meeting when and where they like.


Thursday, October 15, 2009

Web site demo

I met with Ben today. I showed him the PHPmotion web site. He likes it; PHPmotion has the features that he wants.

Now to get e-mail and DNS working. PHPmotion sends confirmation e-mails to new users. I think that it does things when a user clicks on the registration link. My system does not have DNS or e-mail hooked up, so PHPmotion skips the e-mail and never completes the setup work. The web site for the one user has a few things broken, such as uploading files.

I need to understand e-mail servers and DNS. Maybe I can configure them to be a private network, with no connections to the outside world.


Wednesday, October 14, 2009

Spreadsheet comparison

I tried three different spreadsheets today, performing the same task in each and comparing the results.

The spreadsheets were OpenOffice.org, Google Docs, and Zoho. (Each of these office suites has a spreadsheet component.) I did not include Microsoft Excel, as I know that it can perform this task. (Also because I don't have a copy of Microsoft Office.)

The task was to paste some data and create a chart. I used the same data for each spreadsheet. The data showed browser popularity over the past fifteen months. It had statistics for IE8, IE7, IE6, Firefox, Safari, Chrome, and Opera.

I used Google Docs first. I found the experience reasonable. Google Docs has a very good user interface, mimicking the typical (2007) spreadsheet program. (No "ribbon" UI.) I pasted the data and Google Docs did the right thing and parsed my values properly. I deleted some blank rows (they were in the source data), inserted a column and gave it formulas to calculate total IE popularity, sorted the data (the source data was in reverse order), and created a chart. Google Docs walked me through a short wizard to define the data for the chart, define some properties such as title and legend position, and then gave me a chart. The chart was readable and pleasant to look at.

My experience with Zoho was similar. It does not mimic the windows application as much as Google Docs; it takes some time to adjust to Zoho's UI. (But not much.) Zoho also parsed my data properly and provided a wizard to create the chart. (I also sorted the data and created formulas for total IE popularity.) Zoho's performance was slightly better than Google Docs, although Google Docs has the edge on UI. The chart in Zoho was not quite as nice as the one in Google Docs, but still quite usable. Zoho correctly omitted datapoints for empty cells; Google Docs drew the lines with values of zero.

The OpenOffice.org spreadsheet had the best performance. (It was a local application, not a web app.) It provided more granular control over the pasting of data, allowing me to select delimiters. I performed the same operations, removing blank rows, adding columns and formulas, and sorting the data. OpenOffice.org was the only application that could sort a subset of data; Google Docs and Zoho sorted everything with no option to exclude rows or columns. (Perhaps they can with judicious use of the selected cells.) OpenOffice.org also used a wizard to create the chart. It's configuration of chart data is more granular than Google Docs and Zoho; OpenOffice.org lets me select the range for each series where the others use a single block of cells. Oddly, OpenOffice.org's chart was the worst in appearance, with thick, fuzzy lines where the other packages used thin, crisp lines.

Based on this brief evaluation, I see no clear leader. All three did the job. The web-based packages offer sharing of documents; OpenOffice.org uses the "my files on my computer" model. But OpenOffice.org has the better performance and the finer control. I think any could be used successfully for basic spreadsheet needs.


Monday, October 12, 2009

whereis for Windows

One of the utilities that had written at UPS was 'whereis'. Windows does not have such a thing (or if it does, it is called something else). It is a useful program and I resolved a number of problems with it.

The 'whereis' program tells you where a specific command is located. In our Windows systems, we had multiple copies of programs named 'find', 'diff', and 'sort'. The different versions behaved differently, and we found it important to set the PATH variable properly for our applications. The 'whereis' program helped us identify errors in the PATH variable.

Today I missed my old friend, so I decided to write it. (Write it again, since I do not have access to the programs I wrote at UPS.)

My old version was in Perl. My new version is in Ruby. The new version is much cleaner than the old version. I remember that I had a number of issues with the original program, and I needed several weeks (working on and off) to get the program correct.

The Ruby version is shorter, cleaner, and correct. It comes it at 20 lines (including comments and blank lines). I think that is about half the length of the Perl version. I built the program in three stages, testing each stage as I went. Total time: about ten minutes. (Less time that it took me to write this entry!)


Refactoring Ruby

I refactored some Ruby code this afternoon. It went faster than I expected. The changes were in the "Excel sheet creator" programs; I changed some modules to classes. (Ruby allows for modules and classes. I had chosen the module approach but really had designed the code for classes, so the change made things simpler and more obvious.) I used my test cases to verify that my programs still work as expected. (They did not immediately after the change, due to syntax errors.) It felt good having the tests in place to verify my work.


More Rails

I picked up the third edition of Agile Web Development with Rails this week-end. It was at the local Barnes and noble. I used it this morning and followed the exercises in the book. With this book (which is for Rails V2) the exercises work as expected. Yippee!


Friday, October 9, 2009

Rails without alignment

I did some work with Ruby on Rails today; made less progress than I would have liked.

The problem I encountered was a mis-alignment of my installation of Ruby and the Agile Web Development with Rails book. I have Rails version 2 installed; my copy of the book is for version 1. I need a later edition of the book.

I made some progress, and have a pretty good idea of Rails and how it organizes Ruby code. I want to pursue this, and getting the later book should be possible.


Wednesday, October 7, 2009

Refactoring Ruby code

I worked on my "macron" macro processor today. This is a program written in Ruby, and it has two purposes. The first is to convert text into XML files and provide closing tags at the proper place. The second is to let me improve my knowledge of Ruby.

I focussed on the second purpose today. I had two sections of code that read macro definitions from a file. (My initial design of the program was sloppy.) Today I refactored the program to use a single section of code. The adjustment used bits of both old sections of code, combining them into a single function.

I'm pleased with the result. The code is smaller (from 270 lines to 235 lines) and easier to read. Also, the code is more robust; the two old sections of code each had their specific functions but neither did everything. Now the logic is complete and in a single place.

I used automated tests during the refactoring effort. They helped immensely. At the end of the day, I knew that I had all functionality working.

The program is not perfect, nor complete. I have more features to add, and a few tests to add. But for today, it's good progress.


Thoughts on Silverlight

I attended a local user group yesterday evening and saw a presentation on Silverlight, Microsoft's web app environment. The presentation was quite good; it gave us a lot of information about Silverlight and the presenter knew his stuff. He walked us through the construction of a simple Silverlight application, showing us the support in Visual Studio.

Silverlight is an odd duck. It is not a standard Windows application, nor is it a standard web application (read that as "ASP.NET"). Originally built to compete with Adobe's Flash, Silverlight has become much more than a video player. Microsoft has extended it, giving it useful controls and widgets and making it more capable.

-- I'll digress here and say that Microsoft's extension strategy is not the "embrace and extend" strategy that developers dislike. SIlverlight, from what I know, was never "Flash-compatible". Microsoft is not "corrupting the X standard" (in this case the Flash API) with proprietary extensions. With SIlverlight Microsoft has created a competing technology (version 1.0) and improved it (version 2.0) with extensions of its own. This kind of innovation is more welcome than the "embrace and extend" innovation that Microsoft used with Java. -- End of digression.

Silverlight is quite different from Microsoft's ASP.NET platform. The biggest difference is in screen layout. (I still use the term "screen", showing my development roots.) Silverlight defaults to a dynamic layout, not the fixed-position layout that old-school Visual Basic and Visual C++ programmers know and love. The new model requires a re-thinking of screen design, and while some may complain, I think that it is the right thing for Microsoft to do. Silverlight apps live in the web, and the web uses dynamic screens, not the known, fixed windows of a PC application.

Yet Silverlight has its faults. The construction of a simple app, even something like "Hello, world!", requires a fair amount of typing and clicking. The presenter showed us that he could bind a data source to a grid with only four mouse clicks, but he had done a but of work in advance. The effort is not terribly large, and it is less than previous Microsoft platforms, but I think the advantage still goes to other tools such as Rails. (Although Rails requires more effort on the screen layout side.)

My other complaint with SIlverlight is its use of XML, specifically the XAML configuration files. This is my personal bias against XML for configuration files, and less of a technical complaint. I find XML files hard to read -- that is, my internal XML parser is inefficient. Microsoft has provided support for constructing and editing the XAML files with Intellisense type-completion and syntax checking, and those help, but it still leaves me with my internal parser for reading the files.

Overall, I'm impressed with Silverlight. For Microsoft shops, I think that it is the way to go.


Thursday, October 1, 2009

A practice commute

Today I took a practice run at the expected commute for the OMB gig. Up at 6:00 (well, a little earlier), a quick shower, a regular breakfast, and then a brisk walk to Penn Station. Arrived with time to spare -- which means I may be able to sleep until 6:00.

On the ride in I read a copy of InforWorld, some of "Communications of the ACM", and a bit of "Web Navigation" from O'Reilly. The last is an old book on web design, dating back to 1998. (Wow... is anything web-related that old?) While it predates a lot of the modern web stuff, it still has good ideas. Ideas that are now accepted as the norm, such as "keep your navigation controls consistent".

On the commute back I read a copy of The Washington Post. I also noted items for the OMB gig, mostly set-up such as access to servers, e-mail, and tools for building the application. It's a longish list, going on about meetings, core hours, building access, work location, resources, and procedures for technical support. An article from Fast Company gave me a shorter list:

 - Meet a bunch of people, talk with them, and learn about the application
 - Work on the application
 - When I need help, talk to the right people

It's things like this short list that make Fast Company useful.