Documentation

I appreciate good documentation. I find the best documentation includes a good summary, followed by examples, illustrations, and then progressively more detailed information. If I recognize some transitive relationship with another concept, I don’t need a deep dive. I can probably glean the information from the examples and summary, which is why the detailed information should be after the examples and illustrations.

brady book of turbo pascal

In the late 80’s, when we were still reading hard-copy books, I found that Brady technical books, especially ‘The Brady book of Turbo Pascal’ by Mike Edelhart, were very good. Mostly, because they would follow a convention where topic details, highlights, examples, and figures were laid out consistently from page to page. By the end of the book I was able to get most of the information from the examples and highlights.  This is still good practice for our electronic documents today.

round trip

I’ve always called IBM documentation a ‘race to knowledge’ (and not in a good way). What I mean is, IBM documentation usually contained a summary and a reference to another document that, in turn, provided a summary and a reference to another document. And so on, until you were back to the original document.  A few laps around the track and you begin to gain comprehension and can begin experimenting.

Today, many web sites follow this type of documentation flow where short summaries and links fail to bring the major concepts to light. The content pointed to by the link becomes an abstract of the whole and tends to lose it’s effectiveness in conveying the information desired/needed.  If you want an example of this pattern, look at the materials on the OASIS site.

Documentation is not a substitute

Documentation

Most complex problems need documentation before a solid solution can be implemented. However, too often, documentation is where the software vendor ends it’s delivery experience to the consumer.  For example, take software installs.

During the early days of local area networks, vendors like IBM and Novell dominated the LAN market. The installs, much to their credit, included detailed instructions defining the configuration steps required to get the network stack up and running.  Once the install was complete, you had all the information required to complete the process.  Meanwhile, Microsoft was trying to make in-roads to becoming the network of choice.  While there was documentation, it was not generally needed, as the configuration would be complete at the end of the install.

I remember, the experience was less than optimal when installing a network stack on OS/2. Installing the software was easy enough and even created a template configuration file. But it required reading the manual, editing the configuration file, and ensuring that the different drivers where loaded in the proper order. And, heaven forbid, you had an improper semi-colon, misspelled keyword, or load sequence.  And if you need to reinstall – make a backup of your config – because a re-install generally broke the existing configuration and required you start all over.

My Microsoft experience, however (I know, fanboy), required only that I know the information about our environment and enter it during the install.  The details of the layout, formatting, keywords, and loading sequences inside the config file were eliminated as a concern. One reboot and the network was accessible.

install wizard

This pattern of installing and immediately running the application is a hallmark of Microsoft software.  As i said, I’m a bit of a fan boy, but I lived the days of manual configuration and it was not pretty.  One missed keystroke, a typo, or dyslexia and you spent hours trying to find the error.  We are pretty much used to this install and ‘Run Now’ process but it has not always been the case.  I believe it is the main reason Windows won the client wars in the last and early part of this century.  And it won the server wars in the last century as well: OS/2 / Novell Netware – anyone  – ever heard of them?

Beginning with the start of this century, Linux has become the larger part of the network server market.  This is mostly due to it’s smaller footprint and lighter hardware requirements.  But give credit to companies like Red Hat for adding the install and run process to their suites. This drives down the total cost of ownership that enables the web application platforms to compete and take advantage of the internet era.

These are good examples and experiences of third-party software documentation practices, but let me know your ideas on the subject.

In a subsequent blog I will dive into documentation practices that we should embrace and avoid in our vertical software development.

Vertical Software Development

For many of us, we develop software that is meant for vertical markets such as banking, insurance, energy, retail, health care, lending, etc.  Unlike Facebook, Google, Microsoft, and Apple, our markets are smaller, but in many ways more demanding because we need to cater to a client base that has processes and systems that include legacy technology and are more entrenched.

major players

I mention these heavy weight companies because they are successful and provide many of the resources, tools, software stacks, and hardware that we rely upon. Not to mention, they also provide products and service we need to support.  We can learn a lot from these guys and should take advantage of all their resources. But our software requirements are different, our users are different, and we have a lot more interaction with legacy systems, processes, and data.

Not to downplay the excellence that is Facebook, the internet while mature is relatively young; especially compared to the industries we support. Implementing a revelation such as Facebook, from a software standpoint, is much easier than the software you create and maintain on a daily basis.  We can learn a lot from Facebook (and others), around performance, scaling, deployment, versioning, and upgrading. Their sheer volumes alone are enormous and it is rare that anyone complains about response times or service. I know I can usually blame my ISP for any outage or performance issues.

industry logos

Our industries have processes and systems that have been in place for decades. And they drive very important aspects of a company’s business or municipality’s services. Not all processes are automated. Many are manual and have paradigms that are not easily translated into software solutions (or they would have been so already). Happy paths are easily explained and usually the first aspects translated to software.  But the exception paths are many, not easily identified, and are obfuscated by the many transitions from exception back to happy path.

The transitions can go through many hands, from one department to another, can involve research, confirmation, and in some cases authorization. While happy path constitutes 90% of business as usual, it is the 10% that costs our industries the most.  From a widgets counting standpoint, 90% sounds really good (and be sure it is) but it does not address the major costs and pain points which our clients really want and need us to address.

integration

Integrations are a major concern for our software. It is critical if you want to compete for new accounts. It is not likely your software can replace all the systems used by a client. Your ability to integrate with their existing systems can make the difference for your sales team. You don’t necessarily have to include the integrations out of the box, but you must have extensibility built-in to reduce the implementation time line and costs. Being able to show a simple integration, like importing a spreadsheet, during a demo has been very effective in the past. Especially if you can illustrate the same capability as an automated process.

This is where I want to focus my blog. Developing software for vertical markets and legacy industries is were a lot of us live. I hope you find this useful or helpful and will participate by providing your ideas, criticisms, and insights.

Please register and let me know what you think.

Starting a Blog

I’ve been wanting to start a blog for a very long time. You can probably see, I created this site in 2013; almost 3 whole years ago.  I really just want to jot down some of my thoughts, experiences, and ideas related to software development. And hopefully share some mistakes and practices that can be avoided or useful to future software developers.

EDS Logo

I’ve been in software development (professionally) since 1988 when I started with EDS (Electronic Data Systems).  It wasn’t a very exciting environment, but the potential for a young developer was amazing.  EDS was an IT provider in many industries including insurance, banking, automotive, and many more.  In the beginning, COBOL, PL/1, and Assembly were my main programming languages.  But EDS provided so much more.  It gave me access to many of the latest technologies, methodologies, and business education.  By the time I left EDS in 1996, I was a full on OOP developer with a very strong background in data modelling and n-tier software development.

ROI Logo

After EDS, I went to work for ROI (Resource One Incorporated).  This was an amazing opportunity, as the company was updating their original software from older COBOL and Clipper applications to a client/server application based on the latest technology stacks of the time. When I joined ROI, it had a total of 6 employees in two locations, providing performance measurement and incentive software. ROI served the largest banks in North America, along with many smaller credit unions, and financial institutions.  The original COBOL application was developed using Microfocus COBOL and tested on OS/2. The code was uploaded to the bank’s server on IBM mainframe MVS z/OS hardware where it was compiled. The compiled application ran as a series of batch jobs producing report files and CICS screens for online viewing of results. Along with the server application, there was a client application, running on a PC, that was written with Microfocus Dialog.  Because it too was COBOL, much of the server COBOL source code was reused.  The clipper application was a totally different technology, developed specifically for the PC, and was a stand-alone application where the printer was the communication device.  Originally, the two systems had the same feature sets, however, by the time I join the team, the feature sets had diverged.

Tyler Technologies Logo

After successfully merging and transforming the ROI applications to windows clients, .net services, web pages, and Java web services, I was ready for new challenges.  So I tried my hand at consulting and contracting before joining Tyler Technologies as a senior architect.  Tyler is an extremely good company with a very bright future.  In terms of size, Tyler is definitely larger than ROI, but significantly smaller than EDS.  It is very similar in size and technology stack to many of the places where I consulted.  However, the management team, developers, and support staff are among some of the most dedicated, thoughtful, and top performers I have ever had the opportunity to work with.

Frank Image

I will be sharing an expanded view on these experiences in the coming weeks and months. And I hope you find them interesting and thought provoking.  Also, I am very interested in your comments and experiences as well. So, please register and share them with me.

Hello world!

Welcome to FrankHavens.com.  I am Frank Havens and I love my profession. Software development has been a part of my life since 1981.  My first computer, a Tandy (Radio Shack) Color Computer (CoCo), was a 16K machine with a cassette tape drive attached to a 19 inch color TV.CoCoAndStuff  It included a BASIC interpreter for all my computing needs.

I couldn’t wait for the monthly CoCo Magazine; where I would enter in that month’s program and watch it fail.  I’d work out the bugs, make my changes, and wah-la, I had a working program.  The following month, I’d check the publication corrections from previous months and see how my changes stacked up to the editors.  It was quite entertaining, challenging, and rewarding.

Now, I’ve been developing software for some 20+ years and the excitement and challenge is as fever pitch as it was as a hobbyist.  The times have definitely changed, but the goal is still the same; empower people to be self sufficient, by making something foreign, tedious, and error prone, into a process that is enlightening, automated, and accurate.

I hope this blog will be informative, thought provoking, and fun.  I will enjoy hearing your opinions, so please join me.