Video of conversation with Ralph Johnson, Ward Cunningham, and Tom Munnecke about refactoring VistA

Here is a video of a conversation after dinner conversation I had last Fri night at Ward Cunningham's home in Portland, OR, with Ralph Johnson, software refactoring guru.

Ward Cunningham, best known as the inventor of the wiki, invited me to his home for dinner last Friday night. Ralph Johnson, a world-class leader in object oriented programming technology, pattern languages, and refactoring, happened to be his house guest. The after dinner conversation turned to a spirited discussion about how to refactor the VA VistA Electronic Health Record system, so I turned on my iPhone to record the discussion.

Ward Cunningham is also well known for his contributions to the developing practice of object-oriented programming, in particular the use of pattern languages and (with Kent Beck) CRC (Class-Responsibility Collaboration) cards. He is also a significant contributor to the Extreme Programming (Agile) software development methodology. 

Ralph E. Johnson is a Research Associate Professor in the Department of Computer Science at the University of Illinois at Urbana-Champaign. He is a co-author of the influential computer science textbook Design Patterns: Elements of Reusable Object-Oriented Software.

Tom Munnecke was one of the original software architects of what is now known as VistA, the VA's electronic health record, as well as CHCS, a similar system for US Department of Defense hospitals world-wide.





Tom, Thank you for the

Conrad Clyburn's picture


Thank you for the fantastic video.



Thanks Tom. My two cents

Roy Gaber's picture

Thanks Tom.

My two cents follow:

I think one of the biggest issues with the attempts to "modernize" VistA is that those doing so tend to lose sight of the simple nature of VistA, it seems that people want to add layers of complexity to a basically simple structure.

There have been numerous attempts to "refresh" certain aspects of VistA and place them into "state of the art" domains, they all failed miserably, and most, if not all, required access to the back-end data anyway, they tended to be quite slower and less user friendly than the "roll and scroll" environments the users had grown accustomed to.

VA developed a means of integrating these "state of the art" domains with the back-end data a long time ago, the RPC Broker, this tool afforded software developers the tools to create nice front-ends to VistA, notably CPRS, BCMA, etc.  This requires knowledge of the back-end M code in order to effectively develop a new user interface, oftentimes there is no RPC for the desired functionality, this presents a need for the developer to understand M and how it relates to VistA, they need to write the M code to mine the data for presentation to the user.

I think one of the reasons people want to tear apart the fabic of VistA and "modernize" it is because they do not understand it.  They have been classically trained Java developers, or C# developers, etc. and have little or no understanding of how easy M coding really is, or how access to the VistA databases via a conduit like the RPC Broker. 

There are merits in the desire to modernize VistA, but, until the development community as a whole take the time to learn how it works I fear we will continue to see the failures we have witnessed in the past.

An exerpt from the video stands out in my mind, "If it ain't broke, don't fix it."



Check out the new VHA's Health Management Platform

Peter Li's picture

Hi Roy,

It is not necessary to tear part the fabric of VistA to modernize it.  The new VHA's Health Management Platform is an example where Java and M-system developer can work together to create enterprise level web application. see  There is also a Powerpoint presentation by Marcia Pickard that provides some detail on this architecture.  By the way, Tom posted a positive comment on this new platform.  


Exactly my point.  There

Roy Gaber's picture

Exactly my point.  There were/are many who would rather build a new VistA as opposed to interfacing into it.



Why the unnecessary layers?

Rob Tweed's picture

My problem with this is all those unnecessary layers, in particular necessitated by the use of <sigh/> Java.  It all means more moving parts, more complexity, more cost, more development time, more maintenance overhead.

Web UI integration really does not need to be so complex, as we keep demonstrating in recent WorldVistA projects such as VistACom.

Two steps forward, one step back I'm afraid.




Too bad the VA is not adoping

Roy Gaber's picture

Too bad the VA is not adoping your tool Rob.  I guess Steve O. and I can continue to plug away at trying...



Yes, IEHR seems to be stealing defeat from the jaws of victory.

Tom Munnecke's picture

I guess what is so striking about this situation to me is that we have the perfect "lessons learned" opportunity, an A/B comparison between two very different extremes about how to design health information systems.  The same DHCP code base forked into VistA and CHCS in the early 1990's.  VistA went on to become one of the most successful and widely used EHR systems in the world, and CHCS went on to become AHLTA, a candidate for one of the worst Federal IT fiascos of all time.

VistA was a decentralized, patient-centric approach that was developed in extremely close cooperation with VA's clinical staff, at one time, engaging 50,000 VA employees to develop and shape the evolution of the system.

AHLTA was a centralized, top down enterprise approach that was developed by outside contractors, based on top-down requirements "waterfall" approaches, engaging virtually none of the eventual users in the design of the system.

It would appear that the IEHR is trying to "integrate" these two approaches by throwing away everything that has been successful in the past, and replacing it with the approaches that have been failing for the past 3-4 decades. 





Roy Gaber's picture



thanks for all the back channel replies

Tom Munnecke's picture

I get a lot of back channel messages from my postings on OSEHRA... thanks for your encouragement, and I understand the sensitivity of expressing yourselves publicly.  IEHR has provoked more cynicism and doubt than I've never seen before, and given AHLTA's history, that's really saying something.

I have to say that the level of complexity of the systems we are dealing with is increasing at an ENORMOUS and compounding rate, and accelerating as we speak.  Accelerating acceleration is called a "jerk" - and this is what we are dealing with.  We have basic code-level software issues, but they are compounded by privacy and security, genomics, personalization of care, health care funding, political transitions, FDA/HHS/ONC changes, social networks, ubiquitous computing, mHealth, eHealth, and the risks of pandemics, bioterrorism, MRSA, and the health care system bankrupting the US economy.

I guess I get concerned when I see gigabucks being poured into rengineering 1995-level VistA technology - if everything goes according to plan, we'll have a new "integrated" system that replicates what VistA provided in 1995.  It's as if someone threw the train into reverse.

Regardless, I'm going to be thinking about the future... I'll be at the O'Reilly Health Foo Camp next month in Cambridge, which will be a fun place to bring up the above topics...




So who is the "jerk" again? 

Steve Steffensen's picture

So who is the "jerk" again?  :)

Great dialogue here and I'm following with intense interest!  Love the historical perspective and share many of the concerns expressed in this thread.




Requirement complexity v architectural complexity

Rob Tweed's picture

I think the thing that's always missed is that whilst the requirements of an application might be complex and growing in complexity, there's no need for the architectural design to be complex.  Unfortunately the IT industry seems to always respond to complex requirements with complex design, and it's something I continually get annoyed about.

The problem with complex architectural design is that the application requirements get lost in the layers and stacks.  My view has always been that even the most complex of applications should be capable of being understood by (and therefore maintained by) a single person, and, as an application tools designer, that's the maxim I work to.  Ultimately it's about being able to see the wood for the trees.  It's about being able to see the big picture behind a mass of detail.  It's about seeing each detail as a specific example of a more general case.  Once you do that, you can abstract the detail up a level...and ideally you keep repeating the process as you see the patterns emerging out of the detail.  You end up with a system that delivers whatever level of complex requirements you throw at it, yet it's essentially easy to understand and therefore maintain.

A good example is the kind of advanced web and mobile web UIs that I'm helping folks put onto VistA in the WorldVistA community.  The complexity of behaviour is becoming mind-boggling and the stuff that goes on behind the scenes is staggering - way beyond the kind of work we thought was complex even 10 years ago. However, the application developer doesn't need to be aware of any of that - he can focus on the complexity of his application requirements whilst instructing how to control the UI at an almost trivial level of description.

I think that's behind the differences you're describing, Tom.  And you can see from those architectural digrams I was bemoaning the reverse thinking: the assumption that complex systems demand complex stacks and layers of stuff, resulting in a system that no one person can possibly understand or maintain.  And as a result it becomes impossible to implement those increasingly complex application requirements.

You'll notice I keep mentioning "maintain".  It's the other critical thing that application "architects" (I use the term with a certain degree of irony) fail to "get": industry figures suggest that just 5% of the lifetime cost of an application is coding, but at least 67% is downstream maintenance.  So if your development requires thousands of man-hours, God help you when it comes to maintenance.  It's all about maintainability, which means it's all about abstracting complexity and keeping your application architecture as simple as possible.




A practical example of the success of the VistA model

Clayton Curtis's picture

And, as a contrast to the historical example of the consequences of the DHCP-CHCS fork, consider the 28-year success of IHS's use of VistA in its RPMS guise.  (And the unfortunate side effect of VA =NOT= being able to take advantage of innovation in IHS as a consequence of failing to stay with the successful open source model that existed in the 80's and very early 90's.)


You are very right, Clayton...

Tom Munnecke's picture

Clayton, I still remember all of our conversations and your many insightful comments back when you were with the IHS in Tucson.  I think that it was great to have you looking at things from the IHS perspective, and it just seemed like such a wonderful, practical way to develop systems for both of the agencies. 

I think we validated the open source model.  It seems to me that things started to go awry when folks started centralizing things.  Hierarchy replaced lateral communication, agile development gave way to megaprojects, and support and development of the inner layers of the onion model languished in favor of "warts on the onion" - functions pushed out to the application layer instead of burying it in the inner, shared layers.

I wonder what things would be like today if we could have implemented RuleMan to do rules-processing against the data dictionary, FileMan, and MailMan.  Or, Pendex, to track things that don't happen.  Or Universal Namespace, giving every information object in VA, IHS, and DoD a unique, globally accessible name.... and who knows what other adventures these would have lead to?

I guess what bothers me most about the IEHR trajectory is the trashing of the semantics of 30 years of information and evolutionary growth...