Meeting Notes: 2.3.2012

To summarize the meeting discussion, I have attempted to classify our discussion into high level 'steps.'  Please feel free to comment.

Identifying the ‘collisions’ between the various systems.

  • Conor’s tool provides a method to analyze the variations between modules and fields across systems.  Universal, only requires FMQL.
  • Diff analysis too granular to be appropriate at this point; high level identification of variations.
  • May be additional tools of value, will follow-up at VA lockdown.

Prioritizing the resolution of these collisions.

  • What module do we start with?  Fileman has been indicated, though George suggested an evidence based approach to identifying priority targets.
  • Do we work on a high complexity but critical component, or target relatively easy modifications early in the process?  Should identify a relatively simple element to establish process, but big wins demonstrate the value of the process.
  • Is gt.m compatibility harmonization something the community wants to target early?

Identifying the appropriate method of resolution.

  • How do we govern the resolution of discrepancies?  While many modifications may not require harmonization, how do we pick the ‘best’ solution?  We need every organization represented at this point.
  • Review Module Discrepancy -> Review Field Discrepancy -> Review Function Discrepancy.
  • Begin with notes about what is most likely the cause of the modification.  How do we forensically review what each patch has done to a specific system, thus differentiating between modifications made by a vendor and a VA patch?

Performing the Resolution.

  • What is our development environment?  WorldVistA latest release in beta, can be valuable.  Functional system required as a component of development.  ‘Reference’ System.
  • Consider NIST test scripts, and populated data sets.

Testing the modifications.

  • Functional system must be used in testing.  How do we contribute test cases?  What test cases are available in the community today?  How do we perform code reviews?

Publishing the Modifications.

  • Will OSEHRA begin publishing a patch stream to harmonize installations across vendors?  How do we distribute the improvements to each vendor?

Next Steps:

Representation:

  • DSS, WorldVistA represented.  Fabian Lopez key POC for DSS.  Dave Whitten and George Lilly key POCs for WorldVistA.  Will follow up with Edmund Billings and Howard Hayes for Medsphere and IHS points of contact respectively.  For VA, need to find a key POC.  Can bring up in the lock-down.
  • Lutheran, Midland, and Kern could benefit from representation.  Will channel through Medsphere follow-up.

Meeting Schedule:

  • Moved to overlap with Wednesday’s (2/15) usual technical demonstration due to lockdown.
  • Fill out survey of availability to establish meeting schedule going forward.
like0

Comments

Meeting Notes: 2.3.2012

conor dowling's picture

Matthew,

just to put in something before tomorrow's call - I think the need to
distinguish substantives changes (forks) from differences due to age (out
of date builds) came out in the last call too. Forks are what matter.

Just to add to this: "Fork reasons" need to be identified (GT.M
portability, VA-isms/MPI removal, Other ...). If we agree on a framework
like this tomorrow then merging becomes doable. I think it's far too
arduous if you literally start with code diffs where every distinction
becomes important, no matter how insignificant in reality.

In this vain, I did update the Analytics Wiki on
OSEHRA<http://www.osehra.org/wiki/vista-analytics-foia-vs-other-vistas>
and
it points to "Fork and Age" reports for WorldVistA and OpenVistA:
- Forks'n'Age WorldVistA vs
FOIA<http://www.caregraf.org/fmql/reports/packagesFOIAvsWVFOAnalytics.html>
- Forks'n'Age OpenVistA vs
FOIA<http://www.caregraf.org/fmql/reports/packagesFOIAvsOVFOAnalytics.html>

as the wiki says "OpenVistA has many more forks than WorldVistA (out of 160
FOIA packages, 40 forked vs 23). Even where both forked, OpenVistA made
more changes (see list of changed routines in the Kernel). If a merge aims
to account for changes for GT.M then WorldVistA is probably sufficient; if
the goal is to merge "VistA in the private-sector" and VA VistA then the
more substantial OpenVistA changes need analysis.".

Conor

On Tue, Feb 14, 2012 at 5:53 AM, mmccall <mccallm@osehra.org> wrote:

> To summarize the meeting discussion, I have attempted to classify our
> discussion into high level 'steps.' Please feel free to comment.
>
> *Identifying the ‘collisions’ between the various systems.*
>
> - Conor’s tool provides a method to analyze the variations between
> modules and fields across systems. Universal, only requires FMQL.
> - Diff analysis too granular to be appropriate at this point; high
> level identification of variations.
> - May be additional tools of value, will follow-up at VA lockdown.
>
> *Prioritizing the resolution of these collisions.*
>
> - What module do we start with? Fileman has been indicated, though
> George suggested an evidence based approach to identifying priority targets.
> - Do we work on a high complexity but critical component, or target
> relatively easy modifications early in the process? Should identify a
> relatively simple element to establish process, but big wins demonstrate
> the value of the process.
> - Is gt.m compatibility harmonization something the community wants to
> target early?
>
> *Identifying the appropriate method of resolution.*
>
> - How do we govern the resolution of discrepancies? While many
> modifications may not require harmonization, how do we pick the ‘best’
> solution? We need every organization represented at this point.
> - Review Module Discrepancy -> Review Field Discrepancy -> Review
> Function Discrepancy.
> - Begin with notes about what is most likely the cause of the
> modification. How do we forensically review what each patch has done to a
> specific system, thus differentiating between modifications made by a
> vendor and a VA patch?
>
> *Performing the Resolution.*
>
> - What is our development environment? WorldVistA latest release in
> beta, can be valuable. Functional system required as a component of
> development. ‘Reference’ System.
> - Consider NIST test scripts, and populated data sets.
>
> *Testing the modifications.*
>
> - Functional system must be used in testing. How do we contribute
> test cases? What test cases are available in the community today? How do
> we perform code reviews?
>
> *Publishing the Modifications.*
>
> - Will OSEHRA begin publishing a patch stream to harmonize
> installations across vendors? How do we distribute the improvements to
> each vendor?
>
> *Next Steps:*
>
> *Representation:*
>
> - DSS, WorldVistA represented. Fabian Lopez key POC for DSS. Dave
> Whitten and George Lilly key POCs for WorldVistA. Will follow up with
> Edmund Billings and Howard Hayes for Medsphere and IHS points of contact
> respectively. For VA, need to find a key POC. Can bring up in the
> lock-down.
> - Lutheran, Midland, and Kern could benefit from representation. Will
> channel through Medsphere follow-up.
>
> *Meeting Schedule:*
>
> - Moved to overlap with Wednesday’s (2/15) usual technical
> demonstration due to lockdown.
> - Fill out survey of availability to establish meeting schedule going
> forward.
>
> --
> Full post: http://www.osehra.org/discussion/meeting-notes-232012
> Manage my subscriptions:
> http://www.osehra.org/og_mailinglist/subscriptions
> Stop emails for this post:
> http://www.osehra.org/og_mailinglist/unsubscribe/511
>

like0

Hi Conor; Good point; I was

Matt Mccall's picture

Hi Conor;

Good point; I was attempting, in my notes, to run a high level sequencing of what we talked about as we collectively examine this process.  I agree; and this may be in the *Identifying the appropriate method of resolution.*

I understood the group's conclusions to be a 'top down' sort of approach, working at module level discrepancies, then drilling into field and function level.  As you get into those weeds, discrepancies in patch levels would probably be where you first look, then classifying variations by their function; 'diffing' is probably too granular to start.  Maybe a first step would be to use your tools to look at these variations in functions and data from a higher level, noting patch discrepancies, and classifying high level changes into functions, which can then be reconciled with each other based on what they are trying to accomplish.  I would estimate that diffing would be utilized as a matter of deeper diving beyond your reports if needed, and in the actual reconciliation of functionality as required.

In any event, we were planning on performing a review of some VA lockdown conclusions in tomorrow's call to start up, since it's relevant to our topic, then opening it up.  Please feel free to bring this topic to the table.

 

like0