RDF as Universal Healthcare language

After many years working on both the VA VistA and DoD CHCS software architectures, I came to the conclusion that we were hitting a wall of complexity that could not be handled with the technologies that we were using.  In the mid 1990's, I started wandering around looking for alternatives.  I found two very interesting candidates.

My first discovery was a programmer at CERN in Geneva who was working on a thing called the World Wide Web.  There were 150 web sites already linked, and I'd get emails, breathlessly announcing the latest addition to the web.

One of the things I wanted to do with CHCS and VistA (and IHS' RPMS) was to create a "Universal Namespace" - giving a name to every data object handled by any system.  The idea, never implemented, was to prepend the domain name to the FileMan metadata, so that any information could be linked to any other information.  I had some fuzzy thoughts about using FileGrams to do this, and saw the need for some kind of reference language to link it all together.  When I saw Tim's design of the web, I realized that this was what I was trying to do, but done in a really smart way.  He had a certain genius of knowing what had to be right from the start (the design of the URL) and what needed to be "good enough" (HTML, HTTP).  He had a very practical vision of how to make it all scale. He allowed the web to be "broken" - the "404 Not Found error" - in contrast to prior hypertext systems (Doug Engelbart, Ted Nelson) who required bi-directional referential integrity.   This vision was not appreciated by all: his first paper describing the Web was rejected by peer reviewers as being "not scalable" and leading to a "bowl of spaghetti."  Here is a video interview I did with Mark Frisse, one of those reviewers.

The web is arguably one of the most complex and globally transformational invention in the history of technology.  But Tim did not try to create Google, Facebook, or Amazon.  Nor did he create a Dewey Decimal system to try to structure the web, nor did he assign experts to standardize travel, auctions, or book sales.  He started with simple initial conditions and let the system evolve - something that is very hard for many to understand.  A complex system requires a complex command and control system, they would say... the more complex the system, the more complex the command and control hierarchy.

A complex information system would require a complex "librarian" function to categorize, standardize, normalize, and edit information, they would say.  But as we have seen with Google, this is not the case.  We can find things with Google that would have been unimaginably complex in the old days of reference librarians and Dewey Decimal systems.

Today, it is trivial to look up a book and paste the URL in an email or Tweet.  There is no need for a standards group to define a meaningful use "book exchange" standard - it is simply an intrinsic part of the information space we call the web.

I have long thought that health IT needs to adopt a similar "information space" architecture - a large-scale, fine-grained network that links all of our information, all the time, from anywhere.  Concurrent with this universality, we also need an equally powerful privacy and security model that can deal at the fine-grained level of individual objects. The simplicity and the regularity of this model - putting the patient at the center of the health care universe - offers a dramatic improvement in our ability to manage the complexity of health IT. 

I held a workshop at the World Wide Web Consortium offices at MIT Apr 20 (minutes of the meeting coming soon), and decided to continue the discussion at the SemTech conference in San Francisco June 3.

The President’s Council of Advisors on Science and Technology (PCAST) identified the need for a universal healthcare exchange language as a key enabler in addressing this problem by improving healthcare data portability. Many familiar with Semantic Web technology have recognized that RDF / Linked Data would be an excellent candidate to meet this need, for both technical and strategic reasons. Although RDF is not yet well known in conventional healthcare IT, it has been beneficially used in a wide variety of applications over the past ten years -- including medical and biotech applications -- and would exceed all of the requirements outlined in the PCAST report.

RDF offers a practical evolutionary pathway to semantic interoperability. It enables information to be readily linked and exchanged with full semantic fidelity while leveraging existing IT infrastructure investments. Being schema-flexible, RDF allows multiple evolving data models and vocabularies to peacefully co-exist in the same instance data, without loss of semantic fidelity. This enables standardized data models and vocabularies to be used whenever possible, while permitting legacy or specialized models and vocabularies to be semantically linked and used when necessary. It also enables a limitless variety of related information to be semantically linked to patient data, such as genomic, geographic and drug interaction data, enabling more effective treatment, and greater knowledge discovery. Other reasons for adopting RDF as a universal healthcare exchange language include: (a) its ability to make information self-describing with precise semantics; (b) its support for automated inference; and (c) its foundation in open standards.

I think that this is an exciting path forward for VA/DoD sharing, and the logical next technological step for the metadata models begun with VistA and CHCS...

 

 

like0

Comments

RDF as Universal Healthcare language

Stephen Hufnagel's picture

RDF is mandated for the DOD in the following memo … The question is … How can VistA and RDF co-exist?

DoD DCMO Mandates BPMN to RDF and OWL Semantic Technologies

See a Business Perspective YouTube Video at

http://www.youtube.com/watch?v=pkhj2sTPlbk

See a Technical Perspective YouTube Video at

<http://www.youtube.com/watch?feature=player_embedded&v=OzW3Gc_yA9A#t=0s> http://www.youtube.com/watch?feature=player_embedded&v=OzW3Gc_yA9A#t=0s ,

where the DoD DCMO mandate for Semantic Web technologies is discussed

by <http://gov.aol.com/tag/Dennis+Wisnosky%20/> Dennis Wisnosky, DCMO Chief Architect and CTO

APRIL 4, 2011, Modified <http://semanticommunity.info/index.php?title=Build_DoD_in_the_Cloud/Ente...(E2E)_Business_Models_and_Ontology_in_DoD_Business_Architecture&action=history> 14:23, 15 Nov 2011 by <http://semanticommunity.info/User:Admin> Admin | <http://semanticommunity.info/index.php?title=Build_DoD_in_the_Cloud/Ente...(E2E)_Business_Models_and_Ontology_in_DoD_Business_Architecture&action=history> Page History

MEMORADNUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS

CHAIRMAN OF THE JOINT CHIEFS OF STAFF

UNDERSECRETARIES OF DEFENSE

COMMANDER, UNITED STATES TRANSPORT COMMAND

ASSISTANT SECRETARY OF DEFENSE FOR NETWORKS AND INFORMATION INTEGRATION/DOD CHIEF INFORMATION OFFICER

DIRECTOR, COSST ASSESSMENT AND PROGRAM EVALUATION

DIRECTOR, ADMINISTRATION AND MANAGEMENT

DIRECTOR, NET ASSESSMENT

DIRECTORS OF THE DEFENSE AGENCIES

DIRECTORS OF THE DOD FILED ACTIVITIES

Subject: Use of End-to-End (E2E) Business Models and Ontology in DoD Business Architecture

DoD historically spends more than $6.0B annually developing and maintaining a portfolio of more than 2,000 business systems and Web services. Many of these systems, and the underlying processes they support, are poorly integrated. It is imperative, especially in today's limited budget environment, to optimize our business processes and the systems that support them to reduce our annual business systems spending.

The Defense Business Systems Management Committee (DBSMC) embraced the E2E business lifecycle model as a viewpoint to frame and understand our business environment. Further, the DBSMC endorsed using and extending the E2E framework to evolve the Business Enterprise Architecture (BEA) within the context of the DoD Enterprise Architecture. This essential framework will be used by the DBSMC and the supporting Investment Review Board (IRB) process to guide and constrain business system investments and conduct business process reengineering determination as required by statute.

This memorandum outline a new approach to leverage the E2Es defined within the BEA and provides the illumination necessary to achieve the management and interoperability required by statute. The Assistant Deputy Chief Management Officer will issue additional guidance to ensure these approaches are clearly applied to these management efforts. Additional guidance will identify future "time boxed" BEA development activities in consultation with the E2E Governance Council.

The BEA is the Enterprise's blueprint for defining the Department's business environment. It captures required Enterprise capabilities; metrics aligned to the Department's Strategic Management Plan; processes, data standards and rules to permit system and Web service interoperability. It is also a tool for driving portfolio management and business process reengineering. BEA release 8.0 captured and defined the Department's 15 E2E business lifecycle models at a high level.

The BEA will remain aligned to the DoD Architecture Framework (DoDAF), currently DoDAF v 2.0. In order to facilitate integration of the systems and business architecture within the E2E lifecycle models,

the BEA will be described in an ontology using a common language {World-Wide Web Consortium (W3C) open standards Resources Description Framework (RDF)/Web ontology Language (OWL)} and modeling notation {Business Process Modeling Notation (BPMN) 2.0 Analytic Conformance Class (Primitives)}. These open standards shall be applied to all subordinate Enterprise solution architectures federated or assessing compliance with the BEA. <Emphasis added>

Effectively immediately: (1) the E2E framework shall be used to drive and organize BEA content within the federated BEA ontology; and (2) future releases of the BEA will be synchronized with our highest priority system acquisition and modernizations efforts related to critical activities within the Hire-to-Retire and Procure-to-Pay E2E lifecycle models. Initial priorities are as follows:

* Define BEA foundation, organization and structure using the new approach leveraging semantic Web technology.
* Create the BEA ontology architecture with necessary DoD Chief Information Officer and stakeholder coordination.
* Define the relationship of BEA ontology to the DoD Architecture and its applicability to Component and solution architectures. Enterprise
* Propose a schedule for establishing a federation of ontologies within the BEA.

In coordination with the prioritization and timing described, Component and Agency business architectures shall demonstrate compliance with BEA content vis participation and conformance with procedures and processes described herein. The DBSMC will receive quarterly BEA updates via the E2E Governance Council.

My point of contact for this matter is Robert Jennings, <mailto:robert.jennings@osd.mil> robert.jennings@osd.mil or 703-614-0214.

Signed

Elizabeth A. McGrath
Deputy Chief Management Officer (DCMO) for Department of Defense

Thank you for your help and consideration,

Steve

____________________________________________

Stephen P. Hufnagel PhD, Enterprise Architect and Data-Interoperability Engineer

The Information Applications Group (TIAG) / Deloitte IM-BAS Subcontractor

Deloitte: 2941 Fairview Park Drive #4468, Falls Church, VA 22042 , 571-882-7680-desk

TIAG: 1760 Reston Parkway, Suite 510, Reston, VA 20190

703-437-7878, Shufnagel@tiag.net

PERSONAL: PO Box 8097, Falls Church, VA 22041

703-575-7912-mobile and preferred phone

703-995-0841-eFAX, Hufnagel@acm.org

"It is not the critic who counts, not the man who points out how the strong man stumbled, or where the doer of deeds could have done better. The credit belongs to the man who is actually in the arena; whose face is marred by the dust and sweat and blood; who strives valiantly; who errs and comes short again and again; who knows the great enthusiasms, the great devotions and spends himself in a worthy course; who at the best, knows in the end the triumph of high achievement, and who, at worst, if he fails, at least fails while daring greatly; so that his place shall never be with those cold and timid souls who know neither victory or defeat." Theodore Roosevelt. (Paris Sorbonne, 1910)

From: Apache [mailto:apache@groups.osehra.org] On Behalf Of Tom Munnecke
Sent: Thursday, May 09, 2013 6:24 AM
To: Architecture
Subject: [architecture] RDF as Universal Healthcare language

After many years working on both the VA VistA and DoD CHCS software architectures, I came to the conclusion that we were hitting a wall of complexity that could not be handled with the technologies that we were using. In the mid 1990's, I started wandering around looking for alternatives. I found two very interesting candidates.

My first discovery was a programmer at CERN in Geneva who was working on a thing called the World Wide Web. There were 150 web sites already linked, and I'd get emails, breathlessly announcing the latest addition to the web.

One of the things I wanted to do with CHCS and VistA (and IHS' RPMS) was to create a "Universal Namespace" - giving a name to every data object handled by any system. The idea, never implemented, was to prepend the domain name to the FileMan metadata, so that any information could be linked to any other information. I had some fuzzy thoughts about using FileGrams to do this, and saw the need for some kind of reference language to link it all together. When I saw Tim's design of the web, I realized that this was what I was trying to do, but done in a really smart way. He had a certain genius of knowing what had to be right from the start (the design of the URL) and what needed to be "good enough" (HTML, HTTP). He had a very practical vision of how to make it all scale. He allowed the web to be "broken" - the "404 Not Found error" - in contrast to prior hypertext systems (Doug Engelbart, Ted Nelson) who required bi-directional referential integrity. This vision was not appreciated by all: his first paper describing the Web was rejected by peer reviewers as being "not scalable" and leading to a "bowl of spaghetti." Here is a video interview I did with Mark Frisse <http://blip.tv/tom-munneckes-videos/mark-frisse-s-message-to-tim-berners... , one of those reviewers.

The web is arguably one of the most complex and globally transformational invention in the history of technology. But Tim did not try to create Google, Facebook, or Amazon. Nor did he create a Dewey Decimal system to try to structure the web, nor did he assign experts to standardize travel, auctions, or book sales. He started with simple initial conditions and let the system evolve - something that is very hard for many to understand. A complex system requires a complex command and control system, they would say... the more complex the system, the more complex the command and control hierarchy.

A complex information system would require a complex "librarian" function to categorize, standardize, normalize, and edit information, they would say. But as we have seen with Google, this is not the case. We can find things with Google that would have been unimaginably complex in the old days of reference librarians and Dewey Decimal systems.

Today, it is trivial to look up a book and paste the URL in an email or Tweet. There is no need for a standards group to define a meaningful use "book exchange" standard - it is simply an intrinsic part of the information space we call the web.

I have long thought that health IT needs to adopt a similar "information space" architecture - a large-scale, fine-grained network that links all of our information, all the time, from anywhere. Concurrent with this universality, we also need an equally powerful privacy and security model that can deal at the fine-grained level of individual objects. The simplicity and the regularity of this model - putting the patient at the center of the health care universe - offers a dramatic improvement in our ability to manage the complexity of health IT.

I held a workshop at the World Wide Web Consortium offices at MIT <http://www.flickr.com/photos/munnecket/sets/72157633291364002/> Apr 20 (minutes of the meeting coming soon), and decided to continue the discussion at the SemTech conference in San Francisco <http://semtechbizsf2013.semanticweb.com/sessionPop.cfm?confid=70&proposa... June 3.

The President’s Council of Advisors on Science and Technology (PCAST) identified the need for a universal healthcare exchange language as a key enabler in addressing this problem by improving healthcare data portability. Many familiar with Semantic Web technology have recognized that RDF / Linked Data would be an excellent candidate to meet this need, for both technical and strategic reasons. Although RDF is not yet well known in conventional healthcare IT, it has been beneficially used in a wide variety of applications over the past ten years -- including medical and biotech applications -- and would exceed all of the requirements outlined in the PCAST report.

RDF offers a practical evolutionary pathway to semantic interoperability. It enables information to be readily linked and exchanged with full semantic fidelity while leveraging existing IT infrastructure investments. Being schema-flexible, RDF allows multiple evolving data models and vocabularies to peacefully co-exist in the same instance data, without loss of semantic fidelity. This enables standardized data models and vocabularies to be used whenever possible, while permitting legacy or specialized models and vocabularies to be semantically linked and used when necessary. It also enables a limitless variety of related information to be semantically linked to patient data, such as genomic, geographic and drug interaction data, enabling more effective treatment, and greater knowledge discovery. Other reasons for adopting RDF as a universal healthcare exchange language include: (a) its ability to make information self-describing with precise semantics; (b) its support for automated inference; and (c) its foundation in open standards.

I think that this is an exciting path forward for VA/DoD sharing, and the logical next technological step for the metadata models begun with VistA and CHCS...

Images

image/jpeg icon <http://www.osehra.org/profiles/drupal_commons/modules/contrib/filefield/... mit_workshop12.jpg <http://www.osehra.org/sites/default/files/mit_workshop12.jpg>

--
Full post: http://www.osehra.org/blog/rdf-universal-healthcare-language-0
Manage my subscriptions: http://www.osehra.org/og_mailinglist/subscriptions
Stop emails for this post: http://www.osehra.org/og_mailinglist/unsubscribe/1858

like0

RDF as Universal Healthcare language

Sam Habiel's picture

Stephen,

George Lilly developed the Fileman Triple Store, which stores RDF graphs in
Fileman using triples.

It's alpha/beta software, but I want to let you know that it has already
been done.

Here is the KIDS package in case you want to install it:
https://trac.opensourcevista.net/svn/fmts/trunk/kids/FILEMAN_TRIPLE_STOR...

Sam

On Thu, May 9, 2013 at 6:31 AM, "Stephen Hufnagel" <hufnagels@osehra.org>wrote:

> RDF is mandated for the DOD in the following memo … The question is … How
> can VistA and RDF co-exist?****
>
> ** **
> DoD DCMO Mandates BPMN to RDF and OWL Semantic Technologies****
>
> ** **
>
> *See a Business Perspective YouTube Video* at****
>
> http://www.youtube.com/watch?v=pkhj2sTPlbk****
>
> ** **
>
> *See a Technical Perspective YouTube Video at*
>
> http://www.youtube.com/watch?feature=player_embedded&v=OzW3Gc_yA9A#t=0s ,
> ****
>
> where the DoD DCMO mandate for Semantic Web technologies is discussed ****
>
> by Dennis Wisnosky <http://gov.aol.com/tag/Dennis+Wisnosky%20/>, DCMO
> Chief Architect and CTO****
>
> ** **
>
> *APRIL 4, 2011,* Modified 14:23, 15 Nov 2011<http://semanticommunity.info/index.php?title=Build_DoD_in_the_Cloud/Ente...(E2E)_Business_Models_and_Ontology_in_DoD_Business_Architecture&action=history>by
> Admin <http://semanticommunity.info/User:Admin> | Page History<http://semanticommunity.info/index.php?title=Build_DoD_in_the_Cloud/Ente...(E2E)_Business_Models_and_Ontology_in_DoD_Business_Architecture&action=history>
> ****
>
> ** **
>
> MEMORADNUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS****
>
> CHAIRMAN OF THE JOINT CHIEFS OF STAFF****
>
> UNDERSECRETARIES OF DEFENSE****
>
> COMMANDER, UNITED STATES TRANSPORT COMMAND****
>
> ASSISTANT SECRETARY OF DEFENSE FOR NETWORKS AND INFORMATION
> INTEGRATION/DOD CHIEF INFORMATION OFFICER****
>
> DIRECTOR, COSST ASSESSMENT AND PROGRAM EVALUATION****
>
> DIRECTOR, ADMINISTRATION AND MANAGEMENT****
>
> DIRECTOR, NET ASSESSMENT****
>
> DIRECTORS OF THE DEFENSE AGENCIES****
>
> DIRECTORS OF THE DOD FILED ACTIVITIES****
>
> ** **
>
> *Subject*: Use of End-to-End (*E2E*) Business Models and Ontology in DoD
> Business Architecture****
>
> ** **
>
> DoD historically spends more than $6.0B annually developing and
> maintaining a portfolio of more than 2,000 business systems and Web
> services. Many of these systems, and the underlying processes they support,
> are poorly integrated. It is imperative, especially in today's limited
> budget environment, to optimize our business processes and the systems that
> support them to reduce our annual business systems spending. ****
>
> ** **
>
> The Defense Business Systems Management Committee (DBSMC) embraced the E2E
> business lifecycle model as a viewpoint to frame and understand our
> business environment. Further, the DBSMC endorsed using and extending the
> E2E framework to evolve the Business Enterprise Architecture (BEA) within
> the context of the DoD Enterprise Architecture. This essential framework
> will be used by the DBSMC and the supporting Investment Review Board (IRB)
> process to guide and constrain business system investments and conduct
> business process reengineering determination as required by statute.****
>
> ** **
>
> *This memorandum outline a new approach to leverage the E2Es defined
> within the BEA and provides the illumination necessary to achieve the
> management and interoperability** required by statute*. The Assistant
> Deputy Chief Management Officer will issue additional guidance to ensure
> these approaches are clearly applied to these management efforts.
> Additional guidance will identify future "time boxed" BEA development
> activities in consultation with the E2E Governance Council.****
>
> ** **
>
> The BEA is the Enterprise's blueprint for defining the Department's
> business environment. It captures required Enterprise capabilities; metrics
> aligned to the Department's Strategic Management Plan; processes, data
> standards and rules to permit system and Web service interoperability. It
> is also a tool for driving portfolio management and business process
> reengineering. BEA release 8.0 captured and defined the Department's 15 E2E
> business lifecycle models at a high level.****
>
> ** **
>
> The BEA will remain aligned to the DoD Architecture Framework (DoDAF),
> currently DoDAF v 2.0. In order to facilitate integration of the systems
> and business architecture within the E2E lifecycle models, ****
>
> ** **
>
> the BEA will be described in an *ontology using a common language
> {World-Wide Web Consortium (W3C**) open standards Resources Description
> Framework (RDF)/Web ontology Language (OWL)} and modeling notation
> {Business Process Modeling Notation (BPMN) 2.0 Analytic* Conformance Class (Primitives)}.
> These open standards shall be applied to* all* subordinate Enterprise
> solution architectures federated or assessing compliance with the BEA.
> <Emphasis added>****
>
> * *
>
> *Effectively immediately*: (1) the E2E framework shall be used to drive
> and organize BEA content within the federated BEA ontology; and (2) future
> releases of the BEA will be synchronized with our highest priority system
> acquisition and modernizations efforts related to critical activities
> within the Hire-to-Retire and Procure-to-Pay E2E lifecycle models. Initial
> priorities are as follows:****
>
> - Define BEA foundation, organization and structure using the new
> approach leveraging semantic Web technology.****
> - Create the BEA ontology architecture with necessary DoD Chief
> Information Officer and stakeholder coordination.****
> - *Define the relationship of BEA ontology to the DoD Architecture and
> its applicability to Component and solution architectures. Enterprise*
> - *Propose a schedule for establishing a federation of ontologies
> within the BEA.*
>
> In coordination with the prioritization and timing described, Component
> and Agency business architectures shall demonstrate compliance with BEA
> content vis participation and conformance with procedures and processes
> described herein. The DBSMC will receive quarterly BEA updates via the E2E
> Governance Council.****
>
> My point of contact for this matter is Robert Jennings,
> robert.jennings@osd.mil or 703-614-0214.****
>
> Signed****
>
> *Elizabeth A. McGrath*
> *Deputy Chief Management Officer (DCMO) for Department of Defense*****
>
> ** **
>
> ** **
>
> Thank you for your help and consideration,****
>
> Steve****
>
> ____________________________________________ ****
>
> *Stephen P. Hufnagel* PhD, Enterprise Architect and Data-Interoperability
> Engineer **
>
> The Information Applications Group (TIAG) / Deloitte IM-BAS
> Subcontractor ****
>
> *Deloitte*: 2941 Fairview Park Drive #4468, Falls Church, VA 22042 ,
> 571-882-7680-desk****
>
> *TIAG*: 1760 Reston Parkway, Suite 510, Reston, VA 20190 ****
>
> 703-437-7878, Shufnagel@tiag.net ****
>
> *PERSONAL*: PO Box 8097, Falls Church, VA 22041 ****
>
> 703-575-7912-mobile and *preferred phone*****
>
> 703-995-0841-eFAX, Hufnagel@acm.org ****
>
> ** **
>
> "It is not the critic who counts, not the man who points out how the
> strong man stumbled, or where the doer of deeds could have done better. The
> credit belongs to the man who is actually in the arena; whose face is
> marred by the dust and sweat and blood; who strives valiantly; who errs and
> comes short again and again; who knows the great enthusiasms, the great
> devotions and spends himself in a worthy course; who at the best, knows in
> the end the triumph of high achievement, and who, at worst, if he fails, at
> least fails while daring greatly; so that his place shall never be with
> those cold and timid souls who know neither victory or defeat." Theodore
> Roosevelt. (Paris Sorbonne, 1910)****
>
> ** **
>
> *From:* Apache [mailto:apache@groups.osehra.org] *On Behalf Of *Tom
> Munnecke
> *Sent:* Thursday, May 09, 2013 6:24 AM
> *To:* Architecture
> *Subject:* [architecture] RDF as Universal Healthcare language****
>
> ** **
>
> After many years working on both the VA VistA and DoD CHCS software
> architectures, I came to the conclusion that we were hitting a wall of
> complexity that could not be handled with the technologies that we were
> using. In the mid 1990's, I started wandering around looking for
> alternatives. I found two very interesting candidates.****
>
> My first discovery was a programmer at CERN in Geneva who was working on a
> thing called the World Wide Web. There were 150 web sites already linked,
> and I'd get emails, breathlessly announcing the latest addition to the web.
> ****
>
> One of the things I wanted to do with CHCS and VistA (and IHS' RPMS) was
> to create a "Universal Namespace" - giving a name to every data object
> handled by any system. The idea, never implemented, was to prepend the
> domain name to the FileMan metadata, so that any information could be
> linked to any other information. I had some fuzzy thoughts about using
> FileGrams to do this, and saw the need for some kind of reference language
> to link it all together. When I saw Tim's design of the web, I realized
> that this was what I was trying to do, but done in a really smart way. He
> had a certain genius of knowing what had to be right from the start (the
> design of the URL) and what needed to be "good enough" (HTML, HTTP). He
> had a very practical vision of how to make it all scale. He allowed the web
> to be "broken" - the "404 Not Found error" - in contrast to prior hypertext
> systems (Doug Engelbart, Ted Nelson) who required bi-directional
> referential integrity. This vision was not appreciated by all: his first
> paper describing the Web was rejected by peer reviewers as being "not
> scalable" and leading to a "bowl of spaghetti." Here is a video
> interview I did with Mark Frisse<http://blip.tv/tom-munneckes-videos/mark-frisse-s-message-to-tim-berners...,
> one of those reviewers.****
>
> The web is arguably one of the most complex and globally transformational
> invention in the history of technology. But Tim did not try to create
> Google, Facebook, or Amazon. Nor did he create a Dewey Decimal system to
> try to structure the web, nor did he assign experts to standardize travel,
> auctions, or book sales. He started with simple initial conditions and let
> the system evolve - something that is very hard for many to understand. A
> complex system requires a complex command and control system, they would
> say... the more complex the system, the more complex the command and
> control hierarchy.****
>
> A complex information system would require a complex "librarian" function
> to categorize, standardize, normalize, and edit information, they would
> say. But as we have seen with Google, this is not the case. We can find
> things with Google that would have been unimaginably complex in the old
> days of reference librarians and Dewey Decimal systems.****
>
> Today, it is trivial to look up a book and paste the URL in an email or
> Tweet. There is no need for a standards group to define a meaningful use
> "book exchange" standard - it is simply an intrinsic part of the
> information space we call the web.****
>
> I have long thought that health IT needs to adopt a similar "information
> space" architecture - a large-scale, fine-grained network that links all of
> our information, all the time, from anywhere. Concurrent with this
> universality, we also need an equally powerful privacy and security model
> that can deal at the fine-grained level of individual objects. The
> simplicity and the regularity of this model - putting the patient at the
> center of the health care universe - offers a dramatic improvement in our
> ability to manage the complexity of health IT. ****
>
> I held a workshop at the World Wide Web Consortium offices at MIT<http://www.flickr.com/photos/munnecket/sets/72157633291364002/>Apr 20 (minutes of the meeting coming soon), and decided to continue the
> discussion at the SemTech conference in San Francisco<http://semtechbizsf2013.semanticweb.com/sessionPop.cfm?confid=70&proposa... 3.
> ****
>
> The President’s Council of Advisors on Science and Technology (PCAST)
> identified the need for a universal healthcare exchange language as a key
> enabler in addressing this problem by improving healthcare data
> portability. Many familiar with Semantic Web technology have recognized
> that RDF / Linked Data would be an excellent candidate to meet this need,
> for both technical and strategic reasons. Although RDF is not yet well
> known in conventional healthcare IT, it has been beneficially used in a
> wide variety of applications over the past ten years -- including medical
> and biotech applications -- and would exceed all of the requirements
> outlined in the PCAST report.****
>
> RDF offers a practical evolutionary pathway to semantic interoperability.
> It enables information to be readily linked and exchanged with full
> semantic fidelity while leveraging existing IT infrastructure investments.
> Being schema-flexible, RDF allows multiple evolving data models and
> vocabularies to peacefully co-exist in the same instance data, without loss
> of semantic fidelity. This enables standardized data models and
> vocabularies to be used whenever possible, while permitting legacy or
> specialized models and vocabularies to be semantically linked and used when
> necessary. It also enables a limitless variety of related information to be
> semantically linked to patient data, such as genomic, geographic and drug
> interaction data, enabling more effective treatment, and greater knowledge
> discovery. Other reasons for adopting RDF as a universal healthcare
> exchange language include: (a) its ability to make information
> self-describing with precise semantics; (b) its support for automated
> inference; and (c) its foundation in open standards.****
>
> I think that this is an exciting path forward for VA/DoD sharing, and the
> logical next technological step for the metadata models begun with VistA
> and CHCS...****
>
> ****
>
> ****
>
> Images****
>
> [image: image/jpeg icon]mit_workshop12.jpg<http://www.osehra.org/sites/default/files/mit_workshop12.jpg>
> ****
>
> --
> Full post: http://www.osehra.org/blog/rdf-universal-healthcare-language-0
> Manage my subscriptions:
> http://www.osehra.org/og_mailinglist/subscriptions
> Stop emails for this post:
> http://www.osehra.org/og_mailinglist/unsubscribe/1858****
>
> --
> Full post: http://www.osehra.org/blog/rdf-universal-healthcare-language-0
> Manage my subscriptions:
> http://www.osehra.org/og_mailinglist/subscriptions
> Stop emails for this post:
> http://www.osehra.org/og_mailinglist/unsubscribe/1858
>
>

like0

We already have Semantic VistA

Tom Munnecke's picture

It is great to see this movement towards RDF.  I think that Conor Dowling is pretty far along with his work on Semantic VistA: http://vista.caregraf.info/ I don't know how far along he is with getting access to the CHCS data dictionaries and test data, but I think he has the IHS RPMS access.

At my MIT Semantic Health workshop Apr 20, Eric Prud'hommeaux of W3C talked about his progress in converting CCDA to RDF format.

I don't think "using RDF" is enough to really tackle the problem, though.  It's a much deeper issue of the underlying information space being represented.  A semantic web approach would use an universal resource identifier (URI) that can address the entire information space... anything can name a link to anything else.  So, rather than having a CONNECT enterprise-level interface between VA and Kaiser, for example, (controlled by a 39 page DURSA agreement that only a lawyer could love, and having all of VA trust all of Kaiser for the exchanged data), we could have a truly patient-centric information space which would link data from Kaiser and VA, tracking the provenance as metadata to the data.  If we wanted to include blood pressure from WalMart convenience clinic or a Withings iPhone blood pressure app, this data would be linked to patient's information space as well.  If the patient happened to be a Navy SEAL in some far-off mountain pass, his information wouldn't even be visible in the space.  This gives us a much more powerful security model, basing access on metadata specific to the object we're securing.

like0