How can I make my system HL7 certified? - hl7

I'm developing a Health Care System, and I'm new with HL7. My Question is how can I make my system HL7 (Health level 7) certified? What are the main steps I should follow?
Thanks,

You can become member of http://www.hl7.org community and ask there. As far as I know and what http://en.wikipedia.org/wiki/HL7 says about certifications there is no such thing as 1 worldwide HL7 certification authority right now (although hl7.org offers some kind of conformance testing).
There are many conformance levels and many integration profiles and many vendors and many legacy systems.
What seems to make most sense is to clearly declare IHE Integration Statement (see http://www.ihe.net/Technical_Frameworks for more) as HL7 and DICOM are rather technical tools, implementation details that nobody really much cares about (besides programmers).
Integrating the Healthcare Enterprise (IHE)
...Systems developed in accordance with IHE communicate with one another better, are easier to implement, and enable care providers to use information more effectively...
Also the Googling tip by #sqlab may be very usefull. By looking for HL7 Conformance Statement, DICOM Conformance Statement, IHE Integration Statement you can find many examples of what other vendors do

Related

what is the difference between ETSI-M2M standard and OneM2M standard?

I am doing a project about the topic desinging and implementing an M2M Application using OM2M. When I found some documentation in the internet, I know that the OM2M is defined based on the ETSI-M2M and OneM2M Standard. These two standards make me a bit confused about the similarity. Can anyone tell me what is the difference between these two standards, the ETSI-M2M standard and OneM2M standard?
Thank you so much!
I will try to help from the standard prospective, not in terms of the implementation
ETSI M2M was developed starting form 2009, and two releases were completed. In the meantime was identified the need to globalize the solution, so ETSI and its members approched other companies and other Standard Organizations to build a common project, and this is today oneM2M.
It is worth to remember that oneM2M is not a new Standard Organization, it is simply shared Partnership Project among existing organizations to merge the efforts and the expertize to provide better specifications.
Technically speaking, the principles are the same, the key Resources are still Applications, Containers and Access Rights (ACP in oneM2M). And the principle of separation of the semantic treatment from the platform is still the same.
So de facto Release 1 of oneM2M is a sort of "Release 3" of ETSI M2M. But be careful, they are not backward compatible.
Being practical, I would suggest you to look directly at Release 1 and 2 of oneM2M. A lot of improvement has been added by the different partners making it more easily usable.
In particular Release 2 finalizes the semantic interworking framework to be build around the platform, providing inter technology interworking and data sharing.
I hope I was usefull.
Enrico Scarrone,
Telecom Italia - TIM
ETSI SmartM2M Chairman,
oneM2M Steering Comittee Vice Chair

Processes implemented inside the DMS system?

Traditional categorization of processes is talking about integration, human centric and document centric processes, with the last one as a good candidate for placing inside the DMS system (of course, the prerequisite is that there is a built-in support for BPM).
But I was unable to find some concrete,more detailed explanation of the distinction between those options.
Imagine a company, that have Enterprise BPM solution , and also a DMS system with quite good support for BPM (i.e. Filenet DMS).
In both systems you can create user screens and workflows (process logic) as well.
Also, most processes working with documents are also quite "human-centric".
I am perfectly aware of the fact, that choosing the target platform always depends on the requirements and specific circumstances, but I wonder, if there are some general rules, or principles, based on which I can better decide where to put the process layer of the whole solution.
Additional clarification:
I don't want to implement any new platform. As I indicated a little bit in the previous post, we already have BPM platform (Oracle) and DMS as well (Filenet with BPM support - Case Foundation). So the question is not about choosing the new platform...but more about setting the rules for using the existing products/platforms. There are a lot new projects in the queue...and for some of them (that are touching the area of working with documents) we need to decide the target platform/s. For example, when you have a simple process with a few steps, and in all steps there is some work with an existing document (the document - or at least his original version, is also input to this process), the requirements on the front-end are not very complicated etc...it would simpler to build the whole solution in the Filenet platform( mostly because of the cost). But I am wondering if there are some similar rules....Like you should think about that or that... when you want use only the DMS platform...or both platforms etc. You can call these rules the principles for development, references architectures or something like that....that is guiding you when designing the target architecture/s.
Thank you
I'm reposting the answer because I don't see a reason for deletion (by #Bohemian).
I think it adds value to anyone asking the same question. #Bohemian could have at least specified why he deleted the post.
Here it goes:
You gave us rather small amount of information. And what exactly is
the question? What do you mean by "where to put the process layer"?
You shouldn't constrain yourself to only those DM systems that claim
to have BPM built-in. That's marketing speak behind which often lay
two half-baked products. You should instead question which
standards-based integration points the system has, so you can
integrate effortlessly. And then invest in best-of-breed DM and best
BPM separately. All-in-one solutions are often too closed, difficult
to extend and above all, they bring free vendor-lock-in with them.
What are your business requirements, i.e. what do you have to do?
Implement BPM inside organization that already has DM or not? Do you
have some BPM platform already? Do you have any
constraints/requirements when choosing either of those (vendor,
technology foundation, Gartner quadrant...)?
What are the options you're considering for DM and which options are
you evaluating (if any) as a BPM platform? Have you already settled on
IBM or you can go elsewhere? Is open source an option?
What is your role/responsibility in this project?
EDIT - after the author's clarifications:
I have not worked with Oracle's BPM, but I can tell you that, although Case Foundation is more suited to Case Management, you can develop a complete Process Management solution with it (workflows, tasks, roles, deadlines, in-baskets, etc.).
If you go that path and later come across the business need to allow business users to define their own case templates, take a look at IBM Case Manager, as it builds on top of Case Foundation, but also brings additional WebUI features (built on IBM Content Navigator), suitable for business users (although, more often than not, it turns out the IT does that job).
A few IBM redbooks about Case & Content management that might help you make an informed decision:
Introducing IBM FileNet Business Process Manager - this is the former name for Case Foundation - the same product, new version.
Advanced Case Management with IBM Case Manager
Customizing and Extending IBM Content Navigator - you'll need this one for customizations, if you decide to go with CF (instead of Oracle).
Building IBM Enterprise Content Management Solutions From End to End - from ingestion to case/process management (contains Case Manager).
I agree with #Robert regarding integration, after all, before version 5.2 FileNet Content Platform Engine was FN Content Engine + FN Process Engine.
The word of advice I can give you is to first document all features that business requires from BPM. Then do a due diligence on both products, noting down which of those features each of those products supports. Then the answer, if not laid out in front of you, will at least be much easier.
You also have to take into account that IBM is oriented towards IBM BPM (former Lombardi) when process management is concerned. Former FN BPM is now more pushed into Case Management (but those two are very similar paradigms).
You should definitely post back about your experience, whichever option you choose.
Good "luck" :)

where does specification by example complement/replace traditional requirements documentation?

I'm trying to understand where SBE's complement or replaces traditional requirements documentation. The diagram levels of requirements shows three levels of traditional software requirements.
Which of the items below (from the diagram) does SBE replace and which ones does it complement:
Vision and Scope Document
Business Requirements
Use Case Document
User Requirements
Business Rules
Software Requirements Specification
System Requirements
Functional Requirements
Quality Attributes
External Interfaces
Constraints
My naive understanding of SBE's would say that the SBE's are just an alternative form of the Software Requirements Specification. Is this correct?
BDD and SBE are normally used by Agile teams, who don't focus as much on documentation as traditional software development teams do.
BDD is the art of using examples in conversation to illustrate behaviour. SBE then uses the examples as a way of specifying the behaviour that you decide to address (I always think of it as a subset of BDD, since talking through examples often ends up to eliminating scope, discovering uncertainty or finding different options, none of which end up as specifications).
There are a couple of things that are hard to do with BDD. One of them is anything which isn't discrete in nature, or which needs to always be true throughout the lifetime of the system - non-functionals, quality attributes, constraints, etc. It's hard to talk through examples of these. These continuous aspects of requirements lend themselves better to monitoring, and that's discrete, so BDD can even be used to help manage these.
Since an initial vision is usually created to help the company make money, save money, or protect existing revenue (stopping customers going elsewhere, for instance), you can even come up with examples of how the project will do this. In fact, if you can't, the project is likely to fail anyway. So BDD / SBE can also be used to help complement an initial vision and scope.
Therefore, BDD / SBE can complement all of these documents, and in Agile teams, the documents themselves are usually replaced by conversations about the requirements and rules (illustrated by examples), story cards to represent placeholders for those conversations, and perhaps some lightweight capture of those conversations on a Wiki.
It is unlikely that any Agile team captures all of their examples up-front, as this leads to excessive investment in the requirements and tends to turn it into a traditional Waterfall /SDLC project instead.
This blog post I wrote about BDD in the Large may also be of interest.

Is there a more modern implementation of CORBA?

I'm figuring that CORBA is considered a legacy technology that just refuses to die. That being said, I'm curious if there are any known standards out there that are preferred (and are also as platform independent.)
Thoughts? TIA!
Many organization are moving to WebServices and the open standards relating to them (HTTP, WS-*) as alternatives to Corba.
This article provides a comparison of the two technologies and offers some recommendations on when to use which.
If you really care about platform independence and protocol standardization - then the WS-* standards are something to look into.
There is now a state of the art modern CORBA implementation using C++11, TAOX11. This uses the new IDL to C++11 language mapping. For TAOX11 see the TAOX11 website. TAOX11 is supported on a wide range of platforms and compilers.
I have recently tried Google Protocol buffers, they seem rather similar to CORBA by design (some kind of IDL with compiler, binary compact messages, etc). It is probably one of the many possible successors.
Web services are good for the right tasks but creating and parsing messages needs more time and text based messages are more bulky than binary ones. REST API with JSON looks like a good solution where binary protocols do not fit well.
ICE from ZeroC aims to be a "better CORBA".
Unfortunately their licensing terms are crap (at least last time I checked with them), as they do not sell developer licenses but only (roughly) per-installation terms.
It is offered via GPL license too, if you can live with this.

Do Surgical Teams exist?

Has anybody been in or has seen a kind of "Surgical Team" as described in The Mythical Man Month? Have you heard of somebody actually implementing "Mill's Proposal"?
There is a lot of detail about the various roles in the book itself, but for those who haven't read the book, I found a website and a blog post which give a good summary. I've quoted the roles from the website below:
The Surgical Team
The surgeon is the chief programmer and the el-presidente of the whole
team. He produces all the
specifications, codes the entire
system the team is responsible for,
tests it, and drafts its supporting
documentation.
The copilot is the surgeon’s assistant. His main purpose is to
share in the thinking about design
issues – to serve as a sounding board,
as it were. The copilot represents the
team in meetings with other teams. He
knows the code intimately, and serves
as insurance in case of disaster to
the surgeon.
The toolsmith supports the surgeon and builds specialized utilities and
tools as may be required by his
surgeon. Each team has its dedicated
toolsmith in addition to any central
services provided by the rest of the
project infrastructure. The tester is
responsible for maintaining test cases
for testing the surgeon’s work as he
writes it. He is both an adversary who
devises test cases to measure against
the formal specs and devises test data
to be used in debugging.
The language lawyer, which can serve several surgeons, I a widely consulted
specialist who delights in the mastery
of the intricacies of the programming
languages and the operating systems
upon which the software must perform.
The administrator handles money, people, space, and machines. The
surgeon is the ultimate boss, with the
last word on all these issues, but the
day to day management of the issues
and interfacing with the
administrative machinery of the
project is the role of a professional
administrator. One administrator may
serve more than one team.
The editor edits and revises the documentation as drafted or dictated
by the surgeon and oversees the
mechanics of its production.
The program clerk, trained as a secretary, is responsible for
maintaining all the machine-readable
and human-readable technical records
generated by the team. All the filing
and indexing is the responsibility of
the program clerk.
The secretaries handle the project correspondence and non-project files.
We did use the surgical team approach of Brooks' at a startup we set up about 10 years ago. We were five people at the company plus a few others at the uni lab supporting us. The experience was technically great, but it didn't last long for business reasons. :-)

Resources