Hapi includes a parser function that can parse specific HL7 version to alternate versions:
HapiContext context = new DefaultHapiContext();
CanonicalModelClassFactory mcf = new CanonicalModelClassFactory("2.5");
context.setModelClassFactory(mcf);
PipeParser parser = context.getPipeParser();
parser.getParserConfiguration().setIdGenerator(new InMemoryIDGenerator());
context.getParserConfiguration().setValidating(false);
ADT_AXX axx = null;
try {
axx = (ADT_AXX) parser.parse(message.toString());
}
catch (HL7Exception e) {
log.warn("Exception parsing to AXX");
e.printStackTrace();
}
In version 2.31 Attending doctors are messaged in PV1.7-9. In 2.5 there is a ROL segment which holds this information. My issue is that the Hapi parser does not seem to parse PV1.7.9 into ROL. I don't think this is the correct behaviour? Any support guidance appreciated?
In case others stumble across this, the most straight forward answer is that 2.5 also has a PV1.7 segment so the most sensible thing is for the parser to map the PV1.7 data from version 2.31 to the same segment in version 2.5 rather than (as I had assumed) mapping to ROL segments... And this is exactly what it does.
Related
I am trying to add instrumentation (e.g. logging some information) to methods in a Java file. I am using the following Rascal code which seems to work mostly:
import ParseTree;
import lang::java::\syntax::Java15;
// .. more imports
// project is a loc
M3 model = createM3FromEclipseProject(project);
set[loc] projectFiles = { file | file <- files(model)} ;
for (pFile <- projectFiles) {
CompilationUnit cunit = parse(#CompilationUnit, pFile);
cUnitNew = visit(cunit) {
case (MethodBody) `{<BlockStm* post>}`
=> (MethodBody) `{
'System.out.println(new Throwable().getStackTrace()[0]);
'<BlockStm* post>
'}`
}
writeFile(pFile, cUnitNew);
}
I am running into two issues regarding whitespace, which might be unrelated.
The line of code that I am inserting does not preserve whitespace that was there previously. If there was a tab character, it will now be removed. The same is true for the line directly following the line I am inserting and the closing brace. How can I 'capture' whitespace in my pattern?
Example before transforming (all lines start with a tab character, line 2 and 3 with two):
void beforeFirst() throws Exception {
rowIdx = -1;
rowSource.beforeFirst();
}
Example after transforming:
void beforeFirst() throws Exception {
System.out.println(new Throwable().getStackTrace()[0]);
rowIdx = -1;
rowSource.beforeFirst();
}
An additional issue regarding whitespace; if a file ends on a newline character, the parse function will throw a ParseError without further details. Removing this newline from the original source will fix the issue, but I'd rather not 'manually' have to fix code before parsing. How can I circumvent this issue?
Alas, capturing whitespace with a concrete pattern is not a feature of the current version of Rascal. We used to have it, but now it's back on the TODO list. I can point you to papers about the topic if you are interested. So for now you have to deal with this "damage" later.
You could write a Tree to Tree transformation on the generic level (see ParseTree.rsc), to fix indentation issues in a parse tree after your transformation, or to re-insert the comments that you lost. This is about matching the Tree data-type and appl constructors. The Tree format is a form of reflection on the parse trees of Rascal that allow any kind of transformation, including whitespace and comments.
The parse error you talked about is caused by not using the start non-terminal. If you use parse(#start[CompilationUnit], ...) then whitespace and comments before and after the CompilationUnit are accepted.
Is there any way of storing line numbers in the created parse tree, using ANTLR 4? I came across this article, which does it but I think it's for older ANTLR version, because
parser.setASTFactory(factory);
It does not seem to be applicable to ANTLR 4.
I am thinking of having something like
treenode.getLine()
, like we can have
treenode.getChild()
With Antlr4, you normally implement either a listener or a visitor.
Both give you a context where you find the location of the tokens.
For example (with a visitor), I want to keep the location of an assignment defined by a Uppercase identifier (UCASE_ID in my token definition).
The bit you're interested in is ...
ctx.UCASE_ID().getSymbol().getLine()
The visitor looks like ...
static class TypeAssignmentVisitor extends ASNBaseVisitor<TypeAssignment> {
#Override
public TypeAssignment visitTypeAssignment(TypeAssignmentContext ctx) {
String reference = ctx.UCASE_ID().getText();
int line = ctx.UCASE_ID().getSymbol().getLine();
int column = ctx.UCASE_ID().getSymbol().getCharPositionInLine()+1;
Type type = ctx.type().accept(new TypeVisitor());
TypeAssignment typeAssignment = new TypeAssignment();
typeAssignment.setReference(reference);
typeAssignment.setReferenceToken(new Token(ctx.UCASE_ID().getSymbol().getLine(), ctx.UCASE_ID().getSymbol().getCharPositionInLine()+1));
typeAssignment.setType(type);
return typeAssignment;
}
}
I was new to Antlr4 and found this useful to get started with listeners and visitors ...
https://github.com/JakubDziworski/AntlrListenerVisitorComparison/
I am creating a service that aggregates data and will need to be able to read any unknown JSON document. I have the pipeline defined as follows:
private def pipeline = (
addHeader("Accept", "application/json")
~> sendReceive
~> unmarshal[JsObject] // Need this to work for JsObject or JsArray //
~> recover
)
This will work with a JsObject but not a JsArray. If I change it to a JsArray then it will not (of course) work with a JsObject. My recover method returns a JsObject.
I would love to be able to define this as a JsValue or enforce a Root format, but for JsValue I get the following compiler error:
could not find implicit value for evidence parameter of type spray.httpx.unmarshalling.FromResponseUnmarshaller[spray.json.JsValue]
And Root Formats also error.
I am not sure how to accomplish what I need, any help would be appreciated.
Use Either, Eric! :) If the response will be either JsObject or JsArray then Either is good solution.
private def pipeline =
addHeader("Accept", "application/json")
~> sendReceive
~> unmarshal[Either[JsObject, JsArray]]
~> recover
However, beware that unmarshal[Either[JsObject, JsArray]] tries to parse response as JsObject first and if it fails, tries to parse it as JsArray. This may lead some performance issues.
After reviewing #Mustafa's answer I created the following to avoid the potential performance hit. In the end, I really only need a JSON AST to pass on.
In the most simple terms, I simply created a function to handle it:
def unmarshalJSON(httpResponse: HttpResponse): JsValue = {
httpResponse.entity.asString.parseJson
}
and altered below:
private def pipeline = {
addHeader("Accept", "application/json")
~> sendReceive
~> unmarshalJSON
~> recover
}
I would of course want to beef this up a bit for production level code, but this could be another alternative and allows me to return a JsValue. #Mustafa I would be interested to hear your thoughts.
I need to download HTML code from some web page. What is the best way to approach this task? As I understand there are very few working web frameworks for Rust right now and hyper is the one most people use? But after searching it's documentation I couldn't find a way. The closest I got is this
extern crate hyper;
use hyper::Client;
fn main() {
let client = Client::new();
let res = client.get("http://www.bloomberg.com/")
.send()
.unwrap();
println!("{:?}", res);
}
But it returns Response, which doesn't seem to contain any code from HTML body.
Note: this answer is outdated!
I don't have the time to update this with every hyper release. But please see my answer to a very related question: How can I download a website's content into a string?
It's a bit hidden: The Response type implements the trait Read. One method of Read is read_to_string which reads everything into the a String. That's a simple way you can get the body.
extern crate hyper;
use hyper::Client;
use std::io::Read;
fn main() {
let client = Client::new();
let mut res = client.get("http://www.bloomberg.com/")
.send()
.unwrap();
let mut body = String::new();
res.read_to_string(&mut body).expect("failed to read into string");
println!("{}", body);
}
Currently Rustdoc (the HTML documentation of Rust) is a little bit misleading because Rust beginners think that trait implementations don't add any important functionality. This is not true, so better look out for it. However, the hyper documentation could be better...
I am looking for a Java DataOutputStream equivalent for Dart where I can write arbitrary types (int, string, float, byte array etc). There is RandomAccessFile but it does not provide byte array or float-double values. ByteArray seems to have some necessary functions but I am not sure how to write it to a file or an OutputStream.
Here is some simple code showing how to write a ByteArray into an OutputStream:
#import('dart:io');
#import('dart:scalarlist');
main() {
File file = new File("c:\\temp\\foo.txt");
OutputStream os = file.openOutputStream();
os.onNoPendingWrites = () {
print('Finished writing. Closing.');
os.flush();
os.close();
};
Uint8List byteList = new Uint8List(64);
ByteArray byteArray = byteList.asByteArray();
int offset = 0;
offset = byteArray.setUint8(offset, 72);
offset = byteArray.setUint8(offset, 101);
offset = byteArray.setUint8(offset, 108);
offset = byteArray.setUint8(offset, 108);
offset = byteArray.setUint8(offset, 111);
offset = byteArray.setUint8(offset, 0);
byteArray.setFloat32(offset, 1.0);
os.write(byteList);
}
This has been around for a while, but I searched and didn't find good DataInput/OutputStream interoperability classes. I wanted a version that works with streams, so I could process files that don't comfortably fit in RAM. So I wrote one.
It's published over at https://pub.dev/packages/jovial_misc in io_streams, or if you prefer, https://github.com/zathras/misc/tree/master/dart/jovial_misc. I made it so it interoperates with java.io.DataInputStream and java.io.DataOutputStream. Code using it looks a little like this:
import 'package:convert/convert.dart';
import 'package:jovial_misc/io_utils.dart';
void main() async {
final acc = ByteAccumulatorSink();
final out = DataOutputSink(acc);
out.writeUTF8('Hello, world.');
out.close();
final stream = Stream<List<int>>.fromIterable([acc.bytes]);
final dis = DataInputStream(stream);
print(await dis.readUTF8());
await dis.close();
}
The Stream<List<int>> would of course typically come from a socket, or File.openRead(), etc. There's also a DataInputStream variant that is synchronous and takes an Iterable, if you do have all the byte data available up front.
DataInputStream and DataOutputSink are pretty much the obvious mapping of the java.io classes. The tricky part is the buffer management, since a stream shoves data at you in List<int> instances that probably aren't lined up with the data you want. And, of course, it's necessary to do everything asynchronously.
HTH.
You are essentially asking for arbitrary object serialization. And while the Dart VM has one, it isn't exposed to programmers (it is only used for snapshotting and message passing). I'd say that it would be a mistake to expose it -- in different situations, we have different requirements for serialization and "one true solution" isn't gonna work (Java showed us that already).
For example, I'm working on a MsgPack implementation for Dart, I know that Protobuf port is also in the works, maybe someone will start a Thrift port... the possibilities are endless.
The closest thing I could find is this package: https://github.com/TomCaserta/dart_io/ . Unfortunately there is a bug when reading to the end of the byte array - see my pull request in GitHub.
You could use this class:
https://github.com/TomCaserta/dart_io/blob/master/lib/data_output.dart
Unfortunately (a) it doesn't handle streams; (b) writeLong doesn't take a single integer. I have raised an issue for the Dart SDK: https://github.com/dart-lang/sdk/issues/31166
Edit: I have forked the dart_io package and fixed the two problems described above. My new package is published as dart_data_io:
https://github.com/markmclaren2/dart_data_io