Read a list from stream using Yap-Prolog - stream

I want to run a (python3) process from my (yap) prolog script and read its output formatted as a list of integers, e.g. [1,2,3,4,5,6].
This is what I do:
process_create(path(python3),
['my_script.py', MyParam],
[stdout(pipe(Out))]),
read(Out, OutputList),
close(Out).
However, it fails at read/2 predicate with the error:
PL_unify_term: PL_int64 not supported
Exception ignored in: <_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'>
BrokenPipeError: [Errno 32] Broken pipe
I am sure that I can run the process correctly because with [stdout(std)] parameter given to process_create the program outputs [1,2,3,4,5,6] as expected.
Weird thing is that when I change the process to output some constant term (as constant_term) it still gives the same PL_int64 error. Appending a dot to the process' output ([1,2,3,4,5,6].) doesn't solve the error. Using read_term/3 gives the same error. read_string/3 is undefined in YAP-Prolog.
How can I solve this problem?

After asking at the yap-users mailing list I got the solution.
Re-compiled YAP Prolog 6.2.2 with libGMP option and now it works. It may also occur in 32-bit YAP.

Related

Neo4j to Gephi import : Failed to invoke procedure Invalid UTF-8

I tried to import my data from Neo4j into Gephi but it doesn't work.
I have the following result in Neo4j :
Failed to invoke procedure apoc.gephi.add: Caused by: com.fasterxml.jackson.core.JsonParseException: Invalid UTF-8 start byte 0xfb at [Source: (apoc.export.util.CountingInputStream); line: 1, column: 136]
As previously mentioned, it looks like neo4j is not exporting using UTF-8, so that, I would check how neo4j is generating the output.
Another possibility is that, when neo4j writing the output, something went slightly wrong.
I got this very same problem in the past when concurrently managing content in a file.
A thread crashed and closed "not correctly enough" the file. I mean, when reviewing the file, everything looks pretty normal, but some characters have been introduced which are not UTF-8. A tool like Atom can help you.
Best

Woodstox parser works fine in test run in Eclipse, but fails from command line

One of my JUnit tests uses (behind the scenes) the Woodstox parser.
When I run the test from within Eclipse, the test succeeds as expected.
But running the same test on the command line, using
mvn clean test -Dtest=com.example.MyClassTest#someParserTest
results in the test to fail with the following exception messages:
Error on line 114 column 21
SXXP0003: Error reported by XML parser: Invalid UTF-8 middle byte 0x3f (at char #4174, byte #3999)
...
at com.ctc.wstx.io.UTF8Reader.reportInvalidOther(UTF8Reader.java:314)
at com.ctc.wstx.io.UTF8Reader.read(UTF8Reader.java:205)
at com.ctc.wstx.io.ReaderSource.readInto(ReaderSource.java:84)
at com.ctc.wstx.io.BranchingReaderSource.readInto(BranchingReaderSource.java:55)
at com.ctc.wstx.sr.StreamScanner.loadMore(StreamScanner.java:961)
at com.ctc.wstx.sr.BasicStreamReader.readTextSecondary(BasicStreamReader.java:4580)
at com.ctc.wstx.sr.BasicStreamReader.finishToken(BasicStreamReader.java:3657)
at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1063)
at com.ctc.wstx.sax.WstxSAXParser.fireEvents(WstxSAXParser.java:524)
at com.ctc.wstx.sax.WstxSAXParser.parse(WstxSAXParser.java:452)
at net.sf.saxon.event.Sender.sendSAXSource(Sender.java:440)
at net.sf.saxon.event.Sender.send(Sender.java:171)
at net.sf.saxon.jaxp.IdentityTransformer.transform(IdentityTransformer.java:363)
I took a look at the to-be-parsed InputStream. The InputStreams are identical in both cases.
Also, there is no "line 114 column 21" in the InputStream. Line 114 ends on column 11.
How can I investigate what causes the different behavior?
It turned out that a library I used made wrong assumptions about the environment's default character encoding (also called platform's default charset).
In the Eclipse environment, calling Charset.defaultCharset() returned UTF-8, while in the command line environment it returned CP1252.
Many standard and third-party Java APIs behave differently depending on the platform's default charset, among them:
String.getBytes()
ByteArrayOutputStream.toString()
XMLOutputFactory.createXMLStreamWriter(OutputStream stream)
IOUtils.toString(InputStream input)
To resolve my issue, I had to update that library to explicitly use the correct character set:
String.getBytes(StandardCharsets.UTF_8)
ByteArrayOutputStream.toString( StandardCharsets.UTF_8.name() )
XMLOutputFactory.createXMLStreamWriter( OutputStream stream, StandardCharsets.UTF_8.name() )
IOUtils.toString(InputStream input, StandardCharsets.UTF_8)

Lua Compiling Error 'do' expected near '['

I have a Lua file that I decompiled using unluac. When I try to recompile the files without any changes I get the following error:
lua: main.lua:647: 'do' expected near '['
I really do not know the problem here, as the while do statement follows the correct format.
The error is on line 647 as stated above.
Source is here:
Full Pastebin Source
Expressions like while {}[1] do and if {}[1].parentFolderName then are invalid because of {}[1] reference. It needs to be ({})[1]. It's probably a result of some sort of automated processing, but you should be able to fix it manually.

Paamayim nekudotayims in PHP 5.2

I can upgrade php 5.2 in my server. I have to make this server work today (the vacation I have planned tomorrow is under question because of this error) with new testlink. I am stuck with following error i.e Paamayim nekudotayims.
What changes I should do to resolve it?
This link contains the file with the bug.
The Scope Resolution Operator (also called Paamayim Nekudotayim) or in simpler terms, the double colon, is a token that allows access to static, constant, and overridden properties or methods of a class.
SO may be in your codes you try to call static method or properties with wrong operator.
From Wikipedia:
In PHP, the scope resolution operator is also called Paamayim
Nekudotayim (Hebrew: פעמיים נקודתיים‎), which means “double colon” in
Hebrew.
The name "Paamayim Nekudotayim" was introduced in the
Israeli-developed Zend Engine 0.5 used in PHP 3. Although it has been
confusing to many developers who do not speak Hebrew, it is still
being used in PHP 5, as in this sample error message:
$ php -r :: Parse error: syntax error, unexpected
T_PAAMAYIM_NEKUDOTAYIM
As of PHP 5.4, error messages concerning the scope resolution operator
still include this name, but have clarified its meaning somewhat:
$ php -r :: Parse error: syntax error, unexpected '::'
(T_PAAMAYIM_NEKUDOTAYIM)

got SIGSEGV while calling 'require ("lsqlite3")' with lua 5.1.5

I had built lua 5.1.5 and lsqlite3-0.8.1. all of them run well on my RedHat Linux.
and then I ported them to my MIPS development board. lua and other modules (such as luafilesystem, md5, cgilua and wsapi) run well. but lsqlite3 does not work.
when I execute require("lsqlite3") in lua command line, it returns error messages in below:
lua
Lua 5.1.5 Copyright (C) 1994-2012 Lua.org, PUC-Rio
require("lsqlite3")
do_page_fault() #2: sending SIGSEGV to lua for invalid read access from
00000000 (epc == 00000000, ra == 2ac36144)
Segmentation fault
can any one give me any help to fix it? Thanks!
I got few progress in solving this problem, I rebuilt the LUA with gcc compile option '-Wl,-E' and rebuilt lsqlite3 later. I executed require ("lsqlite3") in lua command line, and it didnt print any message. I continued running some other database operation commands and found them all been successfully executed. As it seemed the problem had been solved, I should be very happy about it.
but another more strange problem raised.
If I put sentence require("lsqlite3") into a file, and then execute the file in this way:
lua file
it still printed error messages like this:
do_page_fault() #2: sending SIGSEGV to lua for invalid read access from
2ada054c (epc == 2ada054c, ra == 2abdceac)
If I put more database operation sentences into a file, and then run this file by lua. Lua can give correct result of query operation and insert values to table correctly, but always print error messages showed above.
If I run sentences in the file one by one in lua command line interface, it never print this error message.
It seems to give the error message when executing the 'require' function. But if I put require("lfs") into a file and run this file by lua, it never print error message.
I am confused that whait is the difference between the lua command line execution and lua script.
There are three places in lsqlite3.c where sqlite_int64 is used (never long long directly). When you build sqlite3 some type will be used for 64 bit integers; lsqlite3 will use the same type by including sqlite3.h for the definition of the type.

Resources