I am trying to save some tables of strings to files in Torch. I have tried using this Torch extension by Deepmind: hdf5.
require 'hdf5'
label = {'a', 'b','c','d'}
local myFile = hdf5.open(features_repo .. 't.h5', 'w')
myFile:write('label', label)
myFile:close()
I am getting the error:
/home/user/torch/install/bin/luajit: ...e/user/torch/install/share/lua/5.1/hdf5/group.lua:222: torch-hdf5: writing data of type string is not supported
Torch Tensors are written to file as intended.
I have also tried using matio to write to mat files (for MatLab). I am getting this error:
bad argument #1 to 'varCreate' (cannot convert 'number' to 'const char *')
The error is because "label" is a table of strings, but the function HDF5Group:_writeData is expecting a form of "tensor".
Looking at ffi.lua, it seems that the "tensor" is a typedef for "integer", so maybe replace:
label = {'a', 'b','c','d'}
with
label = {1,2,3,4}
You can use the function t2s from the module (https://github.com/aryajur/tableUtils.git) to generate a string you can save to a file. To convert back just use function s2t.
Related
I'm using opencv-python in VSCODE, I create 2 2-dimensions arrays of the same type and same size. but I got this error when it executes the method cv2.addWeighted, I see other people are asking the same question, but their problem is the different data type. However, I have double checked the data type in my code, they are the same (np.uint8).
img01 = np.ones((200, 200), dtype=np.uint8) * 1000
img02 = np.ones((200, 200), dtype=np.uint8) * 100
weightedImg = cv2.addWeighted(img01,0.6,img02,0.4,3)
error: (-5:Bad argument) When the input arrays in add/subtract/multiply/divide functions have different types, the output array type must be explicitly specified in function 'cv::arithm_op'
I would like to know what's wrong in my code. thanks.
I need to use open cv functions: cv2.imencode,cv2.imdecode to compress (jpeg) and decompress (jpeg) for different QF values.
The picture is 'bridge.ppm' from https://imagecompression.info/test_images/
I've tried:
bridge = cv2.imread('./bridge.ppm')
bridge_en = cv2.imencode('.jpeg', bridge)
bridge_de = cv2.imdecode('.jpeg', bridge_en)
cv2.imshow('image',bridge_de)
but I'm getting an error in the 2nd line saying: "Expected Ptr<cv::UMat> for argument 'buf'".
Also, how can I change and test different QF values?
Please take a look to the documentation for imencode and imdecode
imencode returns two values, the encoded buffer is the second one. And imdecode accepts the encoded buffer and a flag. So:
bridge = cv2.imread('./bridge.ppm')
bridge_en = cv2.imencode('.jpeg', bridge)[1] # you need the second value
bridge_de = cv2.imdecode(bridge_en, cv2.IMREAD_UNCHANGED) # or any other flag, same as 'imread'
cv2.imshow('image',bridge_de)
I have a file I opened as binary like this: local dem = io.open("testdem.dem", "rb")
I can read out strings from it just fine: print(dem:read(8)) -> HL2DEMO, however, afterwards there is a 4-byte little endian integer and a 4-byte float (docs for the file format don't specify endianess but since it didn't specify little like the integer I'll have to assume big).
This cannot be read out with read.
I am new to the LuaJIT FFI and am not sure how to read this out. Frankly, I find the documentation on this specific aspect of the FFI to be underwhelming, although I'm just a lua programmer and don't have much experience with C. One thing I have tried is creating a cdata, but I don't think I understand that:
local dem = io.open("testdem.dem", "rb")
print(dem:read(8))
local cd = ffi.new("int", 4)
ffi.copy(cd, dem:read(4), 4)
print(tostring(cd))
--[[Output
HL2DEMO
luajit: bad argument #1 to 'copy' (cannot convert 'int' to 'void *')
]]--
Summary:
Goal: Read integers and floats from binary data.
Expected output: A lua integer or float I can then convert to string.
string.unpack does this for Lua 5.3, but there are some alternatives for LuaJIT as well. For example, see this answer (and other answers to the same question).
I have a file database. Inside that file I have something like:
DB_A = ...
DB_B = ...
.
.
.
DB_N = ...
I would like to parse the data and group them in lua code like this:
data={}
-- the result after parsing a file
data={
["DB_A"] = {...},
["DB_B"] = {...},
.
.
.
["DB_N"] = {...}
}
In other words, is it possible to create a table inside a table dynamically and assign the key to each table without previously knowing what will be the names of the key (that is something I can figure out after parsing the data from a database).
(Just as a note, I am using Lua 5.3.5; also, I apologize that my code resembles C more than Lua!)
Iterating through your input file line-by-line--which can be done with the Lua FILE*'s lines method--you can use string.match to grab the information you are looking for from each line.
#!/usr/bin/lua
local PATTERN = "(%S+)%s?=%s?(%S+)"
local function eprintf(fmt, ...)
io.stderr:write(string.format(fmt, ...))
return
end
local function printf(fmt, ...)
io.stdout:write(string.format(fmt, ...))
return
end
local function make_table_from_file(filename)
local input = assert(io.open(filename, "r"))
local data = {}
for line in input:lines() do
local key, value = string.match(line, PATTERN)
data[key] = value
end
return data
end
local function main(argc, argv)
if (argc < 1) then
eprintf("Filename expected from command line\n")
os.exit(1)
end
local data = make_table_from_file(argv[1])
for k, v in pairs(data) do
printf("data[%s] = %s\n", k, data[k])
end
return 0
end
main(#arg, arg)
The variable declared at the top of the file, PATTERN, is your capture pattern to be used by string.match. If you are unfamiliar with how Lua's pattern matching works, this pattern looks for a series of non-space characters with zero or one spaces to its right, an equal sign, another space, and then another series of non-space characters. The two series of non-space characters are the two matches--key and value--returned by string.match in the function make_table_from_file.
The functions eprintf and printf are my Lua versions of C-style formatted output functions. The former writes to standard error, io.stderr in Lua; and the latter writes to standard output, io.stdout in Lua.
In your question, you give a sample of what your expected output is. Within your table data, you want it to contain keys that correspond to tables as values. Based on the sample input text you provided, I assume the data contained within these tables are whatever comes to the right of the equal signs in the input file--which you represent with .... As I do not know what exactly those ...s represent, I cannot give you a solid example for how to separate that right-hand data into a table. Depending on what you are looking to do, you could take the second variable returned by string.match, which I called value, and further separate it using Lua's string pattern matching. It could look something like this:
...
local function make_table_from_value(val)
// Split `val` into distinct elements to form a table with `some_pattern`
return {string.match(val, some_pattern)}
end
local function make_table_from_file(filename)
local input = assert(io.open(filename, "r"))
local data = {}
for line in input:lines() do
local key, value = string.match(line, PATTERN)
data[key] = make_table_from_value(value)
end
return data
end
...
In make_table_from_value, string.match will return some number of elements, based on whatever string pattern you provide as its second argument, which you can then use to create a table by enclosing the function call in curly braces. It will be a table that uses numerical indices as keys--rather than strings or some other data type--starting from 1.
I have a torch.FloatTensor that is 2 rows and 200 columns. I want to print the lines to a text file. Is there a toString() method for the torch.FloatTensor? If not, how do you print the line to the file? Thanks.
I can convert the FloatTensor into a Lua table:
local line = a[1]
local table = {}
for i=1,line:size(1) do
table[i] = line[i]
end
Is there an easy way to convert the Lua table to a string, so I can write it to file? Thanks!
There is a built-in table conversion called torch.totable. If what you want to do is save and load your tensor, then it's even easier to use Torch native serialization like torch.save('file.txt', tensor, 'ascii').