I have a C function that I want to convert to LUA but I'm getting strange results out of Lua:
unsigned short crc16(const char* pstrCurrent, int iCount)
{
unsigned short wCRC = 0;
int iIndex = 0;
while(--iCount >= 0)
{
wCRC = wCRC ^ ((int)(*pstrCurrent++) << 8);
printf ("WCRC = %u\n", wCRC);
}
return (wCRC & 0xFFFF);
}
and here is how I started the Lua:
local function crc16(keyCurrent, byteCount)
wCRC = 0
byteIndex = 1
local crcInput = {}
while byteCount > 0 do
print ("BYTE COUNT= " .. byteCount)
wCRC=bit32.bxor(wCRC, bit32.lshift(keyCurrent[byteIndex], 8))
print ( "WCRC = " .. wCRC )
byteCount = byteCount-1
byteIndex = byteIndex+1
end
end
Yes, I know the C function is incomplete, I just want to compare what's causing issues.
The prints of the WCRC is C and Lua print completely different numbers for the same input.
Is my Lua conversion incorrect? It is my second or third time using Lua so not quite sure what I'm doing wrong.
***************** UPDATE ********************
So here is the full C and LUA and a quick little test code:
unsigned short crc16(const char* pstrCurrent, int iCount)
{
unsigned short wCRC = 0;
int iIndex = 0;
// Perform the following for each character in the buffer
while(--iCount >= 0)
{
// Get the byte information for the calculation and
// advance the pointer
wCRC = wCRC ^ ((int)(*pstrCurrent++) << 8);
for(iIndex = 0; iIndex < 8; ++iIndex)
{
if(wCRC & 0x8000)
{
wCRC = (wCRC << 1) ^ 0x1021;
}
else
{
wCRC = wCRC << 1;
}
}
}
return (wCRC & 0xFFFF);
}
and the LUA conversion:
function crc16 (keyCurrent, iCount)
wCRC = 0
byteIndex = 1
iIndex = 0
local crcInput = {}
while iCount >= 1 do
wCRC = bit32.bxor (wCRC, bit32.lshift(keyCurrent[byteIndex], 8))
for iIndex=0,8 do
if (bit32.band (wCRC, 0x8000) ~= nil ) then
wCRC = bit32.bxor (bit32.lshift (wCRC, 1), 0x1021)
else
wCRC = bit32.lshift (wCRC, 1)
end
end
iCount = iCount-1
byteIndex = byteIndex+1
end
return (bit32.band (wCRC, 0xFFFF))
end
local dKey = {}
dKey = {8, 210, 59, 0, 18, 166, 254, 117}
print ( "CRC = " .. crc16 (dKey ,8) )
In C, for the same array I get: CRC16 = 567
In LUA, I get: CRC = 61471
Can someone tell me what I'm doing wrong?
Thanks
It seems they yield the same results:
pure-C
WCRC = 18432
WCRC = 11520
WCRC = 16640
WCRC = 11520
pure-Lua
BYTE COUNT= 4
WCRC = 18432
BYTE COUNT= 3
WCRC = 11520
BYTE COUNT= 2
WCRC = 16640
BYTE COUNT= 1
WCRC = 11520
ASCII convertor:
What do you mean?
There's mistakes in altered Lua sample.
1. bit32.band() returns number. Number 0 not equals to 'nil', that's totally different type. You're trying to compare number with nil, and that check will fail always.2. for iIndex=0,8 do iterates 9 times, including final index 8.
Related
Currently using Dart with gsheets_api, which don't seem to have a function to convert column letters to numbers (column index)
As an example , this is what I use with AppScript (input: column letter, output: column index number):
function Column_Nu_to_Letter(column_nu)
{
var temp, letter = '';
while (column_nu > 0)
{
temp = (column_nu - 1) % 26;
letter = String.fromCharCode(temp + 65) + letter;
column_nu = (column_nu - temp - 1) / 26;
}
return letter;
};
This is the code I came up for Dart, it works, but I am sure there is a more elegant or correct way to do it.
String colLetter = 'L'; //Column 'L' as example
int c = "A".codeUnitAt(0);
int end = "Z".codeUnitAt(0);
int counter = 1;
while (c <= end) {
//print(String.fromCharCode(c));
if(colLetter == String.fromCharCode(c)){
print('Conversion $colLetter = $counter');
}
counter++;
c++;
}
// this output L = 12
Do you have any suggestions on how to improve this code?
First we need to agree on the meaning of the letters.
I believe the traditional approach is "A" is 1, "Z" is 26, "AA" is 27, "AZ" is 52, "BA" is 53, etc.
Then I'd probably go with something like these functions for converting:
int lettersToIndex(String letters) {
var result = 0;
for (var i = 0; i < letters.length; i++) {
result = result * 26 + (letters.codeUnitAt(i) & 0x1f);
}
return result;
}
String indexToLetters(int index) {
if (index <= 0) throw RangeError.range(index, 1, null, "index");
const _letters = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
if (index < 27) return _letters[index - 1];
var letters = <String>[];
do {
index -= 1;
letters.add(_letters[index.remainder(26)]);
index ~/= 26;
} while (index > 0);
return letters.reversed.join("");
}
The former function doesn't validate that the input only contains letters, but it works correctly for strings containing only letters (and it ignores case as a bonus).
The latter does check that the index is greater than zero.
A simplified version base on Irn's answer
int lettersToIndex(String letters) =>
letters.codeUnits.fold(0, (v, e) => v * 26 + (e & 0x1f));
String indexToLetters(int index) {
var letters = '';
do {
final r = index % 26;
letters = '${String.fromCharCode(64 + r)}$letters';
index = (index - r) ~/ 26;
} while (index > 0);
return letters;
}
I reviewed these lines of codes which used to convert string to binary, but i cant understand what this
line of code is doing (hexchar_to_int(str[i]) << 4) | hexchar_to_int(str[i+1]), i am frustrated
with this bit manipulation here, 8 bit of unsigned int used and we shifted left to get the least 4
significant bits. But i don't know the purpose of doing so.
unsigned char hexchar_to_int(char const ch)
{
if (ch >= '0' && ch <= '9') return ch - '0';
if (ch >= 'A' && ch <= 'F') return ch - 'A' + 10;
if (ch >= 'a' && ch <= 'f') return ch - 'a' + 10;
throw std::invalid_argument("Invalid hexadecimal character");
}
std::vector<unsigned char> hexstr_to_bytes(std::string_view str)
{
std::vector<unsigned char> result;
for (size_t i = 0; i < str.size(); i += 2)
{
result.push_back((hexchar_to_int(str[i]) << 4) | hexchar_to_int(str[i+1]));
}
return result;
}
<<4 is *2^4 or *16, shifting left, not right.
The | is a simple addition of the 2 hexa characters at their correct place.
See 0xab = 10*16 + 11 = 171
Im trying to make a toString method that prints out a histogram that shows how often each character of the alphabet is used in a string. The most frequent character has to be 60 #s long, with the rest of the characters then scaled to match.
My issue is with making the equation that scales the rest of the letters to the correct length for the histogram. My current equation is (myArray[i]/max) * 60, but im getting really weird results.
If I put in "hello world" to be analyzed, L would be the most common occuring letter, seen 3 times. So L should have 60 #s for the histogram, h should have 20, o should have 40 etc. Instead im getting results like d : 10
e : 10
h : 10
l : 360
o : 20
r : 10
w : 10
Sorry for how sloppy this is right now, im just trying to figure out whats going on
public class LetterCounter
private static int[] alphabetArray;
private static String input;
/**
* Constructor for objects of class LetterCounter
*/
public LetterCounter()
{
alphabetArray = new int[26];
}
public void countLetters(String input) {
this.input = input;
this.input.toLowerCase();
//String s= input;
//s.toLowerCase();
for ( int i = 0; i < input.length(); i++ ) {
char ch= input.charAt(i);
if (ch >= 97 && ch <= 122){
alphabetArray[ch-'a']++;
}
}
}
public void getTotalCount() {
for (int i = 0; i < alphabetArray.length; i++) {
if(alphabetArray[i]>=0){
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i]);
}
}
}
public void reset() {
for (int i =0; i<alphabetArray.length; i++) {
if(alphabetArray[i]>=0){
alphabetArray[i]=0;
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i]);
}
}
}
public String toString() {
String s = "";
int max = alphabetArray[0];
int markCounter = 0;
for(int i =0; i<alphabetArray.length; i++) {
//finds the largest number of occurences for any letter in the string
if(alphabetArray[i] > max) {
max = alphabetArray[i];
}
}
for(int i =0; i<alphabetArray.length; i++) {
//trying to scale the rest of the characters down here
if(alphabetArray[i] > 0) {
markCounter = (alphabetArray[i] / max) * 60;
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i] + markCounter);
}
}
for (int i = 0; i < alphabetArray.length; i++) {
//prints the whole alphabet, total number of occurences for all chars
if(alphabetArray[i]>=0){
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i]);
}
}
return s;
}
}
There are many many problems with your code, but lets go one by one.
First of all, your print statement is simply misleading. Change it to
System.out.println(ch +" : "+alphabetArray[i] + " " + markCounter);
and you will see
d : 1 0
e : 1 0
h : 1 0
l : 3 60
o : 2 0
r : 1 0
w : 1 0
As you can see: the counters are correct (1,1,1,3,2,1,1). But the your scaling doesn't work:
1 / 3 --> 0 ... and 0 * 3 ... is still 0
3 / 3 --> 1 and 1 * 3 ... is 60
but of course, when you dont print a space between 1 and 0 and 3 and 60.
Thus to get correct scaling, just change to:
markCounter = alphabetArray[i] * 60 / max;
Other things worth mentioning:
You are overriding toString(). Then you should put #Override in fron t of that method
toLowerCase() returns a new string in lower case; just calling it without pushing the result back into your string ... just throws away the "lower casing".
toString() shouldnt print to the console. The whole idea is that you put all the information into the string that you return. In other words: in the end you do some System.out.println(someLetterCounter.toString()
Your code is extremely low-level. You don't iterate arrays using for (int), you can do (int letter : alphabetArray) instead
You might want to read about Map. You see, if you would be using a Map<Character, Integer> where the map key would represent the different characters, and the map value represents a counter for each character ... well, you could throw out most of your code; and come up with a solution that would require a few lines of code only!
( and seriously: because of all these issues, debugging your code was really much harder than it needed to be )
countLetters seems has some issues. You can not convert String to lowercase by just calling
this.input.toLowerCase();
Because String is immutable in java. You have to assign it like:
this.input = input.toLowerCase();
Another problem is you are using input variable from parameter instead of this.input which has lower case string. You can do this way to make work countLetters method:
public void countLetters(String input) {
this.input = input.toLowerCase();
for ( int i = 0; i < this.input.length(); i++ ) {
char ch= this.input.charAt(i);
if (ch >= 97 && ch <= 122) {
alphabetArray[ch-'a']++;
}
}
}
I try to parse gpx files and to output encoded polylines (Google algorithm)
test.gpx
<trkseg>
<trkpt lon="-120.2" lat="38.5"/>
<trkpt lon="-120.95" lat="40.7"/>
<trkpt lon="-126.453" lat="43.252"/>
</trkseg>
I managed most of it, but have trouble with encoding the numbers
gpx2epl:
file = io.open(arg[1], "r")
io.input(file)
--
function round(number, precision)
return math.floor(number*math.pow(10,precision)+0.5) / math.pow(10,precision)
end
function encodeNumber(number)
return number
end
--
local Olatitude = 0
local Olongitude = 0
--
while true do
local line = io.read()
if line == nil
then
break
end
if string.match(line, "trkpt") then
local latitude
local longitude
local encnum
latitude = string.match(line, 'lat="(.-)"')
longitude = string.match(line, 'lon="(.-)"')
latitude = round(latitude,5)*100000
longitude = round(longitude,5)*100000
encnum = encodeNumber(latitude-Olatitude)
print(encnum)
encnum = encodeNumber(longitude-Olongitude)
print(encnum)
Olatitude = latitude
Olongitude = longitude
end
end
This script produces the expected output (see: Google Link), with the exception of encoded latitude and longitude.
3850000
-12020000
220000
-75000
255200
-550300
Mapquest provides an implementation in Javascript:
function encodeNumber(num) {
var num = num << 1;
if (num < 0) {
num = ~(num);
}
var encoded = '';
while (num >= 0x20) {
encoded += String.fromCharCode((0x20 | (num & 0x1f)) + 63);
num >>= 5;
}
encoded += String.fromCharCode(num + 63);
return encoded;
}
Can this be done in Lua? Can somebody please help me out. I have no idea how to implement this in Lua.
Edit:
Based on Doug's advice, I did:
function encodeNumber(number)
local num = number
num = num * 2
if num < 0
then
num = (num * -1) - 1
end
while num >= 32
do
local num2 = 32 + (num % 32) + 63
print(string.char(num2))
num = num / 32
end
print(string.char(num + 63) .. "\n-----")
end
encodeNumber(3850000) -- _p~iF
encodeNumber(-12020000) -- ~ps|U
encodeNumber(220000) -- _ulL
encodeNumber(-75000) -- nnqC
encodeNumber(255200) -- _mqN
encodeNumber(-550300) -- vxq`#
It's near expected output, but only near ... Any hint?
Taking encodeNumber piecemeal...
var num = num << 1;
This is just num = num * 2
num = ~(num);
This is num = (- num) - 1
0x20 | (num & 0x1f)
Is equivalent to 32 + (num % 32)
num >>= 5
Is equivalent to num = math.floor(num / 32)
ADDENDUM
To concatenate the characters, use a table to collect them:
function encodeNumber(number)
local num = number
num = num * 2
if num < 0
then
num = (num * -1) - 1
end
local t = {}
while num >= 32
do
local num2 = 32 + (num % 32) + 63
table.insert(t,string.char(num2))
num = math.floor(num / 32) -- use floor to keep integer portion only
end
table.insert(t,string.char(num + 63))
return table.concat(t)
end
As part of a reverse engineering exercise, I'm trying to write a Z3 solver to find a username and password that satisfy the program below. This is especially tough because the z3py tutorial that everyone refers to (rise4fun) is down.
#include <iostream>
#include <string>
using namespace std;
int main() {
string name, pass;
cout << "Name: ";
cin >> name;
cout << "Pass: ";
cin >> pass;
int sum = 0;
for (size_t i = 0; i < name.size(); i++) {
char c = name[i];
if (c < 'A') {
cout << "Lose: char is less than A" << endl;
return 1;
}
if (c > 'Z') {
sum += c - 32;
} else {
sum += c;
}
}
int r1 = 0x5678 ^ sum;
int r2 = 0;
for (size_t i = 0; i < pass.size(); i++) {
char c = pass[i];
c -= 48;
r2 *= 10;
r2 += c;
}
r2 ^= 0x1234;
cout << "r1: " << r1 << endl;
cout << "r2: " << r2 << endl;
if (r1 == r2) {
cout << "Win" << endl;
} else {
cout << "Lose: r1 and r2 don't match" << endl;
}
}
I got that code from the assembly of a binary, and while it may be wrong I want to focus on writing the solver. I'm starting with the first part, just calculating r1, and this is what I have:
from z3 import *
s = Solver()
sum = Int('sum')
name = Array('name', IntSort(), IntSort())
for c in name:
s.add(c < 65)
if c > 90:
sum += c - 32
else:
sum += c
r1 = Xor(sum, 0x5678)
print s.check()
print s.model()
All I'm asserting is that there are no letters less than 'A' in the array, so I expect to get back an array of any size that has numbers greater than 65.
Obviously this is completely wrong, mainly because it infinite loops. Also, I'm not sure I'm calculating sum correctly, because I don't know if it's initialized to 0. Could someone help figure out how to get this first loop working?
EDIT: I was able to get a z3 script that is close to the C++ code shown above:
from z3 import *
s = Solver()
sum = 0
name = Array('name', BitVecSort(32), BitVecSort(32))
i = Int('i')
for i in xrange(0, 1):
s.add(name[i] >= 65)
s.add(name[i] < 127)
if name[i] > 90:
sum += name[i] - 32
else:
sum += name[i]
r1 = sum ^ 0x5678
passwd = Array('passwd', BitVecSort(32), BitVecSort(32))
r2 = 0
for i in xrange(0, 5):
s.add(passwd[i] < 127)
s.add(passwd[i] >= 48)
c = passwd[i] - 48
r2 *= 10
r2 += c
r2 ^= 0x1234
s.add(r1 == r2)
print s.check()
print s.model()
This code was able to give me a correct username and password. However, I hardcoded the lengths of one for the username and five for the password. How would I change the script so I wouldn't have to hard code the lengths? And how would I generate a different solution each time I run the program?
Arrays in Z3 do not necessarily have any bounds. In this case the index-sort is Int, which means unbounded integers (not machine integers). Consequently, for c in name will run forever because it enumerates name[0], name[1], name[2], ...
It seems that you actually have a bound in the original program (name.size()), so it would suffice to enumerate up to that limit. Otherwise you might need a quantifier, e.g., \forall x of Int sort . name[x] < 65. This comes with all the warnings about quantifiers, of course (see e.g., the Z3 Guide)
Suppose the length is to be determined. Here is what I think you could do:
length = Int('length')
x = Int('x')
s.add(ForAll(x,Implies(And(x>=0,x<length),And(passwd[x] < 127,passwd[x] >=48))))