i am coding opera recovery tool in my delphi
i am using c++ which is already exist
http://pastebin.com/ViPf0yn6
but i didnt get whats DES_KEY_SZ in that code .
i think they are present in des.h ,but i couldnt found same des.pas :(
can any one help me please
regards
Here we go: http://freebsd.active-venture.com/FreeBSD-srctree/newsrc/crypto/des/des.h.html
Apparently,
#define DES_KEY_SZ (sizeof(des_cblock))
where
typedef unsigned char des_cblock[8];
I am not a C programmer, but I think that this means that DES_KEY_SZ has the value 8.
Google Code Search finds many copies of des.h, where the DES_KEY_SZ macro is defined. It's the size of a des_cblock, which happens to be an array of eight unsigned chars.
In other words, DES_KEY_SZ = 8.
You're going to run into other problems beyond just that missing identifier, though. The code you showed calls a handful of DES functions, too. To unencrypt the data, try using DCPCrypt.
Related
Has anyone built a "universal" string class for C++Builder that manages all of the conversions to/from ASCII and Unicode?
I had a vision of a class that would accept AnsiString, UnicodeString, WideString, char*, wchar_t*, std::string, and variant values, and would provide any of those back out. AND the copy constructor has to do a deep copy, not just provide a pointer to the same buffer space (as AnsiString and UnicodeString do).
I figure someone else besides me must have to pass strings to both old interfaces that use char* and new ones that use (wide) strings. If you have built, or know of, something you're willing to share, please let me know. Most of the time it's not too big a deal, until I have to pass a map<std::string, std::string>, then it starts getting ugly.
We do not, and will not, support any internationalization whatsoever, so I don't need to worry about encoding. I just want a class that will return my little ASCII strings in whatever format makes the compiler happy... sanely.
UPDATE: to address the comments:
So, std::map<std::string, std::string> is ugly, because you can't do:
parammap[AnsiString(widekey).c_str()] = AnsiString(widevalue).c_str();
Oh no no no. You have to do this:
AnsiString akey = widekey;
AnsiString aval = widevalue;
parammap[akey.c_str()] = aval.c_str();
The person who originally wrote this code tried to keep it as port-friendly as possible, so he standardized on char* for all of the function calls he wrote (circa 2000, it wasn't a bad assumption). Sometimes I was trying to convert everything to char *s before I realized that the function was then immediately turning around and converting it back to wide. There are multiple interface layers, and it took me a while to figure out how it all went together.
Add in some creative compiler bugs, where it would get confused, especially when pulling string values out of Variants. In some places, I had to do:
String wstr = passedvariant.AsType(varString);
String astr = wstr;
std::string key = astr.c_str();
Then life happened, we ended up starting the port over (for the 3rd time. Don't ask), and I finally got smart and wrapped the low-level library in a layer that does all of the conversions, and retooled the middle layers to deal in Strings, so the application layer can just use String except for that map. For the map<string, string>, I created a function to do the converting, so it was one line in a bunch of places instead of six (the three line conversion above for both key and value).
Lastly, I wasn't actually asking for anyone to make suggestions on how to make my code better. I was asking if anyone had or knew of a universal string class. Our code is the way it is for reasons, and I'm not rewriting all of it to make it prettier. I just wanted not to have to touch so many lines... again. It would have been so much nicer to have the compiler keep track of which format is needed and convert it.
I have a uint8_t variable for example '01:2.7:300:4'. It comes in on an Arduino acting as a transceiver.
I want to parse the variable by the colon (:) using strtok(). However, the first argument of strtok is required to be a char*.
Is there a way to convert the uint8_t variable to a char? or is there another way to parse the input?
I searched google for the last few hours but have been unable to find a solution.
Thank you
Although it's a bit ugly, you can simply cast the uint8_t* to a char* and strtok will work fine (at least, on all normal platforms, including Arduino):
How could I handle operations with a number like:
48534588306961133067968196965257961415756656521818848750723547477673457670019632882524164647651492025728980571833579341743988603191694784406703
Nothing that I've tried worked so far... unsigned long, long long, etc...
What you need is a library that provides support for operations on integers of arbitrary length. However, from what I've been able to find out, there are no such libraries written in Objective-C.
You are nevertheless in luck as Objective-C is a superset of C. That makes it possible for you to use C libraries such as those described in the answers to this somewhat dated SO question.
Also, since the Clang compiler supports C++ and combining Objective-C and C++ code, you can probably use something like big int.
Note that none of the built-in types is even close to being big enough to represent numbers with as many digits as your examples. The biggest available integer type is unsigned long long, if you don't need negative numbers, and its size is 8 bytes/64 bits, which gives you a range of 0-18446744073709551615, or 20 digits max.
You could use JKBigInteger instead, it is a Objective-C wrapper around LibTomMath C library. And really easy to use and understand.
In your case:
JKBigInteger *int = [[JKBigInteger alloc] initWithString:#"48534588306961133067968196965257961415756656521818848750723547477673457670019632882524164647651492025728980571833579341743988603191694784406703"];
You can try here : http://gmplib.org/
GMP is a free library for arbitrary precision arithmetic, operating on signed integers, rational numbers, and floating point numbers. There is no practical limit to the precision except the ones implied by the available memory in the machine GMP runs on. GMP has a rich set of functions, and the functions have a regular interface.
EDIT:
Using shl, MOD, DIV, shr, or any other operators. I can't get conditionals to work.
In a constant round up an extended number e.g.
const
aaa = 3.14;
bbb = 3.14; // round this one up
Normally one would use trunc() or round(). If that doesn't work for you I suggest that you try to find a forum that specializes PaxCompiler, or PascalScript. You can't be the first one that wants to do this, and someone that is more familiar with these compilers might know how to solve the problem with these compilers.
I tried it in Free Pascal and there it works. IIRC it works in Borland Delphi and Turbo Pascal too (it was several years since I programmed in pascal so I could be wrong).
If none of the above works, then you might try to put it in a global variable. Even if global variables is bad, it is sometimes the least bad thing to do.
In Java, when I do following left shift operation, I get a negative result due to integer / long overflow:
0xAAAAAAAA << 7 gives me -183251938048
But, In Lua since everything is a Lua number which is 52 bit float; I am not able to trigger overflow upon left shift:
bit_lshift(0xAAAAAAAA,7) gives me 1431655680
How do I simulate 32bit signed integer in Lua??
You write some C functions that handle this and then export them to Lua.
Though generally, Lua code shouldn't be touching things this low-level.
You are looking for bit manipulating libraries in Lua. One such library is bitop from the author of LuaJIT, which directly contains it without the need for installation. You can also install it in standard Lua.
Another library is the bit32 library, which is contained in Lua 5.2.
Both libraries let you manipulate 32-bit numbers. For example with bitop:
local bit = require 'bit
print(bit.lshift(0xAAAAAAAA, 7)) --> 1431655680
I do not know how you got the negative number, since 1431655680 is what I get by doing (0xAAAAAAAA<<7)&0xFFFFFFFF in C (and also doing that in a "programming calculator").
I hope I'm not seen as trolling for saying this, but the best way to simulate Java from Lua would be to use Java from Lua.
If you need to emulate Java, chances are that your Lua is already embedded in it. Just expose Java's binary operations to the Lua program, so it can use them.