Constant TAO CORBA IOR - corba
How to configure TAO corba server so that IOR string of this server generated from object_to_string is constant?
Each time the IOR string generated from object_to_string changes once server restarts. This is inconvenient since client has to update its cached server IOR string via reloading IOR file or namingservice accessing. As a result, it would be useful if server can generate a constant IOR string, no matter how many times it restarts.
My corba server is based on ACE+TAO and i do remember TAO supports constant IOR string: the IOR string each time it generates are same, and the solution is add some configurations for server. But i could not remember these configurations now.
=============================================
UPDATE:
In ACE_wrappers/TAO/tests/POA/Persistent_ID/server.cpp, i added a new function named testUniqe() which is similiar to creatPOA method. And the update file content is:
void testUniqu(CORBA::ORB_ptr orb_, PortableServer::POA_ptr poa_){
CORBA::PolicyList policies (2);
policies.length (2);
//IOR is the same even it is SYSTEM_ID
policies[0] = poa_->create_id_assignment_policy (PortableServer::USER_ID);
policies[1] = poa_->create_lifespan_policy (PortableServer::PERSISTENT);
PortableServer::POAManager_var poa_manager = poa_->the_POAManager ();
PortableServer::POA_ptr child_poa_ = poa_->create_POA ("childPOA", poa_manager.in (), policies);
// Destroy the policies
for (CORBA::ULong i = 0; i < policies.length (); ++i) {
policies[i]->destroy ();
}
test_i *servant = new test_i (orb_, child_poa_);
PortableServer::ObjectId_var oid = PortableServer::string_to_ObjectId("xushijie");
child_poa_->activate_object_with_id (oid, servant);
PortableServer::ObjectId_var id = poa_->activate_object (servant);
CORBA::Object_var object = poa_->id_to_reference (id.in ());
test_var test = test::_narrow (object.in ());
CORBA::String_var ior = orb_->object_to_string(test.in());
std::cout<<ior.in()<<std::endl;
poa_->the_POAManager()->activate();
orb_->run();
}
int
ACE_TMAIN (int argc, ACE_TCHAR *argv[])
{
try
{
CORBA::ORB_var orb =
CORBA::ORB_init (argc, argv);
int result = parse_args (argc, argv);
CORBA::Object_var obj =
orb->resolve_initial_references ("RootPOA");
PortableServer::POA_var root_poa =
PortableServer::POA::_narrow (obj.in ());
PortableServer::POAManager_var poa_manager =
root_poa->the_POAManager ();
testUniqu(orb.in(), root_poa.in());
orb->destroy ();
}
catch (const CORBA::Exception& ex)
{
ex._tao_print_exception ("Exception caught");
return -1;
}
return 0;
}
The problem is that the output server IORs are still different once restart. I also compared this code to the one in Page 412(Advance Corba Programming), but still fail..
///////////////////////////////////
UPDATE:
With "server -ORBListenEndpoints iiop://:1234 > /tmp/ior1", the generated two IORs are:
IOR:010000000d00000049444c3a746573743a312e300000000001000000000000007400000001010200150000007368696a69652d5468696e6b5061642d543431300000d2041b00000014010f0052535453f60054c6f80c000000000001000000010000000002000000000000000800000001000000004f41540100000018000000010000000100010001000000010001050901010000000000
IOR:010000000d00000049444c3a746573743a312e300000000001000000000000007400000001010200150000007368696a69652d5468696e6b5061642d543431300000d2041b00000014010f0052535468f60054da280a000000000001000000010000000002000000000000000800000001000000004f41540100000018000000010000000100010001000000010001050901010000000000
The result for tao_catior for ior1 and ior2:
ior1:
The Byte Order: Little Endian
The Type Id: "IDL:test:1.0"
Number of Profiles in IOR: 1
Profile number: 1
IIOP Version: 1.2
Host Name: **
Port Number: 1234
Object Key len: 27
Object Key as hex:
14 01 0f 00 52 53 54 53 f6 00 54 c6 f8 0c 00 00
00 00 00 01 00 00 00 01 00 00 00
The Object Key as string:
....RSTS..T................
The component <1> ID is 00 (TAG_ORB_TYPE)
ORB Type: 0x54414f00 (TAO)
The component <2> ID is 11 (TAG_CODE_SETS)
Component length: 24
Component byte order: Little Endian
Native CodeSet for char: Hex - 10001 Description - ISO8859_1
Number of CCS for char 1
Conversion Codesets for char are:
1) Hex - 5010001 Description - UTF-8
Native CodeSet for wchar: Hex - 10109 Description - UTF-16
Number of CCS for wchar 0
ecoding an IOR:
//ior2
The Byte Order: Little Endian
The Type Id: "IDL:test:1.0"
Number of Profiles in IOR: 1
Profile number: 1
IIOP Version: 1.2
Host Name: **
Port Number: 1234
Object Key len: 27
Object Key as hex:
14 01 0f 00 52 53 54 68 f6 00 54 da 28 0a 00 00
00 00 00 01 00 00 00 01 00 00 00
The Object Key as string:
....RSTh..T.(..............
The component <1> ID is 00 (TAG_ORB_TYPE)
ORB Type: 0x54414f00 (TAO)
The component <2> ID is 11 (TAG_CODE_SETS)
Component length: 24
Component byte order: Little Endian
Native CodeSet for char: Hex - 10001 Description - ISO8859_1
Number of CCS for char 1
Conversion Codesets for char are:
1) Hex - 5010001 Description - UTF-8
Native CodeSet for wchar: Hex - 10109 Description - UTF-16
Number of CCS for wchar 0
The diff result is:
< 14 01 0f 00 52 53 54 53 f6 00 54 c6 f8 0c 00 00
---
> 14 01 0f 00 52 53 54 68 f6 00 54 da 28 0a 00 00
19c19
< ....RSTS..T................
---
> ....RSTh..T.(..............
Similar diff result is:
< 14 01 0f 00 52 53 54 62 fd 00 54 2c 9a 0e 00 00
---
> 14 01 0f 00 52 53 54 02 fd 00 54 f9 a9 09 00 00
19c19
< ....RSTb..T,...............
---
> ....RST...T................
The difference is in ObjectKey.
============================================
update:
Instead of using above code, i find a better solution with helper TAO_ORB_Manager which is used NamingService and TAO/examples/Simple. TAO_ORB_Manager encapsulates API and generate persistent IORs, as example code in Simple.cpp:
if (this->orb_manager_.init_child_poa (argc, argv, "child_poa") == -1){
CORBA::String_var str =
this->orb_manager_.activate_under_child_poa (servant_name,
this->servant_.in ());
}
This is some description for TAO_ORB_Manager:
class TAO_UTILS_Export TAO_ORB_Manager
{
/**
* Creates a child poa under the root poa with PERSISTENT and
* USER_ID policies. Call this if you want a #a child_poa with the
* above policies, otherwise call init.
*
* #retval -1 Failure
* #retval 0 Success
*/
int init_child_poa (int &argc,
ACE_TCHAR *argv[],
const char *poa_name,
const char *orb_name = 0);
/**
* Precondition: init_child_poa has been called. Activate <servant>
* using the POA <activate_object_with_id> created from the string
* <object_name>. Users should call this to activate objects under
* the child_poa.
*
* #param object_name String name which will be used to create
* an Object ID for the servant.
* #param servant The servant to activate under the child POA.
*
* #return 0 on failure, a string representation of the object ID if
* successful. Caller of this method is responsible for
* memory deallocation of the string.
*/
char *activate_under_child_poa (const char *object_name,
PortableServer::Servant servant);
...................
}
After build, I can get what i want with -ORBListenEndpoints iiop://localhost:2809 option. Thanks #Johnny Willemsen help
You have to create the POA with a persistent lifespan policy, see ACE_wrappers/TAO/tests/POA/Persistent_ID as part of the TAO distribution.
Related
ESP8266 Arduino. Problem with 128K virtual memory via external SPI SRAM
I have got internet radio project based on ESP8266 and VS1053 board - https://github.com/h1aji/Esp-radio/blob/dev/src/main.cpp (I enable SPIRAM when compile the project. // gets removed from line 45) I am trying to add external SPI RAM 23LC1024 for buffering using this code: #define SPIRAM #if defined ( SPIRAM ) ESP.setExternalHeap(); // Set external memory to use ringbuf = (uint8_t *) malloc ( RINGBFSIZ ) ; // Create ring buffer dbgprint ( "External buffer: Address %p, free %d\n", ringbuf, ESP.getFreeHeap() ) ; ESP.resetHeap(); #else ringbuf = (uint8_t *) malloc ( RINGBFSIZ ) ; // Create ring buffer #endif And I see nothing, buffer is empty. Log: D: Connected to server D: Switch to HEADER D: No data input D: Trying other station/file... D: STOP requested D: Stopping client ----- D: Connect to new host n0e.radiojar.com/hcrb063nn3quv?rj-ttl=5&rj-tok=AAABfwB-IbEAy4LwSrywI0a45Q D: Connect to n0e.radiojar.com on port 80, extension /hcrb063nn3quv?rj-ttl=5&rj-tok=AAABfwB-IbEAy4LwSr D: Connected to server D: Switch to HEADER D: content-type: audio/mpeg D: audio/mpeg seen. D: icy-name: - D: icy-metaint: 16000 D: Switch to DATA, bitrate is 0, metaint is 16000 D: First chunk: D: 00 00 00 00 00 00 00 00 D: 00 00 00 00 00 00 00 00 D: 00 00 00 00 00 00 00 00 D: 00 00 00 00 00 00 00 00 The test sketch from https://github.com/esp8266/Arduino/blob/master/libraries/esp8266/examples/virtualmem/virtualmem.ino works just fine as well as SPI RAM test code in the beginning. What am I missing?
String returns only numbers after separatedBy
I´m trying to separate a string like the following: let path = "/Users/user/Downloads/history.csv" do { let contents = try NSString(contentsOfFile: path, encoding: String.Encoding.utf8.rawValue ) let rows = contents.components(separatedBy: "\n") print("contents: \(contents)") print("rows: \(rows)") } catch { } I have two files, which are looking almost identical. From the first file the output is like this: Output File1: contents: 2017-07-31 16:29:53,0.10109999,9.74414271,0.98513273,0.15%,42302999779,-0.98513273,9.72952650 2017-07-31 16:29:53,0.10109999,0.25585729,0.02586716,0.25%,42302999779,-0.02586716,0.25521765 rows: ["2017-07-31 16:29:53,0.10109999,9.74414271,0.98513273,0.15%,42302999779,-0.98513273,9.72952650", "2017-07-31 16:29:53,0.10109999,0.25585729,0.02586716,0.25%,42302999779,-0.02586716,0.25521765", "", ""] Output File2: contents: 40.75013313,0.00064825,5/18/2017 7:17:01 PM 19.04004820,0.00059900,5/19/2017 9:17:03 PM rows: ["4\00\0.\07\05\00\01\03\03\01\03\0,\00\0.\00\00\00\06\04\08\02\05\0,\05\0/\01\08\0/\02\00\01\07\0 \07\0:\01\07\0:\00\01\0 \0P\0M\0", "\0", "1\09\0.\00\04\00\00\04\08\02\00\0,\00\0.\00\00\00\05\09\09\00\00\0,\0\05\0/\01\09\0/\02\00\01\07\0 \09\0:\01\07\0:\00\03\0 \0P\0M\0", "\0", "\0", "\0"] So both files are readable as String because the print(content) is working. But as soon as the string gets separated, the second file is not readable anymore. I tried different encodings, but nothing worked. Has anyone an idea, how to force the string to the second file, to remain a readable string?
Your file is apparently UTF-16 (little-endian) encoded: $ hexdump fullorders4.csv 0000000 4f 00 72 00 64 00 65 00 72 00 55 00 75 00 69 00 0000010 64 00 2c 00 45 00 78 00 63 00 68 00 61 00 6e 00 0000020 67 00 65 00 2c 00 54 00 79 00 70 00 65 00 2c 00 0000030 51 00 75 00 61 00 6e 00 74 00 69 00 74 00 79 00 ... For ASCII characters, the first byte of the UTF-16 encoding is the ASCII code, and the second byte is zero. If the file is read as UTF-8 then the zeros are converted to an ASCII NUL character, that is what you see as \0 in the output. Therefore specifying the encoding as utf16LittleEndian works in your case: let contents = try NSString(contentsOfFile: path, encoding: String.Encoding.utf16LittleEndian.rawValue) // or: let contents = try String(contentsOfFile: path, encoding: .utf16LittleEndian) There is also a method which tries to detect the used encoding (compare iOS: What's the best way to detect a file's encoding). In Swift that would be var enc: UInt = 0 let contents = try NSString(contentsOfFile: path, usedEncoding: &enc) // or: var enc = String.Encoding.ascii let contents = try String(contentsOfFile: path, usedEncoding: &enc) However, in your particular case, that would read the file as UTF-8 again because it is valid UTF-8. Prepending a byte order mark (BOM) to the file (FF FE for UTF-16 little-endian) would solve that problem reliably.
Write python code in delphi AES MODE ECB
I translated two functions in delphi but i don't know if they are right, I need to write the def do_aes_encrypt(key2_t_xor) to know if I am right. This is what I wrote in delphi: function key_transform (old_key:string): string; var x :integer; begin result:=''; for x := 32 downto 0 do result:= result + chr(ord(old_key[x-1])-( x mod $0C)) ; end; function key_xoring ( key2_t :string ; kilo_challenge :string) : string ; var i :integer; begin result := ''; i:=0 ; while i <= 28 do begin result := result + chr(ord(key2_t[i+1]) xor ord(kilo_challenge[3])); result := result + chr(ord(key2_t[i+2]) xor ord(kilo_challenge[2])) ; result := result+ chr(ord(key2_t[i+3]) xor ord (kilo_challenge[1])) ; i := i + 4 ; end; end; This is the original python code: def key_transform(old_key): new_key = '' for x in range(32,0,-1): new_key += chr(ord(old_key[x-1]) - (x % 0x0C)) return new_key def key_xoring(key2_t, kilo_challenge): key2_t_xor = '' i = 0 while i <= 28: key2_t_xor += chr(ord(key2_t[i]) ^ ord(kilo_challenge[3])) key2_t_xor += chr(ord(key2_t[i+1]) ^ ord(kilo_challenge[2])) key2_t_xor += chr(ord(key2_t[i+2]) ^ ord(kilo_challenge[1])) key2_t_xor += chr(ord(key2_t[i+3]) ^ ord(kilo_challenge[0])) i = i + 4 return key2_t_xor def do_aes_encrypt(key2_t_xor): plaintext = b'' for k in range(0,16): plaintext += chr(k) obj = AES.new(key2_t_xor, AES.MODE_ECB) return obj.encrypt(plaintext) ///////////////////////////////////////////////////////////////////////////// { kilo_challenge = kilo_header[8:12] chalstring = ":".join("{:02x}".format(ord(k)) for k in kilo_challenge) key2 = 'qndiakxxuiemdklseqid~a~niq,zjuxl' # if this doesnt work try 'lgowvqnltpvtgogwswqn~n~mtjjjqxro' kilo_response = do_aes_encrypt(key_xoring(key_transform(key2),kilo_challenge))} this code is for calculate data line 16 byte to be send as an addition to 32 byte before look photo the marked line in blue is what i need to calculate by the 4 byte hex befor marked in porple and this is the key key2 = 'qndiakxxuiemdklseqid~a~niq,zjuxl' in delphi because python code is working perfect look to the photo how it work this is for lg phones upgrading firmware when i receive the KILOCENT ANSOWER AS THE photo show`s this below change every time phone connected || V 4b 49 4c 4f 43 45 4e 54 ([ac e5 b1 06]) 00 00 00 00 KILOCENT¬å±..... 00 00 00 00 00 00 00 00 30 d4 00 00 b4 b6 b3 b0 ........0Ô..´¶³° i have to send KILOMETER REQUEST to phone the first and second line is fixed no change but the third i have to change it by the AES ECB MODE encryption look 4b 49 4c 4f 4d 45 54 52 00 00 00 00 02 00 00 00 KILOMETR........ 00 00 00 00 10 00 00 00 85 b6 00 00 b4 b6 b3 b0 ........…¶..´¶³° fc 21 d8 e5 5b aa fd 58 1e 33 58 fd e9 0b 65 38 ü!Øå[ªýX.3Xýé.e8 <==this and this is old key key2 = 'qndiakxxuiemdklseqid~a~niq,zjuxl'
simple midi file writer in Objective C
I'm writing a program in Objective C to generate a MIDI file. As a test, I'm asking it to write a file which plays one note and stops it a delta tick afterwards. But I'm trying to open it with Logic and Sibelius, and they both say that the file is corrupted. Here's the hex readout of the file.. 4D 54 68 64 00 00 00 06 00 01 00 01 00 40 - MThd header 4D 54 72 6B 00 00 00 0D - MTrk - with length of 13 as 32bit hex [00 00 00 0D] 81 00 90 48 64 82 00 80 48 64 - the track delta noteOn delta noteOff FF 2F 00 - end of file And here's my routines to write the delta time, and write the note - - (void) appendNote:(int)note state:(BOOL)on isMelody:(BOOL)melodyNote{ // generate a MIDI note and add it to the 'track' NSData object char c[3]; if( on ){ c[0] = 0x90; c[2] = volume; } else { c[0] = 0x80; c[2] = lastVolume; } c[1] = note; [track appendBytes:&c length:3]; } - (void) writeVarTime:(int)value{ // generate a MIDI delta time and add it to the 'track' NSData object char c[2]; if( value < 128 ){ c[0] = value; [track appendBytes:&c length:1]; } else { c[0] = value/128 | 0x80; c[1] = value % 128; [track appendBytes:&c length:2]; } } are there any clever MIDI gurus out there who can tell what's wrong with this MIDI file?
The delta time of the EOF event is missing.
Delphi StrToInt on Hex String WORD, UShort, 16 Bit hex fails
I currently use. Here is a few outputs I formatted them they were all together like this. E14802000003FA00014C0000031501A8 currentAttackCount := StrToInt('$' + Copy(CurHex, 17, 4)); Log('Packet = ' + CurHex + ' Count = ' + IntToStr(currentAttackCount) + ' STR = ' + '$' + Copy(CurHex, 17, 4)); Formatted outputs Packet = E1 48 02 00 00 03 FA 00 [01 4C] 00 00 03 15 01 A8 [Count = 76] [STR = $014C] Packet = E1 48 02 00 00 03 FA 00 [01 4D] 00 00 03 15 02 26 [Count = 77] [STR = $014D] Packet = E1 48 02 00 00 03 FA 00 [01 4F] 00 00 03 15 02 26 [Count = 79] [STR = $014F] As you can see the STR output which is STR = $014C code: 'STR = ' + '$' + Copy(CurHex, 17, 4)); Now if you look at the StrToInt currentAttackCount := StrToInt('$' + Copy(CurHex, 17, 4)); It's pretty much the same as STR so shouldn't $014C aka 0x014C be represented as 332 instead of 76 The 76 seems to come from the $4C aka 0x4C of the $014C why does it ignore the first 2 Hex Characters
Ah I think I figured it out. `currentAttackCount: Byte;` I increased it to currentAttackCount: Word; hopefully that would solve the problem.. missed it because it's a global variable and there is soo much code.