String returns only numbers after separatedBy - ios

I´m trying to separate a string like the following:
let path = "/Users/user/Downloads/history.csv"
do {
let contents = try NSString(contentsOfFile: path, encoding: String.Encoding.utf8.rawValue )
let rows = contents.components(separatedBy: "\n")
print("contents: \(contents)")
print("rows: \(rows)")
}
catch {
}
I have two files, which are looking almost identical.
From the first file the output is like this:
Output File1:
contents: 2017-07-31 16:29:53,0.10109999,9.74414271,0.98513273,0.15%,42302999779,-0.98513273,9.72952650
2017-07-31 16:29:53,0.10109999,0.25585729,0.02586716,0.25%,42302999779,-0.02586716,0.25521765
rows: ["2017-07-31 16:29:53,0.10109999,9.74414271,0.98513273,0.15%,42302999779,-0.98513273,9.72952650", "2017-07-31 16:29:53,0.10109999,0.25585729,0.02586716,0.25%,42302999779,-0.02586716,0.25521765", "", ""]
Output File2:
contents: 40.75013313,0.00064825,5/18/2017 7:17:01 PM
19.04004820,0.00059900,5/19/2017 9:17:03 PM
rows: ["4\00\0.\07\05\00\01\03\03\01\03\0,\00\0.\00\00\00\06\04\08\02\05\0,\05\0/\01\08\0/\02\00\01\07\0 \07\0:\01\07\0:\00\01\0 \0P\0M\0", "\0", "1\09\0.\00\04\00\00\04\08\02\00\0,\00\0.\00\00\00\05\09\09\00\00\0,\0\05\0/\01\09\0/\02\00\01\07\0 \09\0:\01\07\0:\00\03\0 \0P\0M\0", "\0", "\0", "\0"]
So both files are readable as String because the print(content) is working.
But as soon as the string gets separated, the second file is not readable anymore.
I tried different encodings, but nothing worked. Has anyone an idea, how to force the string to the second file, to remain a readable string?

Your file is apparently UTF-16 (little-endian) encoded:
$ hexdump fullorders4.csv
0000000 4f 00 72 00 64 00 65 00 72 00 55 00 75 00 69 00
0000010 64 00 2c 00 45 00 78 00 63 00 68 00 61 00 6e 00
0000020 67 00 65 00 2c 00 54 00 79 00 70 00 65 00 2c 00
0000030 51 00 75 00 61 00 6e 00 74 00 69 00 74 00 79 00
...
For ASCII characters, the first byte of the UTF-16 encoding is the
ASCII code, and the second byte is zero.
If the file is read as UTF-8 then the zeros are converted to an
ASCII NUL character, that is what you see as \0 in the output.
Therefore specifying the encoding as utf16LittleEndian works
in your case:
let contents = try NSString(contentsOfFile: path, encoding: String.Encoding.utf16LittleEndian.rawValue)
// or:
let contents = try String(contentsOfFile: path, encoding: .utf16LittleEndian)
There is also a method which tries to detect the used encoding
(compare iOS: What's the best way to detect a file's encoding). In Swift that would be
var enc: UInt = 0
let contents = try NSString(contentsOfFile: path, usedEncoding: &enc)
// or:
var enc = String.Encoding.ascii
let contents = try String(contentsOfFile: path, usedEncoding: &enc)
However, in your particular case, that would read the file as UTF-8
again because it is valid UTF-8. Prepending a byte order mark (BOM)
to the file (FF FE for UTF-16 little-endian) would solve that
problem reliably.

Related

Could not parse base64 DER-encoded ASN.1 public key from iOS in Golang

i have a projects in Golang with RSA enryption, so now, i have a Base64 public key format which used for encrypt a message,
i used this code:
publicKeyBase64 = "MIGJAoGBAJJYXgBem1scLKPEjwKrW8+ci3B/YNN3aY2DJ3lc5e2wNc0SmFikDpow1TdYcKl2wdrXX7sMRsyjTk15IECMezyHzaJGQ9TinnkQixJ+YnlNdLC04TNWOg13plyahIXBforYAjYl2wVIA8Yma2bEQFhmAFkEX1A/Q1dIKy6EfQ+xAgMBAAE="
publicKeyBinary, err := base64.StdEncoding.DecodeString(publicKeyBase64)
publicKeyInterface, err := x509.ParsePKIXPublicKey(publicKeyBinary)
if err != nil {
fmt.Println("Could not parse DER encoded public key (encryption key)")
return "","",err
}
publicKey, isRSAPublicKey := publicKeyInterface.(*rsa.PublicKey)
if !isRSAPublicKey {
fmt.Println("Public key parsed is not an RSA public key")
return "","",err
}
encryptedMessage, err := rsa.EncryptPKCS1v15(rand.Reader, publicKey, "message")
When i run this code, i got this error:
Could not parse DER encoded public key (encryption key)
asn1: structure error: tags don't match (16 vs {class:0 tag:2 length:129 isCompound:false}) {optional:false explicit:false application:false defaultValue:<nil> tag:<nil> stringType:0 timeType:0 set:false omitEmpty:false} AlgorithmIdentifier #3
The error points to publicKeyInterface, it failed to parse from Base64 decoded format to public Key, What's the problem with my code ?
=======updated=====
my publicKeyBase64 is retrieved from my models with Binary Data type
When i store it in my mongoDB from my Rails API, i receive public_key params as Base64 format, but i decode it to binary and then i stored it with this code
def create
params = device_params
public_key = Base64.decode64 device_params[:public_key]
#device_params[:public_key] value is "MIGJAoGBAJJYXgBem1scLKPEjwKrW8+ci3B/YNN3aY2DJ3lc5e2wNc0SmFikDpow1TdYcKl2wdrXX7sMRsyjTk15IECMezyHzaJGQ9TinnkQixJ+YnlNdLC04TNWOg13plyahIXBforYAjYl2wVIA8Yma2bEQFhmAFkEX1A/Q1dIKy6EfQ+xAgMBAAE="
params[:public_key] = BSON::Binary.new(public_key, :generic)
device = Device.find_or_create_by(id: device_params[:id])
render_success device.update_attributes(params), device
end
When i use rails code to convert my Base64 public key string using this code, it succeeded:
rsa_public_key = OpenSSL::PKey::RSA.new(Base64.decode64(public_key))
in my iOS app, i use https://github.com/DigitalLeaves/AsymmetricCrypto
to generate a public Key using this code:
AsymmetricCryptoManager.sharedInstance.createSecureKeyPair({ (success, error) -> Void in
if success {
print("RSA-1024 keypair successfully generated.")
let publicKey = AsymmetricCryptoManager.sharedInstance.getPublicKeyData()?.base64EncodedString()
let url = ENV.BASE_URL + "devices"
let headers = ["Authentication-Token": CurrentUser.getCurrentUser().token] as! HTTPHeaders
let params = ["device[user_id]": CurrentUser.getCurrentUser().id!, "device[id]": instanceID,"device[token]": fcmToken, "device[os]": "ios", "device[public_key]": publicKey!]
Alamofire.request(url, method: .post, parameters: params, encoding: URLEncoding.default, headers: headers)
} else { print("An error happened while generating a keypair: \(error)") }
})
We can dump the ASN.1 contents to see what they look like:
$ echo "MIGJAoGBAJJYXgBem1scLKPEjwKrW8+ci3B/YNN3aY2DJ3lc5e2wNc0SmFikDpow1TdYcKl2wdrXX7sMRsyjTk15IECMezyHzaJGQ9TinnkQixJ+YnlNdLC04TNWOg13plyahIXBforYAjYl2wVIA8Yma2bEQFhmAFkEX1A/Q1dIKy6EfQ+xAgMBAAE=" | \
base64 -d | \
dumpasn1 -
0 137: SEQUENCE {
3 129: INTEGER
: 00 92 58 5E 00 5E 9B 5B 1C 2C A3 C4 8F 02 AB 5B
: CF 9C 8B 70 7F 60 D3 77 69 8D 83 27 79 5C E5 ED
: B0 35 CD 12 98 58 A4 0E 9A 30 D5 37 58 70 A9 76
: C1 DA D7 5F BB 0C 46 CC A3 4E 4D 79 20 40 8C 7B
: 3C 87 CD A2 46 43 D4 E2 9E 79 10 8B 12 7E 62 79
: 4D 74 B0 B4 E1 33 56 3A 0D 77 A6 5C 9A 84 85 C1
: 7E 8A D8 02 36 25 DB 05 48 03 C6 26 6B 66 C4 40
: 58 66 00 59 04 5F 50 3F 43 57 48 2B 2E 84 7D 0F
: B1
135 3: INTEGER 65537
: }
0 warnings, 0 errors.
A well-formatted ASN.1 public key should include the algorithm as well. We should have a line similar to:
5 9: OBJECT IDENTIFIER rsaEncryption (1 2 840 113549 1 1 1)
The AsymmetricCryptoManager.getPublicKeyData() returns a very barebones ASN.1 key, without any algorithm information. This makes Go very unhappy as it has no way of knowing what kind of key it is. See more about correctly exporting the key here.
If you can change the iOS code, you should instead use CryptoExportImportManager and use one of exportPublicKeyToPEM or exportPublicKeyToDER. These take the output of getPublicKeyData and generate output usable by other tools. You can find an example of how to use them in the CryptoExportImportManager example.
If you cannot change the key export code, you can instead parse it directly in Go. This assumes that you know for sure that it is a RSA public key:
func main() {
publicKeyBase64 := "MIGJAoGBAJJYXgBem1scLKPEjwKrW8+ci3B/YNN3aY2DJ3lc5e2wNc0SmFikDpow1TdYcKl2wdrXX7sMRsyjTk15IECMezyHzaJGQ9TinnkQixJ+YnlNdLC04TNWOg13plyahIXBforYAjYl2wVIA8Yma2bEQFhmAFkEX1A/Q1dIKy6EfQ+xAgMBAAE="
// Base64 decode.
publicKeyBinary, err := base64.StdEncoding.DecodeString(publicKeyBase64)
if err != nil {
panic(err)
}
// rsa.PublicKey is a big.Int (N: modulus) and an integer (E: exponent).
var pubKey rsa.PublicKey
if rest, err := asn1.Unmarshal(publicKeyBinary, &pubKey); err != nil {
panic(err)
} else if len(rest) != 0 {
panic("rest is not nil")
}
fmt.Printf("key: %+v\n", pubKey)
}
This prints out:
key:
{N:+102767083290202280873554060983826675083148443795791447833515664566475334389364583758312108980110921996262487865832851258326049062353432991986398760705560379825908169063986770245967781444794847106351934016144540466696422397564949226710181429429140226472206572796987719088983654589217713611861345869296293449649
E:65537}
You can now use your public key in package rsa functions.

Write python code in delphi AES MODE ECB

I translated two functions in delphi but i don't know if they are right, I need to write the def do_aes_encrypt(key2_t_xor) to know if I am right.
This is what I wrote in delphi:
function key_transform (old_key:string): string;
var
x :integer;
begin
result:='';
for x := 32 downto 0 do
result:= result + chr(ord(old_key[x-1])-( x mod $0C)) ;
end;
function key_xoring ( key2_t :string ; kilo_challenge :string) : string ;
var
i :integer;
begin
result := '';
i:=0 ;
while i <= 28 do begin
result := result + chr(ord(key2_t[i+1]) xor ord(kilo_challenge[3]));
result := result + chr(ord(key2_t[i+2]) xor ord(kilo_challenge[2])) ;
result := result+ chr(ord(key2_t[i+3]) xor ord (kilo_challenge[1])) ;
i := i + 4 ;
end;
end;
This is the original python code:
def key_transform(old_key):
new_key = ''
for x in range(32,0,-1):
new_key += chr(ord(old_key[x-1]) - (x % 0x0C))
return new_key
def key_xoring(key2_t, kilo_challenge):
key2_t_xor = ''
i = 0
while i <= 28:
key2_t_xor += chr(ord(key2_t[i]) ^ ord(kilo_challenge[3]))
key2_t_xor += chr(ord(key2_t[i+1]) ^ ord(kilo_challenge[2]))
key2_t_xor += chr(ord(key2_t[i+2]) ^ ord(kilo_challenge[1]))
key2_t_xor += chr(ord(key2_t[i+3]) ^ ord(kilo_challenge[0]))
i = i + 4
return key2_t_xor
def do_aes_encrypt(key2_t_xor):
plaintext = b''
for k in range(0,16):
plaintext += chr(k)
obj = AES.new(key2_t_xor, AES.MODE_ECB)
return obj.encrypt(plaintext)
/////////////////////////////////////////////////////////////////////////////
{
kilo_challenge = kilo_header[8:12]
chalstring = ":".join("{:02x}".format(ord(k)) for k in kilo_challenge)
key2 = 'qndiakxxuiemdklseqid~a~niq,zjuxl' # if this doesnt work try 'lgowvqnltpvtgogwswqn~n~mtjjjqxro'
kilo_response = do_aes_encrypt(key_xoring(key_transform(key2),kilo_challenge))}
this code is for calculate data line 16 byte to be send as an addition to 32 byte
before
look photo the marked line in blue is what i need to calculate by the 4 byte hex befor marked in porple
and this is the key
key2 = 'qndiakxxuiemdklseqid~a~niq,zjuxl'
in delphi
because python code is working perfect
look to the photo
how it work
this is for lg phones upgrading firmware when i receive the KILOCENT ANSOWER AS THE photo show`s
this below change every time phone connected
||
V
4b 49 4c 4f 43 45 4e 54 ([ac e5 b1 06]) 00 00 00 00 KILOCENT¬å±.....
00 00 00 00 00 00 00 00 30 d4 00 00 b4 b6 b3 b0 ........0Ô..´¶³°
i have to send KILOMETER REQUEST to phone the first and second line is fixed no change but the third i have to change it by the AES ECB MODE encryption look
4b 49 4c 4f 4d 45 54 52 00 00 00 00 02 00 00 00 KILOMETR........
00 00 00 00 10 00 00 00 85 b6 00 00 b4 b6 b3 b0 ........…¶..´¶³°
fc 21 d8 e5 5b aa fd 58 1e 33 58 fd e9 0b 65 38 ü!Øå[ªýX.3Xýé.e8 <==this
and this is old key
key2 = 'qndiakxxuiemdklseqid~a~niq,zjuxl'

Constant TAO CORBA IOR

How to configure TAO corba server so that IOR string of this server generated from object_to_string is constant?
Each time the IOR string generated from object_to_string changes once server restarts. This is inconvenient since client has to update its cached server IOR string via reloading IOR file or namingservice accessing. As a result, it would be useful if server can generate a constant IOR string, no matter how many times it restarts.
My corba server is based on ACE+TAO and i do remember TAO supports constant IOR string: the IOR string each time it generates are same, and the solution is add some configurations for server. But i could not remember these configurations now.
=============================================
UPDATE:
In ACE_wrappers/TAO/tests/POA/Persistent_ID/server.cpp, i added a new function named testUniqe() which is similiar to creatPOA method. And the update file content is:
void testUniqu(CORBA::ORB_ptr orb_, PortableServer::POA_ptr poa_){
CORBA::PolicyList policies (2);
policies.length (2);
//IOR is the same even it is SYSTEM_ID
policies[0] = poa_->create_id_assignment_policy (PortableServer::USER_ID);
policies[1] = poa_->create_lifespan_policy (PortableServer::PERSISTENT);
PortableServer::POAManager_var poa_manager = poa_->the_POAManager ();
PortableServer::POA_ptr child_poa_ = poa_->create_POA ("childPOA", poa_manager.in (), policies);
// Destroy the policies
for (CORBA::ULong i = 0; i < policies.length (); ++i) {
policies[i]->destroy ();
}
test_i *servant = new test_i (orb_, child_poa_);
PortableServer::ObjectId_var oid = PortableServer::string_to_ObjectId("xushijie");
child_poa_->activate_object_with_id (oid, servant);
PortableServer::ObjectId_var id = poa_->activate_object (servant);
CORBA::Object_var object = poa_->id_to_reference (id.in ());
test_var test = test::_narrow (object.in ());
CORBA::String_var ior = orb_->object_to_string(test.in());
std::cout<<ior.in()<<std::endl;
poa_->the_POAManager()->activate();
orb_->run();
}
int
ACE_TMAIN (int argc, ACE_TCHAR *argv[])
{
try
{
CORBA::ORB_var orb =
CORBA::ORB_init (argc, argv);
int result = parse_args (argc, argv);
CORBA::Object_var obj =
orb->resolve_initial_references ("RootPOA");
PortableServer::POA_var root_poa =
PortableServer::POA::_narrow (obj.in ());
PortableServer::POAManager_var poa_manager =
root_poa->the_POAManager ();
testUniqu(orb.in(), root_poa.in());
orb->destroy ();
}
catch (const CORBA::Exception& ex)
{
ex._tao_print_exception ("Exception caught");
return -1;
}
return 0;
}
The problem is that the output server IORs are still different once restart. I also compared this code to the one in Page 412(Advance Corba Programming), but still fail..
///////////////////////////////////
UPDATE:
With "server -ORBListenEndpoints iiop://:1234 > /tmp/ior1", the generated two IORs are:
IOR:010000000d00000049444c3a746573743a312e300000000001000000000000007400000001010200150000007368696a69652d5468696e6b5061642d543431300000d2041b00000014010f0052535453f60054c6f80c000000000001000000010000000002000000000000000800000001000000004f41540100000018000000010000000100010001000000010001050901010000000000
IOR:010000000d00000049444c3a746573743a312e300000000001000000000000007400000001010200150000007368696a69652d5468696e6b5061642d543431300000d2041b00000014010f0052535468f60054da280a000000000001000000010000000002000000000000000800000001000000004f41540100000018000000010000000100010001000000010001050901010000000000
The result for tao_catior for ior1 and ior2:
ior1:
The Byte Order: Little Endian
The Type Id: "IDL:test:1.0"
Number of Profiles in IOR: 1
Profile number: 1
IIOP Version: 1.2
Host Name: **
Port Number: 1234
Object Key len: 27
Object Key as hex:
14 01 0f 00 52 53 54 53 f6 00 54 c6 f8 0c 00 00
00 00 00 01 00 00 00 01 00 00 00
The Object Key as string:
....RSTS..T................
The component <1> ID is 00 (TAG_ORB_TYPE)
ORB Type: 0x54414f00 (TAO)
The component <2> ID is 11 (TAG_CODE_SETS)
Component length: 24
Component byte order: Little Endian
Native CodeSet for char: Hex - 10001 Description - ISO8859_1
Number of CCS for char 1
Conversion Codesets for char are:
1) Hex - 5010001 Description - UTF-8
Native CodeSet for wchar: Hex - 10109 Description - UTF-16
Number of CCS for wchar 0
ecoding an IOR:
//ior2
The Byte Order: Little Endian
The Type Id: "IDL:test:1.0"
Number of Profiles in IOR: 1
Profile number: 1
IIOP Version: 1.2
Host Name: **
Port Number: 1234
Object Key len: 27
Object Key as hex:
14 01 0f 00 52 53 54 68 f6 00 54 da 28 0a 00 00
00 00 00 01 00 00 00 01 00 00 00
The Object Key as string:
....RSTh..T.(..............
The component <1> ID is 00 (TAG_ORB_TYPE)
ORB Type: 0x54414f00 (TAO)
The component <2> ID is 11 (TAG_CODE_SETS)
Component length: 24
Component byte order: Little Endian
Native CodeSet for char: Hex - 10001 Description - ISO8859_1
Number of CCS for char 1
Conversion Codesets for char are:
1) Hex - 5010001 Description - UTF-8
Native CodeSet for wchar: Hex - 10109 Description - UTF-16
Number of CCS for wchar 0
The diff result is:
< 14 01 0f 00 52 53 54 53 f6 00 54 c6 f8 0c 00 00
---
> 14 01 0f 00 52 53 54 68 f6 00 54 da 28 0a 00 00
19c19
< ....RSTS..T................
---
> ....RSTh..T.(..............
Similar diff result is:
< 14 01 0f 00 52 53 54 62 fd 00 54 2c 9a 0e 00 00
---
> 14 01 0f 00 52 53 54 02 fd 00 54 f9 a9 09 00 00
19c19
< ....RSTb..T,...............
---
> ....RST...T................
The difference is in ObjectKey.
============================================
update:
Instead of using above code, i find a better solution with helper TAO_ORB_Manager which is used NamingService and TAO/examples/Simple. TAO_ORB_Manager encapsulates API and generate persistent IORs, as example code in Simple.cpp:
if (this->orb_manager_.init_child_poa (argc, argv, "child_poa") == -1){
CORBA::String_var str =
this->orb_manager_.activate_under_child_poa (servant_name,
this->servant_.in ());
}
This is some description for TAO_ORB_Manager:
class TAO_UTILS_Export TAO_ORB_Manager
{
/**
* Creates a child poa under the root poa with PERSISTENT and
* USER_ID policies. Call this if you want a #a child_poa with the
* above policies, otherwise call init.
*
* #retval -1 Failure
* #retval 0 Success
*/
int init_child_poa (int &argc,
ACE_TCHAR *argv[],
const char *poa_name,
const char *orb_name = 0);
/**
* Precondition: init_child_poa has been called. Activate <servant>
* using the POA <activate_object_with_id> created from the string
* <object_name>. Users should call this to activate objects under
* the child_poa.
*
* #param object_name String name which will be used to create
* an Object ID for the servant.
* #param servant The servant to activate under the child POA.
*
* #return 0 on failure, a string representation of the object ID if
* successful. Caller of this method is responsible for
* memory deallocation of the string.
*/
char *activate_under_child_poa (const char *object_name,
PortableServer::Servant servant);
...................
}
After build, I can get what i want with -ORBListenEndpoints iiop://localhost:2809 option. Thanks #Johnny Willemsen help
You have to create the POA with a persistent lifespan policy, see ACE_wrappers/TAO/tests/POA/Persistent_ID as part of the TAO distribution.

Delphi StrToInt on Hex String WORD, UShort, 16 Bit hex fails

I currently use.
Here is a few outputs I formatted them they were all together like this.
E14802000003FA00014C0000031501A8
currentAttackCount := StrToInt('$' + Copy(CurHex, 17, 4));
Log('Packet = ' + CurHex + ' Count = ' + IntToStr(currentAttackCount) + ' STR = ' + '$' + Copy(CurHex, 17, 4));
Formatted outputs
Packet = E1 48 02 00 00 03 FA 00 [01 4C] 00 00 03 15 01 A8 [Count = 76] [STR = $014C]
Packet = E1 48 02 00 00 03 FA 00 [01 4D] 00 00 03 15 02 26 [Count = 77] [STR = $014D]
Packet = E1 48 02 00 00 03 FA 00 [01 4F] 00 00 03 15 02 26 [Count = 79] [STR = $014F]
As you can see the STR
output which is STR = $014C
code: 'STR = ' + '$' + Copy(CurHex, 17, 4));
Now if you look at the StrToInt
currentAttackCount := StrToInt('$' + Copy(CurHex, 17, 4));
It's pretty much the same as STR so shouldn't $014C aka 0x014C be represented as 332 instead of 76
The 76 seems to come from the $4C aka 0x4C of the $014C why does it ignore the first 2 Hex Characters
Ah I think I figured it out.
`currentAttackCount: Byte;`
I increased it to
currentAttackCount: Word;
hopefully that would solve the problem.. missed it because it's a global variable and there is soo much code.

Unable to convert NSData to NSString or any other format of informationElementData of CWNetwork

Friends,
It might look like familiar question but i really need help to convert NSData to any other understandable form. Basically i am using CoreWLAN framework and CWNetwork has properly called informationElement and it's data type is NSData. I have tried to convert it to any other readable format but not working. I have tried with all available string encoding. Below is sample code:
- (void) printNSData:(NSData *) dataToPrint forKey:(NSString *) key{
for(int i = 1 ; i < 16 ; i++){
size_t length = [dataToPrint length]+1;
unsigned char aBuffer[length];
[dataToPrint getBytes:aBuffer length:length];
aBuffer[length] = 0;
NSString *content = [[NSString alloc] initWithBytes:aBuffer
length:[dataToPrint length] encoding: i];
NSLog(#"%# : %# ", key,content);
}
/*
NSUTF16BigEndianStringEncoding = 0x90000100,
NSUTF16LittleEndianStringEncoding = 0x94000100,
NSUTF32StringEncoding = 0x8c000100,
NSUTF32BigEndianStringEncoding = 0x98000100,
NSUTF32LittleEndianStringEncoding = 0x9c000100,
NSProprietaryStringEncoding = 65536
*/
NSString *content = [[NSString alloc] initWithBytes:[dataToPrint bytes]
length:[dataToPrint length] encoding: NSUTF16BigEndianStringEncoding];
NSLog(#"%# : %# ",key, content);
content = [[NSString alloc] initWithBytes:[dataToPrint bytes]
length:[dataToPrint length] encoding: NSUTF16LittleEndianStringEncoding];
NSLog(#"%# : %# ",key, content);
content = [[NSString alloc] initWithBytes:[dataToPrint bytes]
length:[dataToPrint length] encoding: NSUTF32StringEncoding];
NSLog(#"%# : %# ", key,content);
content = [[NSString alloc] initWithBytes:[dataToPrint bytes]
length:[dataToPrint length] encoding: NSUTF32BigEndianStringEncoding];
NSLog(#"%# : %# ", key,content);
content = [[NSString alloc] initWithBytes:[dataToPrint bytes]
length:[dataToPrint length] encoding: NSUTF32LittleEndianStringEncoding];
NSLog(#"%# : %# ", key,content);
content = [[NSString alloc] initWithBytes:[dataToPrint bytes]
length:[dataToPrint length] encoding: NSProprietaryStringEncoding];
NSLog(#"%# : %#", key,content);
}
But i am getting either Null or empty response. Please please help.
Regards,
MP
You can't convert arbitrary data into a string. That only works for data that actually represents a string. Which is usually not the case if an API exposes a NSData object.
To get some meaning into the data you have to know what the data represents.
You might be able to get some structure into it by simply looking at it.
If I look at the first few bytes you have posted it looks like the data is well structured and not arbitrary.
The data seams to be split into packets. Each packet starts with a type identifier, which is followed by a $length byte. And then there will be $length bytes of data
The first packet contains the string "SYmantak"
00 08 53 79 6d 61 6e 74 61 6b
^^ Type Identifier
^^ Length
^^^^^^^^^^^^^^^^^^^^^^^ Data. In this case the ASCII string "SYmantak"
If you find a bunch of bytes that all lay between 0x20 and 0x7E you are probably looking at ASCII. That's basically how I figured out the payload of this packet. And because we have 8 bytes that are ASCII the 0x08 in front of the ASCII most likely means 8 bytes of data.
The next packets look like this:
01 08 82 84 0b 16 24 30 48 6c
^^ Type Identifier
^^ Length
^^^^^^^^^^^^^^^^^^^^^^^ Data. But not a ASCII string
03 01 06
2a 01 00
2f 01 00
30 14 01 00 00 0f ac 04 01 00 00 0f ac 04 01 00 00 0f ac 02 0c 00
32 04 0c 12 18 60
2d 1a 6e 18 1b ff 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
and so on. The general packet structure is quite easy to parse.
Though it will be very hard to turn these bytes into meaningful data. As you can see from the other packets, it's not always as easy as with the first packet that contained ASCII.
But please don't take this quickly reverse engineered structure for granted. I might be completely wrong about the meaning of these fields.
You should try to find the specification of this data. It should be somewhere in the IEEE 802.11 documents.

Resources