iOS Flash Drive SCSI Inquiry - ios

We are trying to send a SCSI inquiry to a iOS Flash Drive. Connected via USB we can send the following inquiry command using libUsb and Java:
private byte[] getInquiryCommand(byte replyLength) {
// Command Block Wrapper (CBW)
ByteBuffer outBuffer = ByteBuffer.allocate(31); // ATTENTION!!!! WE NEED EXACTLY 31 BYTES! See: http://www.usb.org/developers/docs/devclass_docs/usbmassbulk_10.pdf
outBuffer.order(ByteOrder.LITTLE_ENDIAN);
outBuffer.putInt(0x43425355); // CBW Signature
outBuffer.putInt(0x84752008); // CBW Tag ~ command identifier
outBuffer.putInt((byte) 0x24); // CBW Data transfer length
outBuffer.put((byte) 0x80); // CBW Flags (0x80 = Data IN)
outBuffer.put((byte) 0x0); // CBW Lun
outBuffer.put((byte) 0x6); // CBW CB Length
// Inquiry Command
outBuffer.put((byte) 0x12); // operation code
outBuffer.put((byte) 0x01); // reserved, CmdDt, EVPD
outBuffer.put((byte) 0x80); // page code
outBuffer.put((byte) 0x00); // reserved (or allocation MSB)
outBuffer.put((byte) replyLength); // allocation (reply) length
outBuffer.put((byte) 0x00); // control
return outBuffer.array();
}
For this we are wrapping the SCSI command into the USB protocol. No we want to do the same thing with Swift and iOS. The problem is that we aren't in contact with the hardware manufacturer and we aren't member of the MFI program.
We tried to send the same request in swift to a flash drive connected to a IPhone. But of course this doesn’t work because we are using the USB protocol to wrap the SCSI command, but we need to use some Lightning protocol Layer(That's what we believe...) This is the Request:
func getInquiryCommand() -> [UInt8]{
return [
0x55, // CBW Signature 1
0x53, // CBW Signature 2
0x42, // CBW Signature 3
0x43, // CBW Signature 4
0x08, // CBW Tag ~ command identifier 1
0x20, // CBW Tag ~ command identifier 2
0x75, // CBW Tag ~ command identifier 3
0x84, // CBW Tag ~ command identifier 4
0x24, // CBW Data transfer length
0x0, // CBW Data transfer length
0x0, // CBW Data transfer length
0x0, // CBW Data transfer length
0x80, // CBW Flags (0x80 = Data IN)
0x0, // CBW Lun
0x6, // CBW CB Length
// Inquiry Command
0x12, // operation code
0x01, // reserved, CmdDt, EVPD
0x80, // page code
0x00, // reserved (or allocation MSB)
127, // allocation (reply) length
0x00, // control
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
0x00, // Auf 31 Bytes auffüllen
]
}
The protocol string for one of our devices is "com.allbond.protocol05".
Does anyone know if it's even possible to send a SCSI inquiry this way?
We would also appreciate any further information to the topic.

Related

Opus decoder on iOS is crashing with no obvious reason

I have simple code that decodes opus frame into audio samples.
It works on Android but it crashes in Unity3D iOS project and does not crash in regular iOS project.
EXC_BAD_ACCESS (code=1, address=0x2f)
Both projects share same opus static library and header files.
#include "opus.h"
int test1(){
unsigned char opus_chunk[] = {0x68, 0x97, 0x50, 0x0d,
0xba, 0xa4, 0x80, 0x0d, 0x31, 0x21, 0x9c, 0xcf, 0x74, 0x98, 0xda, 0xc6,
0xd5, 0x27, 0xcb, 0xd9, 0x51, 0xd7, 0xce, 0x90, 0xc5, 0x58, 0x94, 0x53,
0xb0, 0xe9, 0xb4, 0xe4, 0xf4, 0x42, 0x4d, 0xc7, 0xa4, 0x61, 0xfa, 0xfe};
int len = sizeof(opus_chunk);
short samples[5760];
int err1;
OpusDecoder *decoder;
decoder = opus_decoder_create(48000, 1, &err1);
int n = opus_decode(decoder, opus_chunk, len, samples, 5760, 0);
opus_decoder_destroy(decoder);
}
Stack trace:
#0 0x00b944ec in compute_allocation ()
#1 0x00c03698 in celt_decode_with_ec at ./opus_ios/build/src/opus-1.1.2/celt/celt_decoder.c:956
#2 0x00c2400c in opus_decode_frame at ./opus_ios/build/src/opus-1.1.2/src/opus_decoder.c:490
#3 0x00c24ea2 in opus_decode_native [inlined] at ./opus_ios/build/src/opus-1.1.2/src/opus_decoder.c:692
#4 0x00c24e80 in opus_decode at ./opus_ios/build/src/opus-1.1.2/src/opus_decoder.c:782
I compared build settings and made them almost same.
Error sounds like - something is wrong with allocation.
opus_decoder_create is able to allocate OpusDecoder but error is in opus_decode
This occurs due to a symbol conflict. The Unity 3D library defines some symbols, including compute_allocation(), that are also defined and used by libopus. If the Unity 3D library is before libopus on the linker command line then it may pull in that version, which will not work with libopus. If you need both sets then you may need to rename the conflicting symbols.

const char array in Objective-C has different value depending upon device

In an app we are working on, there is a literal string we would like to keep secret so we have referenced it in the source as a const char array:
const char secret[] = { 0x63, 0x35, 0x4d, 0x58, 0x52, 0x32, 0x2c, 0x52, 0x53, 0x12, 0x3c, 0x74, 0x51, 0x53, 0x 69, 0x8a, 0x64, 0x12, 0x7f, 0x6e, 0x25, 0x64, 0x4e, 0x32, 0x23, 0x53, 0x12, 0x7b, 0x4c, 0x87, 0x64, 0x23, 0x41, 0x23, 0x56, 0x34, 0x6c, 0x23, 0x75, 0x5e, 0x56, 0x23, 0x65, 0x5b, 0x23, 0x75, 0x12, 0x65, 0x23, 0x76, 0x3a, 0x2f, 0x53, 0x32a, 0x23, 0x54, 0x54, 0x21, 0x64, 0x32, 0x53, 0x13, 0x24, 0x32 };
(I've changed this so it doesn't match our secret :) )
We use -[NSData dataWithBytes:length:] to convert secret to NSData, then base 64 decode it and -[NSString initWithData:encoding:] it.
The problem is, on iPhone 5 & 4s converting the decoded data to a string fails.
Upon inspecting the contents of secret in the debugger, there are more characters than there should be.
Finally, copying the exact same literal to another const char and printing both in succession produces different results.
What is going on?
It seems that the devices in question are reading past the end of the array until they meet some kind of termination character. To stop this from happening, you need the final character in the array to be a newline.
$ echo -en "\n" | xxd -pu
shows us that \n as a hex is 0a, so adding 0x0A as the final element in the secret array literal will stop the OS from reading random memory. You also may want to make sure that the final NSString doesn't contain the newline character :)
Update
The above fixed all problems on Debug builds, but on a Release build the same issue occurred so we replaced 0x0a with 0x0 and both builds started working

How to appropriately encrypt and decrypt a NSString with AES 128

I am using http://aes.online-domain-tools.com to encrypt my NSString and what i get back from this is an array of unsigned char like this c2 84 6b 71 72 6d d2 e7 cd 0b a6 08 cd 85 c3 0c.
Then is use this to convert it into NSString in my code:
const unsigned char encrpytedAppIDbytes[] = {0xe5, 0x35, 0xdf, 0x72, 0x57, 0xaf, 0xf7, 0xe6, 0x1f, 0x6d, 0x51, 0x1d, 0x26, 0xe8, 0x5e, 0xa2};
NSData *appIDToDecrypt = [NSData dataWithBytes:encrpytedAppIDbytes length:sizeof(encrpytedAppIDbytes)];
NSString *decryptedAppID = [[NSString alloc] initWithData:[appIDToDecrypt AES128DecryptedDataWithKey:#"something"] encoding:NSUTF8StringEncoding];
if([decryptedAppID isEqualToString:#"Something"]){} // This fails even when i look at them in the debugger they are the same.
But when i am trying to decrypt it, its showing up as the same string but when i compare it with the same NSString hardcode to check if it is the same string it doesn't work.
This fails some authentication check i have in my app.
Please point anything wrong i am doing here.
Thanks,
Alright so after spending few hours with it i finally found the solutions which might not be optimal but works in my case.
It seems like after decryption, the string contains some other characters which are not visible in the debugger but when i am trying to check the length it shows greater than the number of characters in it which indicates that there is something wrong. For now what i have done is this :
const unsigned char nameBytes[] = {0xa6, 0xf0, 0xea, 0x36, 0x5f, 0x78, 0xb7, 0x52, 0x29, 0x6a, 0x67, 0xb7, 0xeb, 0x73, 0xd5, 0x14};
NSData *nameBytesData = [NSData dataWithBytes:nameBytes length:sizeof(nameBytes)];
NSString *nameBytesString = [[NSString alloc] initWithData:[nameBytesData AES128DecryptedDataWithKey:#"PaymentGateway"] encoding:NSUTF8StringEncoding];
NSCharacterSet * set = [[NSCharacterSet characterSetWithCharactersInString:#"abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLKMNOPQRSTUVWXYZ0123456789"] invertedSet];
NSString *safeSearchString = [[nameBytesString componentsSeparatedByCharactersInSet:set] componentsJoinedByString:#""];
NSLog(#"length:%lu",(unsigned long)[safeSearchString length]);
NSLog(#"lengthActual:%lu",(unsigned long)[#"ashutosh" length]);
if ([safeSearchString isEqualToString:#"ashutosh"]) {
NSLog(#"Success");
}
NSLog(#"Decrypted:%#",nameBytesString);
The code above removes all the special characters and replaces it with #"" so the resulted string only has valid chars. For adding support to consider more chars as valid just add them to NSCharacterSet * set.

Create Byte array from NSMutableArray

I want to create a Byte Array like this one;
Byte UUID[] = {0xEB, 0xEF, 0xD0, 0x83, 0x70, 0xA2, 0x47, 0xC8, 0x98, 0x37, 0xE7, 0xB5, 0x63, 0x4D, 0xF5, 0x24};
But the problem here I am facing is, I need to fill all the elements in the above array programatically from a NSMutableArray that holds the values as below;
(
0xEB,
0xEF,
0xD0,
0x83,
0x70,
0xA2,
0x47,
0xC8,
0x98,
0x37,
0xE7,
0xB5,
0x63,
0x4D,
0xF5,
0x24
)
I have tried with the integer values of each index but it is showing '/0' in the Byte Array.
If anyone have any information regarding this please share.
Thanks
Assuming that you have an array of strings "0xEB", "0xEF", ..., the following should work:
NSArray *array = #[#"0xEB", #"0xEF", #"0xD0", #"0x83", #"0x70", #"0xA2", #"0x47", #"0xC8", #"0x98", #"0x37", #"0xE7", #"0xB5", #"0x63", #"0x4D", #"0xF5", #"0x24"];
Byte UUID[16];
for (int i = 0; i < 16; i++) {
UUID[i] = strtoul([array[i] UTF8String], NULL, 16);
}
This works even if the strings do not have the "0x" prefix:
NSArray *array = #[#"EB", #"EF", ...]
because strtoul(string, ..., 16) reads a string with or without "0x" prefix
in base 16, and converts it to an integer.

Cannot parse Width/Height in AVC SPS

Here is the information which I have parsed out of an avcC atom in an mp4 container
The avc extradata Conforms with ISO/IEC 14496-15:2004(E) 5.2.4.1.1
> 0x01 0x42 0x00 0x1E 0xFF 0xE1 0x00 0x0E
Configuration Version: 1 u(8)<br>
AVCProfileIndication: 66 u(8)<br>
profile_compatability: 0 u(8)<br>
AVCLevelIndication: 30 u(8)<br>
bit(6) reserved = '111111' b <br>
unsigned int (2) lengthSizeMinusOne = '11' <br>
bit(3) reseved = '111' <br>
unsigned int (5) numOfSequenceParameterSets = 1 <br>
unsigned int (16) sequenceParameterSetLength = 14 <br>
SPS
> 0x67 0x42 0x00 0x1E 0x8D 0x68 0x6E 0x03 0xDA 0x6A 0x0C 0x02 0x0C 0x04 <br>
avC data Continued
> 0x01 0x00 0x04 <br>
unsigned int (8) numOfPictureParameterSets: 1 <br>
unsigned int (16) pictureParameterSetLength: 4 <br>
PPS <br>
>0x68 0xCE 0x74 0xC8
The contents of the SPS appears to give incorrect results regarding pic_width_in_mbs_minus1 (5) and I do not believe there are any emulation_3_byte preventions. Am I missing something obvious? I am parsing the SPS according to ISO/IEC 14496-10:2004(E) which is the same SPS parsing information found here.
Image is 96x16 (why?)
Sequence Parameter Set
profile_idc 66
constraint_set0_flag 0
constraint_set1_flag 0
constraint_set2_flag 0
constraint_set3_flag 0
level_idc 30
seq_parameter_set_id 0
// ...
num_ref_frames 1
gaps_in_frame_num_value_allowed_flag 0
pic_width_in_mbs_minus1 5
pic_height_in_map_units_minus1 0
frame_mbs_only_flag 1
direct_8x8_inference_flag 1
frame_cropping_flag 0
vui_parameters_present_flag 0
// ...
Picture Parameter Set
pic_parameter_set_id 0
seq_parameter_set_id 0
entropy_coding_mode_flag 0
num_slice_groups_minus1 0
// ...

Resources