Set data for each bit in NSData iOS - ios

I created NSData of length 2 bytes (16 bits) and I want to set first 12 bits as binary value of (int)120 and 13th bit as 0 or 1(bit), 14th bit as 0 or 1 (bit) and 15th bit as 0 or 1(bit).
This is what I want to do:
0000 0111 1000 --> 12 bits as 120 (int)
1 <-- 13th bit
0 <-- 14th bit
1 <-- 15th bit
1 <-- 16th bit
Expected output => 0000 0111 1000 1011 : final binary and convert it to NSData.
How can I do that? Please give me some advice. Thanks all.

0000011110001011 in bit is 0x07 0x8b in byte.
unsigned char a[2] ;
a[0] = 0x07 ;
a[1] = 0x8b ;
NSData * d = [NSData dataWithBytes:a length:2] ;

Recently i wrote this code for my own project.. Check if this can be helpful to you.
//Create a NSMutableData with particular num of data bytes
NSMutableData *dataBytes= [[NSMutableData alloc] initWithLength:numberOfDataBytes];
//get the byte in which you want to change bit.
char x;
[dataBytes getBytes:&x range:NSMakeRange(bytePos,1)];
//change the bit here by shift operation
x |= 1<< (bitNum%8);
//put byte back in NSMutableData
[dataBytes replaceBytesInRange:NSMakeRange(bytePos,1) withBytes:&x length:1];
Let me know if you need more help ..:)
For changing 13th bit based on string equality
if([myString isEqualTo:#"hello"])
{
char x;
[dataBytes getBytes:&x range:NSMakeRange(0,1)];
//change the bit here by shift operation
x |= 1<< (4%8); // x |=1<<4;
//put byte back in NSMutableData
[dataBytes replaceBytesInRange:NSMakeRange(0,1) withBytes:&x length:1];
}

This is the exact code you might need:
uint16_t x= 120; //2 byte unsigned int (0000 0000 0111 1000)
//You need 13th bit to be 1
x<<=1; //Shift bits to left .(x= 0000 0000 1111 0000)
x|=1; //OR with 1 (x= 0000 0000 1111 0001)
//14th bit to be 0.
x<<=1; // (x=0000 0001 1110 0010)
//15th bit to be 1
x<<=1; //x= 0000 0011 1100 0100
x|=1; //x= 0000 0011 1100 0101
//16th bit to be 1
x<<=1; //x= 0000 0111 1000 1010
x|=1; //x= 0000 0111 1000 1011
//Now convert x into NSData
/** **** Replace this for Big Endian ***************/
NSMutableData *data = [[NSMutableData alloc] init];
int MSB = x/256;
int LSB = x%256;
[data appendBytes:&MSB length:1];
[data appendBytes:&LSB length:1];
/** **** Replace upto here.. :) ***************/
//replace with :
//NSMutableData *data = [[NSMutableData alloc] initWithBytes:&x length:sizeof(x)];
NSLog(#"%#",[data description]);
Output: <078b> //x= 0000 0111 1000 1011 //for Big Endian : <8b07> x= 1000 1011 0000 0111

Related

How to reinterpret data given by AVAudioRecorder as const char *?

PROBLEM:
The problem I am trying to solve is the following. I have audio data recorded by AVAudioRecorder. I can get the NSData by:
NSData *data = [NSData dataWithContentsOfURL: self.audioRecorder.url];
But then I need to convert/reinterpret this NSData to a const char* form which would essentially look like
00 01 00 ff
which are bytes in hex or at least the equivalent string. They don't have to be actually in hex but just needs to be convertible to hex.
QUESTION:
My question is that the NSData has "\0" in them. So if I do something like this:
NSUInteger len = [data length];
Byte *byteData = (Byte*)malloc(len);
memcpy(byteData, [data bytes], len);
It would not work as the data will be cutoff when it meets the first "\0". I am super new to audio files, but I think it is because of the x00 values in the header. So basically, I don't want to them to be interpreted as "\0" but as "00". Is there a way to do this?
Not sure I understand the question or what you are trying to do. Your memcpy will copy all the bytes to the byteData buffer, it is only when you try to use the byteData buffer as a string (char*) and pass them into a format function (NSLog(%"%s", val)) will it cut off. If you want a string representation of the data as hex:
NSString* bytesToHex(Byte* bytes, NSUInteger count) {
NSMutableString *hex = [NSMutableString string];
for(int i = 0; i < count; i++) [hex appendFormat:#"%.2x " , *(bytes+i)];
return hex;
}
NSString* dataToHex(NSData* data) {
return bytesToHex((Byte*)data.bytes, data.length);
}
will do it, ie:
Byte* bytes = (Byte*)"t\0h\0i\0s\0 i\0s\0 a\0 t\0e\0st";
NSData* data = [NSData dataWithBytes:bytes length:24];
NSLog(#"%#",NSLog(#"%#", dataToHex(data));
will print:
74 00 68 00 69 00 73 00 20 69 00 73 00 20 61 00 20 74 00 65 00 73 74 00
or
Byte* bytes = (Byte*)"t\0h\0i\0s\0 i\0s\0 a\0 t\0e\0st";
NSData* data = [NSData dataWithBytes:bytes length:24];
NSUInteger len = [data length];
Byte *byteData = (Byte*)malloc(len);
memcpy(byteData, [data bytes], len);
NSLog(#"%#", bytesToHex(byteData, len));
will print:
74 00 68 00 69 00 73 00 20 69 00 73 00 20 61 00 20 74 00 65 00 73 74 00
Just remembered something
Even easier, if you use the NSData description property, it gives you the data in hex already!
Byte* bytes = (Byte*)"t\0h\0i\0s\0 i\0s\0 a\0 t\0e\0st";
NSData* data = [NSData dataWithBytes:bytes length:24];
NSLog(#"%#", data.description);
Will print
<74006800 69007300 20690073 00206100 20740065 00737400>
Not as pretty, but the same thing...

IOS:Convert hex values from the characterstic.value result

i am able to retrieve value from the peripheral as hex value and i need to convert as per my requirement.[24/12/14 11:37:00 am] sonali_phatak: I can see that i have received proper response.from 01117100352e36302e313100000000e55a
01 - 01-start byte
11 - 17(Dec) - length of responce packet
71 - response ID
00 - Ignore this byte
So now out of total length 17, first 4 bytes are header, last 2 bytes are CRC. We
need to read remaining 11 bytes and convert them to ASCII.
35 - 5
2e - .
36 - 6
30 - 0
2e - .
31 - 1
31 - 1
So Iam getting version number from watch as 5.60.11
But i need to show the above value 5.60.11 in string and print in console . how to convert it pleas help me
Please try this :
NSString *strOriginalHex= #"01117100352e36302e313100000000e55a";
NSString *strNewHexForVersion = [strOriginalHex substringWithRange:NSMakeRange(8, 14)];
NSLog(#"%#",[self stringFromHexString:strNewHexForVersion]);//5.60.11
- (NSString *)stringFromHexString:(NSString *)aStrHexString
{
// The hex codes should all be two characters.
if (([aStrHexString length] % 2) != 0)
return nil;
NSMutableString *aMutStrNewString = [NSMutableString string];
for (NSInteger i = 0; i < [aStrHexString length]; i += 2)
{
NSString *hex = [aStrHexString substringWithRange:NSMakeRange(i, 2)];
NSInteger decimalValue = 0;
sscanf([hex UTF8String], "%x", &decimalValue);
[aMutStrNewString appendFormat:#"%c", decimalValue];
}
return aMutStrNewString;
}

simple midi file writer in Objective C

I'm writing a program in Objective C to generate a MIDI file. As a test, I'm asking it to write a file which plays one note and stops it a delta tick afterwards.
But I'm trying to open it with Logic and Sibelius, and they both say that the file is corrupted.
Here's the hex readout of the file..
4D 54 68 64 00 00 00 06 00 01 00 01 00 40 - MThd header
4D 54 72 6B 00 00 00 0D - MTrk - with length of 13 as 32bit hex [00 00 00 0D]
81 00 90 48 64 82 00 80 48 64 - the track
delta noteOn delta noteOff
FF 2F 00 - end of file
And here's my routines to write the delta time, and write the note -
- (void) appendNote:(int)note state:(BOOL)on isMelody:(BOOL)melodyNote{ // generate a MIDI note and add it to the 'track' NSData object
char c[3];
if( on ){
c[0] = 0x90;
c[2] = volume;
} else {
c[0] = 0x80;
c[2] = lastVolume;
}
c[1] = note;
[track appendBytes:&c length:3];
}
- (void) writeVarTime:(int)value{ // generate a MIDI delta time and add it to the 'track' NSData object
char c[2];
if( value < 128 ){
c[0] = value;
[track appendBytes:&c length:1];
} else {
c[0] = value/128 | 0x80;
c[1] = value % 128;
[track appendBytes:&c length:2];
}
}
are there any clever MIDI gurus out there who can tell what's wrong with this MIDI file?
The delta time of the EOF event is missing.

How to read a NSInputStream with UTF-8?

I try to read a large file in iOS using NSInputStream to separate the files line by newlines (I don't want to use componentsSeparatedByCharactersInSet as it uses too much memory).
But as not all lines seem to be UTF-8 encoded (as they can appear just as ASCII, same bytes) I often get the Incorrect NSStringEncoding value 0x0000 detected. Assuming NSASCIIStringEncoding. Will stop this compatiblity mapping behavior in the near future. warning.
My question is: Is there a way to surpress this warning by e.g. setting a compiler flag?
Furthermore: Is it save to append/concatenate two buffer reads, as reading from the byte stream, then converting the buffer to string and then appending the string could make the string corrupted?
Below an example method that demonstrates that the byte to string conversion will discard the first and second half of the UTF-8 character, as being invalid.
- (void)NSInputStreamTest {
uint8_t testString[] = {0xd0, 0x91}; // #"Б"
// Test 1: Read max 1 byte at a time of UTF-8 string
uint8_t buf1[1], buf2[1];
NSString *s1, *s2, *s3;
NSInteger c1, c2;
NSInputStream *inStream = [[NSInputStream alloc] initWithData:[[NSData alloc] initWithBytes:testString length:2]];
[inStream open];
c1 = [inStream read:buf1 maxLength:1];
s1 = [[NSString alloc] initWithBytes:buf1 length:1 encoding:NSUTF8StringEncoding];
NSLog(#"Test 1: Read %d byte(s): %#", c1, s1);
c2 = [inStream read:buf2 maxLength:1];
s2 = [[NSString alloc] initWithBytes:buf2 length:1 encoding:NSUTF8StringEncoding];
NSLog(#"Test 1: Read %d byte(s): %#", c2, s2);
s3 = [s1 stringByAppendingString:s2];
NSLog(#"Test 1: Concatenated: %#", s3);
[inStream close];
// Test 2: Read max 2 bytes at a time of UTF-8 string
uint8_t buf4[2];
NSString *s4;
NSInteger c4;
NSInputStream *inStream2 = [[NSInputStream alloc] initWithData:[[NSData alloc] initWithBytes:testString length:2]];
[inStream2 open];
c4 = [inStream2 read:buf4 maxLength:2];
s4 = [[NSString alloc] initWithBytes:buf4 length:2 encoding:NSUTF8StringEncoding];
NSLog(#"Test 2: Read %d byte(s): %#", c4, s4);
[inStream2 close];
}
Output:
2013-02-10 21:16:23.412 Test[11144:c07] Test 1: Read 1 byte(s): (null)
2013-02-10 21:16:23.413 Test[11144:c07] Test 1: Read 1 byte(s): (null)
2013-02-10 21:16:23.413 Test[11144:c07] Test 1: Concatenated: (null)
2013-02-10 21:16:23.413 Test[11144:c07] Test 2: Read 2 byte(s): Б
First of all, in line: s3 = [s1 stringByAppendingString:s2]; you are trying to concatenate to 'nil' values. The result would be 'nil' also. So, you may want to concatenate bytes instead of strings:
uint8_t buf3[2];
buf3[0] = buf1[0];
buf3[1] = buf2[0];
s3 = [[NSString alloc] initWithBytes:buf3 length:2 encoding:NSUTF8StringEncoding];
Output:
2015-11-06 12:57:40.304 Test[10803:883182] Test 1: Read 1 byte(s): (null)
2015-11-06 12:57:40.305 Test[10803:883182] Test 1: Read 1 byte(s): (null)
2015-11-06 12:57:40.305 Test[10803:883182] Test 1: Concatenated: Б
Secondary, length of UTF-8 character may lay in [1..6] bytes.
(1 byte) 0aaa aaaa //if symbol lays in 0x00 .. 0x7F (ASCII)
(2 bytes) 110x xxxx 10xx xxxx
(3 bytes) 1110 xxxx 10xx xxxx 10xx xxxx
(4 bytes) 1111 0xxx 10xx xxxx 10xx xxxx 10xx xxxx
(5 bytes) 1111 10xx 10xx xxxx 10xx xxxx 10xx xxxx 10xx xxxx
(6 bytes) 1111 110x 10xx xxxx 10xx xxxx 10xx xxxx 10xx xxxx 10xx xxxx
So, if you are intended to read from NSInputStream raw bytes and then translate them into UTF-8 NSString, you probably want to read byte by byte from NSInputStream until you will get valid string:
#define MAX_UTF8_BYTES 6
NSString *utf8String;
NSMutableData *_data = [[NSMutableData alloc] init]; //for easy 'appending' bytes
int bytes_read = 0;
while (!utf8String) {
if (bytes_read > MAX_UTF8_BYTES) {
NSLog(#"Can't decode input byte array into UTF8.");
return;
}
else {
uint8_t byte[1];
[_inputStream read:byte maxLength:1];
[_data appendBytes:byte length:1];
utf8String = [NSString stringWithUTF8String:[_data bytes]];
bytes_read++;
}
}
ASCII (and hence the newline character) is a subset of UTF-8, so there should not be any conflict.
It should be possible to divide your stream at the newline characters, as you would in a simple ASCII stream. Then you can convert each chunk ("line") into an NSString using UTF-8.
Are you sure the encoding errors are not real, i.e., that your stream may actually contain erroneous characters with respect to a UTF-8 encoding?
Edited to add from the comments:
This presumes that the lines consist of sufficiently few characters to keep a whole line in memory before converting from UTF-8.

Cannot parse Width/Height in AVC SPS

Here is the information which I have parsed out of an avcC atom in an mp4 container
The avc extradata Conforms with ISO/IEC 14496-15:2004(E) 5.2.4.1.1
> 0x01 0x42 0x00 0x1E 0xFF 0xE1 0x00 0x0E
Configuration Version: 1 u(8)<br>
AVCProfileIndication: 66 u(8)<br>
profile_compatability: 0 u(8)<br>
AVCLevelIndication: 30 u(8)<br>
bit(6) reserved = '111111' b <br>
unsigned int (2) lengthSizeMinusOne = '11' <br>
bit(3) reseved = '111' <br>
unsigned int (5) numOfSequenceParameterSets = 1 <br>
unsigned int (16) sequenceParameterSetLength = 14 <br>
SPS
> 0x67 0x42 0x00 0x1E 0x8D 0x68 0x6E 0x03 0xDA 0x6A 0x0C 0x02 0x0C 0x04 <br>
avC data Continued
> 0x01 0x00 0x04 <br>
unsigned int (8) numOfPictureParameterSets: 1 <br>
unsigned int (16) pictureParameterSetLength: 4 <br>
PPS <br>
>0x68 0xCE 0x74 0xC8
The contents of the SPS appears to give incorrect results regarding pic_width_in_mbs_minus1 (5) and I do not believe there are any emulation_3_byte preventions. Am I missing something obvious? I am parsing the SPS according to ISO/IEC 14496-10:2004(E) which is the same SPS parsing information found here.
Image is 96x16 (why?)
Sequence Parameter Set
profile_idc 66
constraint_set0_flag 0
constraint_set1_flag 0
constraint_set2_flag 0
constraint_set3_flag 0
level_idc 30
seq_parameter_set_id 0
// ...
num_ref_frames 1
gaps_in_frame_num_value_allowed_flag 0
pic_width_in_mbs_minus1 5
pic_height_in_map_units_minus1 0
frame_mbs_only_flag 1
direct_8x8_inference_flag 1
frame_cropping_flag 0
vui_parameters_present_flag 0
// ...
Picture Parameter Set
pic_parameter_set_id 0
seq_parameter_set_id 0
entropy_coding_mode_flag 0
num_slice_groups_minus1 0
// ...

Resources