To control a Behringer X32 audio mixer, I have to send an OSC message like /ch/01/mix/fader ,f .3 to move a fader to 30%. The mixer, per the OSC protocol, is expecting the .3 to come in as a 4 character string - in hex it's 3E 99 99 9A. So special characters are involved.
TIdUDPClient is given the characters for 3E 99 99 9A, but it sends out 3E 3F 3F 3F. Likewise .4 wants to be 3E CC CC CD but 3E 3F 3F 3F is sent.
When you get up to .5 and greater, things work again as the characters are below 3F. For example, .6 should be 3F 19 99 9A and goes out as 3F 19 3F 3F.
Evidently the Behringer is only looking at the first two characters there.
I am using Delphi Rio with the Indy 10 version distributed with it. I can create a module in Lazarus with Lnet that works fine. But my main application is in Delphi where I need this ability. As you can see, I've tried several different ways with the same non-working result.
How do I send the proper characters?
procedure TCPForm1.OSCSendMsg;
var
OutValueStr: String;
I: Integer;
J: Tbytes;
B1: TIdbytes;
begin
If Length(CommandStr) > 0 then begin
OscCommandStr := PadStr(CommandStr); //convert CommandStr to OSC string
If TypeStr='' then OscCommandStr := OscCommandStr+','+#0+#0+#0;
If Length(TypeStr) = 1 then begin
If TypeStr='i' then Begin // Parameter is an integer
I := swapendian(IValue); //change to big endian
OscCommandStr := OscCommandStr+','+TypeStr+#0+#0+IntToCharStr(I);
OutValueStr := IntToStr(IValue);
end;
If TypeStr='f' then Begin // Parameter is a float (real)
I := swapendian(PInteger(#FValue)^); //typecast & change to big endian
//I := htonl(PInteger(#FValue)^); //typecast & change to big endian
//J := MakeOSCFloat(FValue);
OscCommandStr := OscCommandStr+','+TypeStr+#0+#0+IntToCharStr(I);
//OscCommandStr := OscCommandStr+','+TypeStr+#0+#0+char(J[0])+char(J[1])+char(J[2])+char(J[3]);
OutValueStr := FloatToStr(FValue);
end;
end;
//IdUDPClient2.Send(OSCCommandStr,IndyTextEncoding_UTF8);
//IdUDPClient2.Send(OSCCommandStr);
B1 := toBytes(OSCCommandStr);
IdUDPClient2.SendBuffer(B1);
if loglevel>0 then logwrite('OSC= '+ hexstr(OSCCommandStr));
Wait(UDPtime);
// if loglevel>0 then logwrite('OSC '+ OSCCommandStr);
end;
end;
function TCPForm1.IntToCharStr(I : Integer) : String;
var
CharStr : String;
MyArray: array [0..3] of Byte;
J: Integer;
begin
For J :=0 to 3 do MyArray[J] := 0;
Move(I, MyArray, 4); //typeset conversion from integer to array of byte
CharStr := '';
For J :=0 to 3 do //convert array of byte to string
CharStr := CharStr+char(MyArray[J]);
IntToCharStr := CharStr;
end;
UPDATE:
The system would not let me add this as an answer, so...
Thank you, Remy. At least as far as the X32 software simulator is concerned, adding the 8 bit text encoding gives the correct response. I'll have to wait until tomorrow to test on the actual mixer in the theater. A byte array might be better if we had control of both ends of the communication. As it is, I can't change the X32 and it wants to get a padded string (in Hex: 2F 63 68 2F 30 31 2F 6D 69 78 2F 66 61 64 65 72 00 00 00 00 2C 66 00 00 3E CC CC CD) for the text string "/ch/01/mix/fader ,f .4". The documentation of messages the X32 responds to is a long table of similar messages with different parameters. e.g. "/ch/01/mix/mute on", "/bus/1/dyn/ratio ,i 2", etc. This is all in accordance with the Open Sound Control protocol.
As always, you are the definitive source of Indy wisdom, so, thank you. I'll edit this note after my results with the actual device.
UPDATE:
Confirmed that the addition of 8 bit text encoding to the Send command works with the X32. Cheers! A couple questions as a result of this:
Is one send construct preferred over the other?
Where should I have read/learned more about these details of Indy?
3F is the ASCII '?' character. You are seeing that character being sent when a Unicode character is encoded to a byte encoding that doesn't support that Unicode character. For example, Indy's default text encoding is US-ASCII unless you specify otherwise (via the GIdDefaultTextEncoding variable in the IdGlobal.pas unit, or via various class properties or method parameters), and US-ASCII does not support Unicode characters > U+007F.
It seems like you are dealing with a binary protocol, not a text protocol, so why are you using strings to create its messages? I would think byte arrays would make more sense.
At the very least, try using Indy's 8-bit text encoding (via the IndyTextEncoding_8Bit() function in the IdGlobal.pas unit) to convert Unicode characters U+0000..U+00FF to bytes 0x00..0xFF without data loss, eg:
B1 := ToBytes(OSCCommandStr, IndyTextEncoding_8Bit); // not ASCII or UTF8!
IdUDPClient2.SendBuffer(B1);
IdUDPClient2.Send(OSCCommandStr, IndyTextEncoding_8Bit); // not ASCII or UTF8!
Related
I recently received some Delphi 2007 source code that was written by another developer.
I noticed when I click on a component icon in the IDE that Delphi generates the stub code as you would expect, HOWEVER it is 'stealing' the first character from the next procedure or function and placing that character in front of the generated code.
For example, when clicking on the RaizeObjects Launcher component icon in the IDE, I get this generated code with an "f" stolen from the next function (or it would be a "p" if the next item was a procedure):
fprocedure TFLogin.RzLauncher1Error(Sender: TObject; ErrorCode: Cardinal);
begin
end;
unction TFLogin.DelDir(dir: string): Boolean;
var
Of course, this corrupts my source code everytime I click on a component icon in the IDE.
I did a hex dump on the source and found that these source files only have a linefeed (0A) in them and not the Carriage Return (0D) & Linefeed (0A) that my locally produced code has (examples below).
Hex dump of a typical source file Delphi normally produces:
unit Unit1; (CR & LF)
75 6E 69 74 20 55 6E 69 74 31 3B 0D 0A
Example source code file from other person (hex dump):
unit Calc; (LF)
75 6E 69 74 20 43 61 6C 63 3B 0A
Saving the source in my editor does not fix this issue. My question is, how do I fix this issue? Is there a setting in Delphi 2007 telling it to just use linefeeds, or is there some Windows 7 setting to adjust for this issue?
They ask me to represet a set of char like into "map memory". What chars are in the set? The teacher told us to use ASCII code, into a set of 32 bytes.
A have this example, the set {'A', 'B', 'C'}
(The 7 comes from 0111)
= {00 00 00 00 00 00 00 00 70 00
00 00 00 00 00 00 00 00 00 00
00}
Sets in pascal can be represented in memory with one bit for every element; if the bit is 1, the element is present in the set.
A "set of char" is the set of ascii char, where each element has an ordinal value from 0 to 255 (it should be 127 for ascii, but often this set is extended up to a byte, so there are 256 different characters).
Hence a "set of char" is represented in memory as a block of 32 bytes which contain a total of 256 bits. The character "A" (upper case A) has an ordinal value of 65. The integer division of 65 by 8 (the number of bits a byte can hold) gives 8. So the bit representing "A" in the set resides in the byte number 8. 65 mod 8 gives 1, which is the second bit in that byte.
The byte number 8 will have the second bit ON for the character A (and the third bit for B, and the fourth for C). All the three characters together give the binary representation of 0000.1110 ($0E in hex).
To demonstrate this, I tried the following program with turbo pascal:
var
ms : set of char;
p : array[0..31] of byte absolute ms;
i : integer;
begin
ms := ['A'..'C'];
for i := 0 to 31 do begin
if i mod 8=0 then writeln;
write(i,'=',p[i],' ');
end;
writeln;
end.
The program prints the value of all 32 bytes in the set, thanks to the "absolute" keyword. Other versions of pascal can do it using different methods. Running the program gives this result:
0=0 1=0 2=0 3=0 4=0 5=0 6=0 7=0
8=14 9=0 10=0 11=0 12=0 13=0 14=0 15=0
16=0 17=0 18=0 19=0 20=0 21=0 22=0 23=0
24=0 25=0 26=0 27=0 28=0 29=0 30=0 31=0
where you see that the only byte different than 0 is the byte number 8, and it contains 14 ($0E in hex, 0000.1110). So, your guess (70) is wrong.
That said, I must add that nobody can state this is always true, because a set in pascal is implementation dependent; so your answer could also be right. The representation used by turbo pascal (on dos/windows) is the most logical one, but this does not exclude other possible representations.
How can I tell if a stream contains a picture or not? I am working with Delphi xe8 FMX developing an iOS application. I have a listbox and am loading pictures into the items.
I can do this:
if not Assigned(S) then
s:=TMemoryStream.Create;
if not Assigned(clHTTP) then
clHTTP := TIDHTTP.Create;
with clHTTP do
begin
clHTTP.HandleRedirects := True;
clHTTP.AllowCookies := True;
clHTTP.RedirectMaximum := 110000;
clHTTP.Get(someimageURL,s);
end;
s.Seek(0,soFromBeginning);
try
LItem.ItemData.Bitmap.LoadFromStream(s);
except
clHTTP.Get(DefaultImageURL,s);
s.Seek(0,soFromBeginning);
LItem.ItemData.Bitmap.LoadFromStream(s);
end;
s.Free;
clHTTP.Free;
I would prefer not to use a try-except block because it appears this causes loading of the bitmaps to be inconsistent. For example, I have to scroll the listbox items out of view, then back into view to see the pictures.
A quick way would be to check for the file signature. For instance, here are some common image format signatures:
PNG: 89 50 4E 47 0D 0A 1A 0A
JPEG: FF D8 FF E0
GIF87a: 47 49 46 38 37 61
GIF89a: 47 49 46 38 39 61
A comprehensive list can be found here: http://en.wikipedia.org/wiki/List_of_file_signatures
This approach will not prove that the rest of the stream is valid, but it is a good start and will allow you at least to reject obvious ringers.
I'm having troubles decoding/encoding a base64 string because of the CRLF on it.
I've tried this lib Base64.h and this one NSData+Base64.h but both do not handle well the CRLF.
Anyone had this problem before?
Anyone has an advice on how to avoid these CRLF? I think Android's Java lib is replacing this with a '0', am I correct?
public static final int CRLF = 4;
Base64 encodes 64 characters, namely 'A-Za-z0-9+/' with a possible trailing '=' to indicate a non mod 3 length. CR+LF may be used as a line separator, generally decode each line separately.
See Wikipedia Base64 for more information on CR+LF variants.
"+vqbiP7s3oe7/puJ8v2a3fOYnf3vmpap"
decoded is:
"FA FA 9B 88 FE EC DE 87 BB FE 9B 89 F2 FD 9A DD F3 98 9D FD EF 9A 96 A9"
The last character is not 0.
I have some problem when using Indy (idTCPServer component) to read data send by client, the data itself was a hex formatted so i can't use AThread.Connection.ReadLn(); for that...
Here my sample data sended by client
24 24 00 11 12 34 56 FF FF FF FF 50 00 8B 9B 0D 0A
or
24 24 00 13 12 34 56 FF FF FF FF 90 02 00 0A 8F D4 0D 0A
PS: it's in bytes hexadecimal (data length may vary depend on command, at maximum 160 bytes) which i can't get string representation since $00 translated to null (which mean i can't use ReadLn)
Here my sample code
procedure TfrmMain.IdTCPServerExecute(AThread: TIdPeerThread);
var
Msg : Array[0..255] of Byte;
begin
AThread.connection.ReadBuffer(Msg,SizeOf(Msg));
AThread.connection.WriteBuffer(Msg,MsgSize,true);
end;
this code will not work if client not send 255 byte data, while in my case data length can be vary, i have tried this, but no response outputted
procedure TfrmMain.IdTCPServerExecute(AThread: TIdPeerThread);
var
Msg : Array of Byte;
MsgSize : integer;
begin
MsgSize := AThread.connection.ReadInteger; //doesn't actually get packet length?
SetLength(Msg, MsgSize);
AThread.connection.ReadBuffer(Msg,MsgSize);
AThread.connection.WriteBuffer(Msg,MsgSize,true);
end;
so how exactly i can count how many byte data sended by client (packet length)? or could someone tell me the right code to read the data?
The simple answer is: you can't. TCP is a stream protocol, so there is no concept of a message. The data is received in chunks, whose sizes may (and will) differ from the actually sent buffers (the network stack is free to slice or merge the stream at will).
You may built a message protocol on top of TCP, e.g. by starting the transmission, and every consequent message, by a "size field" and then waiting only for the bytes needed; you still need to check the actual size received and re-reading the rest if applicable.
The point is: packet length in the TCP world has nothing to do with the length of the sent messages.
What TIdTCPConnection does behind all the Read-methods is:
reading all available data from the network stack, appending it into an internal input buffer and returning the requested N bytes from the beginning of the buffer, if available (waiting for next chunk, if not).
The 3rd and 4th bytes in the data you have shown specify the total size of the data being sent. You were close to try ReadInteger(), but the way you used it includes the 1st and 2nd bytes, which is wrong. Try this instead:
procedure TfrmMain.IdTCPServerExecute(AThread: TIdPeerThread);
var
Unknown: Smallint; // maybe a msg type?
DataSize: Smallint;
Data: Array of Byte;
begin
Unknown := AThread.Connection.ReadSmallInt;
DataSize := AThread.Connection.ReadSmallInt - 4;
SetLength(Data, DataSize);
AThread.Connection.ReadBuffer(Data[0], DataSize);
//...
end;