Byte array to int16 conversion - f#

How would one take a byte array and convert it into a multidimensional integer array. The real data would actually be the results of a System.IO.BinaryReader using ReadBytes. The function would take the byte array and output the new array.
For instance, the given data would have the first 4 elements in a 4xN array of
256, 328, 344, 546
The following conversion works for a single pair of bytes
let value = int bytedata.[1] ||| (int bytedata.[0] <<< 8)
I can do it with looping, however it seems that F# should make this easier.
let realdata = [|1uy; 0uy; 1uy; 48uy; 0uy; 158uy; 0uy; 222uy; 0uy; 250uy; 0uy; 0uy; 0uy;
151uy; 2uy; 238uy; 3uy; 31uy; 1uy; 191uy; 1uy; 228uy; 1uy; 62uy; 1uy;
111uy; 0uy; 247uy; 1uy; 183uy; 0uy; 83uy; 0uy; 213uy; 2uy; 197uy; 2uy;
161uy; 1uy; 7uy; 0uy; 201uy; 1uy; 48uy; 0uy; 166uy; 0uy; 133uy; 1uy; 40uy;
0uy; 150uy; 0uy; 193uy; 2uy; 207uy; 2uy; 217uy; 1uy; 158uy; 1uy; 53uy; 1uy;
38uy; 0uy; 141uy; 0uy; 162uy; 1uy; 23uy; 0uy; 0uy; 0uy; 128uy; 2uy; 223uy;
2uy; 204uy; 1uy; 236uy; 2uy; 20uy; 1uy; 56uy; 0uy; 221uy; 0uy; 235uy; 1uy;
118uy; 0uy; 29uy; 0uy; 173uy; 2uy; 58uy; 2uy; 27uy; 1uy; 56uy;|]
let convertBytes (data : byte[]) =
seq {
let i = ref 0
while !i < 10 do
if !i%2 = 0 then
yield int(data.[!i] ||| (data.[!i+1] <<< 8))
i := !i + 2
}
However, I'm yielding a byte array rather than a int16 array.
Don

First of all I will assume you want a single-dimension array, at least in your sample code that's what you are getting.
Second, your conversion math seems wrong to me, if you take [1uy; 48uy] and your result is 49, what is the result of [0uy; 49uy]? It looks like you're adding rather than converting.
So correct me if I'm wrong but I think this is what you need:
open System
let result =
[|0..2..Array.length realdata-1|]
|> Array.map (fun i -> BitConverter.ToInt16(realdata, i))
// val r : int16 [] = [|1s; 12289s; -25088s; -8704s; -1536s; 0s; ...
Alternatively if the data is Big Endian:
let result =
[|0..2..Array.length realdata-1|]
|> Array.map (fun i -> (int16 (realdata.[i]) <<< 8) ||| int16 (realdata.[i+1]))
Or you can replace the expression in the lambda with your conversion if it makes sense for you.
UPDATE
From your comments, you may want to do something like this:
let result =
Array2D.init 8 (realdata.Length/16) (fun i j -> (int16 (realdata.[j * 16 + i*2]) <<< 8) ||| int16 (realdata.[j * 16 + i*2 + 1]))

Related

swift extracting bytes from struct Data

data is arriving in the form of struct Data size count == 5 the last 3 bytes contain an ID that needs to be extracted.
The following code works, however I am sure it could be greatly improved (as you can tell I am new to swift!):
var data:[UInt8] = [ 0xff, 0xff, 0x01, 0x02 ,0x03 ]
var txData = Data(bytes: data)
print(txData.count)
let byte00 = txData.subdata(in: 2..<3 ).withUnsafeBytes { (ptr: UnsafePointer<UInt8>) -> UInt8 in
return ptr.pointee
}
let byte01 = txData.subdata(in: 3..<4 ).withUnsafeBytes { (ptr: UnsafePointer<UInt8>) -> UInt8 in
return ptr.pointee
}
let byte02 = txData.subdata(in: 4..<5 ).withUnsafeBytes { (ptr: UnsafePointer<UInt8>) -> UInt8 in
return ptr.pointee
}
let rxData = (UInt(byte00) << 16) + (UInt(byte01) << 8) + UInt(byte02)
print( String(rxData, radix:16) )
Any tutorial recommendations covering this area of swift would be greatly appreciated.
You can write something like this:
var data:[UInt8] = [ 0xff, 0xff, 0x01, 0x02 ,0x03 ]
var txData = Data(bytes: data)
print(txData.count)
let byte00 = txData[2]
let byte01 = txData[3]
let byte02 = txData[4]
let rxData = (UInt(byte00) << 16) + (UInt(byte01) << 8) + UInt(byte02)
print( String(rxData, radix:16) ) //->10203
In Swift 3, Data can be treated as a Collection of UInt8, you can subscript to Data directly when getting each byte as UInt8.
And as 3 byte is not a good number for the current CPUs, the code above cannot be much shorter.

How to convert an Int to Hex String in Swift

In Obj-C I used to convert an unsigned integer n to a hex string with
NSString *st = [NSString stringWithFormat:#"%2X", n];
I tried for a long time to translate this into Swift language, but unsuccessfully.
You can now do:
let n = 14
var st = String(format:"%02X", n)
st += " is the hexadecimal representation of \(n)"
print(st)
0E is the hexadecimal representation of 14
Note: The 2 in this example is the field width and represents the minimum length desired. The 0 tells it to pad the result with leading 0's if necessary. (Without the 0, the result would be padded with leading spaces). Of course, if the result is larger than two characters, the field length will not be clipped to a width of 2; it will expand to whatever length is necessary to display the full result.
This only works if you have Foundation imported (this includes the import of Cocoa or UIKit). This isn't a problem if you're doing iOS or macOS programming.
Use uppercase X if you want A...F and lowercase x if you want a...f:
String(format: "%x %X", 64206, 64206) // "face FACE"
If you want to print integer values larger than UInt32.max, add ll (el-el, not eleven) to the format string:
let n = UInt64.max
print(String(format: "%llX is hexadecimal for \(n)", n))
FFFFFFFFFFFFFFFF is hexadecimal for 18446744073709551615
Original Answer
You can still use NSString to do this. The format is:
var st = NSString(format:"%2X", n)
This makes st an NSString, so then things like += do not work. If you want to be able to append to the string with += make st into a String like this:
var st = NSString(format:"%2X", n) as String
or
var st = String(NSString(format:"%2X", n))
or
var st: String = NSString(format:"%2X", n)
Then you can do:
let n = 123
var st = NSString(format:"%2X", n) as String
st += " is the hexadecimal representation of \(n)"
// "7B is the hexadecimal representation of 123"
In Swift there is a specific init method on String for exactly this:
let hex = String(0xF, radix: 16, uppercase: false)
println("hex=\(hex)") // Output: f
With Swift 5, according to your needs, you may choose one of the three following methods in order to solve your problem.
#1. Using String's init(_:radix:uppercase:) initializer
Swift String has a init(_:radix:uppercase:) initializer with the following declaration:
init<T>(_ value: T, radix: Int = 10, uppercase: Bool = false) where T : BinaryInteger
Creates a string representing the given value in base 10, or some other specified base.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format by using init(_:radix:uppercase:) and without having to import Foundation:
let string1 = String(2, radix: 16)
print(string1) // prints: "2"
let string2 = String(211, radix: 16)
print(string2) // prints: "d3"
let string3 = String(211, radix: 16, uppercase: true)
print(string3) // prints: "D3"
#2. Using String's init(format:​_:​) initializer
Foundation provides String a init(format:​_:​) initializer. init(format:​_:​) has the following declaration:
init(format: String, _ arguments: CVarArg...)
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted.
The Apple's String Programming Guide gives a list of the format specifiers that are supported by String and NSString. Among those format specifiers, %X has the following description:
Unsigned 32-bit integer (unsigned int), printed in hexadecimal using the digits 0–9 and uppercase A–F.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​_:​):
import Foundation
let string1 = String(format:"%X", 2)
print(string1) // prints: "2"
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
let string3 = String(format:"%02X", 211)
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", 12, 121, 255)
print(string4) // prints: "0C, 79, FF"
#3. Using String's init(format:​arguments:​) initializer
Foundation provides String a init(format:​arguments:​) initializer. init(format:​arguments:​) has the following declaration:
init(format: String, arguments: [CVarArg])
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted according to the user’s default locale.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​arguments:​):
import Foundation
let string1 = String(format:"%X", arguments: [2])
print(string1) // prints: "2"
let string2 = String(format:"%02X", arguments: [1])
print(string2) // prints: "01"
let string3 = String(format:"%02X", arguments: [211])
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", arguments: [12, 121, 255])
print(string4) // prints: "0C, 79, FF"
Swift 5.2.4
let value = 200
let hexString = String(format: "%02X", value)
Answers above work fine for values in the range of a 32 bit Int, but values over this won't work as the value will roll over.
You need to use the length modifier for values greater than a 32bit Int
%x = Unsigned 32-bit integer (unsigned int)
ll = Length modifiers specifying that a following d, o, u, x, or X conversion specifier applies to a long long or unsigned long long argument.
let hexString = String(format:"%llX", decimalValue)
To use
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
In Swift3 import foundation is not required, At least not in a Project.
String should have all the functionality as NSString.

Trouble With Kinect Skeleton Tracking in F#

I am using the F# skeleton tracking template provided by KinectContrib. The template in C# that does the same thing works so I know the hardware is OK.
I am using Windows Kinect SDK v1.8.
The program will track once in a rare while but with no consistent pattern. I have been playing with the code since last night so I am looking for someone to confirm the same behavior on another system or for any pointers on how to change the code.
Thanks in advance.
This is the template code:
#light
open System
open System.Windows
open System.Windows.Media.Imaging
open Microsoft.Kinect
open System.Diagnostics
let sensor = KinectSensor.KinectSensors.[0]
//The main canvas that is handling the ellipses
let canvas = new System.Windows.Controls.Canvas()
canvas.Background <- System.Windows.Media.Brushes.Transparent
let ds : byte = Convert.ToByte(1)
let dummySkeleton : Skeleton = new Skeleton(TrackingState = SkeletonTrackingState.Tracked)
// Thanks to Richard Minerich (#rickasaurus) for helping me figure out
// some array concepts in F#.
let mutable pixelData : byte array = [| |]
let mutable skeletons : Skeleton array = [| |]
//Right hand ellipse
let rhEllipse = new System.Windows.Shapes.Ellipse()
rhEllipse.Height <- 20.0
rhEllipse.Width <- 20.0
rhEllipse.Fill <- System.Windows.Media.Brushes.Red
rhEllipse.Stroke <- System.Windows.Media.Brushes.White
//Left hand ellipse
let lhEllipse = new System.Windows.Shapes.Ellipse()
lhEllipse.Height <- 20.0
lhEllipse.Width <- 20.0
lhEllipse.Fill <- System.Windows.Media.Brushes.Red
lhEllipse.Stroke <- System.Windows.Media.Brushes.White
//Head ellipse
let hEllipse = new System.Windows.Shapes.Ellipse()
hEllipse.Height <- 20.0
hEllipse.Width <- 20.0
hEllipse.Fill <- System.Windows.Media.Brushes.Red
hEllipse.Stroke <- System.Windows.Media.Brushes.White
canvas.Children.Add(rhEllipse) |> ignore
canvas.Children.Add(lhEllipse) |> ignore
canvas.Children.Add(hEllipse) |> ignore
let grid = new System.Windows.Controls.Grid()
let winImage = new System.Windows.Controls.Image()
winImage.Height <- 600.0
winImage.Width <- 800.0
grid.Children.Add(winImage) |> ignore
grid.Children.Add(canvas) |> ignore
//Video frame is ready to be processed.
let VideoFrameReady (sender : obj) (args: ColorImageFrameReadyEventArgs) =
let receivedData = ref false
using (args.OpenColorImageFrame()) (fun r ->
if (r <> null) then
(
pixelData <- Array.create r.PixelDataLength ds
//Array.Resize(ref pixelData, r.PixelDataLength)
r.CopyPixelDataTo(pixelData)
receivedData := true
)
if (receivedData <> ref false) then
(
winImage.Source <- BitmapSource.Create(640, 480, 96.0, 96.0, Media.PixelFormats.Bgr32, null, pixelData, 640 * 4)
)
)
//Required to correlate the skeleton data to the PC screen
//IMPORTANT NOTE: Code for vector scaling was imported from the Coding4Fun Kinect Toolkit
//available here: http://c4fkinect.codeplex.com/
//I only used this part to avoid adding an extra reference.
let ScaleVector (length : float32, position : float32) =
let value = (((length / 1.0f) / 2.0f) * position) + (length / 2.0f)
if value > length then
length
elif value < 0.0f then
0.0f
else
value
//This will set the ellipse positions depending on the passed instance and joint
let SetEllipsePosition (ellipse : System.Windows.Shapes.Ellipse, joint : Joint) =
let vector = new Microsoft.Kinect.SkeletonPoint(X = ScaleVector(640.0f, joint.Position.X), Y=ScaleVector(480.0f, -joint.Position.Y),Z=joint.Position.Z)
let mutable uJoint = joint
uJoint.TrackingState <- JointTrackingState.Tracked
uJoint.Position <- vector
System.Windows.Controls.Canvas.SetLeft(ellipse,(float uJoint.Position.X))
System.Windows.Controls.Canvas.SetTop(ellipse,(float uJoint.Position.Y))
//Triggered when a new skeleton frame is ready for processing
let SkeletonFrameReady (sender : obj) (args: SkeletonFrameReadyEventArgs) =
let receivedData = ref false
using (args.OpenSkeletonFrame()) (fun r ->
if (r <> null) then
(
skeletons <- Array.create r.SkeletonArrayLength dummySkeleton
r.CopySkeletonDataTo(skeletons)
for i in skeletons do
Debug.WriteLine(i.TrackingState.ToString())
receivedData := true
)
if (receivedData <> ref false) then
(
for i in skeletons do
if i.TrackingState <> SkeletonTrackingState.NotTracked then
(
let currentSkeleton = i
SetEllipsePosition(hEllipse, currentSkeleton.Joints.[JointType.Head])
SetEllipsePosition(lhEllipse, currentSkeleton.Joints.[JointType.HandLeft])
SetEllipsePosition(rhEllipse, currentSkeleton.Joints.[JointType.HandRight])
)
)
)
let WindowLoaded (sender : obj) (args: RoutedEventArgs) =
sensor.Start()
sensor.ColorStream.Enable()
sensor.SkeletonStream.Enable()
sensor.ColorFrameReady.AddHandler(new EventHandler<ColorImageFrameReadyEventArgs>(VideoFrameReady))
sensor.SkeletonFrameReady.AddHandler(new EventHandler<SkeletonFrameReadyEventArgs>(SkeletonFrameReady))
let WindowUnloaded (sender : obj) (args: RoutedEventArgs) =
sensor.Stop()
//Defining the structure of the test window
let window = new Window()
window.Width <- 800.0
window.Height <- 600.0
window.Title <- "Kinect Skeleton Application"
window.Loaded.AddHandler(new RoutedEventHandler(WindowLoaded))
window.Unloaded.AddHandler(new RoutedEventHandler(WindowUnloaded))
window.Content <- grid
window.Show()
[<STAThread()>]
do
let app = new Application() in
app.Run(window) |> ignore
I ended up rewriting it based off of this post http://channel9.msdn.com/coding4fun/kinect/Kinecting-with-F and the skeleton tracking is now working. Still interested in why the original code doesn't work as well.
// Learn more about F# at http://fsharp.net
#light
open System
open System.Windows
open System.Windows.Media.Imaging
open System.Windows.Threading
open Microsoft.Kinect
open System.Diagnostics
[<STAThread>]
do
let sensor = KinectSensor.KinectSensors.[0]
sensor.SkeletonStream.Enable()
sensor.Start()
// Set-up the WPF window and its contents
let width = 1024.
let height = 768.
let w = Window(Width=width, Height=height)
let g = Controls.Grid()
let c = Controls.Canvas()
let hd = Shapes.Rectangle(Fill=Media.Brushes.Red, Width=15., Height=15.)
let rh = Shapes.Rectangle(Fill=Media.Brushes.Blue, Width=15., Height=15.)
let lh = Shapes.Rectangle(Fill=Media.Brushes.Green, Width=15., Height=15.)
ignore <| c.Children.Add hd
ignore <| c.Children.Add rh
ignore <| c.Children.Add lh
ignore <| g.Children.Add c
w.Content <- g
w.Unloaded.Add(fun args -> sensor.Stop())
let getDisplayPosition w h (joint : Joint) =
let x = ((w * (float)joint.Position.X + 2.0) / 4.0) + (w/2.0)
let y = ((h * -(float)joint.Position.Y + 2.0) / 4.0) + (h/2.0)
System.Console.WriteLine("X:" + x.ToString() + " Y:" + y.ToString())
new Point(x,y)
let draw (joint : Joint) (sh : System.Windows.Shapes.Shape) =
let p = getDisplayPosition width height joint
sh.Dispatcher.Invoke(DispatcherPriority.Render, Action(fun () -> System.Windows.Controls.Canvas.SetLeft(sh, p.X))) |> ignore
sh.Dispatcher.Invoke(DispatcherPriority.Render, Action(fun () -> System.Windows.Controls.Canvas.SetTop(sh, p.Y))) |> ignore
let drawJoints (sk : Skeleton) =
draw (sk.Joints.Item(JointType.Head)) hd
draw (sk.Joints.Item(JointType.WristRight)) rh
draw (sk.Joints.Item(JointType.WristLeft)) lh
let skeleton (sensor : KinectSensor) =
let rec loop () =
async {
let! args = Async.AwaitEvent sensor.SkeletonFrameReady
use frame = args.OpenSkeletonFrame()
let skeletons : Skeleton[] = Array.zeroCreate(frame.SkeletonArrayLength)
frame.CopySkeletonDataTo(skeletons)
skeletons
|> Seq.filter (fun s -> s.TrackingState <> SkeletonTrackingState.NotTracked)
|> Seq.iter (fun s -> drawJoints s)
return! loop ()
}
loop ()
skeleton sensor |> Async.Start
let a = Application()
ignore <| a.Run(w)
In F#, any value bindings (e.g., let or do) you declare within a module itself will be executed the first time the module is opened or accessed from another module. If you're familiar with C#, you can think of these value bindings as executing within a type constructor (i.e., a static constructor).
I suspect the reason the second version of your code works, but not the first, is that in the second version, you're creating the Window and drawing the shapes into it from within the STA thread running the application's message loop. In the first version, I'd guess that code is executing on some other thread, and that's why it isn't working as expected.
There's nothing wrong with the second version of your code, but a more-canonical F# approach would be to lift your functions (getDisplayPosition, draw, etc.) out of the top-level do binding. That makes the code a bit easier to read by making it obvious that those functions aren't capturing any of the local values created within the do.

Convert Enum into underlying type

I have an enum as follows
type Suit =
|Clubs = 'C'
|Spades = 'S'
|Hearts = 'H'
|Diamonds = 'D'
How do I get the underlying char value if given enum value?
eg I have Suit.Clubs and want to get 'C'
as another option
type Suit =
|Clubs = 'C'
|Spades = 'S'
|Hearts = 'H'
|Diamonds = 'D'
let c = Suit.Clubs
let v : char = LanguagePrimitives.EnumToValue c
EDITED:
Comparison of different approaches:
type Suit =
|Clubs = 'C'
|Spades = 'S'
|Hearts = 'H'
|Diamonds = 'D'
let valueOf1 (e : Suit) = LanguagePrimitives.EnumToValue e
let valueOf2 (e : Suit) = unbox<char> e
let valueOf3 (e : Suit) = (box e) :?> char
And under the hood:
.method public static
char valueOf1 (
valuetype Program/Suit e
) cil managed
{
// Method begins at RVA 0x2050
// Code size 3 (0x3)
.maxstack 8
IL_0000: nop
IL_0001: ldarg.0
IL_0002: ret
} // end of method Program::valueOf1
.method public static
char valueOf2 (
valuetype Program/Suit e
) cil managed
{
// Method begins at RVA 0x2054
// Code size 13 (0xd)
.maxstack 8
IL_0000: nop
IL_0001: ldarg.0
IL_0002: box Program/Suit
IL_0007: unbox.any [mscorlib]System.Char
IL_000c: ret
} // end of method Program::valueOf2
.method public static
char valueOf3 (
valuetype Program/Suit e
) cil managed
{
// Method begins at RVA 0x2064
// Code size 13 (0xd)
.maxstack 8
IL_0000: nop
IL_0001: ldarg.0
IL_0002: box Program/Suit
IL_0007: unbox.any [mscorlib]System.Char
IL_000c: ret
} // end of method Program::valueOf3
You can use functions from the LanguagePrimitives module:
// Convert enum value to the underlying char value
let ch = LanguagePrimitives.EnumToValue Suit.Clubs
// Convert the char value back to enum
let suit = LanguagePrimitives.EnumOfValue ch
EDIT: I didn't see these functions in my first answer attempt, so I first suggested using:
unbox<char> Suit.Clubs
This is shorter than what ildjarn suggests in a comment, but it has the same problem - there is no checking that you're actually converting to the right type. With EnumToValue, you cannot make this mistake, because it always returns the value of the right underlying type.

F# - This code isn't compiling for me

This code isn't compiling for me: let countDown = [5L .. −1L .. 0L];;
I have a book (page 33) that says it should return this:
val countDown : int list = [5L; 4L; 3L; 2L; 1L; 0L]
Compiler Error:
Program.fs(42,24): error FS0010: Unexpected character '−' in expression
>
> let countDown = [5L .. −1L .. 0L];;
let countDown = [5L .. −1L .. 0L];;
-----------------------^
The book's wrong. but why? is it an update to the language? what is the way to achieve that?
Edit: the problem was that the − character copied from the PDF, isn't the - character.
Your original code works fine for me even without the modifications that Igor suggested:
Microsoft (R) F# 2.0 Interactive build 4.0.30319.1
Copyright (c) Microsoft Corporation. All Rights Reserved.
> let l = [ 10L .. -1L .. 0L ];;
val l : int64 list = [10L; 9L; 8L; 7L; 6L; 5L; 4L; 3L; 2L; 1L; 0L]
A possible subtle error is that if you (for example) pasted the code from Word (or some other program), it may have replaced the - symbol with some other type of dash that looks the same, but has actually a different code.
Another way to break the code is to remove some spaces - for example, there must be a space between .. and -1L. Otherwise, I don't see any reason why it shouldn't work.
Try this:
let countDown = [5L .. (-1L) .. 0L];;
Or this:
let countDown = [5 .. -1 .. 0];;
Both of the above will work.
Here is some output:
> let countDown = [5 .. -1 .. 0];;
val countDown : int list = [5; 4; 3; 2; 1; 0]
> let countDown = [5L .. (-1L) .. 0L];;
val countDown : int64 list = [5L; 4L; 3L; 2L; 1L; 0L]

Resources