Related
I have a TensorFlowLite error as follows:
Cannot copy from a TensorFlowLite tensor (model_outputs) with shape [1, 13, 13, 35] to a Java object with shape [1, 2].
and which results from the following C# (Xamarin) code:
var interpreter = new Xamarin.TensorFlow.Lite.Interpreter(mappedByteBuffer);
var tensor = interpreter.GetInputTensor(0);
var shape = tensor.Shape();
var width = shape[1];
var height = shape[2];
var byteBuffer = GetPhotoAsByteBuffer(bytes, width, height);
var sr = new StreamReader(Application.Context.Assets.Open("labels.txt"));
var labels = sr.ReadToEnd().Split('\n').Select(s => s.Trim()).Where(s => !string.IsNullOrEmpty(s)).ToList();
var outputLocations = new float[1][] { new float[labels.Count] };
var outputs = Java.Lang.Object.FromArray(outputLocations);
interpreter.Run(byteBuffer, outputs);
The error throws at the last line, and which I assume means that the preceding two lines are incorrectly framed:
var outputLocations = new float[1][] { new float[labels.Count] };
var outputs = Java.Lang.Object.FromArray(outputLocations);
Incidentally, labels.Count equals two!
I've used the Netron app to verify that the output shape from the .tflite model is indeed shaped [1, 13, 13, 35], hence my question, How do I specify a Java object in Xamarin that correlates with a TensorFlowLite output?
EDIT:
Interestingly, I notice that:
When there are 2 labels, the error is "Cannot copy from a TensorFlowLite tensor (model_outputs) with shape [1, 13, 13, 35] to a Java object with shape [1, 2]"
When there are 4 labels, the error is: "Cannot copy from a TensorFlowLite tensor (model_outputs) with shape [1, 13, 13, 35] to a Java object with shape [1, 4]"
Also, I don't know if it's useful to add, but tensor.ShapeSignature(); is an int[4] with values 1,416,416,3.
Here's the image pre-processing method referred to in the top (it's pretty standard stuff), which takes 416 from the ShapeSignature as width and height:
private ByteBuffer GetPhotoAsByteBuffer(byte[] bytes, int width, int height)
{
var modelInputSize = FloatSize * height * width * PixelSize;
var bitmap = BitmapFactory.DecodeByteArray(bytes, 0, bytes.Length);
var resizedBitmap = Bitmap.CreateScaledBitmap(bitmap, width, height, true);
var byteBuffer = ByteBuffer.AllocateDirect(modelInputSize);
byteBuffer.Order(ByteOrder.NativeOrder());
var pixels = new int[width * height];
resizedBitmap.GetPixels(pixels, 0, resizedBitmap.Width, 0, 0, resizedBitmap.Width, resizedBitmap.Height);
var pixel = 0;
for (var i = 0; i < width; i++)
{
for (var j = 0; j < height; j++)
{
var pixelVal = pixels[pixel++];
byteBuffer.PutFloat(pixelVal >> 16 & 0xFF);
byteBuffer.PutFloat(pixelVal >> 8 & 0xFF);
byteBuffer.PutFloat(pixelVal & 0xFF);
}
}
bitmap.Recycle();
return byteBuffer;
}
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 2 years ago.
Improve this question
I recently came across GoogleFoobar's problem Prepare the Bunnies Escape, and I submitted a Shortest Path based solution.
However, only 3 / 5 cases passed, and I am really intrigued to know why.
I have attached my code below for reference.
If anyone can "Hack" my solution / provide a countercase / tell me what I am doing wrong, that would be appreciated.
PLEASE DO NOT SEND ME IMPLEMENTATIONS, verbal explanations of my mistakes / counter tests would be appreciated.
Thanks.
import java.util.PriorityQueue;
public class Solution
{
public static int ans = Integer.MAX_VALUE;
public static int dx [] = {0,0,-1,1};
public static int dy [] = {-1,1,0,0};
static class State implements Comparable<State>
{
int x,y,moves;
boolean wentThroughWall;
public State(int x, int y, int moves, boolean wentThroughWall)
{
this.x = x;
this.y = y;
this.moves = moves;
this.wentThroughWall = wentThroughWall;
}
public int compareTo(State other)
{
return moves - other.moves;
}
}
public static int solution(int[][] map)
{
PriorityQueue<State> enque = new PriorityQueue<State>();
boolean visited [][] = new boolean [map.length][map[0].length];
enque.add(new State(0, 0, 1,false));
while(!enque.isEmpty())
{
State top = enque.poll();
if(top.x == map.length - 1 && top.y == map[0].length - 1)
{
ans = Math.min(ans, top.moves);
continue;
}
if(visited[top.x][top.y])
continue;
visited[top.x][top.y] = true;
for(int i = 0; i < dx.length; i++)
{
int nx = top.x + dx[i];
int ny = top.y + dy[i];
if(nx < 0 || nx >= map.length || ny < 0 || ny >= map[0].length || (map[nx][ny] == 1 && top.wentThroughWall))
continue;
if(map[nx][ny] == 1)
enque.add(new State(nx, ny, top.moves + 1, true));
else
enque.add(new State(nx, ny, top.moves + 1, top.wentThroughWall));
}
}
return ans;
}
public static void main(String[] args) {
int [][] test = {{0, 0, 0, 0, 0, 0}, {1, 1, 1, 1, 1, 0}, {0, 0, 0, 0, 0, 0}, {0, 1, 1, 1, 1, 1}, {0, 1, 1, 1, 1, 1}, {0, 0, 0, 0, 0, 0}};
System.out.println(Solution.solution(test));
}
}
Statement:
You're awfully close to destroying the LAMBCHOP doomsday device and freeing Commander Lambda's bunny prisoners, but once they're free of the prison blocks, the bunnies are going to need to escape Lambda's space station via the escape pods as quickly as possible. Unfortunately, the halls of the space station are a maze of corridors and dead ends that will be a deathtrap for the escaping bunnies. Fortunately, Commander Lambda has put you in charge of a remodeling project that will give you the opportunity to make things a little easier for the bunnies. Unfortunately (again), you can't just remove all obstacles between the bunnies and the escape pods - at most you can remove one wall per escape pod path, both to maintain structural integrity of the station and to avoid arousing Commander Lambda's suspicions.
You have maps of parts of the space station, each starting at a prison exit and ending at the door to an escape pod. The map is represented as a matrix of 0s and 1s, where 0s are passable space and 1s are impassable walls. The door out of the prison is at the top left (0,0) and the door into an escape pod is at the bottom right (w-1,h-1).
Write a function solution(map) that generates the length of the shortest path from the prison door to the escape pod, where you are allowed to remove one wall as part of your remodeling plans. The path length is the total number of nodes you pass through, counting both the entrance and exit nodes. The starting and ending positions are always passable (0). The map will always be solvable, though you may or may not need to remove a wall. The height and width of the map can be from 2 to 20. Moves can only be made in cardinal directions; no diagonal moves are allowed.
Languages
To provide a Python solution, edit solution.py
To provide a Java solution, edit Solution.java
Test cases
Your code should pass the following test cases.
Note that it may also be run against hidden test cases not shown here.
-- Python cases --
Input:
solution.solution([[0, 1, 1, 0], [0, 0, 0, 1], [1, 1, 0, 0], [1, 1, 1, 0]])
Output:
7
Input:
solution.solution([[0, 0, 0, 0, 0, 0], [1, 1, 1, 1, 1, 0], [0, 0, 0, 0, 0, 0], [0, 1, 1, 1, 1, 1], [0, 1, 1, 1, 1, 1], [0, 0, 0, 0, 0, 0]])
Output:
11
-- Java cases --
Input:
Solution.solution({{0, 1, 1, 0}, {0, 0, 0, 1}, {1, 1, 0, 0}, {1, 1, 1, 0}})
Output:
7
Input:
Solution.solution({{0, 0, 0, 0, 0, 0}, {1, 1, 1, 1, 1, 0}, {0, 0, 0, 0, 0, 0}, {0, 1, 1, 1, 1, 1}, {0, 1, 1, 1, 1, 1}, {0, 0, 0, 0, 0, 0}})
Output:
11
Sike, I fixed it. I managed to generate a bunch of testcases using a random test case generator, and realized that my visited array isn't defined correctly.
I have listed the correct solution below for reference with the fix.
import java.util.PriorityQueue;
public class Solution
{
public static int ans = Integer.MAX_VALUE;
public static int dx [] = {0,0,-1,1};
public static int dy [] = {-1,1,0,0};
static class State implements Comparable<State>
{
int x,y,moves;
boolean wentThroughWall;
public State(int x, int y, int moves, boolean wentThroughWall)
{
this.x = x;
this.y = y;
this.moves = moves;
this.wentThroughWall = wentThroughWall;
}
public int compareTo(State other)
{
return moves - other.moves;
}
}
public static int solution(int[][] map)
{
PriorityQueue<State> enque = new PriorityQueue<State>();
boolean visited [][][] = new boolean [map.length][map[0].length][2];
enque.add(new State(0, 0, 1,false));
while(!enque.isEmpty())
{
State top = enque.poll();
if(top.x == map.length - 1 && top.y == map[0].length - 1)
{
ans = Math.min(ans, top.moves);
continue;
}
if(visited[top.x][top.y][(top.wentThroughWall ? 0 : 1)])
continue;
visited[top.x][top.y][(top.wentThroughWall ? 0 : 1)] = true;
for(int i = 0; i < dx.length; i++)
{
int nx = top.x + dx[i];
int ny = top.y + dy[i];
if(nx < 0 || nx >= map.length || ny < 0 || ny >= map[0].length || (map[nx][ny] == 1 && top.wentThroughWall))
continue;
if(map[nx][ny] == 1)
enque.add(new State(nx, ny, top.moves + 1, true));
else
enque.add(new State(nx, ny, top.moves + 1, top.wentThroughWall));
}
}
assert(ans != Integer.MAX_VALUE);
return ans;
}
public static void main(String[] args) {
int [][] test = {{0, 0, 0, 0, 0, 0}, {1, 1, 1, 1, 1, 0}, {0, 0, 0, 0, 0, 0}, {0, 1, 1, 1, 1, 1}, {0, 1, 1, 1, 1, 1}, {0, 0, 0, 0, 0, 0}};
System.out.println(Solution.solution(test));
}
}
As a competitive person myself, I would like to know if my code really works, or was it just weak testing.
If you find a testcase which breaks my code, please let me know in the comments and I will get back to you ASAP.
I'm learning OpenGL on iOS by this guide, and I want to implement everything on swift. So, there is some code were I'm getting crash:
Memory structures:
private struct Vertex {
var Position: (GLfloat, GLfloat, GLfloat)
var Color: (GLfloat, GLfloat, GLfloat, GLfloat)
}
private static var Vertices = [
Vertex(Position: (1, -1, 0) , Color: (1, 0, 0, 1)),
Vertex(Position: (1, 1, 0) , Color: (0, 1, 0, 1)),
Vertex(Position: (-1, 1, 0) , Color: (0, 0, 1, 1)),
Vertex(Position: (-1, -1, 0), Color: (0, 0, 0, 1))
]
private static var Indices: [GLubyte] = [
0, 1, 2,
2, 3, 0
]
Create vertex buffers:
var vertexBuffer = GLuint()
glGenBuffers(1, &vertexBuffer)
glBindBuffer(GLenum(GL_ARRAY_BUFFER), vertexBuffer)
glBufferData(GLenum(GL_ARRAY_BUFFER), Vertices.size, Vertices, GLenum(GL_STATIC_DRAW))
var indexBuffer = GLuint()
glGenBuffers(1, &indexBuffer)
glBindBuffer(GLenum(GL_ELEMENT_ARRAY_BUFFER), indexBuffer)
glBufferData(GLenum(GL_ELEMENT_ARRAY_BUFFER), Indices.size, Indices, GLenum(GL_STATIC_DRAW))
Setup memory offsets:
var positionPtr = 0
glVertexAttribPointer(GLuint(positionSlot), 3, GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(strideofValue(Vertex)), &positionPtr)
var colorPtr = strideof(GLfloat) * 3
glVertexAttribPointer(GLuint(colorSlot), 4, GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(strideofValue(Vertex)), &colorPtr)
Crash (Trying to draw):
var startPtr = 0
// EXC_BAD_ACCESS code=1 here!
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(Indices.count / 3), GLenum(GL_UNSIGNED_BYTE), &startPtr)
All shaders are compiled without any errors and glClear() draws well, so I suppose my problem is concerned with VBOs.
And here how I calculate size of arrays:
extension Array
{
var size: Int {
get { return self.count * strideof(Element) }
}
}
UPD: I'm using OpenGLES 2.0.
I had learned by you guide for amount 4 months ago. I tried to convert it from objective-c to swift until draw rectangle same below picture.
Now I run it and convert to Swift 2.1. It still work and show same image below.
Here my code (Method setupVBOs, render and struct)
// Track of all our per-vertex information (currently just color and position)
struct Vertex {
var Position: (CFloat, CFloat, CFloat)
var Color: (CFloat, CFloat, CFloat, CFloat)
}
// Array with all the info for each vertex
var Vertices = [
Vertex(Position: (1, -1, 0) , Color: (1, 0, 0, 1)),
Vertex(Position: (1, 1, 0) , Color: (0, 1, 0, 1)),
Vertex(Position: (-1, 1, 0) , Color: (0, 0, 1, 1)),
Vertex(Position: (-1, -1, 0), Color: (0, 0, 0, 1))
]
// Array that gives a list of triangles to create, by specifying the 3 vertices that make up each triangle
var Indices: [GLubyte] = [
0, 1, 2,
2, 3, 0
]
//helper extensions to pass arguments to GL land
extension Array {
func size () -> Int {
return self.count * sizeofValue(self[0])
}
}
//The best way to send data to OpenGL is through something called Vertex Buffer Objects.
func setupVBOs() { // VBO : Vertex Buffer Objects.
//There are two types of vertex buffer objects – one to keep track of the per-vertex data (like we have in the Vertices array), and one to keep track of the indices that make up triangles (like we have in the Indices array).
glGenBuffers(1, &vertexBuffer)
glBindBuffer(GLenum(GL_ARRAY_BUFFER), vertexBuffer)
glBufferData(GLenum(GL_ARRAY_BUFFER), Vertices.count * sizeofValue(Vertices[0]), Vertices, GLenum(GL_STATIC_DRAW)) // send the data over to OpenGL-land.
glGenBuffers(1, &indexBuffer)
glBindBuffer(GLenum(GL_ELEMENT_ARRAY_BUFFER), indexBuffer)
glBufferData(GLenum(GL_ELEMENT_ARRAY_BUFFER), Indices.count * sizeofValue(Indices[0]), Indices, GLenum(GL_STATIC_DRAW))
}
func render() {
glClearColor(0, 104.0/255.0, 55.0/255.0, 1.0)
glClear(GLbitfield(GL_COLOR_BUFFER_BIT))
//glViewport(0, 0, GLint(frame.size.width), GLint(frame.size.height))
glViewport(0, GLint(frame.size.height/2)/2, GLint(frame.size.width), GLint(frame.size.height/2))
// feed the correct values to the two input variables for the vertex shader – the Position and SourceColor attributes.
glVertexAttribPointer(positionSlot, 3, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), nil)
glVertexAttribPointer(colorSlot, 4, GLenum(GL_FLOAT), GLboolean(UInt8(GL_FALSE)), GLsizei(sizeof(Vertex)), UnsafePointer<Int>(bitPattern: sizeof(Float) * 3))
// This actually ends up calling your vertex shader for every vertex you pass in, and then the fragment shader on each pixel to display on the screen.
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(Indices.count), GLenum(GL_UNSIGNED_BYTE), nil)
_context.presentRenderbuffer(Int(GL_RENDERBUFFER))
}
I am trying to get my context (EAGLContext) to render in Swift; but for days I haven't been able to the function glDrawElements to work.
I have read a couple of similar questions here on Stackoverflow but to no avail.
My glDrawElements is as follows:
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(Indices.count), GLenum(GL_UNSIGNED_BYTE), &offset)
I am having problem with the last parameter - offset, which expects a UnsafePointer<Void>.
I have tried the following:
let offset: CConstVoidPointer = COpaquePointer(UnsafePointer<Int>(0))
The above no longer works, because CConstVoidPointer doesn't seem to be available anymore in Swift 1.2.
And:
var offset = UnsafePointer<Int>(other: 0)
The above gives a warning that I should use bitPattern: instead.
Although I don't believe bitPattern: should be used here (because the parameter expects a Word), I decided to give it a try according to the suggestion provided such as the following:
var offset = UnsafePointer<Int>(bitPattern: 0)
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(Indices.count), GLenum(GL_UNSIGNED_BYTE), &offset)
I would get a EXE_BAD_ACCESS (code=EXC_I386_GPFLT) error.
In vain, I also tried something as simple as the following using:
var offsetZero : Int = 0
and subsequently feeding it to the last parameter of glDrawElements like so:
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(Indices.count), GLenum(GL_UNSIGNED_BYTE), &offsetZero)
I am getting the same EXE_BAD_ACCESS like in the case above.
How can I properly form a type suitable for the last parameter of glDrawElements that expects the type UnsafePointer<Void>?
#RobMayoff
Update (adding code for VBO set-up and variables declarations and definitions):
struct vertex
{
var position: (CFloat, CFloat, CFloat)
var color: (CFloat, CFloat, CFloat, CFloat)
}
var vertices =
[
vertex(position: (-1, -1, 0) , color: (1, 1, 0, 1)),
vertex(position: (1, -1, 0) , color: (1, 1, 1, 1)),
vertex(position: (-1, 0, 1) , color: (0, 1, 0, 1)),
vertex(position: (-1, 1, 1), color: (1, 0, 0, 1))
]
let indices: [UInt8] = [ 0, 2, 3, 3, 1, 0 ]
class mainGLView : UIView
{
var layer : CAEAGLLayer!
var context : EAGLContext!
var cBuffer : GLuint = GLuint()
var pos : GLuint = GLuint()
var color : GLuint = GLuint()
var iBuffer : GLuint = GLuint()
var vBuffer : GLuint = GLuint()
var vao : GLuint = GLuint()
override class func layerClass() -> AnyClass
{
return CAEAGLLayer.self
}
required init(coder aDecoder: NSCoder)
{
super.init(coder: aDecoder)
//setting up context, buffers, shaders, etc.
self.configureVBO()
self.setupRendering()
}
func configureVBO()
{
glGenVertexArraysOES(1, &vao);
glBindVertexArrayOES(vao);
glGenBuffers(GLsizei(1), &vBuffer)
glBindBuffer(GLenum(GL_ARRAY_BUFFER), vBuffer)
glBufferData(GLenum(GL_ARRAY_BUFFER), vertices.size(), vertices, GLenum(GL_STATIC_DRAW))
glEnableVertexAttribArray(pos)
var ptr = COpaquePointer(UnsafePointer<Int>(bitPattern: 0))
glVertexAttribPointer(GLuint(pos), GLint(3), GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(sizeof(vertex)), &ptr)
glEnableVertexAttribArray(color)
let fColor = UnsafePointer<Int>(bitPattern: sizeof(Float) * 3)
glVertexAttribPointer(GLuint(color), 4, GLenum(GL_FLOAT), GLboolean(GL_FALSE), GLsizei(sizeof(vertex)), fColor)
glGenBuffers(1, &iBuffer)
glBindBuffer(GLenum(GL_ELEMENT_ARRAY_BUFFER), iBuffer)
glBufferData(GLenum(GL_ELEMENT_ARRAY_BUFFER), indices.size(), indices, GLenum(GL_STATIC_DRAW))
glBindBuffer(GLenum(GL_ARRAY_BUFFER), 0)
glBindVertexArrayOES(0)
}
func setupRendering()
{
glBindVertexArrayOES(vao);
glViewport(0, 0, GLint(self.frame.size.width), GLint(self.frame.size.height));
indices.withUnsafeBufferPointer
{
(pointer : UnsafeBufferPointer<UInt8>) -> Void in
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(indices.count), GLenum(GL_UNSIGNED_BYTE), UnsafePointer<Void>(pointer.baseAddress))
Void()
}
self.context.presentRenderbuffer(Int(GL_RENDERBUFFER))
glBindVertexArrayOES(0)
}
}
Because of your choices for the count and type arguments, the indices argument (the last argument) needs to be a pointer to the first element of an array, of (at least) length Indices.count, of unsigned bytes. This pointer needs to be converted to UnsafePointer<Void> in Swift.
You didn't show the declaration of your Indices. Let's assume it's declared like this:
let indices: [UInt8] = [ 0, 1, 2, 3, 4, 5, 6, 7, 8 ]
Then you can call glDrawElements by jumping through these hoops:
indices.withUnsafeBufferPointer { (pointer: UnsafeBufferPointer<UInt8>) -> Void in
glDrawElements(GLenum(GL_TRIANGLES), GLsizei(indices.count),
GLenum(GL_UNSIGNED_BYTE),
UnsafePointer<Void>(pointer.baseAddress))
Void()
}
Can someone explain the following, totally unintuitive, results:
Mat_<Vec3f> mat(Size(3,3),0);
Mat_<Vec3f> mat_add = (mat + 9);
Mat_<Vec3f> mat_add_div = (mat + 9) / 3;
magically, I don't have any other explanation for it:
mat_add = [9,0,0,9,0,0,9,0,0];
mat_add_div = [3,3,3,3,3,3,3,3,3];
EDIT:
My take: this is a legacy bug that can't be fixed anymore because it is retroactive and will impact lot of projects. Also it would be pretty nasty to debug (except project has already pretty thorough unit testing)
Mat_<Vec3f> mat_add = (mat + 9);
is equivalent to
Mat_<Vec3f> temp(Size(3,3),Vec3f(9,0,0));
Mat_<Vec3f> mat_add = mat+temp;
So, you will get
mat_add =
[9, 0, 0, 9, 0, 0, 9, 0, 0;
9, 0, 0, 9, 0, 0, 9, 0, 0;
9, 0, 0, 9, 0, 0, 9, 0, 0]
However, I have no clue for why you will get values like that for mat_add_div. In fact, if you replace it by:
Mat_<Vec3f> mat_add_div = mat_add / 3;
You will end up with
mat_add_div =
[3, 0, 0, 3, 0, 0, 3, 0, 0;
3, 0, 0, 3, 0, 0, 3, 0, 0;
3, 0, 0, 3, 0, 0, 3, 0, 0]
This result is however reasonable based on the same theory as said in above.