I have this code:
var w = 0
var h = 0
for i in 1...am
{
if w > Int(screenSize.width)
{
w = 0
h += CHeight
}
//some other code
w += CWidth
so the value W is a part of the screen width, and it can be not "perfectly" equal to the screen width while adding them together.
the IF doing it's job only when the part value(W) larger then screen width. but how to make it so the IF will work when value W will be just before the end of screen width (doesn't go over it)?
if w > Int(screenSize.width) - CWidth
{
w = 0
h += CHeight
}
Related
I'm making an isometric grid and using padding for it cells. Currently as parameters I'm waiting for its items width and height. But how can I avoid this and use final cells size in an optimal way?
I tried some solutions (commented code is the original way) from this question but it doesn't look legit for the case: I have extra multiple calculations, an error: Padding must be non-negative and it doesn't show correctly in Android Studio preview.
Modifier.onGloballyPositioned also looks same and incorrect for the case, what's the right way?
#Composable
fun <T> IsometricGrid(
gridWidth: Int,
gridHeight: Int,
cellWidth: Int,
cellHeight: Int,
data: List<T>,
itemContent: #Composable (Int, T) -> Unit
) {
var width by remember { mutableStateOf(0) }
var height by remember { mutableStateOf(0) }
for (y in 0 until gridHeight) {
for (x in 0 until gridWidth) {
// val start = (y % 2 * 0.5 + x) * cellWidth
// val top = y * cellHeight * 0.5
val index = x * gridHeight + y
Box(
modifier = Modifier
.onSizeChanged {
width = it.width
height = it.height
Timber.i("$width, $height")
}
// .padding(start = start.dp, top = top.dp)
.padding(start = ((y % 2 * 0.5 + x) * cellWidth).dp, top = (y * height * 0.5).dp)
) {
itemContent(index, data[index])
}
}
}
}
Added usage:
IsometricGrid(4, 4, 100, 50, listOf<Foo>(...)) { index: Int, foo: Foo ->
Icon(...)
}
start = (y % 2 * 0.5 + x * width).dp, top = (y * height * 0.5).dp
This line is not correct because you add dp extension to pixel instead of converting pixel to dp
What you should be doing
val density = LocalDensity.current
density.run{(y % 2 * 0.5 + x * width).toDp()}
because dp value of any pixel values is calculated as
dpValue = valueInPixel/density
let's say you have 100px with a density = 2.0f
your dp value should be 50.dp. If you calculate it as in your question you find that it returns 100.dp
Try this. I don't have data, so i'm not able to try your function.
#Composable
fun <T> IsometricGrid(
gridWidth: Int,
gridHeight: Int,
cellWidth: Int,
cellHeight: Int,
data: List<T>,
itemContent: #Composable (Int, T) -> Unit
) {
val density = LocalDensity.current
var width by remember { mutableStateOf(0) }
var height by remember { mutableStateOf(0) }
for (y in 0 until gridHeight) {
for (x in 0 until gridWidth) {
// val start = (y % 2 * 0.5 + x) * cellWidth
// val top = y * cellHeight * 0.5
val index = x * gridHeight + y
val startInDp = density.run { ((y % 2 * 0.5f + x) * cellWidth).toDp() }
val topInDp = density.run { (y * height * 0.5f).toDp() }
Box(
modifier = Modifier
.onSizeChanged {
width = it.width
height = it.height
}
// .padding(start = start.dp, top = top.dp)
.padding(
start = startInDp,
top = topInDp
)
) {
itemContent(index, data[index])
}
}
}
}
toDp() is extension for Float in Density interface
/** Convert a [Float] pixel value to a Dp */
#Stable
fun Float.toDp(): Dp = (this / density).dp
I'm working on translating this ActionScript tutorial on binary space partitioning into Swift so I can use it in my rogue-like game. I came across a hitch.
In the article, the writer initializes his class like so:
public function Leaf(X:int, Y:int, Width:int, Height:int)
{
// initialize our leaf
x = X;
y = Y;
width = Width;
height = Height;
}
When I translated this into Swift, I ran into an error. The code above doesn't initialize all its declared values. This leads me into an impossible error that I can't seem to fix. Somehow, the writer of the article initializes his leftChild and rightChild variables with this function that is outside the initialization scope.
public function split():Boolean
{
// begin splitting the leaf into two children
if (leftChild != null || rightChild != null)
return false; // we're already split! Abort!
// determine direction of split
// if the width is >25% larger than height, we split vertically
// if the height is >25% larger than the width, we split horizontally
// otherwise we split randomly
var splitH:Boolean = FlxG.random() > 0.5;
if (width > height && width / height >= 1.25)
splitH = false;
else if (height > width && height / width >= 1.25)
splitH = true;
var max:int = (splitH ? height : width) - MIN_LEAF_SIZE; // determine the maximum height or width
if (max <= MIN_LEAF_SIZE)
return false; // the area is too small to split any more...
var split:int = Registry.randomNumber(MIN_LEAF_SIZE, max); // determine where we're going to split
// create our left and right children based on the direction of the split
if (splitH)
{
leftChild = new Leaf(x, y, width, split);
rightChild = new Leaf(x, y + split, width, height - split);
}
else
{
leftChild = new Leaf(x, y, split, height);
rightChild = new Leaf(x + split, y, width - split, height);
}
return true; // split successful!
}
Which is somehow ok in ActionScript, but in Swift it leads me to my problem.
Here is my translated code (Swift):
private let mapWidth:Int = 50
private let mapHeight:Int = 50
class Leaf {
var leftLeaf = [Leaf]()
var rightLeaf = [Leaf]()
var minLeafSize:Int = 6
var x, y, width, height: Int
var leftChild:Leaf
var rightChild:Leaf
init (X:Int, Y:Int, W:Int, H:Int) {
x = Y
y = Y
width = W
height = H
let maxLeafSize:UInt = 20
var leaves = [Leaf]()
// first, create a Leaf to be the 'root' of all Leafs.
let root = Leaf(X: 0, Y: 0, W: mapWidth, H: mapHeight)
leaves.append(root)
var didSplit:Bool = true
// we loop through every Leaf in our Vector over and over again, until no more Leafs can be split.
while (didSplit) {
didSplit = false
for l in leaves {
if l.leftLeaf.isEmpty == true && l.rightLeaf.isEmpty == true {
// if this Leaf is too big, or 75% chance...
if l.width > maxLeafSize || l.height > maxLeafSize || Int(arc4random_uniform(100)) > 25 {
if (l.split()) {
// if we did split, push the child leafs to the Vector so we can loop into them next
leaves.append(l.leftChild)
leaves.append(l.rightChild)
didSplit = true
}
}
}
}
}
}
func split() -> Bool {
if leftLeaf.isEmpty == true || rightLeaf.isEmpty == true {
return false
}
var splitH = arc4random_uniform(100) > 50 ? true : false
if width > height && Double(width / height) >= 1.25 {
splitH = false
}
if height > width && Double(height / width) >= 1.25 {
splitH = true
}
let max:Int = (splitH ? height : width) - minLeafSize // determine the maximum height or width
if max <= minLeafSize { return false }
let split:Int = Int(arc4random_uniform(UInt32(minLeafSize - max) + UInt32(max)))
if (splitH) {
leftChild = Leaf(X: x, Y: y, W: width, H: split)
rightChild = Leaf(X: x, Y: y + split, W: width, H: height - split)
leftLeaf.append(leftChild)
rightLeaf.append(rightChild)
} else {
leftChild = Leaf(X: x, Y: y, W: split, H: height)
rightChild = Leaf(X: x + split, Y: y, W: width - split, H: height);
leftLeaf.append(leftChild)
rightLeaf.append(rightChild)
}
return true
}
}
It is identical (as far as I can figure) to the ActionScript code in the article. But it is giving me an error. The leftChild and rightChild variables aren't initialized in my init method. When I move the split() -> Bool function into the init method it won't let me use the function, giving me an error "Value of type Leaf has no member split()". Removing the l from the if (l.spit()) line gives me a second error "Use of local variable 'split' before its declaration". The split() function has to be outside the initialization scope.
If I attempt to initialize leftChild and rightChild like so:
init (X:Int, Y:Int, W:Int, H:Int) {
x = Y
y = Y
width = W
height = H
leftChild = Leaf(X: x, Y: y, W: width, H: height)
rightChild = Leaf(X: x, Y: y, W: width, H: height)
}
It creates an infinite loop that eventually causes a crash.
The code should be initializing leftChild and rightChild in the split() -> Bool function but I don't think that's how it works in Swift. You should be able to copy/paste it into a Swift file and get the same errors.
Why is this happening? Is my code poorly written? How can I fix this?
In ActionScript, uninitialised variables are automatically evaluated with the special value undefined; also, in ActionScript, undefined == null, which is why if (leftChild != null || rightChild != null) works.
In Swift, you need to explicitly allow your variables to be nilable. The variables you are worried about need to start off as nil (which they automatically will, if you allow them to, by setting their type to Optional - note the question mark):
var leftChild:Leaf?
var rightChild:Leaf?
Project is to find boundary of an image and the given idea is to find the bigger difference of pixel values between clockwise pixel locations zeroing in on the center of the image thereby finding the boundary pixel locations. Any way we can iterate through pixels in clockwise direction ?
The usual row column scan doesn't work for finding pixel difference along boundary of the image.
for y in 0..<rgbaImage.height {
for x in 0..<rgbaImage.width {
let index = x * rgbaImage.height + y
var pixel = rgbaImage.pixels[index]
print("\((Int32(pixel.red) + Int32(pixel.green) + Int32(pixel.blue)) / 3)", terminator:" ")
Here is a quick example of looping in a spiral to build an n x n array. You can use the same approach to loop through an image.
Paste this into a playground to test:
//: Playground - noun: a place where people can play
var spiral = [[Int]](count: 5, repeatedValue: [Int](count: 5, repeatedValue: 0))
var n:Int = 5
var value:Int = 0
var minCol:Int = 0
var maxCol:Int = n-1
var minRow:Int = 0
var maxRow:Int = n-1
while value != n*n {
for var i = minCol; i <= maxCol; i += 1 {
spiral[minRow][i] = value;
value += 1;
}
for var i = minRow+1; i <= maxRow; i += 1 {
spiral[i][maxCol] = value;
value += 1;
}
for var i = maxCol-1; i >= minCol; i-- {
spiral[maxRow][i] = value;
value += 1;
}
for var i = maxRow-1; i >= minRow+1; i-- {
spiral[i][minCol] = value;
value += 1;
}
minCol += 1;
minRow += 1;
maxCol -= 1;
maxRow -= 1;
}
//loop through the array we just built
for var x = 0; x < n; x += 1 {
for var y = 0; y < n; y += 1 {
print(spiral[x][y],terminator: "\t")
}
print();
}
The output looks like this:
0 1 2 3 4
15 16 17 18 5
14 23 24 19 6
13 22 21 20 7
12 11 10 9 8
Im trying to add space around add UITabelViewCell in ios7 just like grouped table cell in ios6. Similar to facebook news-feed cell.
I have a custom UITableCell which i add to the tableView ( grouped). Inside cellForRowAtIndexPath i tried the following
int x = cell.frame.origin.x;
int y = cell.frame.origin.y;
int h = cell.frame.size.height;
int w = cell.frame.size.width;
NSLog(#" x : %i y : %i h = %i w = %i" , x, y , h, w);
CGRect newFram = cell.frame;
newFram.origin.x = 10;
cell.frame = newFram;
x = cell.frame.origin.x;
NSLog(#" x : %i y : %i h = %i w = %i" , x, y , h, w);
[[cell layer] setBorderWidth:1.0f];
I tried to change the x orgin of the cell. When i print 'x' , its says 0 first and 10 second time but the cell is always at 0.
2013-11-04 22:07:30.334 iosproj[6256:70b] x : 0 y : 0 h = 71 w = 320
2013-11-04 22:07:30.335 iosproj[6256:70b] x : 10 y : 0 h = 71 w = 320
2013-11-04 22:07:30.337 iosproj[6256:70b] x : 0 y : 0 h = 71 w = 320
2013-11-04 22:07:30.337 iosproj[6256:70b] x : 10 y : 0 h = 71 w = 320
Any help on this is appreciated.
Thanks,
This is not answering the question directly, but if you really wanted custom design for your view and cells, you could use a UICollectionView and UICollectionViewCells instead. You can then use a UICollectionViewLayout to handle the spacing between sections and cells. It's all very clear in the documentation for iOS.
building an Adobe AIR app- seem to be having an issue with localToGlobal. I've been following this to set up my app for multiple screen sizes:
http://www.adobe.com/devnet/air/articles/multiple-screen-sizes.html
But I seem to always be running into an issue with my localToGlobal. I have used most of the code in the "Scaling and centering an interface" section and my App is adjusting the app size properly.
I have converted all my localToGlobal calls and multiplied the new scale value to the x and y but am not getting the result needed. It works fine still on the original iPhone 4/4s resolution, but testing on the 3gs resolution my Line Intersections are not working, that specifically use the localToGlobal code to figure out.
Here's the code for a call to a local point:
var p:Point = localToGlobal(_lineStart);
p = parent.globalToLocal(p);
p.x *= GameHolderSun.scaleValue;
p.y *= GameHolderSun.scaleValue;
return p;
Determine Intersection:
var p:Point = determineLineIntersection(_laser.get2ndLastPoint(), _laser.getLastPoint(), _reflectors[i].lineStart, _reflectors[i].lineEnd);
if(p != null){
_laser.addNewLinePoint(p, _reflectors[i].rotation);
}
Line Intersection function I grabbed online:
private function determineLineIntersection(A:Point, B:Point, E:Point, F:Point,as_seg:Boolean=true):Point
{
var ip:Point;
var a1:Number;
var a2:Number;
var b1:Number;
var b2:Number;
var c1:Number;
var c2:Number;
a1= B.y-A.y;
b1= A.x-B.x;
c1= B.x*A.y - A.x*B.y;
a2= F.y-E.y;
b2= E.x-F.x;
c2= F.x*E.y - E.x*F.y;
var denom:Number=a1*b2 - a2*b1;
if (denom == 0) {
return null;
}
ip=new Point();
ip.x=(b1*c2 - b2*c1)/denom;
ip.y=(a2*c1 - a1*c2)/denom;
ip.x *= GameHolderSun.scaleValue;
ip.y *= GameHolderSun.scaleValue;
if(as_seg)
{
if(Math.pow(ip.x - B.x, 2) + Math.pow(ip.y - B.y, 2) > Math.pow(A.x - B.x, 2) + Math.pow(A.y - B.y, 2)) return null;
if(Math.pow(ip.x - A.x, 2) + Math.pow(ip.y - A.y, 2) > Math.pow(A.x - B.x, 2) + Math.pow(A.y - B.y, 2)) return null;
if(Math.pow(ip.x - F.x, 2) + Math.pow(ip.y - F.y, 2) > Math.pow(E.x - F.x, 2) + Math.pow(E.y - F.y, 2)) return null;
if(Math.pow(ip.x - E.x, 2) + Math.pow(ip.y - E.y, 2) > Math.pow(E.x - F.x, 2) + Math.pow(E.y - F.y, 2)) return null;
}
return ip;
}
Hope someone can help!! I can give more information if needed!!
I believe your issue is DPI, at least if you are setting applicationDPI. Stage values do not compensate for the forced DPI change.
To fix this, you have two options:
1) For stage.mouseX/mouseY and stage.stageHeight/Width, you can just use FlexGlobals.topLevelApplication.systemManager properties. I believe SystemManager has a mouseX/mouseY property and you can find the new width/height in the screen property
2) You account for the DPI change yourself. It's relatively simple to do so. For using localToGlobal, I think this is what you will have to do.
var x:Number = 200; //this number is in default DPI size
var appDPI:Number = FlexGlobals.topLevelApplication.applicationDPI;
var deviceDPI:Number = Capabilities.screenDPI; //actual DPI of device
//Adobe doesn't use the actual DPI, they use either 160, 240, or 320
//and use ranges to determine which maps where
if ( deviceDPI < 200 ) {
deviceDPI = 160;
}
else if ( deviceDPI >= 200 && deviceDPI < 280 ) {
deviceDPI = 240;
}
else if ( deviceDPI >= 280 ) {
deviceDPI = 320;
}
var newX:Number = x / ( deviceDPI / appDPI );
I highly suggest you create a Util class and throw this into a (maybe static) function as you will likely end up using it often. The X var can be any number, so long as it is not adjusted for DPI changes.
Hope that helps. I beat my head on a rock trying to figure out why my stage.stageWidth and stage.stageHeight values were incorrect back in July.