Impact of using LLVM-GCC to resolve issues on 2nd generation device running iOS 4.2.1 [duplicate] - ios-4.2

This question already has an answer here:
Coordinates all wrong on iPhone 3G? It could be your compiler
(1 answer)
Closed 3 years ago.
I have an app that uses touch events to draw on the screen. I use UITouch locationInView to return the CGPoint of the touch. The problem is that the x and y coordinates are always the same — BUT only on 2nd generation devices running iOS 4.2.1 AND only when my app is built in Release mode. This also only seems to be a problem when taking touches directly from a touchesMoved or touchesEnded event object, since my buttons respond to touches correctly.
Thanks to a post at http://getmoai.com/forums/new-users/ios-touch-sensor-y-co-ordinate-always-the-same-as-x/ I was able to fix the problem by using the LLVM-GCC compiler rather than the newer LLVM 3.0 compiler and by using optimization level 0.
What is interesting is that using the GCC compiler corrected the touch locations I received in touchesEnded while changing the optimization level to -O0 corrected the touch locations I received in touchesMoved. I cannot explain why this is, but for now I'm ecstatic my app is working on these devices.
My questions then are — What are the downsides to delivering my app to the store using the older compiler? (I understand the impact of the optimization level.) And is there any way to configure the project so that I use the older compiler and lower optimization level only for iOS 4.2.1 and/or 2nd gen devices?

I am using this hack:
// not working (location.x == location.y)
CGPoint location = [touch locationInView:_myView];
//
// working!
static CGPoint location;
location = [touch locationInView:_myView];
//

Related

In React Native is there a way to recognize stylus (pen) vs touch (finger) event?

I'm working on the RN application that has one screen with a list of "drawable" areas in it. So this screen should be scrollable AND drawable.
What I'm trying to do - is to find a solution to distinguish touch events coming from fingers (these will be used to scroll and disallow drawing) and stylus via Apple Pencil (these will be used to draw and disallow scrolling).
In both Gesture Responder and PanResponder there are events being passed on each move. Each of those events (alongside with the nativeEvent) contains the type property. However, it's always null for me in both simulator and device.
Is there any way to recognize a move event as a finger vs stylus?
We had a similar requirement for one of our projects, and what we did was to use a Pressable component, to which a handlePress function was passed as prop.
This function accepted the GestureResponderEvent as event callback argument.
By using the event.nativeEvent.altitudeAngle property that was added recently, we were able to detect Apple Pencil touches.
function handlePress(event: GestureResponderEvent) {
//#ts-expect-error React Native Types do not include altitudeAngle
const isPencilTouch = !!event.nativeEvent.altitudeAngle;
...
}

MapKit iOS rendererForOverlay refreshing out of control

I have a MapKit issue with MKMapView using addOverlay and rendererForOverlay. Testing and debugging is being done on a device (iPhone 7 iOS 11.1.1) with Xcode 9.1 (9B55). The overlay renderer is being refreshed repeatedly for all tiles in the map view (2500 calls per sec to drawMapRect:). The calls to the renderer are ignoring the changed rectangle in setNeedsDisplayInMapRect: and are not initiated by setNeedsDisplayInMapRect. This refreshing continues forever even after all map updates were finished with Xcode reporting the app is using over 160% CPU.
Xcode Debug Navigator Image Link
The MKMapView code is based on the Apple Sample code 'BreadCrumb' available from https://developer.apple.com/library/content/samplecode/Breadcrumb/Introduction/Intro.html. There are no significant structural changes to this code.
Has anyone else experienced this or have any suggestions of where to start looking for a solution?
Running the Apple Breadcrumb sample did not exhibit the same problem. After putting this back into my project and adding the changes from my project I was finally able to isolate the problem to having inserted 'self.alpha = 0.5' into drawMapRect:. It does not matter whether the alpha property is set to 1.0 or some other value, the problem will still occur.
- (void)drawMapRect:(MKMapRect)mapRect
zoomScale:(MKZoomScale)zoomScale
inContext:(CGContextRef)context;
{
CrumbPath *crumbs = (CrumbPath *)(self.overlay);
self.alpha = 0.5; // <-------- THE PROBLEM
With the problem resolved overlay renderer calls reverted to between 40 and 80 per second with no calls occurring without map updates and calls to setNeedsDisplayInMapRect:.

Can I disable multitouch for a Windows Phone Application?

I'm currently working on a game for Windows Phone 7, using the XNA version of Cocos2d.
Due to the rules of the game, I need it so that the user can only touch one thing at a time but multitouch seems to be always in effect. Additionally I don't know if this is a Cocos error, but it also causes the game to behave erratically (responding to a single touch like they were many).
I guess I would have to correct every touch event of the game one by one, but I was wondering if I can use something to disable multitouch quickly, or reduce the number of touches accepted to one at a time.
I'm not sure about Cocos2d-x for XNA. But in regular XNA if you want to force only single-touch input, the simplest way to that is by using the Mouse class. In a touch environment it is still available - emulated by using touches. It only responds to a single touch at a time.
Each frame you can get the list of touches. Since their management is delegated to your code, just ignore them if you have more than one. Another option is only using the first one, remember its TouchID and ignore all the rest.
I've used the first option when porting mouse applications over to the phone.
Cocos touch input has to be treated somewhere in the game, in accessible code, so you should have access to their point of entry.
Yes, you can do this with Cocos2D-XNA. You set the TouchMode to be either OneByOne, or AllAtOnce. OneByOne will give you the single CCTouch signature methods, and AllAtOnce will get you the List signature methods.
public MyCtor() {
TouchEnabled = true;
TouchMode = CCTouchMode.OneByOne;
}
public override bool TouchBegan(CCTouch t) {
}
public override void TouchMoved(CCTouch t) {
}
public override void TouchEnded(CCTouch t) {
}
Now you only get one touch at a time. There's no way to disable the touch pad's delivery of all touches from all fingers though. As another user mentioned, you would just ignore those.
Remember that you get a touch ID with every touch, which works to match the touch data with each began, moved, ended event call. I suggest you make use of touch ID as well to ensure that you are processing only the touches that you want.

Air for iOS: Testing swipe gesture on my mac

I'm working right now on an iOS project with air. Everything is fine, and I implemented the "swipe" gesture among some others. In the beginning, the gesture was recognized, the event fired, everything alright.
Then I changed something (don't ask me what), and the event isn't any more dispatched.
So my Problem is, if I want to test it, I'll have to compile, then upload to testflightapp.com, then install it. Takes a lot of time.
Is there any possibility to test the gesture on my mac while developing? So I can trace and check what is happening, or what not.
I compile with the Flash IDE CS5.5, but could also with Flash Builder if there is a solution.
And just for your interest, here the interesting part of my code.
Multitouch.inputMode = MultitouchInputMode.GESTURE;
contestView.addEventListener(TransformGestureEvent.GESTURE_SWIPE , onSwipeContest);
function onSwipeContest (e:TransformGestureEvent):void{
if (e.offsetX == 1) {
//User swiped towards right
// do something
}
}
This is a lame answer, but why aren't you skipping the testflight step and just dumping it straight to your phone? do you have xcode installed? if so, fire it up, open the organizer, set your device for development, and have at it. way, way faster that way.
anyway.
are you sure that gesture events aren't functioning on your mac? i've found that multitouch events are spotty, but gesture ones usually work ok on the trackpad from air 2 onward.

presentationLayer position property yielding bad values on iPad, no problem with iPhone

I have a game with several small objects animated using CAKeyframeAnimation. Objects animate perfectly for BOTH iPhone and iPad. However, the value of the position property of the animated CALayers' presentationLayers only yield reasonable values on the iPhone. I use the current position of the animating objects for hit testing. Any ideas of differences in this area between iPad and iPhone/iPod Touch?
The position points for iPhone show expected incremental change as objects animate; on iPad I see peculiar values, for example this sequence:
<-36893488147419103232.000000,2.607987>,
<-0.000000,2.658927>,
<0.000000,2.709929>,
<36893488147419103232.000000,2.755450>, ...
Other properties of the presentationLayer are correct (these are properties whose values don't change during the animation, however).
After the animation finishes, the presentationLayer position value IS accurate.
Exactly same problem here!! and it looks that is a bug starting with 3.2 version of the SDK...
Actually I did a iPhone app (using 3.1 SDK) that it is completely based on CAAnimation, when I installed and ran the the app on an iPhone with 3.2 nothing works!!!... but on devices that has 3.1 works perfectly. I google it and I found another people with exactly the same problem.
take a look at the bottom of the following forum
http://www.iphonedevsdk.com/forum/iphone-sdk-development/19622-current-position-animating-calayer.html
Could you solved this issue or found a workaround?

Resources