Can the new UWB iPhones measure their distance towards other UWB iPhones? - ios

Is it practically possible for a developer right now (or at least theoretically in the future) to develop an App that can measure via UWB, the distance towards other Iphones?
UWB technology can take different forms in terms of ranging techniques. How does (or will) ranging work in these iPhones?

I have seen the iPhone11 schematic, found uwb chip U1 support 5 antennas, three of which are used for AOA positioning, the other two support uwb channel 5 and channel 9 data transmission, Now the iPhone and AirTag positioning, should be carried out through 3 AOA antennas, the other 2 should be used to support the iPhone as a tag (position with other iPhone phones, or anchor/node) use.

Yes. It might be an late answer, iOS provides Nearby Interaction framework to get distance and direction information.
https://developer.apple.com/documentation/nearbyinteraction

Related

Will ARCORE work on Huawei Mate 10? it is working on p20 line and they have the exact same hardware except one less camera lens

It is weird for me to come here and see that P2 line is ARCore capable but mate 10 line is not, and I would like to know why if the hardware in P20 Pro is almost the same except for the RAM and one more lens, it just doesn't make any sense for me.
As far as I know it has something about calibrating each phone based on the camera and motion sensors, and their locations on the phone. So even though their specifications might seem similiar there still are differences on location of the sensors and the cameras.
They might add support in the future.
Keep checking this page for supported devices, as I remember when it first became available it was not supported on devices like Galaxy S8 and S8+ and later support was added so keep an eye out for it.

Why is Pokemon Go running on unsupported devices?

If most of the devices are not supported ARCore, then why is Pokemon Go running on every device?
My device is not supported by ARCore but Pokemon Go is on it with full performance.
Why?
Until October 2017, Pokemon Go appears to use a Niantic made AR engine. At a high level, the game placed the Pokemon globally in space at a server defined location (the spawn point). The AR engine used the phone’s GPS and compass to determine if the phone should be moved to the left or to the right. Eventually, the phone pointed to the right heading and the AR engine drawed the 3D model over the video coming from the camera. At that time there was no attempt to perform mapping of the environment, surface recognition, etc. That was a simple, yet very effective technique which created the stunning effects we’ve all seen.
After that Niantic has shown prototypes of Pokemon GO using ARKit for iOS. It is easy to notice enhancements: missed pokeballs appear to bounce very naturally on the sidewalk and respect physics, it feels like Pikachu naturally walks on the sidewalk as opposed to floating in the air with the currently released game. Most observers expected Niantic to replace the current engine with ARKit (iOS) and ARCore (Android), possibly via Unity 3D AR APIs.
In early 2018 Niantic improved the aspect of the game on Android by adding support for ARCore, Google’s augmented reality SDK. And a similar update to what we’ve already seen on iOS 11, which was updated to support ARKit. The iOS update gave the virtual monsters a much greater sense of presence in the world, due to camera tracking, allowing them to more accurately stand on real-world surfaces rather than floating in the center of the frame. Android users will need a phone compatible with ARCore in order to use the new “AR+” mode.
Prior to AR+, Pokémon Go would use rough approximations of where objects were to try and place the Pokémon in your environment, but it was mostly a clunky workaround that functioned mostly as a novelty feature. The new AR+ mode also lets iOS users take advantage of a new capture bonus, called expert handler, that involves sneaking up close to a Pokémon, so as not to scare it away, in order to more easily capture it. With ARKit, since it’s designed to use the camera with the gyroscope and all the sensors, it actually feeds in 60 fps with full resolution. It’s a lot more performant and it actually uses less battery than the original AR mode.
For iOS users there's a standard list of supported devices:
iPhone 6s and higher
iPad 2017 and higher
For Android users not everything is clear. Let's see why. Even if you have an officially unsupported device with poor-calibrated sensors you can still use ARCore on your phone. For example, ARCore for All allows you do it. So for Niantic, as well, there's no difficulties to make every Android phone suitable for Pokemon Go.
Hope this helps.

What to do with too many touches swift spritekit [duplicate]

Is there a way to find out the maximum number of simultaneous touches on an iOS device (iPhone, iPod Touch, iPad) ? I've read here and there that iPhone can handle 5 while the iPad can handle 11, but I haven't found an official way (through a function call, say) to confirm this.
By testing it! See here for videos and source: http://mattgemmell.com/2010/05/09/ipad-multi-touch
There's no public API to request that information from the hardware.
iPhone can register 5 touch points on its tiny display, don't know about iPad.
Having said that, I wouldn't count on the numbers that you find empirically, because this information is not documented nor there is an api for it.
One good reason I can think of is the reduced touch sensor accuracy as the number of touch points increase.

Programmatically determine available iPhone camera resolutions

It looks like when I shoot video with UIImagePickerControllerQualityTypeMedium, on an iPod Touch it comes out 480x360, but on an iPhone 4 it's something higher (can't say just what as I don't have one handy at the moment) and on an iPad 2 presumably the same as the 4, if not something different again.
I'd like to shoot the same quality on all devices -- I have to add some frames and titles, and it'll make my life a lot easier if I just have to code that for one resolution. Is there any way to determine what the different UIImagePickerControllerQualityType values correspond to at run time? (Apart from shooting video with each and then examining the result, that is.)
Or is my only choice to use UIImagePickerControllerQualityType640x480?
If you need more customization/power on iOS than you get wish the higher level objects, such as UIImagePickerController, it is recommended to work at the next lower level: AV Foundation Framework. Apple has some excellent documentation on AV Foundation programming that should come in handy for that purpose.
Unfortunately, even there you are limited to capturing at 640x480 if you do want it standard across all devices. There, however, is a great chart available at the same link (but anchors are broken in the docs, so Ctrl+F to "Capturing Still Images") that lists all the resolutions for various devices under certain quality directives.
Your most solid bet, assuming 640x480 is too small, is to work out some sort of scaling algorithm that would allow you to scale according to the overall resolution.

Is it reasonable to consider a future Retina / HD iPad when starting a new project?

A few days ago a client asked me if the transition to the iPhone 4s retina display was a difficult one, development-wise.
This made me ask myself whether I should have considered iPhones with high resolution dispays even before the iPhone 4 had been announced - creating artwork with higher resolution, preparing codepaths... (while, of course, creating high resolution artwork is never a bad idea, considering its use for marketing, porting to other platforms etc.)
Now, with the iPad being around for some months, first rumors of a future iPad with retina display emerge from the depths of the www. And I start wondering - would it make sense to prepare new projects for such an iPad? I'm pretty sure that apple will in fact release a retina iPad at some point in the future, because it would be quite a logical step. So, I guess the important question is "how soon can we expect such a device?". There is much to consider when thinking about that, most of all production difficulties and the impact of a resolution of 2048 x 1536 (if apple sticks to simply doubling the "old" specs) on a mobile devices performance...
So, what do you think? Will it pay up to prepare new projects for a retina iPad, starting now? Or do you think the overhead is not worth it, yet?
Maybe some of you are already developing with the retina iPad in mind..?
I'd be glad to hear some of your thoughts! Thanks alot, guys!
Edit:
Well, Apple just answered my question. Yes, it was in fact reasonable to consider a Retina iPad..!
I wouldn't spend too much time making your app work on a theoretical device. But that doesn't mean you can't be prepared. Ever since they started changing things around I've been considering the following:
Use vector art wherever practical. That way resizing should be simple
Don't assume that the screen is 768x1024 or 320x480. Try to make your views gracefully resize
Don't assume that there will be an on-screen keyboard
So far Apple have allowed time between announcing products and making them available, and even there un-optimised apps have still worked.
Most of my work is for a client who has their own designer, who provides me with layered Photoshop files to pick image elements out of. I now have a policy with them that ALL images will be provided to me at double resolution. I don't care if it's just text, if it's only going to be on the iPad, I want it at 2x no matter what.
That takes a lot of thinking and judgement out of the hands of the designer (who's a good designer but not a particularly good technician or strategist), and allows me maximum flexibility in what I'm building.
Right now, I don't think I'd build #2x support into an iPad app just now (although presumably 4.2 will allow you to do it and have it downgrade nicely, just like 4.1 does), but I have the graphics here ready to install when needed.
A few of Apple's apps (such as iBooks) have already been seen in the wild with #2x iPad graphical elements (mistakingly?) left in, so it is clear that a retina iPad is coming as soon as it is practical for Apple to affordably include such an incredibly hires panel.
It might be later this year, it might be a year from now, or it might be two years from now.
It doesn't hurt at all to prepare now though. It is easy to downres graphics, but it is often impossible to upres graphical elements without redoing them from scratch.
So short answer - do everything in #2x resolution now, but wait to include it with your app until the time is right. When Apple issues the call for retina iPad apps, you'll be ready to go and able to be featured on day #1.
I'm going to agree with the others. I'll go out on a limb and say I think it is highly likely that a Retina iPad will have 2x horizontal and vertical resolution compared to the current iPad screen, just like the did with the iPhone, because it is such a freaking clever idea in terms of the relative ease of support of the new resolution for developers, the backwards compatibility with apps that have not been updated, and it also gives Apple a mechanism for preventing developers from making a I'll-cram-in-more-UI-on-the-high-resolution-version interface...
So absolutely, planning ahead for this is a good idea. That said, the ideal would be to plan for full resolution independence where possible, using vector artwork and so on so you can re-export at new resolutions with minimum hassle.

Resources