This question is mentioned in explaining How can I get a [Glance] Interface Controller / blank slate for Apple Watch? , but is a separate basic question.
In iOS, if you have a UIImage, you can create a UIImageView which supports, among other things, rotation, translation, and other transforms. In the Swift that I've seen, you can create a WKInterfaceImage, for instance:
#IBOutlet var foo: WKInterfaceImage!
However, there were no search results matching, for example WKInterfaceImageView.
How can I accomplish the work on an Apple Watch that might be done on an iPhone by getting a UIImageView from an Image? Are old-fashioned UIImageView/UIImages still available? Or is the best available method something like computing an image on the iPhone and dynamically offering it to the watch?
Thanks,
Related
First I want to talk about what I exactly did on Android, and I am now trying to do the same on IOS :
I developed a library on Android which contains an activity, in that activity we have several type of views (labels, buttons ... etc).
These views are used as an entry point/inputs for my library that provides some services to other apps.
To be able to match the general style of the user app, I created my own View Classes, the user(of my library) needs to provide a style id to the activity before launching it, so it can be used when constructing the View.
Doing so, we are able to maintain the style of the app without the need to re-implement the views for every app that uses my library or making the user(final user on physical device) feels as if he just moved out to another app.
I am really a newbie on iOS and I recently started to learn it so i am not sure how to do it, can you please provide me some guidelines please?
On iOS there are so many ways to create views (xibs, Storyboards , SwiftUI, and programatically). I am not really sure where to start.
He's a sample of my code on Android :
class MyView(val context: Context, val themeId: Int) : View(context) {
init {
val wrappedContext = ContextThemeWrapper(context, themeId)
val mInflater = wrappedContext.getSystemService(Context.LAYOUT_INFLATER_SERVICE) as LayoutInflater
mInflater.inflate(R.layout.my_view_layout, this, true)
}
}
This is a very broad question, but the short answer is "sure, it can be done."
I suppose a quick example might be:
I have a custom UIView with a UILabel centered in it, with 20-pts padding on all 4 sides.
I have 4 "themes" defined, defining the view's background color, the label's text color and the label's font style.
I write a custom init func to accept an Int "id" of 1 through 4
The library user could then instantiate it along the lines of let myView = MyView(with themeId: 2)
As this is a very broad question (topic), once you start developing your custom view you could come back if you run into specific coding questions.
I have an iOS app that dynamically draw shapes on UIView (through drawRect), and I am looking at the possibility to port that app (or a small part of that) to Apple Watch. Unfortunately, after read through relative posts I cannot find which class can handle the similar job in WatchKit.
So the question is:
If there is a similar class like UIView, in WatchKit, to draw lines cycles on Apple Watch?
What is the best (possible) way to implement a simple drawing function on Apple Watch, if there is no UIView like class. (Assume this is achievable)
Thanks in advance.
No, there is no class similar to UIView in WatchKit
You can generate images and transfer them to Watch App using WKInterfaceImage's func setImage(image: UIImage?)
If you already have drawing code for drawing in -drawRect:, it shouldn't be that difficult to modify it to support drawing to image
When I was watching one of the videos from Apple about the Apple Watch and its features I noticed that the page indicator dot colours change depending on the page presented.
They've also got images of the coloured dots within the Apple Watch Human Interface Guidelines App Anatomy section. Note the coloured title text as well.
Normally in WatchKit I would use the following snippet of code to do this:
let features = ["First", "Second", "Third"]
let controllers = [String](count: features.count, repeatedValue: "FeatureInterfaceController")
self.presentControllerWithNames(controllers, contexts: features)
Unfortunately as far as I know there isn't a way to access the currentPageIndicatorTintColor and pageIndicatorTintColor as you normally would as part of UIKit here (scroll down to Tint Color).
I wonder if Apple simply just added those colours for the video or whether they're actually going to be available as part of WatchKit in the future?
Perhaps I'm wrong and there is a way to change these dot colours.
As written by an Apple Engineer in the Dev Forums, it's not possible to customize the color of the page control at this time.
Source: https://devforums.apple.com/message/1100695#1100695
I'm currently working with the AVCam demo app to present a live camera feed over airplay or apple hdmi adapter for import into a HD camera switcher.
The issue I'm having is with OverScanCompensation to remove the huge black border from the mirrored view.
The only documentation I have found is to implement the screen.overscanCompensation = 3; method someplace? I have tried to put it into viewDidLoad and it will let me, but it doesn't change anything on the external view?
I had success of sorts with the Airplay Demo (quellish) using UIImagePicker, but I would much prefer to implement AVFoundation for this exercise.
Is there a better way to achieve what I'm looking for without having to implement separate view controllers?
All you need to do is, upon setting up the external screen (via, say, if ([[UIScreen screens] count] > 1) externalScreen = (UIScreen *)[[UIScreen screens] objectAtIndex:1];), set the overscanCompensation property of the above UIScreen instance to UIScreenOverscanCompensationInsetApplicationFrame (=2). It'll entirely get rid of both the border (overscanning) and image quality-deterioating scaling.
See http://www.iphonelife.com/blog/87/tv-display-output-why-does-your-picture-have-black-border-and-how-can-it-be-fixed for more info.
I have a sample program which I was interested in emulating in building my main application, but that sample program does something which I think is probably not permitted. I found some forum posts which suggest Apple will reject an app from the app store if it does this:
#interface MainTabBarController : UITabBarController <XXCustomAccessoryDelegate,UIAccelerometerDelegate> {
I'm relatively new to iOS development, and I'm not 100% certain about this. I have googled a bit and found some information suggesting "not to do this", but not a firm "you should not do this".
The UI tab bar controller class that this person has subclassed has IBOutlet and IBAction connection points, and these are a central part of the application, but it does not override the painting code. Do I need to rewrite or adapt the code if I want to reuse part of it, before Apple will permit it in the App Store? Or is it simply overriding to access the view and modify the painting code that Apple does not permit?
The docs have changed between ios5 and ios6.
ios5: " This class is not intended for subclassing."
ios6: "This class is generally used as-is but may be subclassed in iOS 6 and later."