Force crop/start with crop tool open - ios

With the Aviary SDK, it was possible to start the view controller with a tool open by default, and force the user to crop the photo before proceeding to close the editor.
I have scoured through the Adobe Creative SDK documentation to find the equivelent functions, but can find nothing.
Is it possible to do with the Adobe SDK?
(FYI: Adobe bought out the Aviary SDK)

The Creative SDK Image Editor UI component doesn't currently support Quick Launch, the feature you are referring to.
However, it is possible to set the list of tools available to the user using the setToolOrder: method.
For example, if you only want to allow cropping, you can do this:
// Set the tool list to show Crop only.
[AdobeImageEditorCustomization setToolOrder:#[kAFCrop]];
For further information, see the "Tool Order" section of the Image Editor UI guide on creativesdk.adobe.com.

Related

How to detect and click on Mapbox markers in native Android application?

We are trying to create automated tests for Mapbox's Maps SDK for Android based native mobile application. The automated tests need to determine the number of markers present on the screen, the number of marker clusters present on the screen, click on marker or cluster etc.
When looking at the Mapbox maps on Android application through UIAutomatorViewer or through Appium inspector, the markers visible on the map are not shown in the object hierarchy.
What can the Android native mobile application development team do to surface the markers/clusters so that they are visible to Appium?
Alternately, what other options can the automation team explore to develop automated tests? Espresso is not ideal as automation team does not have access to source code for the native mobile application.
Please see Mapbox demo application's -> Annotations -> Draw a marker for an example of a marker that we would like to detect and click on.
Since the Mapbox SDK uses OpenGL rendering, not native components, most test automation frameworks won't be able to recognise UI elements like the markers or clusters you add to the map.
There are some testers using image recognition from the Accelerated-KAZE Features project to find items on screen in order to count and select them. Some example code can be found at https://github.com/bitbar/bitbar-samples/tree/master/image-recognition
Another approach may be to have the mobile app developers include testing hooks that allow your testing code to use the Mapbox API to query rendered features. Mapbox documentation for that is here: https://www.mapbox.com/android-docs/maps/overview/query/#query-rendered-features
I had come across this scenario earlier that an element, even when it's displayed on the screen, was not highlighted when selected from Appium inspector.
Later, when I scanned the complete hierarchy tree clicking on each node was I able to find the element.
I would suggest you to click on every node and check, the marker will be there in the hierarchy tree. It's just that inspector cursor is not able to highlight it.

Markup feature in images in swift

I've created a chat app, where user can chat and share images. Now I need to add feature where user can edit and perform some annotation on images. I've heard from iOS 9 there's a new feature called markup to perform editting on images. I've checked UIActivityViewController and couldn't find in it. Is this feature available and can it fit to my requirement? If not, is there any other alternatives that I can approach?
Yes you are Right Markup feature is available from ios 9.
iOS 9 has introduced a set of markup tools for images that you attach
with a message in the Mail app.
To use this feature,
1) Open the Mail app and compose a new message (or reply to an old one if that’s what you need to do).
2) Tap and hold on the message body area and in the floating actions bar, tap the Insert Photo or Video option. From your camera roll, insert a photo.
3) Tap and hold on the photo and in the floating actions bar that appears on the photo, tap the new ‘Markup’ option.
--> Your photo will be enter in Editing mode.
Markup tool will look like this ,
From the link whats-new you will find that apple implement markup option for mail.
So, As per your Question , Markup image facility provided bydefault by apple in Mail. so, i think you can't implement it programatically. but when user will open the mail from UIActivitycontroller from your application then user will get Markup feature with following above procedure. (But user have to manually do process to enable Markup feature)
You can also check video on youtube ,
How to markup and annotate an attachment in iOS 9
Reference link regarding markup feature,
ios-9-markup-annotate-email-attachments
I hope this info is helpful for you.

How do I invoke iOS photo editing extension?

Let's say I'm writing a word-processing application. Users can embed images using the app. Now since the application is not a photo editor, naturally it would delegate photo editing of embedded images to other applications.
The question is, how to do that? How to invoke photo editing extensions in an iOS app?
Theoretically it should be as simple as passing an image, invoke the extension, and then get another image back as the "edited" image. However the SDK documentation doesn't seem to provide any hint on how to do this.
Looks like the SDK doc actually says it doesn't provide a UI per se:
When using built-in editing controls, the image picker controller enforces certain options. For still images, the picker enforces a square cropping as well as a maximum pixel dimension. For movies, the picker enforces a maximum movie length and resolution. If you want to let the user edit full-size media, or specify custom cropping, you must provide your own editing UI."
From:
https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/CameraAndPhotoLib_TopicsForIOS/Articles/PickinganItemfromthePhotoLibrary.html
I'm looking at the Adobe Creative SDK Image Editor, which seems to provide a UI. Might show some of their branding in the photo editor, however. Hope to follow up...

Resizable selection area on realtime video

I want to develop an iOS App which will be grab an specific text from a paper using resizable selection area on Real Time video.Click here to see the link of an app which have similar feature. Can you please tell me the which API or Code should i implement in order to add similar features in iOS Platform. I will highly appreciate valuable resource links or project links or Code Sample.

How to embed Adobe Photoshop in my App

We are developing a software that will automate many crucial activities in Photoshop.
This application is targeted for newbies.
In this application I want to embed Photoshop's window in my applications window. Currently Photoshop runs separately in its own window.
How can I get it to run in a particular location in given space in my application window?
How about this: find Photoshop's window handle using FindWindow, and after that use SetParent to embed it into your form/panel. You might also need to maximize and remove borders from Photoshop's window, see Windows API for more details on how to do this.
I don't know of any API to let you embed Photoshop into another application, and I don't think that API exists. However, why can't you accomplish what you want using ActionScript or a native plug-in inside Photoshop? This is accomplishing almost the same thing, just from a different direction.

Resources