I am working on Apple ResearchKit application for Lupus patients. I have already put some surveys and a task for walking activity.
Now I need capture image of a skin rash at frequent intervals, save it inside the app only (not in photos app) and compare the newest image with the last image taken.
I need to know if I can use ResearchKit to do the above said task. How can I open the iPhone camera and capture an image using ResearchKit? I know image comparison is a task outside ResearchKit. But my first priority is capturing the image in ResearchKit. Is it possible to use ResearchKit or do I have to do this task outside the scope of RK.
Please provide me with any code or any link if available.
Thanks in advance
#prateek ResearchKit has a Image Capture step which can do what you require. You'll also have to declare an output directory for the captured image in your task view controller. Sample code below.
ORKImageCaptureStep *imageCaptureStep = [[ORKImageCaptureStep alloc] initWithIdentifier:#"ImageCaptureStep"];
imageCaptureStep.title = /*Title for the step*/;
ORKTaskViewController *taskViewController = [[ORKTaskViewController alloc] initWithTask:imageCaptureStep taskRunUUID:nil];
taskViewController.delegate = self;
taskViewController.outputDirectory = /*where to store your image*/;
And don't forget to implement the "ORKTaskViewControllerDelegate" for the task view controller.
Related
How do I get last picture from gallery (camera roll) using flutter?
I would like to display this photo as a thumbnail like this:
I know that it's possible in android Get uri of picture taken by camera and in IOS Swift - how to get last taken 3 photos from photo library?
Is there a tool, lib, media API that could help me with that?
You can try using this library. photo_manager
List<AssetEntity> assets = [];
_fetchAssets() async {
final albums = await PhotoManager.getAssetPathList(type: RequestType.all);
final recentAlbum = albums.first;
final recentAssets = await recentAlbum.getAssetListRange(
start: 0, // start at index 0
end: 1, // end at a very big index (to get all the assets)
);
print(recentAssets);
setState(() => assets = recentAssets);
}
Use platform channels ,, as you know it's native solution
https://medium.com/flutter-io/flutter-platform-channels-ce7f540a104e
https://flutter.io/docs/development/platform-integration/platform-channels
If you're willing to do this through a dialog, you can use the image picker plugin which allows you to go through the camera's image gallery and the user can select the image (including the last image).
If you want this done completely programmatically, someone implemented a pull request for this very feature here: https://github.com/flutter/plugins/pull/676, but it hasn't been accepted because it needs some tests. You could copy the code from that PR, or add some tests, and then get it accepted into the repo!
i have created a UWP App which uses HTML5 Webradio streams.
Everything works fine but now i wanted to add track and artist information to the MediaPlayer Element.
This information will be shown if the user locked his device, on the start screen.
The first track if the user selects a stream is shown correctly. But I can't update this information without restart the Stream.
MediaItemDisplayProperties mdp = _mediaPlaybackItem.GetDisplayProperties();
mdp.Type = Windows.Media.MediaPlaybackType.Music;
mdp.MusicProperties.Artist = "TBA Artist";
mdp.MusicProperties.Title = "TBA Title";
mdp.Thumbnail = Windows.Storage.Streams.RandomAccessStreamReference.CreateFromUri(MainPage.Current.CurrentStream.PreviewImageUri);
_mediaPlaybackItem.ApplyDisplayProperties(mdp);
_mediaPlayer.Source = mpItem;
_mediaPlayer.Play();
If i take this lines into my refresh Method for Artist/Title, I also have to set the Source of _mediaPlayer again which will result to a pause of playing the music.
Does anyone have an idea how to fix this problem? Or give any advice I can look further.
Thanks Chris
If you want to update the Artist/Title, you should be able to use SystemMediaTransportControlsDisplayUpdater class, it provides functionality to update the music information that is displayed on the SystemMediaTransportControls.
We can set Artist/Title to the SystemMediaTransportControlsDisplayUpdater.MusicProperties property. Then we can use SystemMediaTransportControlsDisplayUpdater.Update method to update the metadata for the currently playing media.
Use the SystemMediaTransportControlsDisplayUpdater class to update the media info that is displayed by the transport controls, such as the song title or the album art for the currently playing media item. Get an instance of this class with the SystemMediaTransportControls.DisplayUpdater property. If your scenario requires it, you can update the metadata displayed by the system media transport controls manually by setting the values of the MusicProperties, ImageProperties, or VideoProperties objects exposed by the DisplayUpdater class.
For example:
SystemMediaTransportControlsDisplayUpdater updater = _systemMediaTransportControls.DisplayUpdater;
updater.MusicProperties.Artist = "artist";
updater.MusicProperties.AlbumArtist = "album artist";
updater.MusicProperties.Title = "song title";
updater.Thumbnail = RandomAccessStreamReference.CreateFromUri(new Uri("ms-appx:///Music/music1_AlbumArt.jpg"));
updater.Update();
So I have app where I want to let users to share screenshot of score to facebook etc... I'm using SocialShare widget. In documentation it says to set path to image like this: "image:fileToShare.nativePath", but I'm not really sure how to set it. Another problem is that I need to share picture that has always different name, it saves screenshots with names like tia7828157.png,tia107997596.png... in folder in device internal memory in pictures/enigmania/ I'm new to appcelerator, so I dont know if there is something like wildcard I could use for this? Thanks for any help.
This is my code so far which I know is wrong, I know the widget works because it shares text without problem:
function shareTextWidget(e){
// share text status
var socialWidget=Alloy.createWidget('com.alcoapps.socialshare');
socialWidget.share({status:"Enigmania kvíz",androidDialogTitle:"hoho",image:test.png/pictures/enigmania});
}
You should use Ti.Filesystem class methods/properties to get the path of any file located on internal or external storage.
Also aware of the permissions of reading storage on Android 6+. Use Storage Permissions before accessing any file on Android 6+.
Simple code snippet to create a directory on internal storage at this location: pictures/enigmania and then write an image file of captured view in this directory.
function shareTextWidget(e){
var directory = Ti.Filesystem.getFile(Ti.Filesystem.externalStorageDirectory, 'pictures/enigmania');
!directory.exists() && directory.createDirectory();
var fileToShare = Ti.Filesystem.getFile(directory.resolve(), 'screen.jpg');
fileToShare.write($.SCREENSHOT_VIEW.toImage()); // write the blob image to created file
var socialWidget=Alloy.createWidget('com.alcoapps.socialshare');
socialWidget.share({status:"Enigmania kvíz",androidDialogTitle:"hoho",image:fileToShare.nativePath});
}
This code should work without any issues.
Note that $.SCREENSHOT_VIEW is the View ID for which you will take screenshot, so it depends on you how you maintain your View positions in order to capture correct screenshot, but point is to use Ti.UI.View toImage() method to capture the screenshot of particular view.
Let me know if this works for you or not, else we can look into other methods by getting your exact requirements. Good Luck!!!!
My understanding currently is that:
CameraUI
I can use the CameraUI to access the built in camera for MediaType.VIDEO and that delegates to the built-in video camera app and lets me record a video. My app does that now.
When I stop recording and click the "Use" button, I am returned to my app and theoretically I have a valid MediaPromise.
iOS does -not- provide a valid/usable url/filename to the recorded video (or to photos) and so I would have to use a Loader to bring-in/use/access the 'recorded' video... AND... iOS does not actually create a file anywhere on the device, most importantly, in the Camera Roll where one would expect by the normal behavior when uses the system native camera/video app.
The documentation says that the Loader can load various image types and SWFs but nothing about video data, so I conclude from that that I cannot actually use the CameraUI to generate a valid MediaPromise that I can then pass to a Loader class or similar to read in the information created by the system camera and then manipulate (upload, save to applicationStorageDirectory, and/or display in one of the two video player components available in the API).
CameraRoll
I can have video entities in the iOS Camera Roll but the AS3/Air3.5 CameraRoll class won't let me view/access/reference them in any way.
Normal File I/O
All my attempts to use the Air3.5 File classes to browse to the storage location of the iOS Camera Roll have been rebuffed.
------- Questions -------
Am I correct in believing that there is a way to take video but no way to use the video that's been captured. (No way to use the resulting MediaPromise successfully).
I believe you can take video and access it using Android, but there's nothing in the documentation that says that you cannot using iOS.
Am I correct in believing that iOS sandboxes apps so that they cannot browse to video/photo storage using standard File I/O, but only through the apparently non-workable means I've tried (CameraUI & CameraRoll)
Am I wrong to think that these should be rather obvious NEEDS that one can achieve using the XCode Objective C++ etc route but the AIR Mobile Framework does not allow either because of Apple blocking functionality or because Adobe has failed to meet reasonable expectations?
One item of ironic note to convey. If I use the iOS system camera app to record a video, a thumnail of that video then appears in the Gallery/Camera Roll, and of course, I can share it or view it, or whatever... If I use AIR's CameraRoll.browseForImage(), provided I haven't used the camera to take another image, when it shows me the folder where the pictures are stored, the folder icon uses the thumbnail of the last object added... in this case, the video I took, but if I then enter the folder, the video cannot be found. It's teasing us. It knows it's there, but it is apparently forbidden fruit.
I can't answer all your questions, so this entry may not be acceptable, but I found this page while searching a solution for some the problems you described and thought that someone else may find this answer (partially) useful.
To save the movie you just took you need to open and read the data from the promise.
The iOS won't save the file anywere, so the MediaPromise.file is always null.
This is my solution to the problem:
private var camera:CameraUI;
private var dataInput:IDataInput;
public function recordVideo():void
{
// Start the camera and ask for a video
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, onCameraComplete);
camera.launch(MediaType.VIDEO);
}
private function onCameraComplete(event:MediaEvent):void
{
// event.data is a MediaPromise and MediaPromise.open() returns a IDataInput
// Let's cast it to a dispatcher and check when it's complete
dataInput = event.data.open();
var dispatcher:IEventDispatcher = IEventDispatcher(dataInput);
dispatcher.addEventListener(Event.COMPLETE, onDataInputComplete);
}
private function onDataInputComplete(event:Event):void
{
// We can do whatever we want with the data, so we'll store it in a File
var file:File = new File();
var bytes:ByteArray = new ByteArray();
var stream:FileStream = new FileStream();
// Reading the data from the opened MediaPromise
dataInput.readBytes(bytes);
stream.open(file, FileMode.WRITE);
stream.writeBytes(bytes, 0, bytes.bytesAvailable);
stream.close();
}
Also, I'm still looking for a way to put the movie in the CameraRoll
my following code returns null ,
byte[] image1 = _videoControl.getSnapshot(null);
any suggestion please
Few important moments about VideoControl.getSnapshot method:
some manufacturers may not implement getSnapshot() method;
the viewfinder must actually be visible on the screen prior to calling getSnapShot();
if you attempt to take pictures too quickly, however, getSnapShot() may
return null. The camera requires time to clear out its buffer and
prepare for the next shot;
you may check MMAPI System Property for "video.snapshot.encodings" before capturing:
if (System.getProperty("video.snapshot.encodings") == null) {
// getSnapshot() is not supported
}
You may read this chapter from book "Advanced BlackBerry Development":
http://books.google.com/books?id=F4Qu-lpoVncC&pg=PA53&lpg=PA53#v=onepage&q&f=false
Since VideoControl.getSnapshot method is not supported by all devices I'd recommend to use another approach. You can start the native BB Camera app with this line of code:
Invoke.invokeApplication(Invoke.APP_TYPE_CAMERA, new CameraArguments());
and then using the FileSystemJournalListener catch the taken image.
The BB SDK on your PC contains samples. Search for 'fileexplorerdemo' sample to see the rest of details.