Is it possible to take a remote (but I can download it first if needed) GIF sequence and make a MPMoviePlayerViewController playable mp4 on the device?
I have tried using http://api.online-convert.com/, but the API doesn't suit and the free version is too restricted for our needs.
Imagemagick for iOS also doesn't seem to include GIF support.
Swift 3.0 version.
https://gist.github.com/powhu/00acd9d34fa8d61d2ddf5652f19cafcf
Usage
let data = try! Data(contentsOf: Bundle.main.url(forResource: "gif", withExtension: "gif")!)
let tempUrl = URL(fileURLWithPath:NSTemporaryDirectory()).appendingPathComponent("temp.mp4")
GIF2MP4(data: data)?.convertAndExport(to: tempUrl, completion: { })
This wasn't so easy. There is a gist snippet here with the solution.
1st - Separate all images from GIF .
2nd - Create video from that separate images.
Please check this one. i think you can solution which you want.
With this Link you can easily convert GIF file to MP4.
Click
https://github.com/dennygajera/gifToMp4
Related
I want to get the format from a video URL (mp4, m3u8, mpeg etc...) to decided which player to use in my app (AVPlayer or third party).
One option is to check if AVAsset(...).isPlayable, Im using this right now, but I want to improve my logic for when the AVPlayer can't be used.
When I can't use AVPlayer then I encode the video using ffmpeg, if I could have the format of the video (and maybe even more information) I could improve the encode done by ffmpeg.
I would use a Regex expression to analyse the URL and find the extension.
you can use this one : .\w{3,4}($|?)
If you do NOT want to use Regex, then this code should do the trick :
var url = "https://www.bruno.com/videos/video.mp4"
extension String {
func fileExtension() -> String {
return URL(fileURLWithPath: self).pathExtension
}
}
let fileExtension = url.fileExtension()
print(fileExtension)
You can use something like below
let url = URL(string: "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")
let result = url?.pathExtension
//result will return the extension of the given path
Hope this helps you
If you are using ffmpeg you could do an ffprobe on the url to get the codec_name.
Something like this should work:
ffprobe test.mp4 -show_streams -show_entries stream=codec_type,codec_name, -of json -v quiet
I want to build an iOS 10 app that lets you shoot a RAW (.dng) image, edit it, and then saved the edited .dng file to the camera roll. By combining code from Apple's 2016 "AVCamManual" and "RawExpose" sample apps, I've gotten to the point where I have a CIFilter containing the RAW image along with the edits.
However, I can't figure out how to save the resulting CIImage to the camera roll as a .dng file. Is this possible?
A RAW file is "raw" output direct from a camera sensor, so the only way to get it is directly from a camera. Once you've processed a RAW file, what you have is by definition no longer "raw", so you can't go back to RAW.
To extend the metaphor presented at WWDC where they introduced RAW photography... a RAW file is like the ingredients for a cake. When you use Core Image to create a viewable image from the RAW file, you're baking the cake. (And as noted, there are many different ways to bake a cake from the same ingredients, corresponding to the possible options for processing RAW.) But you can't un-bake a cake — there's no going back to original ingredients, much less a way that somehow preserves the result of your processing.
Thus, the only way to store an image processed from a RAW original is to save the processed image in a bitmap image format. (Use JPEG if you don't mind lossy compression, PNG or TIFF if you need lossless, etc.)
If you're writing the results of an edit to PHPhotoLibrary, use JPEG (high quality / less compressed if you prefer), and Photos will store your edit as a derived result, allowing the user to revert to the RAW original. You can also describe the set of filters you applied in PHAdjustmentData saved with your edit — with adjustment data, another instance of your app (or Photos app extension) can reconstruct the edit using the original RAW data plus the filter settings you save, then allow a user to alter the filter parameters to create a different processed image.
Note: There is a version of the DNG format called Linear DNG that supports non-RAW (or "not quite RAW") images, but it's rather rare in practice, and Apple's imaging stack doesn't support it.
Unfortunately DNG isn't supported as an output format in Apple's ImageIO framework. See the output of CGImageDestinationCopyTypeIdentifiers() for a list of supported output types:
(
"public.jpeg",
"public.png",
"com.compuserve.gif",
"public.tiff",
"public.jpeg-2000",
"com.microsoft.ico",
"com.microsoft.bmp",
"com.apple.icns",
"com.adobe.photoshop-image",
"com.adobe.pdf",
"com.truevision.tga-image",
"com.sgi.sgi-image",
"com.ilm.openexr-image",
"public.pbm",
"public.pvr",
"org.khronos.astc",
"org.khronos.ktx",
"com.microsoft.dds",
"com.apple.rjpeg"
)
This answer comes late, but it may help others with the problem. This is how I saved a raw photo to the camera roll as a .dng file.
Just to note, I captured the photo using the camera with AVFoundation.
import Photos
import AVFoundation
//reading in the photo data in as a data object
let photoData = AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer)
// put it into a temporary file
let temporaryDNGFileURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("temp.dng")!
try! photoData?.write(to: temporaryDNGFileURL)
// get access to photo library
PHPhotoLibrary.requestAuthorization { status in
if status == .authorized {
// Perform changes to the library
PHPhotoLibrary.shared().performChanges({
let options = PHAssetResourceCreationOptions()
options.shouldMoveFile = true
//Write Raw:
PHAssetCreationRequest.forAsset().addResource(with: .photo, fileURL: temporaryDNGFileURL, options: options)
}, completionHandler: { success, error in
if let error = error { print(error) }
})
}
else { print("cant access photo album") }
}
Hope it helps.
The only way to get DNG data as of the writing of this response (iOS 10.1) is:
AVCapturePhotoOutput.dngPhotoDataRepresentation(forRawSampleBuffer: CMSampleBuffer, previewPhotoSampleBuffer: CMSampleBuffer?)
Noting the OP refers to Core Image. As mentioned by rickster, CI works on processed image data, therefore only offers processed image results (JPEG, TIFF):
CIContext.writeJPEGRepresentation(of:to:colorSpace:options:)
CIContext.writeTIFFRepresentation(of:to:format:colorSpace:options:)
I'm an undergraduate student and I'm witring an iPhone HumanSeg app. But now I have a problem, that I have a native video in album, and I need to load that video into my code and do some processing. My codes are below:
let filePath = Bundle.main.path(forResource: "1", ofType: "mp4")
let videoURL = NSURL(fileURLWithPath: filePath!)
let avAsset = AVAsset(url: videoURL as URL)
But when I run this code, Xcode just tells me that filePath is nil. I assert that 1.mp4 is in both Assets.xcaassets and iPhone album. Is there anyone who'd like to offer some help?
By the way, How can I get the images(in UIImage format) in the video at the fastest speed? For each image at given time, I really have to read it in no more than 5ms so I may output the preserved video at a good fps.
Check target's build phases, whether the file is being copied to the bundle.Also the check the box to include the file. This code is correct for fetching that file.
I downloaded a jpg file from my ip cam.
but the file seems broken or something about its format.
I couldn't open it using safari or preview.app(macOS),
but it shows in chrome.
What I really trying to do is download it programmatically and show in UIImageView, but the image data is always nil.
I don't know much about image format and the question is strange,
so if you're willing to help me and see whats going on with that picture,
the image: broken jpg
and thanks for your time!
UPDATE 2017/03/30 :
Still haven't found the answer for how to decode motion jpeg frame.
From what I googled, the difference is DHT, but don't know how to add it to a frame.
As far as I know, there are few third-party libs like libjpeg-turbo, ffmpeg but haven't found an example.
If you have done this before and wrote in C or Objective-C, hope you can help me out!
Really want and need to know how!
Thanks!
Your file is an Motion JPEG file, not a JPEG image...
This does explain why browsers are able to open it and you can check this in vlc by looking at the codecs information :
Motion JPEG Video (MJPG)
or even through ffmpeg -i MeQ6p.jpg:
Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 640x480, 25 tbr, 25 tbn, 25 tbc
So your problem lies in your files only, and this can probably be fixed by setting your ip-cam to save still JPEG images instead of MJPEG streams.
I just tried and can see the data gets fetched from your posted URL and got some 17750 bytes.
let imageURL = NSURL(string : "https://i.stack.imgur.com/MeQ6p.jpg")
let imageData = NSData(contentsOfURL: imageURLFromParse! as NSURL)
I am working with the new keyboard extensions and I am able to create a keyboard to allow text to be sent through. (The easy stuff). I also figured out how to copy+paste images within the keyboard extension into the messages. However I cannot seem to find much or any information on how I can send an audio clip to someone through messages (or a video file).
I understand this has to be similar to the way sending images works. Where you need to copy and paste it into the field.
Does anyone know how to get this done?
Thanks!
The process to get an audio clip to the pasteboard should be pretty similar to an image. Here's some swift code which pastes a file called audio.wav
let path = NSBundle.mainBundle().pathForResource("audio", ofType:"wav")
let fileURL = NSURL(fileURLWithPath: path!)
let data = NSData(contentsOfURL: fileURL)
let wavUTI = "com.microsoft.waveform-audio"
UIPasteboard.generalPasteboard().setData(data!, forPasteboardType: wavUTI)