jpg file shows in chrome but not in safari - ios

I downloaded a jpg file from my ip cam.
but the file seems broken or something about its format.
I couldn't open it using safari or preview.app(macOS),
but it shows in chrome.
What I really trying to do is download it programmatically and show in UIImageView, but the image data is always nil.
I don't know much about image format and the question is strange,
so if you're willing to help me and see whats going on with that picture,
the image: broken jpg
and thanks for your time!
UPDATE 2017/03/30 :
Still haven't found the answer for how to decode motion jpeg frame.
From what I googled, the difference is DHT, but don't know how to add it to a frame.
As far as I know, there are few third-party libs like libjpeg-turbo, ffmpeg but haven't found an example.
If you have done this before and wrote in C or Objective-C, hope you can help me out!
Really want and need to know how!
Thanks!

Your file is an Motion JPEG file, not a JPEG image...
This does explain why browsers are able to open it and you can check this in vlc by looking at the codecs information :
Motion JPEG Video (MJPG)
or even through ffmpeg -i MeQ6p.jpg:
Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 640x480, 25 tbr, 25 tbn, 25 tbc
So your problem lies in your files only, and this can probably be fixed by setting your ip-cam to save still JPEG images instead of MJPEG streams.

I just tried and can see the data gets fetched from your posted URL and got some 17750 bytes.
let imageURL = NSURL(string : "https://i.stack.imgur.com/MeQ6p.jpg")
let imageData = NSData(contentsOfURL: imageURLFromParse! as NSURL)

Related

WebP encodedData loads for 30+ seconds

iOS version: 13.1
iPhone: X
I'm currently using DBAttachmentPickerController to choose from a variety of images, the problem comes when I take a picture directly from the camera and try to upload it to our server. The SDImageWebPCoder.shared.encodedData loads for about 30 seconds more less. The same image in the Android app takes about 2-3 seconds.
Here is the code I use
let attachmentPickerController = DBAttachmentPickerController(finishPicking: { attachmentArray in
self.images = attachmentArray
var currrentImage = UIImage()
self.images[0].loadOriginalImage(completion: { image in
self.userImage.image = image
currrentImage = image!
})
//We transform it to webP
let webpData = SDImageWebPCoder.shared.encodedData(with: currrentImage, format: .webP, options: nil)
self.api.editImageUser(data: webpData!)
}, cancel: nil)
attachmentPickerController.mediaType = DBAttachmentMediaType.image
attachmentPickerController.allowsSelectionFromOtherApps = true
attachmentPickerController.present(on: self)
Should I change the Pod I'm using? Should I just compress it? Or am I doing something wrong?
WebP encoding speed is related slow, it use software encoding and VP8 compression algorithm (complicated), compared to the Hardware accelerated JPEG/PNG encoding. (Apple's SoC).
picture directly from the camera
The original image taken on iPhone camera may be really lark, like 4K resolution. If you don't do some pre-scale and try to encode it, you may consume much more time.
The suggestion can be like this:
Try to use the options like the compressionQuality, the higher cost
more time, but compress more.By default it's 1.0, which is the higest and most time consuming.
Try to pre-scale the original image. For image from Photos Libraray, you can always use the API to control the size. Or, you can use SDWebImage's transform method like - [UIImage sd_resizedImage:].
Do all the encoding in background thread, never block main thread
If all these is not suitable, the better solution, it's to use JPEG and PNG format instead of WebP. Then, on your image server side code, transcoding the JPEG/PNG to WebP. Server side processing is always the best idea for this thing.
If you're intersted the real benchmark or something, compared to JPEG/PNG (Hardware) and WebP (Software). You can try to use my benchmark code demo here, to help you do your decision.
https://github.com/dreampiggy/ModernImageFormatBenchmark

WebRTC iOS: Filtering camera stream from RTCCameraVideoCapturer. Conversion from RTCFrame to CVPixelBuffer

I found the git below is simple and efficient by using func capturer(_ capturer: RTCVideoCapturer, didCapture frame: RTCVideoFrame) of RTCVideoCapturerDelegate. You get RTCVideoFrame and then convert to CVPixelBuffer to modify.
https://gist.github.com/lyokato/d041f16b94c84753b5e877211874c6fc
However, I found Chronium says nativeHandle to get PixelBuffer is no more available(link below). I tried frame.buffer.pixelbuffer..., but, looking at framework > Headers > RTCVideoFrameBuffer.h, I found CVPixelBuffer is also gone from here!
https://codereview.webrtc.org/2990253002
Is there any good way to convert RTCVideoFrame to CVPixelBuffer?
Or do we have better way to modify captured video from RTCCameraVideoCapturer?
Below link suggests modifying sdk directly but hopefully we can achieve this on Xcode.
How to modify (add filters to) the camera stream that WebRTC is sending to other peers/server
can you specify what is your expectation? because you can get pixel buffer from RTCVideoframe easily but I feel there can be a better solution if you want to filter video buffer than sent to Webrtc, you should work with RTCVideoSource.
you can get buffer with
as seen
RTCCVPixelBuffer *buffer = (RTCCVPixelBuffer *)frame.buffer;
CVPixelBufferRef imageBuffer = buffer.pixelBuffer;
(with latest SDK and with local video camera buffer only)
but in the sample i can see that filter will not work for remote.
i have attached the screenshot this is how you can check the preview as well.

how to set video quality for ios 270 360 480 720 1080

To set video quality for ios.
I have tried to load m3u8 video url from server and i downloaded the m3u8 file & i segregate all RESOLUTION from video quality & AFTER SEGMENTS get the bandwidth of url in array.
When i load base url sample.m3u8 it has video & audio after that i set the base url of before segments and i have append the bandwidth url from array it was loading video as per quality selected but no audio came.
To achieve this i have made some solutions will work
I make separate to run original url which contains both video & audio and i run separately low bandwidth url which contains no audio to make sync
ex: RESOLUTION=1280x720,SAMPLE_720p_v4.m3u8
SAMPLE.m3u8
#EXTM3U
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio-0",NAME="Default",AUTOSELECT=YES,DEFAULT=YES,URI="segments/SAMPLE_audio_v4.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=30681000,CODECS="avc1.640028",URI="segments/SAMPLE_1080p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=30140000,CODECS="avc1.4d001f",URI="segments/SAMPLE_720p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=15431000,CODECS="avc1.42001f",URI="segments/SAMPLE_480p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=11009000,CODECS="avc1.42001e",URI="segments/SAMPLE_360p_iframe.m3u8"
#EXT-X-I-FRAME-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=7850000,CODECS="avc1.420015",URI="segments/SAMPLE_270p_iframe.m3u8"
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=4080000,RESOLUTION=1280x720,CODECS="avc1.640028,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_1080p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=3471000,RESOLUTION=1280x720,CODECS="avc1.4d001f,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_720p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1934000,RESOLUTION=854x480,CODECS="avc1.42001f,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_480p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1106000,RESOLUTION=640x360,CODECS="avc1.42001e,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_360p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=837000,RESOLUTION=480x270,CODECS="avc1.420015,mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_270p_v4.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=185000,CODECS="mp4a.40.2",AUDIO="audio-0"
segments/SAMPLE_audio_v4.m3u8
Use the preferredPeakBitRate property on your playeritem https://developer.apple.com/documentation/avfoundation/avplayeritem/1388541-preferredpeakbitrate you need to pass a valid bandwidth value.
Not sure why you are downloading the m3u8 file AVFoundation manage this for you.

Detecting that iOS image data is HEIF or HEIC

My server doesn't support the HEIF format. So I need to transform it to JPEG before uploading from my app.
I do this:
UIImage *image = [UIImage imageWithData:imageData];
NSData *data=UIImageJPEGRepresentation(image, 1.0);
But how can I know that the data is HEIF (or HEIC) ? I can look at a file:
([filePath hasSuffix:#".HEIC"] || [filePath hasSuffix:#".heic"])
But I don't think it's a good answer. Is there any other solution?
Both existing answers have good recommendations, but to attempt to tell the whole story...
UIImage doesn't represent an image file or even binary data in an image-file format. A UIImage is best thought of as an abstract representation of the displayable image encoded in that data — that is, a UIImage is the result of the decoding process. By the time you have a UIImage object, it doesn't care what file format it came from.
So, as #Ladislav's answer notes, if you have a UIImage already and you just want to get data in a particular image file format, call one of the convenience functions for getting a UIImage into a file-formatted data. As its name might suggest, UIImageJPEGRepresentation returns data appropriate for writing to a JPEG file.
If you already have a UIImage, UIImageJPEGRepresentation is probably your best bet, since you can use it regardless of the original image format.
As #ScottCorscadden implies, if you don't have a UIImage (yet) because you're working at a lower level such that you have access to the original file data, then you'll need to inspect that data to divine its format, or ask whatever source you got the data from for metadata describing its format.
If you want to inspect the data itself, you're best off reading up on the HIEF format standards. See nokiatech, MPEG group, or wikipedia.
There's a lot going on in the HEIF container format and the possible kinds of media that can be stored within, so deciding if you have not just a HEIF file, but an HEIF/HEVC file compatible with this-or-that viewer could be tricky. Since you're talking about excluding things your server doesn't support, it might be easier to code from the perspective of including only the things that your server does support. That is, if you have data with no metadata, look for something like the JPEG magic number 0xffd8ff, and use that to exclude anything that isn't JPEG.
Better, though, might be to look for metadata. If you're picking images from the Photos library with PHImageManager.requestImageData(for:options:resultHandler:), the second parameter to your result handler is the Uniform Type Identifier for the image data: for HEIF and HEIC files, public.heif, public.heif-standard, and public.heic have been spotted in the wild.
(Again, though, if you're looking for "images my sever doesn't support", you're better off checking for the formats your server does support and rejecting anything not on that list, rather than trying to identify all the possible unsupported formats.)
When you are sending to your server you are most likely decoding the UIImage and sending it as Data so just do
let data = UIImageJPEGRepresentation(image, 0.9)
Just decide what quality works best for you, here it is 0.9
A bit late to the party, but other than checking the extension (after the last dot), you can also check for the "magic number" aka file signature. Byte 5 to 8 should give you the constant "ftyp". The following 4 bytes would be the major brand, which I believe is one of "mif1", "heic" and "heix".
For example, the first 12 bytes of a .heic image would be:
00 00 00 18 66 74 79 70 6d 69 66 31
which, after removing 0s and trim the result, literally decoded to ftypmif1.
Well, you could look at magic bytes - JPEG and PNG certainly are known, and I seem to see some references that HEIF (.heic) starts with a NUL byte. If you're using any of the PHImageManager methods like requestImageDataForAsset:options:resultHandler, that resultHandler will be passed a NSString * _Nullable dataUTI reference. There's a decent WWDC video/slides on this (possibly here) that suggest if the UTI is not kUTTypeJPEG you convert it (and the slides have some lower-level sample code in swift to do it that preserve orientation too).
I should also mention, if you have control at your app layer and all uploads come from there, do all this there.
If you're using Photos framework and are importing images from photo library, there's a solution that was mentioned briefly during WWDC17. First, import core services:
import MobileCoreServices
Then, when you request the image, check the UTType that is returned as a second parameter to your block:
// asset: PHAsset
PHImageManager.default().requestImageData(for: asset, options: nil) { imageData, dataUTI, orientation, info in
guard let dataUTI = dataUTI else { return }
if !(UTTypeConformsTo(dataUTI as CFString, kUTTypeJPEG) || UTTypeConformsTo(dataUTI as CFString, kUTTypePNG)) {
// imageData is neither JPG not PNG, possibly subject for transcoding
}
}
Other UTTypes can be found here

Convert GIF to MP4 on device

Is it possible to take a remote (but I can download it first if needed) GIF sequence and make a MPMoviePlayerViewController playable mp4 on the device?
I have tried using http://api.online-convert.com/, but the API doesn't suit and the free version is too restricted for our needs.
Imagemagick for iOS also doesn't seem to include GIF support.
Swift 3.0 version.
https://gist.github.com/powhu/00acd9d34fa8d61d2ddf5652f19cafcf
Usage
let data = try! Data(contentsOf: Bundle.main.url(forResource: "gif", withExtension: "gif")!)
let tempUrl = URL(fileURLWithPath:NSTemporaryDirectory()).appendingPathComponent("temp.mp4")
GIF2MP4(data: data)?.convertAndExport(to: tempUrl, completion: { })
This wasn't so easy. There is a gist snippet here with the solution.
1st - Separate all images from GIF .
2nd - Create video from that separate images.
Please check this one. i think you can solution which you want.
With this Link you can easily convert GIF file to MP4.
Click
https://github.com/dennygajera/gifToMp4

Resources