How to flip a video using AVFoundation - ios

I've recorded a video with the front facing camera and the output is mirrored...
I've tried using AVMutablecomposition and layerinstructions to flip the video but no luck.
Googling and searching Stack Overflow has been fruitless so I bet a simple, straight forward example of how to do this is something that would benefit many.

Theres no indication on what you are using to record the video, ill assume AVCaptureSession + AVCaptureVideoDataOutput
lazy var videoFileOutput: AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()
let v = videoFileOutput.connectionWithMediaType(AVMediaTypeVideo)
v.videoOrientation = .Portrait
v.videoMirrored = true

You can use -[AVMutableVideoCompositionLayerInstruction setTransform:atTime:]
CGAffineTransform transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0);
transform = CGAffineTransformScale(transform, -1.0, 1.0);
[videoCompositionLayerInstruction setTransform:transform atTime:videoTime];
// then append video tracks
// [compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error];
// apply instructions
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
videoCompositionInstruction.layerInstructions = #[videoCompositionLayerInstruction];
videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize);
videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate);
videoComposition.instructions = #[videoCompositionInstruction];
https://github.com/ElfSundae/AVDemo/tree/ef2ca437d0d8dcb3dd41c5a272c8754a29d8a936/AVSimpleEditoriOS
Export composition:
AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:presetName];
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.outputURL = outputURL;
exportSession.shouldOptimizeForNetworkUse = YES;
// videoComposition contains transform instructions for video tracks
exportSession.videoComposition = videoComposition;
// audioMix contains background music for audio tracks
exportSession.audioMix = audioMix;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
AVAssetExportSessionStatus status = exportSession.status;
if (status != AVAssetExportSessionStatusCompleted) {
// exportSession.error
} else {
// exportSession.outputURL
}
}];

After you get your output transform your video
func mirrorVideo(inputURL: URL, completion: #escaping (_ outputURL : URL?) -> ())
{
let videoAsset: AVAsset = AVAsset( url: inputURL )
let clipVideoTrack = videoAsset.tracks( withMediaType: AVMediaType.video ).first! as AVAssetTrack
let composition = AVMutableComposition()
composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID())
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = CGSize(width: clipVideoTrack.naturalSize.height, height: clipVideoTrack.naturalSize.width)
videoComposition.frameDuration = CMTimeMake(1, 30)
let transformer = AVMutableVideoCompositionLayerInstruction(assetTrack: clipVideoTrack)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30))
var transform:CGAffineTransform = CGAffineTransform(scaleX: -1.0, y: 1.0)
transform = transform.translatedBy(x: -clipVideoTrack.naturalSize.width, y: 0.0)
transform = transform.rotated(by: CGFloat(Double.pi/2))
transform = transform.translatedBy(x: 0.0, y: -clipVideoTrack.naturalSize.width)
transformer.setTransform(transform, at: kCMTimeZero)
instruction.layerInstructions = [transformer]
videoComposition.instructions = [instruction]
// Export
let exportSession = AVAssetExportSession(asset: videoAsset, presetName: AVAssetExportPreset640x480)!
let fileName = UniqueIDGenerator.generate().appending(".mp4")
let filePath = documentsURL.appendingPathComponent(fileName)
let croppedOutputFileUrl = filePath
exportSession.outputURL = croppedOutputFileUrl
exportSession.outputFileType = AVFileType.mp4
exportSession.videoComposition = videoComposition
exportSession.exportAsynchronously {
if exportSession.status == .completed {
DispatchQueue.main.async(execute: {
completion(croppedOutputFileUrl)
})
return
} else if exportSession.status == .failed {
print("Export failed - \(String(describing: exportSession.error))")
}
completion(nil)
return
}
}

Swift 5. AVCaptureSession:
let movieFileOutput = AVCaptureMovieFileOutput()
let connection = movieFileOutput.connection(with: .video)
if connection?.isVideoMirroringSupported ?? false {
connection?.isVideoMirrored = true
}
Same for PhotoOutput.

Related

How to change video resolution in Swift

Is there a way to export a video with resolution 480 x 960 ? I know there are libraries for this but I'd rather do it without installing more pods on my project if possible.
I am converting a captured video in .MOV to .MP4. I used the method suggested on this thread.
The available options from AVAssetExport are these:
AVAssetExportPresetLowQuality
AVAssetExportPresetMediumQuality
AVAssetExportPresetHighestQuality
AVAssetExportPresetHEVCHighestQuality
AVAssetExportPreset640x480
AVAssetExportPreset960x540
AVAssetExportPreset1280x720
AVAssetExportPreset1920x1080
AVAssetExportPreset3840x2160
Is this the correct approach if the exported video is MP4? The documentation for AVAssetExportSession says this is for quicktime movies so I am a bit confused about this.
func exportVideo(inputurl: URL,
presetName: String = AVAssetExportPresetHighestQuality,
outputFileType: AVFileType = .mp4,
fileExtension: String = "mp4",
then completion: #escaping (URL?) -> Void)
{
let asset = AVAsset(url: inputurl)
let filename = filePath.deletingPathExtension().appendingPathExtension(fileExtension).lastPathComponent
outputURL = FileManager.default.temporaryDirectory.appendingPathComponent(filename)
if let session = AVAssetExportSession(asset: asset, presetName: presetName) {
session.outputURL = outputURL
session.outputFileType = outputFileType
session.shouldOptimizeForNetworkUse = true
session.exportAsynchronously {
switch session.status {
case .completed:
completion(self.outputURL)
case .cancelled:
debugPrint("Video export cancelled.")
completion(nil)
case .failed:
let errorMessage = session.error?.localizedDescription ?? "n/a"
debugPrint("Video export failed with error: \(errorMessage)")
completion(nil)
default:
break
}
}
} else {
completion(nil)
}
}
You must transform video tranck and video composition to your export session...
You have to do something like this:
//transform video
let rotationTransform = CGAffineTransform(rotationAngle: .pi)
videoTrack.preferredTransform = rotationTransform;
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
layerInstruction.setTransform(videoAssetTrack.preferredTransform, at: kCMTimeZero)
let videoCompositionInstruction = AVMutableVideoCompositionInstruction()
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
videoCompositionInstruction.layerInstructions = [layerInstruction]
let videoComposition = AVMutableVideoComposition()
videoComposition.instructions = [videoCompositionInstruction]
videoComposition.frameDuration = CMTime(value: 1, timescale: 30)
videoComposition.renderSize = videoSize
//saving...
guard let exportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetMediumQuality) else { return }
//exportSession.outputURL = your output url
exportSession.videoComposition = videoComposition

Flip video horizontally if so it does not have mirror effect

In my custom camera, when I film a video with the front facing camera, it does the mirror effect like the original iPhone camera. I don't want that. I would like to flip the video horizontally, and implement that in this function down below. I have a boolean variable called filmedWithFront that is true when a video is filmed with the front facing camera.
var filmedWithFront = false
func cropVideo(_ outputFileURL:URL){
let videoAsset: AVAsset = AVAsset(url: outputFileURL) as AVAsset
let clipVideoTrack = videoAsset.tracks(withMediaType: AVMediaType.video).first! as AVAssetTrack
let composition = AVMutableComposition()
composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: CMPersistentTrackID())
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = CGSize(width: 720, height: 1280)
videoComposition.frameDuration = CMTimeMake(1, 30)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(180, 30))
// rotate to portrait
let transformer:AVMutableVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: clipVideoTrack)
let t1 = CGAffineTransform(translationX: 720, y: 0);
let t2 = t1.rotated(by: CGFloat(CGFloat.pi/2));
transformer.setTransform(t2, at: kCMTimeZero)
instruction.layerInstructions = [transformer]
videoComposition.instructions = [instruction]
if filmedWithFront == true {
// This is where I want to add the code to flip video horizontally
}
let removedPath = outputFileURL.path
let documentsPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString
let cropUniqueId = NSUUID().uuidString
let outputPath = "\(documentsPath)/\(cropUniqueId).mov"
arrayOfStringPaths.append(outputPath)
stringOfArrayPaths = outputPath
let relativePath = "\(cropUniqueId).mov"
let relativeURL = URL(fileURLWithPath: relativePath)
saveData(arrayPath: relativePath)
let outputUrl = URL(fileURLWithPath: outputPath, relativeTo: relativeURL)
let exporter = AVAssetExportSession(asset: videoAsset, presetName: AVAssetExportPreset1280x720)!
exporter.videoComposition = videoComposition
exporter.outputURL = outputUrl
exporter.outputFileType = AVFileType.mov
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronously(completionHandler: { () -> Void in
DispatchQueue.main.async(execute: {
self.handleExportCompletion(exporter, removedPath)
})
})
}
Here's a snippet of the transform I did to finally fix mirrored video output from front camera... videoInputWriter is AVAssetWriterInput. Hope this helps.
if (cameraPosition == .front) {
var transform: CGAffineTransform = CGAffineTransform(scaleX: -1.0, y: 1.0)
transform = transform.rotated(by: CGFloat(Double.pi/2))
self.videoInputWriter.transform = transform
}

Overlay Two Videos with AVFoundation

I am trying to overlay two videos, with the foreground video being somewhat alpha transparent. I have been following the Apple Docs as well as This tutorial.
Whenever I try putting two of the same video through my code it doesn't crash; however, when I try feeding it two different videos I receive this error:
VideoMaskingUtils.exportVideo Error: Optional(Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.})
VideoMaskingUtils.exportVideo Description: <AVAssetExportSession: 0x1556be30, asset = <AVMutableComposition: 0x15567f10 tracks = (
"<AVMutableCompositionTrack: 0x15658030 trackID = 1, mediaType = vide, editCount = 1>",
"<AVMutableCompositionTrack: 0x1556e250 trackID = 2, mediaType = vide, editCount = 1>"
)>, presetName = AVAssetExportPresetHighestQuality, outputFileType = public.mpeg-4
Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}
I understand that you can't save a video with an alpha channel on iOS -- I want to flatten the two videos into one opaque video.
When trying to overlap the two videos and apply a PiP style using CATransforms, it crashes; simply overlapping them (w/o alpha or any other effects applied work)
Any help is appreciated.
Here's my code (with both approaches in it):
class func overlay(video firstAsset: AVURLAsset, withSecondVideo secondAsset: AVURLAsset, andAlpha alpha: Float) {
let mixComposition = AVMutableComposition()
let firstTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let secondTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
guard let firstMediaTrack = firstAsset.tracksWithMediaType(AVMediaTypeVideo).first else { return }
guard let secondMediaTrack = secondAsset.tracksWithMediaType(AVMediaTypeVideo).first else { return }
do {
try firstTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, firstAsset.duration), ofTrack: firstMediaTrack, atTime: kCMTimeZero)
try secondTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, secondAsset.duration), ofTrack: secondMediaTrack, atTime: kCMTimeZero)
} catch (let error) {
print(error)
}
let width = max(firstMediaTrack.naturalSize.width, secondMediaTrack.naturalSize.width)
let height = max(firstMediaTrack.naturalSize.height, secondMediaTrack.naturalSize.height)
let videoComposition = AVMutableVideoComposition()
videoComposition.renderSize = CGSizeMake(width, height)
videoComposition.frameDuration = firstMediaTrack.minFrameDuration
let firstApproach = false
if firstApproach {
let mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
mainInstruction.backgroundColor = UIColor.redColor().CGColor
let firstlayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: firstTrack)
firstlayerInstruction.setTransform(firstAsset.preferredTransform, atTime: kCMTimeZero)
let secondInstruction = AVMutableVideoCompositionInstruction()
secondInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, secondAsset.duration)
let backgroundColor = UIColor(colorLiteralRed: 1.0, green: 1.0, blue: 1.0, alpha: alpha)
secondInstruction.backgroundColor = backgroundColor.CGColor
let secondlayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: secondTrack)
secondlayerInstruction.setTransform(secondAsset.preferredTransform, atTime: kCMTimeZero)
secondInstruction.layerInstructions = [secondlayerInstruction]
mainInstruction.layerInstructions = [firstlayerInstruction]//, secondlayerInstruction]
videoComposition.instructions = [mainInstruction, secondInstruction]
} else {
let firstLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: firstMediaTrack)
firstLayerInstruction.setTransform(firstMediaTrack.preferredTransform, atTime: kCMTimeZero)
firstLayerInstruction.setOpacity(1.0, atTime: kCMTimeZero)
let secondlayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: secondMediaTrack)
secondlayerInstruction.setTransform(secondMediaTrack.preferredTransform, atTime: kCMTimeZero)
secondlayerInstruction.setOpacity(alpha, atTime: kCMTimeZero)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, min(firstAsset.duration, secondAsset.duration))
instruction.layerInstructions = [firstLayerInstruction, secondlayerInstruction]
videoComposition.instructions = [instruction]
}
let outputUrl = VideoMaskingUtils.getPathForTempFileNamed("output.mov")
VideoMaskingUtils.exportCompositedVideo(mixComposition, toURL: outputUrl, withVideoComposition: videoComposition)
VideoMaskingUtils.removeTempFileAtPath(outputUrl.absoluteString)
}
Here is my exportCompositedVideo function.
private class func exportCompositedVideo(compiledVideo: AVMutableComposition, toURL outputUrl: NSURL, withVideoComposition videoComposition: AVMutableVideoComposition) {
guard let exporter = AVAssetExportSession(asset: compiledVideo, presetName: AVAssetExportPresetHighestQuality) else { return }
exporter.outputURL = outputUrl
exporter.videoComposition = videoComposition
exporter.outputFileType = AVFileTypeQuickTimeMovie
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronouslyWithCompletionHandler({
switch exporter.status {
case .Completed:
// we can be confident that there is a URL because
// we got this far. Otherwise it would've failed.
UISaveVideoAtPathToSavedPhotosAlbum(exporter.outputURL!.path!, nil, nil, nil)
print("VideoMaskingUtils.exportVideo SUCCESS!")
if exporter.error != nil {
print("VideoMaskingUtils.exportVideo Error: \(exporter.error)")
print("VideoMaskingUtils.exportVideo Description: \(exporter.description)")
}
NSNotificationCenter.defaultCenter().postNotificationName("videoExportDone", object: exporter.error)
break
case .Exporting:
let progress = exporter.progress
print("VideoMaskingUtils.exportVideo \(progress)")
NSNotificationCenter.defaultCenter().postNotificationName("videoExportProgress", object: progress)
break
case .Failed:
print("VideoMaskingUtils.exportVideo Error: \(exporter.error)")
print("VideoMaskingUtils.exportVideo Description: \(exporter.description)")
NSNotificationCenter.defaultCenter().postNotificationName("videoExportDone", object: exporter.error)
break
default: break
}
})
}
Your min should be max...
Replace this line
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, min(firstAsset.duration, secondAsset.duration))
With this line and it will work :
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, max(firstAsset.duration, secondAsset.duration))

iOS rotate video AVAsset avfoundation

Example
Hi,
Struggling to rotate this video to show in the proper orientation and fill the entire screen.
I cannot get the avasset with videocompisition but cannot get it to work correctly.
let videoAsset: AVAsset = AVAsset(URL: outputFileURL) as AVAsset
let clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo).first! as AVAssetTrack
let newHeight = CGFloat(clipVideoTrack.naturalSize.height/3*4)
let composition = AVMutableComposition()
composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
let videoComposition = AVMutableVideoComposition()
var videoSize = CGSize()
videoSize = clipVideoTrack.naturalSize
videoComposition.renderSize = videoSize
videoComposition.frameDuration = CMTimeMake(1, 30)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(180, 30))
// rotate to portrait
let transformer:AVMutableVideoCompositionLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: clipVideoTrack)
let t1 = CGAffineTransformMakeTranslation(0, 0);
let t2 = CGAffineTransformRotate(t1, CGFloat(M_PI_2));
transformer.setTransform(t2, atTime: kCMTimeZero)
instruction.layerInstructions = [transformer]
videoComposition.instructions = [instruction]
let formatter = NSDateFormatter()
formatter.dateFormat = "yyyy'-'MM'-'dd'T'HH':'mm':'ss'Z'"
let date = NSDate()
let documentsPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as NSString
let outputPath = "\(documentsPath)/\(formatter.stringFromDate(date)).mp4"
let outputURL = NSURL(fileURLWithPath: outputPath)
let exporter = AVAssetExportSession(asset: videoAsset, presetName: AVAssetExportPresetHighestQuality)!
exporter.videoComposition = videoComposition
exporter.outputURL = outputURL
exporter.outputFileType = AVFileTypeQuickTimeMovie
exporter.exportAsynchronouslyWithCompletionHandler({ () -> Void in
dispatch_async(dispatch_get_main_queue(), {
self.handleExportCompletion(exporter)
})
})
Solved the rotation converting from the code below:
AVMutableVideoComposition rotated video captured in portrait mode
Now having issues with exporting in question below if anyone knows:
https://stackoverflow.com/questions/35233766/avasset-failing-to-export

Add watermark to recorded video and save

So I am trying to add a watermark to a previously recorded video using the following code, but when I view the video, there is no watermark. Can anyone help? I tried following the post at: iPhone Watermark on recorded Video.
public func addWatermarkToVideo(url: NSURL, completion:(url: NSURL?) -> Void) {
let videoAsset = AVURLAsset(URL: url)
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let clipVideoTrack: AVAssetTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
do {
try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: clipVideoTrack, atTime: kCMTimeZero)
} catch {
print(error)
}
compositionVideoTrack.preferredTransform = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0].preferredTransform
//Add watermark
guard let myImage = UIImage(named: "Logo") else {
completion(url: nil)
return
}
let aLayer = CALayer()
aLayer.contents = myImage.CGImage
aLayer.frame = CGRectMake(5, 25, 100, 57)
aLayer.opacity = 0.65
let videoSize = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0].naturalSize
let parentLayer = CALayer()
let videoLayer = CALayer()
parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height)
videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height)
parentLayer.addSublayer(videoLayer)
parentLayer.addSublayer(aLayer)
let videoComp = AVMutableVideoComposition()
videoComp.renderSize = videoSize
videoComp.frameDuration = CMTimeMake(1,30)
videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)
let instruction = AVMutableVideoCompositionInstruction()
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
let videoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0]
let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
instruction.layerInstructions = [layerInstruction]
videoComp.instructions = [instruction]
let paths = NSSearchPathForDirectoriesInDomains(NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)
let documentsDirectory: AnyObject = paths[0]
let dataPath = documentsDirectory.stringByAppendingPathComponent("VideoCache")
if (!NSFileManager.defaultManager().fileExistsAtPath(dataPath)) {
do {
try NSFileManager.defaultManager().createDirectoryAtPath(dataPath, withIntermediateDirectories: false, attributes: nil)
} catch {
print("Couldn't create path")
}
}
let tempURL = NSURL(fileURLWithPath: dataPath)
let completeMovieUrl = tempURL.URLByAppendingPathComponent("tether-\(NSDate()).mov")
if let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) {
exporter.outputURL = completeMovieUrl
exporter.outputFileType = AVFileTypeMPEG4
exporter.exportAsynchronouslyWithCompletionHandler({ () -> Void in
switch exporter.status {
case .Failed:
print("failed \(exporter.error)")
completion(url: nil)
break
case .Cancelled:
print("cancelled \(exporter.error)")
completion(url: nil)
break
default:
print("complete")
completion(url: exporter.outputURL)
}
})
}
}
You've forgotten to add your videoComposition to the AVAssetExportSession:
exporter.outputFileType = AVFileTypeMPEG4 // You had this
exporter.videoComposition = videoComp // but had forgotten this
exporter.exportAsynchronouslyWithCompletionHandler({ // ...

Resources