Use GPUImage in EDK Marmalade - ios

i'm trying write extension of GPUImage for marmalade Framework. For this i used oficial documentation and Extension Development Kit (EDK) Marmalade. I write some sample code, compile with:
mkb s3egpuimage_iphone.mkb --arm --release --compiler clang
It's compile fine, and i get library and headers and make link with deploying tool marlmalade and linkage complete fine. But i write ipa into iPod touch and run this code, i get or freez application or crash application. Crash or freez begin of i call:
[videoCamera startCameraCapture]
ofcourse i initialized videoCamera with
[[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
and make easy target:
textureOutput = [[GPUImageTextureOutput alloc] init];
...
[videoCamera addTarget:textureOutput];
[videoCamera startCameraCapture];
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
movieWriter.shouldPassthroughAudio = YES;
videoCamera.audioEncodingTarget = movieWriter;
[movieWriter startRecording];
i think about this, but i not understand it.
With you help please?

Again I'd comment if I could but.... So this is a partial answer.
Worth looking through the log to see if any messages come up you were not expecting. However you've not shown the s4e file but a few things to consider:
1) At the lower level, are you running on the OS thread (either by stating it in the s4e file or rolling your own)? Find out what it should be accessed in, and be consistent - don't mix and match.
2) If you are in the os thread, look out for any exceptions. [the marmalade code that calls across the os thread does not like unhandled exceptions.]
3) The API that calls across threads uses varargs (...). This looks powerful but there are known issues with varargs and we'd now advise against - issues relate to 64-bit and similar allignment problems. Suggest creating a parameter block for each function and passing that instead.
If you find any more feel free to post.

Related

GPUImage and AVAssetWriter writes files to where?

I am trying to use Brad Larson's GPUImage library to record a video file. So far, the issue I am running into is the following piece of code:
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/Movie.m4v"];
unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];
This is picked up directly from his example from his GitHub page
https://github.com/BradLarson/GPUImage
Now, I looked at the source code and I see he does the usual AVFoundation stuff - creates AVAssetWriter instance and adds inputs to it. MY question here is that where is #"Documents/Movie.m4v" located? How can I play this video back? Or how do I even access it? I know it is located in an arbitrary location where the app bundle is copied during run-time. My goal is to be able to write the video into the Gallery or atleast an album in Gallery that is created by my app. Does anyone know where the file is being written to?
Maybe I'm misunderstanding, but the movie is at the path pathToMovie (or URL movieURL.
You can play it back with something like this:
#import <AVFoundation/AVFoundation.h>
#import <AVKit/AVKit.h>
// later
AVPlayerViewController *avc = [[AVPlayerViewController alloc] init];
avc.player = [[AVPlayer alloc] initWithURL:movieURL];
[myViewController presentViewController:avc animated:YES completion:nil];
If you want to download the file to your computer, you can select your connect device in Xcode > Devices, then select your app and download the package. Your movie will be in the Documents directory.
To save it to the gallery, do
// TODO: use completion arguments for a better user experience
UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie,nil,nil,nil);

GPUImage failed to init ACVFile with data:(null)

First of all, I must say that GPUImage is an excellent framework. However, when loading an ACV file that I export from Photoshop CS6, it gives me an error saying that: failed to init ACVFile with data:(null). The thing is though, that the same code works for some other ACV files, and the file definitely has data, 64 bites of it in fact.
Here is how I am trying to load it:
GPUImageToneCurveFilter *stillImageFilter2 = [[GPUImageToneCurveFilter alloc] initWithACV:#"test"];
UIImage *quickFilteredImage = [stillImageFilter2 imageByFilteringImage:baseImage];
photoImage.image = quickFilteredImage;
If I change test to another ACV file, it works perfectly. Not sure what is wrong.
Thanks
MehtaiPhoneApps
just add the extension of tone curve file test.acv and you are good to go
=> updated code
GPUImageToneCurveFilter *stillImageFilter2 = [[GPUImageToneCurveFilter alloc] initWithACV:#"test.acv"];
UIImage *quickFilteredImage = [stillImageFilter2 imageByFilteringImage:baseImage];
photoImage.image = quickFilteredImage;

adding GPUImage framework to IOS project

The hours are turning into days trying to add the GPUImage framework into an IOS project. Now I've got it working I'm trying the sample filtering live video code from Sunset Lake Software page. The app fails to build with the following red error: ' Use of undeclared 'thresholdfFilter'
GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:#"CustomShader"];
GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, 768, 1024)];
// problem here
[videoCamera addTarget:thresholdFilter];
[customFilter addTarget:filteredVideoView];
[videoCamera startCameraCapture];
Using Xcode 6.0.1 and testing app on iPad2 with IOS 8.0.2 - If required, I can post screen shots of how I emdedded the framework.
First, the code written in my initial blog post announcing the framework should not be copied to use with the modern version of the framework. That initial post was written over two years ago, and does not reflect the current state of the API. In fact, I've just deleted all of that code from the original post and directed folks to the instructions on the GitHub page which are kept up to date. Thanks for the reminder.
Second, the problem you're describing above is that you're attempting to use a variable called thresholdFilter without ever defining such a variable. That's not a problem with the framework, the compiler has no idea what you're referring to.
Third, the above code won't work for another reason: you're not holding on to your camera instance. You're defining it locally, instead of assigning it to an instance variable of your encapsulating class. This will lead to ARC deallocating that camera as soon as your above setup method is complete, leading to a black screen or to a crash. You need to create an instance variable or property and assign your camera to that, so that a strong reference is made to it.

iOS Abandoned memory VM: Image IO

I have a such kind of problem. I`m developing an iPad application which actually uses a lot of images and core animation stuff. I have no leaks but I have abandoned memory issue. I have a steady memory growth. I have disabled animations that actually use quite a lot of memory but I still have problems with memory growth. For animations I use http://markpospesel.wordpress.com/2012/05/07/mpfoldtransition/. I have replaced implementations of such loading methods in UIImage class as "imageNamed:" etc. Still it does not help.
If anyone has any ideas please help.
Thanks to everybody.
+ (UIImage *)imageNamed:(NSString *)name {
NSString *pathExtension = [name pathExtension];
name = [name stringByDeletingPathExtension];
if ([pathExtension isEqualToString:#""]) {
pathExtension = #"png";
}
NSString *sufix = [BMKAppUtilites isRetina] ? #"#2x" : #"";
name = [name stringByAppendingString:sufix];
name = [name stringByAppendingPathExtension:pathExtension];
name = [[NSBundle mainBundle] pathForResource:[name stringByDeletingPathExtension] ofType:[name pathExtension]];
return [[self alloc] initWithData:[NSData dataWithContentsOfFile:name options:NSDataReadingUncached error:NULL] scale:[BMKAppUtilites scaleFactor]];
}
edit: I just noticed this question is over 6 years old. 😆
—
It looks a lot like something still has a strong reference to your images. I’d say your best bet based on the data you’ve shown us is to just run from Xcode and pause in the memory graph to see what has the strong references. Don’t forget to turn on malloc stack traces in your scheme in order to get traces to where the memory is allocated. Good luck.
This might help a bit if you’ve never used the Xcode memory graph: https://developer.apple.com/documentation/xcode/improving_your_app_s_performance/reducing_your_app_s_memory_use/gathering_information_about_memory_use

VLC-can't playback online video

i'm trying to use VLC to playback youtube online video for IOS5.
I set a NSURL to MVLCMovieViewController, use code like this:
NSString *conntentURL = #"http://www.youtube.com/watch?v=FWKYriGgmCo";//(it's a workable link)
NSURL *url = [NSURL URLWithString:connectURL];
MVLCMovieViewController *movieViewController = [[MVLCMovieViewController alloc] init];
movieViewController.url = url;
[self presentModalViewController:movieViewController animated:YES];
[movieViewController release];
run the app, but i got a stop at http.c file with a hint "Program received signal "EXC_BAD_ACCESS"" near code:
p_sys->psz_user_agent = var_InheritString(p_access, "http-user-agent");
for(char *p = p_sys->psz_user_agent; *p, p++)
So does VLC support online playbacking? Or what should to be modified so that i can play a url directly on ios?
Thanks a lot for your help in advance!
I've done a lot of work on the VLC iOS source code, to try to get it to handle RTP and UDP streams. The short answer is that I didn't get it to work for those protocols but HTTP works, and the blocking seems to be at the OS level.
If you want the details on what I did to make VLC compile correctly and work on the latest XCode, please read the following forum thread https://forum.videolan.org/viewtopic.php?f=12&t=108691
Since YouTube seems to be HTTP, it should work but your mileage may vary.
Hope this helps.

Resources