GPUImage examples not working - ios

I've hit a roadblock with using GPUImage. I'm trying to apply a filter (SepiaFilter or OpacityFilter) on a prerecorded video. What I'm expecting to see is the video played back with the filter applied to it. I followed the SimpleFileVideoFilter example for my code. What I ended up with is a video that is unplayable by Quicktime (m4v extension) and the live preview of the rendering all skewed. I thought it was my code at first so I ran the example app from the examples directory and lo and behold I got the same issue. Is the library broken? I just refreshed from master out of GitHub.
Thanks!
Here's a sample output of the video generated
http://youtu.be/SDb9GfVf9Lc
No matter what filter is applied the resultant video are all similar. (all skewed )
#Brad Larson (I hope you see this message), do you know what I can be doing wrong? I am using the latest XCode and source code of GPUImage. I also tried using the latest from CocoaPods as well. Both end up the same.

I assume you're trying to run this example via the Simulator. Movie playback in the Simulator has been broken for as long as I can remember. You need to run this on an actual device to get movie playback to work.
Unfortunately, one of the recent pull requests that I brought in appears to have introduced some crashing bugs even there, and I may need to revert those changes and figure out what went wrong. Even that's not an iOS version thing, it's a particular bug with a recent code addition. I haven't had the time to dig into it and fix it, though.

Related

Unity 3d model fragmented on mobile

I have a strange issue that keeps coming up in Unity 2018.2.15f1 (personal). The best way to describe it is that my 3D models are fragmenting (exploding? shattering?) when I build to iOS. So, I'll start with a visual explanation. This is what the model looks like on an iPad
and this is what it should look like (you can also see it on the App Store w/o fragmenting):
These models came from OSM terrain data, worked on in Blender, and then imported to Unity. They worked fine on mobile builds up until one of two things happened while trying to increase performance 1) I experimented with mobile shaders and 2) I followed some of the tips in this video. Since discovering the issue I "undid" all the changes (using Git this was easy) and it seemed to fix it, until many versions later the problem has suddenly showed up again, but only on this one model (not the other two "cities"). I assumed the issue was the switch to mobile shaders, but since I'm not using them any long I now have no idea what is causing the issue.
Here's what I've done to try to fix it:
Reimported the model
Broken the model into distinct components (buildings and terrain)
Double checked I have default Quality settings (under Project Settings)
Double checked I use only the Standard Unity shader throughout the game
I have found if I turn off one or the other of two models in the scene (the buildings and terrain) the issue goes away.
I have found if I position the building model so they don't intersect (see #5) it works sometimes but not every time. It must be at least 30 units above the terrain on the Y axis before the fragments go away.
I tried writing over the iOS build folder (instead of append) but that had no effect.
I tried switching to PC standalone, resetting the GI Cache in Preferences, and switching back to iOS but no luck.
I have found the a solution, but I don't know why. I separated the meshes into separate objects AND gave each a different material, with matching settings except for one option.
On the new one, I changed the Rendering Mode from Opaque to Transparent. Now there is no more fragmenting, but I'm not sure why. Leaving this question open until someone knows the answer.

iOS 12 Model rendering issue

Having an iOS 12 model rending issue.
My app loads OBJ models with associated MTLs and textures.
On iOS 11 we were able to load up the models and they looked good:
On iOS 12, they look completely different:
We are able to make some changes after the model loads initially to make it look good, but it takes time for the iPhone to load the better looking version.
Has anyone heard about/experienced this issue and know what has changed in iOS 12 (and potentially MacOS Mojave) that is causing it?
There might be two issues: 1- texture issue (as seen in chair on left) and 2- Material/MTL issue as seen in the ‘delivery drone’ on the right
I don't have any code at this moment as I am not one of the developers on the project - I have been tasked with reaching out here. If you have any questions regarding the specific code I could definitely try to get some to show here. It seems to me like this might not be a code issue or bug, but rather some settings that have to changed due to changes made in iOS 12, but I can't find documentation for something that matches this.
I know this is not an answer, but I was asked for a screenshot. For the moment I use the OpenGL renderer instead of Metal as a workaround.
I solve the same issue by convert .obj file to .scn files in Xcode, and use this scenes as nodes. Editor -> Convert to SceneKit file format (.scn)
screenshot of this menu

Why is SKAction.playSoundFileNamed crashing?

NOTE:
It has been a few days and I still am having this problem. One thing that would be helpful, is to know some troubleshooting ideas to try, so I can track down what is causing the crash. Any help that would lead me in the correct direction would be greatly appreciated.
I'm running in Xcode 8.2.1 on the simulator, as well as on several different iOS devices. I get the same problem wherever I go.
I have imported a small mp3 file into my spritekit project, called "cat_meow_1.mp3"
when I select the file, in Xcode, and hit the play button, it plays normally. Incidentally I have tried with various different files in various formats, with the same results.
in my code, which complies okay, when I get to the line:
run(SKAction.playSoundFileNamed("cat_meow_1.mp3", waitForCompletion: true))
I get a crash with the error message,
error: use of undeclared identifier '$r0'
Any suggestions how to debug this problem, or to figure out what I did wrong?
I also tried to preload the sound and make a class property, but got the same error. Here's what it looks like:
UPDATE:
After reinstalling Xcode, I still have the same problem. With certain files, I get white noise, while with other ones I get the error here. What else could be wrong with my system to cause this problem?
UPDATE 2:
tried my experiment on a separate mac, and basically it worked just fine. I reinstalled my OS, and got the same problem, with a little more of an error message this time, which reads:
2017-01-18 18:10:09.397565 sound attempt 2[533:124220] [DYMTLInitPlatform] platform initialization successful
2017-01-18 18:10:11.058506 sound attempt 2[533:124042] Metal GPU Frame Capture Enabled
2017-01-18 18:10:11.059146 sound attempt 2[533:124042] Metal API Validation Enabled
error: use of undeclared identifier '$r0'
warning: could not load any Objective-C class information. This will significantly reduce the quality of type information available.
After a couple days, I finally gave in and used Clean My Mac 3 to completely remove Xcode, as well as any and all related files... I must not have caught everything when previously I uninstalled/reinstalled it, because this time, there were none of my custom settings at all when I ran the reinstalled app.
It works now, thankfully, and I can get back to work.

What's wrong with this aupreset for AUSampler?

I created this aupreset to be loaded into an AUSampler in iOS. I followed the process outlined here and used the EPSSampler class for the same post. So, if I run my app in the iOS simulator, on iOS 9, the aupreset loads and I get to play notes. If I run the same app on a device running iOS 6, the preset loads but I get no sound. I have used the same process on simulator and device in the past, but always by filtering the built-in sine wave generator, never with audio samples. Can someone spot what I'm doing wrong?
EDIT I have no way of testing the app on any device running iOS above 6, at least for now.
EDIT 2 To clarify further, this is how my project looks in Xcode, so you know that my files are going to the right places – i.e. the audio files are going to the Sounds folder within the app bundle (I double checked, just to be sure).
EDIT 3 So, I took the Trombone.aupreset from LoadPresetDemo and manually plugged my audio files into it. Magically, it worked. So I figured I'd load it into the AUSampler's GUI through AU Lab, and make whatever changes I needed to make it sound right – i.e., increasing the release time. It stopped working. So, i manually tweaked the working copy to roughly match what I needed (the docs on aupresets plists are surprisingly unhelpful) and I'm rolling with it. It would seem that AUSampler is messing up the preset, at least for iOS 6 on device, which, currently, is the only device I have to test on. Insights?
I didn't really look that deeply, but the file://localhost//Library/Audio/Sounds/C.caf in your file references looks a little fishy. Mine looks like this /Users/dave/Library/Audio/Sounds/C6.wav. Maybe the file:// part is throwing it off.
Create a directory in your iOS project named Sounds (just like your image shows) and put your caf files in there. The URLs will be groked by iOS.
Here is a post that goes into detail.

What is the simplest way to play a MIDI note for an indefinite duration in iOS?

I want to play instrument 49 in iOS for varying pitches [B2-E5] for varying durations.
I have been using the Load Preset Demo as reference. The vibraphone.aupreset does not reference any files. So, I had presumed that:
I would be able to find and change the instrument in the aupreset file (unsuccessful so far)
there is some way to tell the MIDI interface to turn notes on and off without generating *.mid files.
Here's what I did:
Duplicated the audio related code from the proj
removed the trombone related files and code (called loadPresetTwo: in place of loadPresetOne: in init (as opposed to viewDidLoad)),
added a note sequence, and timer to turn off the previous note, and turn on the next note.
Build. Run. I hear sound on the simulator.
There is NO sound coming from my iPhone.
I have triple checked the code that I copied as well as where the calls are taking place. It's all there. The difference is the trombone related files and code are absent. Perhaps there is some dependency that I'm not aware of. Perhaps this is problem rooted in architectural differences between the simulator running on remote Mac VM and the iPhone. Perhaps I can only speculate because I don't know enough about the problem to understand what questions to ask.
Any thoughts or suggested tests would be great!
Thanks.
MusicPlayer + MusicSequence + MusicTrack works. It was much easier than trying to guess what code in the demo was doing.

Resources