I opened Instruments and select the Automation template for it.
After opening an Instrument tool i selected an app from Choose Target.
After that i selected my Java Script file from Choose script option but the Start Script and Stop Script option are not getting enabled.In this case how should i Record and Playback my Scripts in Instruments. I am using Xcode 4.1.
Please help me to enable those buttons and please guide me on how can i record and playback scripts on Insstruments.
This might be iOS device/simulator incompatibility. Record is available for iOS 5 and above. If you will connect device with 4.x iOS or launch simulator for 4.x iOS - record will be disabled.
Related
I'm trying to add an audio unit extension to my iOS app. I used File > New > Target > Audio Unit Extension in Xcode to use the built-in template, filled in the info to populate the Info.plist file, and built and ran my app. Even though the audio unit doesn't do anything yet, I expected that the audio unit would be visible to host apps at this point, but it's not.
I downloaded and ran Apple's FilterDemo app, which creates an audio unit extension similar to what I got from the Xcode template, and that appears in host apps (I'm using Auria as a host to test the audio units).
I've tried running my app's main target, or running the extension target and selecting Auria as the host app, but neither works. When running the FilterDemo app, I just ran the main target and that worked.
I went through the Info.plist and build settings comparing the FilterDemo to my app, but I didn't see any significant differences. I also opened the Xcode build folder and viewed the app package that I'm running in debug mode and confirmed that my audio unit extension (.appex package) is there in the Plugins folder.
In the Info.plist file, my extension type is augn (generator), manufacturer is Test and description is Share Audio. I experimented with some changes to these settings, but that didn't help.
I thought the existing Inter-App Audio functionality might be interfering, so I removed the AudioComponents section from the container app's Info.plist file. That made the IAA component disappear from host apps, but didn't make the AUv3 component appear.
I've read through the App Extension Programming Guide, the AUAudioUnit class reference, this tutorial and this one, and the transcript of the WWDC introduction of this functionality, but none of them mention any extra steps needed to make the extension visible to host apps.
What am I missing?
You should run first of all the app installation and then the extension:
Create an empty iOS project, you can name it HostAU
Create an Audio Unit Extension target for iOS named EffectAU
Set Audio Unit Type as Effect, Subtype Code as something like oiuy, and Manufacturer Code as something like Oiuy
Then run HostAU in your iOS device
After run EffectAU selecting Garage Band as target (make sure you have it installed in your iOS device)
In Garage Band select any instrument, press button settings to show it on the left side and find for "PLUG-INS & EQ" section.
Press this bar to edit and press edit at the top, then press any plus green button to add a new one.
Find the Audio Unit Extension segmented control option at top and then you'll be able to see your Audion Unit Extension icon, name and Manufacturer Code
The same process works in MacOS, but you should create and empty MacOs project and a MacOS Audio Unit Extension target. If you're using MacOS Catalyst, create two Audio Unit Extensions for both platforms and configure which one will be used for which platform in Project > Build Phases > "Embed App Extensions".
When you run the extension target MacOS Garage Band, make sure you have AudioUnit menu enabled at Preferences > Audio/Midi > Enable Audio Units. An then configure it at bottom in Track > Plug-ins > Audio Units > {Manufacturer Code} > {Audio Unit Extension Name}.
In MacOS, the shell commands pluginkit -m and auval -a helps you to check if the plugin is installed in MacOS and if the audio unit was recognized, respectively.
I've done that experiment using MacOS 11.6, iOS 15.0.1, MacOS GarageBand 10.4.3, iOS GarageBand later 2021-10 and Xcode 13.0.
I'm new to Instruments, and I'm trying to use Instruments Automation to send location events to my Xcode Swift project, which uses MapKit, but using a SIMULATOR (c.f. real device).
I've got this working by:
running my project in PROFILE mode in Xcode
then going to instruments automation with a script that uses "setLocationWithOptions"
But neither in Xcode or Instruments do I see my normal log output (print or NSLog) I use to monitor the app? How would I see print line outputs when doing this? Or is the different approach I should be taking?
Xcode 8
If talk about simulator, you need to run your application first and then, record running application through you instrument, Zombies, for example.
What you need to choose for recording:
select running application
I'm trying to figure out how to do automated UI testing so that I can test my app for regression errors as I make changes and such. I'm following the instructions found in Apple's documentation. I built my app in Xcode and it is running in the iOS simulator.
I opened Instruments and chose the iOS Automation instrument. However, I can't seem to figure out how to get Instruments to run the script on my iOS app in the sim. When I first create the Automation instrument the target drop down just says "lkj" but if I try to choose the sim as my target, it tells me that the Automation instrument doesn't allow attaching.
How do I get Instruments to run my test on my iOS app?
I tried just running it with the "lkj" target selected and I got a weird error involving some random guy's name (I'm assuming he's a dev for Instruments or something?)
Path not found
'/Users/jonathan_morgan/Library/Developer/Xcode/DerivedData/lkj-randomstringofcharacters/Build/Products/Release-iphonesimulator/lkj.app/lkj
The simplest way to attach your target to the simulator and run your UIAutomation scripts is to profile the app. Xcode - Product - Build For - Profiling and then select the Automation template.
Another way to attach the target, if you've already built the app on your simulator. Is to select Choose Target and then go to the following location /Users/[yourUserName]/Library/Application Support/iPhone Simulator/[iOSVersion]/Applications/[AppFolder]/[NameOfYourTarget]
For more info, you can take a look at this blog which is pretty detailed http://blog.manbolo.com/2012/04/08/ios-automated-tests-with-uiautomation
Hope it helps.
I've installed Monkeytalk on my machine, it has OS X 10.8.5
I had a successfully run an automatized test on Android with Monkeytalk, so I moved on X Code 5.0.2. Created a sample application (iOS 7.0), added the required targets, changed the Schema name too. Also added into Buid Phases the required libraries, mentioned here, as well as the Build Settings' Other Linker Flag option (-all_load). Now, I can successfully build the app to the new Monkey target, appears on the simulator correctly. But after selecting the iOS Simulator in the Monkeytalk IDE, however the console's output is "Connected to Device: iOS Simulator", the record button is not enabled.
I've watched the tutorial video about running Monkeytalk on iOS simulator too, the linker flag mentioned there is -all_load lstdc++ if I understood correctly (the quality of the video wasn't the best and I could not read the line). Did you ever get this problem?
Edit: I've tried to run via networked device, the Monkeytalk is successfully connected to the device, but the record button is still not enabled.
Did you get message in XCode's log output like the last photo of Installing the MonkeyTalk iOS Agent?(Sorry, I don't get the permission to attach images here yet)
If not, make sure you added all the required libraries into Buid Phases and run in CORRECT target first(I met this problem once because of this :<). Then you can check the log output and google these messages for help.
I created a build for ad-hoc distribution of our product and installed the same on my device. Now I want to run the time profiler on the running process but Instruments is unable to attach to it. This is the error that I get when I try to attach to a running process on the iPhone:
Target failed to run : Could not attach to process <app-name> (<pid>)
I also tried "Choose Target" > app-name but that too failed with the following error:
Target failed to run : Remote exception encountered : 'Failed to get task for pid <pid>'
Here are the details of my setup:
OS X 10.7.2
Xcode 4.2.1 (Build 4D502)
Instruments 4.2 (4233)
iPhone OS 5.0.1 (Build 9A405)
I had the same problem. I didn't solve it initially, but an easy workaround is to launch the app yourself and then attach to it from the "Attach to Process..." command in the Target menu in the Instruments window.
After some digging around it seems this is a common topic of discussion on the apple developer forums: Instruments does not work on Xcode 4 with device
It appears that different Apple products will change/update the MobileDevice framework. Some of these updates break the Instruments integration.
To fix this on my own machine, I installed the iTunes 10.5 beta v6. Fire up XCode, and I'm back in business Instrumenting on my device.
I would recommend this tutorial since it is one of the better ones for explaining how to use Instruments
It can be very confusing at first, but take the time to get to know it and it'll ease a lot of headaches later.
Hope atleast one of these approaches work for you.
In Xcode 10: don't launch Instruments separately. Instead:
From Xcode’s menu bar, select Product\Profile, or press ⌘I. This will build the app and launch Instruments. You will be greeted with a selection window...
Full Tutorial
Note
If you do launch separately you'll get the failed to attach to target error with the recommendation to disable System Integrity.