Unable to change buffer size via settings - audiokit

I have a custom C node done with AudioKitEX.
My node always receives a buffer size of 512.
I've tried changing the settings to a different buffer size, but the buffer size always stays the same at 512. How can I change it ?
This is what I tried, and I was expecting to get a buffer size of 1024 i my node, but no, still got 512..
Settings.bufferLength = .veryLong
try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(Settings.bufferLength.duration)
AudioKit 5.4

Right, so I came to the conclusion that the buffer size is not something that can be set from AudioKit, but it is dictated by the hardware. For example I was getting 512 when running my app on the iOS simulator, but 1024 when running on a real iPhone device...

Related

iOS not respecting jpeg image 32MP limit - how to fix

As title states and as searching on google can give, on iOS there is a limit for what the devices can handle for jpeg images.
As per Apple docs (Know iOS Resource Limits):
Because of the memory available on iOS, there are limits on the number
of resources it can process:
The maximum size for decoded GIF, PNG, and TIFF images is 3 megapixels
for devices with less than 256 MB RAM and 5 megapixels for devices
with greater or equal than 256 MB RAM. That is, ensure that width *
height ≤ 3 * 1024 * 1024 for devices with less than 256 MB RAM. Note
that the decoded size is far larger than the encoded size of an image.
The maximum decoded image size for JPEG is 32MP megapixels using
subsampling. JPEG images can be up to 32 megapixels due to
subsampling, which allows JPEG images to decode to a size that has one
sixteenth the number of pixels. JPEG images larger than 2 megapixels
are subsampled—that is, decoded to a reduced size. JPEG subsampling
allows the user to view images from the latest digital cameras.
I added the enfasis on the point that's bugging me mostly. I'm trying to display a fairly big image, but still largely in the above 32MP mentioned limit, specifically its a 3995px * 2138px for a total of 8.5MP and 396kb weight (jpeg quality/compression set to 25 via PS).
Still whenever I call for that image as ex. source of an <img> tag, nothing is displayed on any iOS device I've been able to test, on emulators and couple real devices (iphone4, ipad2, 3, mini...).
Is there anything I'am missing blatantly or maybe I've not understand from the docs above?
What can I do apart replace it with a reduced file size? If forced to replace it, what is the highest width I can reach without breaking? How can I ensure iOS honor the 32MP limit mentioned?
I'm speaking in a website perspective, not a native app on the device.
It doesn't fix your current problem but if you look at image handling in IOS8 there are no longer any image size limits (CoreImage can automatically tile) - perhaps you could target that?
You can split up images and tile them.I routinely display images 180,000 x 120,000 pixels on IOS devices by chopping them up and using a CATiledLayer.

Is there a maximum image width in iOS (image I have is 25020 x 238)? Image works when resized

This image (http://imgur.com/TyPtrxy) will not load in the simulator, although when I scale it to half the size it loads just fine. When trying to load the full image I just get a black box where it should be.
Yes, there is a maximum image size (number of pixels). The limit depends on the hardware in part, but it is generally in the range of 5 to 10 million pixels. This limit is related to limitations on the maximum sizes of textures that can be sent to the graphics card; therefore, it only applies to images that are drawn.
From the documentation:
You should avoid creating UIImage objects that are greater than 1024 x 1024 in size. Besides the large amount of memory such an image would consume, you may run into problems when using the image as a texture in OpenGL ES or when drawing the image to a view or layer. This size restriction does not apply if you are performing code-based manipulations, such as resizing an image larger than 1024 x 1024 pixels by drawing it to a bitmap-backed graphics context. In fact, you may need to resize an image in this manner (or break it into several smaller images) in order to draw it to one of your views.
It might be that you are hitting some limits on the maximum size of the CALayer (which in turn is dependent on the maximum OpenGL texture size supported by the hardware) that is backing the view. If you're exceeding the maximum size, a message like CoreAnimation: surface <size> is too large will be logged. It's also possible that the decompressed image may be too large to fit in memory. You should use CATiledLayer to display content of that size to ensure that it stays within the resource constraints of the device.
Just to expand a bit on the other answers.
The UIImage documentation (as of iOS 10) no longer seems to mention size limitations, although if you use UIImageView with images whose dimensions are larger than the maximum texture size* supported by the device you happen to be using you do get very large memory consumption at render time.
(The memory consumption I see in Instruments seems to indicate that the entire image is put into a 32 bits per pixel buffer when the CA::Layer is rendered.)
If your app doesn't get kill by the OS due to memory usage, the UIImageView does still end up displaying the image though.
Given this, you'll still need strategies to deal with very large images.
* You can check the maximum texture size using something like glGetIntegerv(GL_MAX_TEXTURE_SIZE, &maxTextureSize);. Just make sure you've set the EAGLContext current context to be something non-nil before querying OpenGL, otherwise you'll get zero.

iPad retina screen recording

Two parts:
Correct me if I'm wrong, but there isn't a standard video file format that holds 2048 x 1536 frames? (i.e. recording the full resolution of the iPad retina is impossible?)
My app uses a glReadPixels call to record the screen, and appends the pixel buffers to an AVAssetWriterInputPixelBufferAdaptor. If the video needs to be resized to export, what's the best way to do this? I'm trying right now with AVMutableVideoCompositionLayerInstructions and CGAffineTransforms, but it's not working. Any ideas?
Thanks
Sam
Yes , it is possible. My app is also taking big frame video .
Don't use glReadpixels it causes a lot of delay especially if you record big frames as 2048 x 1536
Since iOS 5.0 you can use a faster way using texture cash (link)

cvCaptureFromCAM() / cvQueryFrame(): get native resolution of connected camera?

I'm using the two OpenCV functions mentioned above to retrieve frames from my webcam. No additional properties are set, just running with default parameters.
Here cvQueryFrame() always returns frames in size 640x480, independent from the native resolution of the camera. But how can I get full size frames when I don't know the exact resolution of it and therefore can't set width and height property? Is there a way to reset these 640x480 settings? Or is there a possibility to query the device for the maximum resolution it supports?
Thanks!

Are memory issues common when scanning 2400 DPI pictures with TWAIN?

I'm using twaindotnet to scan an image with 2400 DPI. Whenever I scan a full page in color I get a message that there is not enough memory to perform this action. I tried it on a different computer with around 4 GB of ram and got the same error message.
If I scan the image as black and white or gray-scale I don't get any error and everything is working fine.
Is that a problem that is related to the scanner driver (Canon 9000F) or is this a general TWAIN problem?
gray-scale images have a bit depth varying from 2 to 8. for an image of legal size and of 2400 dpi, the size can be 163 MB ~ 654 MB.
color images have higher bit depth. take 32 for example, the image of the same size and dpi can be around 2.62 GB. plus the memory occupied by other applications, 4 GB memory likely runs out.
File Size = (height x width x bit depth x dpi2) / 8
dpi2 means dpi square
Looks like that Twain library is scanning to memory, the Twain specification also has a file transfer mode which is generally used for very large images (ICAP_XFERMECH). Twaindotnet may allow you to choose the file transfer mode when scanning.

Resources