I'm making an app that uses Tile Layers in Google Maps to provide custom indoor maps for buildings of interest. I've got the tile layer sending tiles at the appropriate coordinates, but any time one of the tiles is empty, Google Maps is rendering it as white, not transparent. You can see this in the attached screenshot.
What is going on here and how can I get it to render those tiles as transparent? Note, the tiles exhibit no unusual characteristics when viewed in Photoshop or Preview.
I finally figured out what the problem was: ImageMagick was saving the fully-transparent tiles in a 16-bit format (8 gray / 8 alpha) that Google Maps didn't understand. Tiles with content were being saved as the usual 32-bit RGBA format. I added the following option to my tile generating script to fix it:
-define png:color-type=6
Related
I made a project to process Bayer pixels from RAW images on iOS.
I would like to display the processed pixels on the iOS screen, without having to first convert it to 8-bit RGBA (the format for UIImage), which is what UIImageView seems to only accept.
I reviewed CIRAWFilter and the sample RawExposeEmbedded project from Apple, to see how to possibly draw 14-bit data to screen, but the methods they use to render the image to screen involves GLKView and GL functions.
Is this the only known way to render 14-bit RAW processed pixels to an iOS screen or is there an alternative method, such as using CIImage/UIImage with UIImageView?
I have a problem when display tiles from openstreetmap through MKTileOverlay ; in retina display, text and labels are half size ( wrong scale ) and the map is unreadable.
I have tried to change tilesize , but with a different value the map is not correctly displayed ( i see nothing with 512, while tile overlaps with 128 )
How can i fix this problem ?
Raster map tiles need to be designed specifically for retina displays to avoid this. I'd recommend something like Mapbox, whose API automatically takes care of this.
Otherwise, you'll have to implement your own custom MKTileOverlayRenderer and draw each 256px tile into a 512px CGContext. The tiles will appear blurry, though.
In my app, I convert and process images.
from colour to greyscale, then doing operations such as histogram-equalisation, filtering, etc.
that part works fine.
my UIImage display correctly, I also save them to jpeg files and it works.
The only problem is that, although my images are now greyscales, they are still saved as RGB jpegs. that is the red, green and blue value for each pixel are the same but it still waste space to keep the duplicated value, making the file size higher than it could be.
So when i open the image file in photoshop, it is black & white but when I check "Photoshop > Image > Mode", it still says "RGB" instead of "Greyscale".
Anyone know how to tell iOS that the UIImageJPEGRepresentation call should create data with one channel per pixel instead of 4?
Thanks in advance.
You should do an explicit conversion of your image using CGColorSpaceCreateDeviceGray() as color space which is 8 bits per component, 1 channel.
I'm using the color information in the texture still when the alpha is set to 0. The PNG file is correctly saved with the color preserved. If I use the content pipeline and set it to non-premultiplied, everything works fine. Texture2D.FromStream is documented as non-premultiplied but it's wiping out the color. When debugging in PIX and looking at the texture, all pixels with 0 alpha are set to black.
Is there a way I can bypass the content pipeline and still keep my color for transparent pixels?
I'm not able to help too much just now as I don't have code in front of me but I done this a few days ago myself and it had all the correct transparency that was expected. Perhaps it's your image that has an issue? I used a PNG saved using Paint.Net.
As seen in this image http://imgur.com/Qrqqo the boat, tree trunk and ladder all have transparency which allow them to be on a second layer and the tileset itself is loaded using from stream (User generated content ftw).
So if no-one has answered this before I get to my computer with code then I'll take a look at what I have and post a sample if needed.
I am able to use PNGs that have drop shadows but the effect when displayed on the BlackBerry looks like it collapses the transparent channel down from its original smooth gradient to only several transparent values giving it a choppy look.
The same issue is encountered by drawing on the UI using BlackBerry fields or the graphics.drawBitmap method. Anyone want to share hints for getting great looking transparent effects on the BlackBerry?
Dither your images or pre-composite them. When loading an image on a BlackBerry, you get at most 4 bits of alpha data, which allows 4 bits each for RGB. So, if you want to dither your transparent images, go for RGB4444. If you don't dither them, that's what causes 8-bit alpha to just be mapped to the nearest 4-bit value.
If you include no alpha data (i.e., precomposite), you can get RGB565, which will have a better image quality overall, but you will have to deal with static positioning for your dropshadows.