How to create an image with black and white pixels? - ios

I'm trying to create a black and white image that on the one hand can be displayed in an UIImageView and on the other hand it should be possible to share this image via e-mail or just be saved to camera roll.
I've got a two dimensional array (NSArray containing other NSArrays) which contains a matrix with the NSInteger values 0 and 1 (for a white and a black pixel).
So I just want to place a black pixel for a 1 and a white one for a 0.
All other questions I found deal with changing pixels from an existing image.
I hope you can help me!
[EDIT]:
Of course, I didn't want someone to do my work. I couldn't figure out how to create a new image and place the pixels as I wanted to at first, so I tried to edit an existing image that only consists of white pixels and change the color of the pixel if necessary. So my code for trying this, but as you can see I had no idea how the change the pixel. I hope that shows that i was trying on my own.
- (UIImage*) createQRCodeWithText:(NSString*) text andErrorCorrectionLevel:(NSInteger) level {
QRGenerator *qr = [[QRGenerator alloc] init];
NSArray *matrix = [qr createQRCodeMatrixWithText:text andCorrectionLevel:level];
UIImage *image_base = [UIImage imageNamed:#"qr_base.png"];
CGImageRef imageRef = image_base.CGImage;
for (int row = 0; row < [matrix count]; row++) {
for (int column = 0; column < [matrix count]; column++) {
if ([[[matrix objectAtIndex:row] objectAtIndex:column] integerValue] == 1) {
//set pixel (column, row) black
}
}
}
return [[UIImage alloc] initWithCGImage:imageRef];
}

I would create the image from scratch using a CGBitmapContext. You can call CGBitmapContextCreate() to allocate the memory for the image. You can then walk through the pixels just like you're doing now, setting them from your array. When you've finished, you can call CGBitmapContextCreateImage() to make a CGImageRef out of it. If you need a UIImage you can call [+UIImage imageWithCGImage:] and pass it the CGImageRef you created above.

Related

How to make images in WKInterfacePicker Transparent?

I have a WKInterfacePicker which is set to Sequence mode. When i add images to it during runtime and display it on the apple watch, the images seems to have a black background (instead of having the transparent pixels). The Images have transparent pixels in them. Here is the code i used in willActivate function:
NSMutableArray *pItems = [[NSMutableArray alloc] initWithCapacity:31];
for(int i=0; i<30; ++i){
WKPickerItem *item = [[WKPickerItem alloc] init];
WKImage *img = nil;
if(i < 10)
img = [WKImage imageWithImageName:[NSString stringWithFormat:#"safeDial_0%d", i]];
else
img = [WKImage imageWithImageName:[NSString stringWithFormat:#"safeDial_%d", i]];
[item setContentImage:img];
[pItems addObject:item];
}
[self.pickerSafeLock setItems:[pItems copy]];
[self.pickerSafeLock setSelectedItemIndex:0];
[self.pickerSafeLock focus];
Note: pickerSafeLock is the WKInterfacePicker
Also, when i use the same images on a WKImage, i see that the transparent pixels work fine.
Is there a way to make the Images in WKInterfacePicker transparent?
After trying multiple ways (including #DaGaMs's solution), i found an considerably tricky solution to the problem. I have used a WKInterfacePicker and an associated WKInterfaceImage alongside it. Here is a way to overcome the issue:
In the Storyboard of your WatchKit App add a WKinterfaceImage, WKInterfaceGroup and WKInterfacePicker inside a WKInterfaceGroup.
Here is how it would look:
This is just to hide the Picker offscreen, so that we can fake the animation from the Picker to the WKInterfaceImage.
Once you do this, on the IBAction of WKInterfacePicker set the corresponding images like this:
- (IBAction)pickerAction:(NSInteger)value {
[self.yourWKInteraceImage setImage:imgPickerList[value]];
}
Note: imgPickerList contains all the images that you want to animate (same as the Picker).

How display random Image in UIImageView Each time it resets

I'm currently making a game. I have an UIImageView that scrolls from right to left and once it leaves the ViewController the image resets on the right and scrolls left again, this currently set to displays "image1" I would like to change it to display a different image randomly from a set of 5 each time it resets to the right.
Here is my Method:
- (void)PlacePipe{
RandomTopPipePosition = arc4random() %350;
RandomTopPipePosition = RandomTopPipePosition - 228;
RandomBottomPipePosition = RandomTopPipePosition + 660;
PipeTop.center = CGPointMake(340-10, RandomTopPipePosition);
randomImagebottom.center = CGPointMake(340-10, RandomBottomPipePosition);
}
The name of the images I want to randomly add into this UIImageView are -toppipestyleone.png, toppipestyletwo.png, toppipestylethree.png, toppipestylefour.png, toppipestylefive.png".
I'm not sure at the best route to doing this, I looked at doing it with an array but I'm not sure how to set up an array or even call images randomly.
You could put the image names in an array as you considered, ex.
NSArray *imageNameArray = [[NSArray alloc] initWithObjects:#"toppipestyleone.png", #"toppipestyletwo.png", #"toppipestylethree.png", #"toppipestylefour.png", #"toppipestylefive.png", nil];
then create and set an image using the image name at a random index of that array using:
imageView.image = [UIImage imageNamed:[imageNameArray objectAtIndex:arc4random_uniform((uint32_t)[imageNameArray count])];

Creating single image by combining more than one images in iOS

I am using UIImageView and I have to set more than one image as a background.
All the images have transparent background and contains any one symbol at their corners. Images are saved based on the conditions. Also there is possibility that there can be more than one images too.
Currently I am setting images, but I can view only the last image. So I want that all the images should be displayed together.
Please do let me know if there is any other way through which I can convert multiple images into single image.
Any help will be appreciated
Thanks in advance
You can draw the images with blend modes. For example, if you have a UIImage, you can call drawAtPoint:blendMode:alpha:. You'd probably want to use kCGBlendModeNormal as the blend mode in most cases.
I had created a function which gets array of images and will return single image. My code is below:
-(UIImage *)blendImages:(NSMutableArray *)array{
UIImage *img=[array objectAtIndex:0];
CGSize size = img.size;
UIGraphicsBeginImageContext(size);
for (int i=0; i<array.count; i++) {
UIImage* uiimage = [array objectAtIndex:i];
[uiimage drawAtPoint:CGPointZero blendMode:kCGBlendModeNormal alpha:1.0];
}
return UIGraphicsGetImageFromCurrentImageContext();
}
Hope this will help others too.
You should composite your images into one -- especially because they have alpha channels.
To do this, you could
use UIGraphicsBeginImageContextWithOptions to create the image at the destination size (scale now, rather than when drawing to the screen and choose the appropriate opacity)
Render your images to the context using CGContextDrawImage
then call UIGraphicsGetImageFromCurrentImageContext to get the result as a UIImage, which you set as the image of the image view.
You can use:
typedef enum _imageType{
image1,
image2,
...
imageN
}imageType;
and declare in #interface
imageType imgType;
in .h file.
And in the.m file
-(void)setImageType:(imageType)type{
imgType = type;
}
and then you can use function setImageType: to set any images what you want.

How to make a saturation value slider with OpenCV in Xcode

I want to make a slider that can change the saturation of the image in an image view.
I'm currently using OpenCV. I've found some code on the web and tried it. It's working but it works in a little bit strange way..There is a white cup on the image but its color goes all the way rainbow regardless of the value(unless it's totally grayscale).
- (IBAction)stSlider:(id)sender {
float value = stSlider.value;
UIImage *image = [UIImage imageNamed:#"sushi.jpg"];
cv::Mat mat = [self cvMatFromUIImage:image];
cv::cvtColor(mat, mat, CV_RGB2HSV);
for (int i=0; i<mat.rows;i++)
{ for (int j=0; j<mat.cols;j++)
{
int idx = 1;
mat.at<cv::Vec3b>(i,j)[idx] = value;
}
}
cv::cvtColor(mat, mat, CV_HSV2RGB);
imageView.image = [self UIImageFromCVMat:mat];
}
This is the code I used.
Please tell me which part I have to change to make it work right.

Implement Blur over parts of view

How can I implement the image below pragmatically - meaning the digits can change at runtime or even be replaced with a movie?
Just add a blurred UIView on top of your thing.
For example...make a UIImage of your desired view size, blur it using CIFilter and then add it to your view .It should achieve the desired effect.
This is generally the same question and is answered by quite a few methods.. Anyway I would propose 1 more:
Get the image from UIView
+ (UIImage *)imageFromLayer:(CALayer *)layer {
UIGraphicsBeginImageContext([layer frame].size);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
rather yet play around a bit with this to get the desired part of the view as the image. Now create a new view and add to it image views (with the image you get from layer). Then move the centers of the image views to achieve gaussian algorithm and take the image from this layer again and place it back on the original view.
Moving the center should be defined by radius fragment (I'd start with .5f) and resample range.
for(int i=1; i<resampleCount; i++) {
view1.center = CGPointMake(view1.center.x + radiusFragment*i, view1.center.y);
view2.center = CGPointMake(view2.center.x - radiusFragment*i, view2.center.y);
view3.center = CGPointMake(view3.center.x, view3.center.y + radiusFragment*i);
view4.center = CGPointMake(view4.center.x, view4.center.y - radiusFragment*i);
//add the subviews
}
//get the image from view
All the subviews need to have alpha set to 1.0f/(resampleCount*4)
This method might not be the fastest but it would be extremely easy to implement and if you can pimp the radius and resample range to minimum fragments it should do pretty well.
use a UIView whith white background and decrease the alpha property
blurView.backgroundColor=[UIColor colorWithRed:255 green:255 blue:255 alpha:0.3]

Resources