I have a folder with about 30 image icons. I am trying to allow for the user to select one of the 30 "local" images as their profile picture. I am looking to come up with the best way to do this, however most tutorials are for accessing the camera roll and allowing the user to upload their photos.
I am looking to have a method, maybe a UICollectionView and allow them to select a image that will be the users icon. I understand how to pull in the images from the iPhone itself but the server that I am using is not coded currently to allow this process to happen.
What is the best way to use images that are within the app and allow them to be placed into a image view?
The UICollectionView is the way to go.
You'll need to first load all the local avatar filenames. The following example will load all images in the app directory which start with avatar-, ignoring all retain #2x.png files.
func getAvatarFilenames() -> Array<String> {
var avatarFileNames = Array<String>()
var paths = NSBundle.mainBundle().pathsForResourcesOfType("png", inDirectory: nil)
for path in paths {
var imageName = path.lastPathComponent
// ignore retina images as when the uiimage loads them back out
// it will pick the retina version if required
if (imageName.hasSuffix("#2x.png")) {
continue
}
// only add images that are prefixed with 'avatar-'
if (imageName.hasPrefix("avatar-")) {
avatarFileNames.append(imageName)
}
}
return avatarFileNames
}
You can then create a UICollectionView which loads each of the avatar filenames. Something like this to configure each cell (assuming your AvatarCell has an image with tag 1000 - or better yet, a UICollectionViewCell subclass).
var cell = collectionView.dequeueReusableCellWithReuseIdentifier("AvatarCell", forIndexPath: indexPath) as UICollectionViewCell
var avatarImageView = cell.viewWithTag(1000) as UIImageView
avatarImageView.image = UIImage(named: avatarFileNames[indexPath.row])
Related
Im trying to upload a post on firebase which contains: one thumbnail image, and unlimited subposts (like the content of the thumbnail). I have 3 steps on posting the pictures.
1: handle the upload image task.
2: create a "bulkUpload" function that creates image paths for each subpost.
3: call the "bulkUpload" (which also calls the upload image task)
The structure looks something like this:
(postid) {
author: (author)
likes: (number)
pathToImage: (path)
postId: (id)
subposts {
(id): (path)
(id): (path)
...etc
}
userId: (userid)
}
Simple and gets it working. But, not really. There is a strange problem that occurs with the subposts.
When I post the post, everything works, except for the subposts. The subposts, when posting the set of images for the first time, don't show.
Without adding or removing subposts, I tried posting for the second time, this time, the subposts do show. But double the amount I selected in the picker and what shows in the imageViews.
I will link the code from pastebin since its a little lengthy (and stackoverflow doesn't like a lot of code).
But hope I can get this working.
https://pastebin.com/rpLZT6nm
#Matt is right, in your case you should update your data model, which is presented by the collectionView and call reloadData or reloadItems(at:)
self.updateModel();
self.collectioView.reloadData()
self.collectioView.reloadItems(at: [IndexPath(item: 0, section: 0)]);
But if you need direct acces to visible cells, you can use other workflow:
let image : UIImage? = nil
let path = IndexPath(item: 0, section: 0)
let visiblePaths = self.collectioView.indexPathsForVisibleItems
if visiblePaths.contains(path) {
if let cell = self.collectioView.cellForItem(at: path) as? UploadSubPostCell {
cell.previewStep.image = image;
}
}
You can use such workflow to update only visible cells, because invisible cells will be reused before show to the screen
I'm working on a instant messaging app. The user can send text messages, for which I'm using a UITextView inside the custom tableview cells. The user should be able to send all kinds of multimedia data, such as images, documents and videos.
Inside my table view cell I have both a textView and a UIImageView in a stackView and I can send the respective kinds of data. If there is no text, I hide the TextView. If there is no Image, I hide the image.
The problem is: to scale the app, I'd have to add a new container for documents, another one for videos, another one for audio files and hide all the other containers that have no data added to it. It's a garbage solution, but it was the only one that I found.
Knowing from the backend what kind of data is sent, how could I programatically add a container view in which I make the setup on the spot? I was thinking of having a textView as default and an empty View and inside cellForRowAt just add the needed elements.
I would try to create several cells which you dequeue in cellForRowAt by checking for a criteria, e.g.:
if userPost.images != nil {
let cell = tableView.dequeueReusableCell(withReuseIdentifier: "Identifier1", for: indexPath) as! UITableViewCell
return cell
} else {
let cell = tableView.dequeueReusableCell(withReuseIdentifier: "Identifier2", for: indexPath) as! UITableViewCell
return cell
}
With this solution, you can create a cell that nicely fits the data a user sent another user. Just check for any criteria you want and return the according cell.
Hope it helped!
My app loads pictures in to the UICollectionView from iPhone gallery, Facebook and Instagram. Everything works fine but I have a small problem and I'm stuck.. The user can selected some of the images (from gallery, Fb, or IG) and load it to the another ViewController, With Facebook and Instagram the thing is simple because when user selected the images I load the URL to the array and when he deselect and just simply remove this URL from an array. The problem appears with an iPhone gallery. He can selected the images but he can not unsealed (I just can't remove a deselected image from the array).
This is the code for ViewController which is responsible for selecting and "deselecting" images from gallery.
func collectionView(_ collectionView: UICollectionView, didSelectItemAt indexPath: IndexPath) {
_selectedCells.add(indexPath)
collectionView.reloadItems(at: [indexPath])
CounterManager.addPhotos()
// These lines should add selected images to the array
let asset = fetchResult.object(at: indexPath.row)
let image = getUIImage(asset: asset)
tempUIImageArray.append(image!)
CounterManager.updateText(label: numberOfPickslabel)
GalleryManager.selectedGalleryImages.append(image!)
if CounterManager.flag && CounterManager.counter >= 20 {
present(CounterManager.alert(), animated: true, completion: nil)
} else if CounterManager.flag == false && CounterManager.counter >= 40 {
present(CounterManager.alert(), animated: true, completion: nil)
}
}
func collectionView(_ collectionView: UICollectionView, didDeselectItemAt indexPath: IndexPath) {
collectionView.allowsSelection = true
_selectedCells.remove(indexPath)
collectionView.reloadItems(at: [indexPath])
collectionView.deselectItem(at: indexPath, animated: true)
CounterManager.subPhotos()
CounterManager.updateText(label: numberOfPickslabel)
//These lines should remove images from the Array
let asset = fetchResult.object(at: indexPath.row)
let image = getUIImage(asset: asset)
GalleryManager.selectedGalleryImages = GalleryManager.selectedGalleryImages.filter({$0 != image})
}
GalleryManager.selectedGalleryImages is the global array for all the images (from gallery, Fb and IG).
The getUIImage is the method that converts PHAsset to UIImage
It seems that the selected and deselected image keeps changing some data. Below is a picture were I am selecting and deselecting the same image. I do not understand why it's happening..
I will be very grateful for all kind of help.
Thanks M.
You're getting an actual UIImage out of the gallery and storing it in an array, not a unique text reference to the image location. I suspect this is something to do with the way GalleryManager loads images (because you're just comparing raw image data using !=, which doesn't work). But without seeing the code, I can't be sure.
I'd suggest using ALAssetsLibrary instead to get a reference URL to the image, and storing that in your array - this way you can guarantee that the image ref is unique, and you can easily remove it from your array.
Some side notes:
Storing binary data in an array is going to eat memory like crazy. It is not efficient, and you're going to run into problems (especially on older devices).
This code is likely to break whenever asset is nil:
let image = getUIImage(asset: asset)
tempUIImageArray.append(image!)
This is a good candidate for guard or if x let:
guard let image = getUIImage(asset: asset) else { return }
tempUIImageArray.append(image)
Your problem is in comparing the image with the other to filter and remove the the deselected one here
GalleryManager.selectedGalleryImages = GalleryManager.selectedGalleryImages.filter({$0 != image})
This can't be valid as image comparison is a way far from that , so you should associate each image with id or index to select or deselect according to that
The problem is that you compare two different UIImage instances, even though they are created from the same data, they won't be evaluated as the same.
Instead of keeping the selected images in a separate array (GalleryManager.selectedGalleryImages), keep just the original array of images, and to detect which are selected, create an array of selected images' indexes (e.g., GalleryManager.selectedGalleryImagesIndixes) in the original array of images. That way you can have access to the selected images by accessing the original array through the index and you want have problem of comparing two UIImages, because you will work witn Ints.
UPDATE
Based on the UIImage documentation:
Comparing Images
The isEqual(_:) method is the only reliable way to determine whether two images contain the same image data. The image objects you create may be different from each other, even when you initialize them with the same cached image data. The only way to determine their equality is to use the isEqual(_:) method, which compares the actual image data. Listing 1 illustrates the correct and incorrect ways to compare images.
Listing 1
Comparing two images
let image1 = UIImage(named: "MyImage")
let image2 = UIImage(named: "MyImage")
if image1 != nil && image1!.isEqual(image2) {
// Correct. This technique compares the image data correctly.
}
if image1 == image2 {
// Incorrect! Direct object comparisons may not work.
}
Therefore your problem might be also solvable by swapping comparisong using == and != with .isEqual(_:).
Change:
GalleryManager.selectedGalleryImages = GalleryManager.selectedGalleryImages.filter({$0 != image})
To:
GalleryManager.selectedGalleryImages = GalleryManager.selectedGalleryImages.filter({ !($0.isEqual(image)) })
I am using a collectionView to display both images/videos.
I am retrieving the content from Firebase.
With the images, I can scroll. I added a scrollview and manually added 20 imageViews as Firebase do not allow bulk retrieval of images/content. And then check if a value exists, and if so, display the content at a certain index (of image 1-20).
I did seem to notice when define a certain number in this piece of code:
func numberOfSections(in collectionView: UICollectionView) -> Int {
return 1
}
I can scroll the videos. However since I cannot retrieve multiple images, I resorted to the more inconvenient manner or adding 20 imageViews. To display videos, I am doing the following:
if let firstVidURLString = post?.firstVideoURL, let firstVideoURL = URL(string: firstVidURLString) {
self.volumeView.isHidden = false
videoPlayer = AVPlayer(url: firstVideoURL)
videoPlayerLayer = AVPlayerLayer(player: videoPlayer)
videoPlayerLayer?.frame = self.firstImageView.frame
videoPlayerLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
layoutIfNeeded()
videoPlayerLayer?.zPosition = 0
self.contentView.layer.addSublayer(videoPlayerLayer!)
self.volumeView.layer.zPosition = 1
videoPlayer?.play()
videoPlayer?.isMuted = isMuted
}
I am happy with how it is placed, I have defined the frame/bounds in correlation to the first image view I have. Surely, that should help me scroll? As it takes up the same size as one image view. Alas, the video player/video layer is just static and will not conform to the scrollview. Any ideas on how I can tackle this?
Thank you.
Figured it out. This is for anyone else, in the future.
Connect your scrollView, if you have one, to the corresponding .swift file i.e. ViewController, CollectionViewCell etc.
Once you do that, simply add this line.
self.[your scrollview].layer.addSublayer(the name of the video layer !)
And you'll be fine!
I am building an application which uses a Core Data database to store its data on products. These products are displayed in a UICollectionView. Each cell in this collection view displays basic information on the product it contains, including an image.
Although the cells are relatively small, the original images they display are preferably quite large as they should also be able to be displayed in a larger image view. The images are loaded directly from Core Data in my CellForItemAtIndexPath: Method:
func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell {
let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "Cell", for: indexPath) as! CollectionWineCell
var current: Product!
//product details are set on labels
//image is set
if current.value(forKey: "image") != nil {
let image = current.value(forKey: "image") as! WineImage
let loadedImage = UIImage(data: image.image)
cell.imageview.image = loadedImage
} else {
cell.imageview.image = UIImage(named: "ProductPlaceholder.png")
}
return cell
}
When the collection of products grows, scrolling gets bumpier and a lot of frames are dropped. This makes sense to me, but so far I haven't found a suitable solution. When looking online a lot of documentation and frameworks are available for asynchronous image loading from a URL (either online or a file path), but doing this from Core Data does not seem very common.
I have already tried doing it using an asynchronous fetch request:
let fetchRequest = NSFetchRequest<NSFetchRequestResult>(entityName:"ProductImage")
fetchRequest.predicate = NSPredicate(format: "product = %#", current)
let asyncRequest = NSAsynchronousFetchRequest(fetchRequest: fetchRequest) { results in
if let results = results.finalResult {
let result = results[0] as! ProductImage
let loadedImage = UIImage(data: result.image)
DispatchQueue.main.async(execute: {
cell.wineImage.image = loadedImage
})
}
}
_ = try! managedObjectContext.executeRequest(asyncRequest)
However, this approach does not seems to smoothen things out either
QUESTION
When displaying large sets of data, including images, from Core Data, how does one load images in a way that it does not cause lags and frame drops in a UICollectionView?
If the images can be, as you say, quite large, a better approach is not to save them in Core Data but to put them in files. Store the filename in Core Data and use that to look up the file.
But that's not the immediate problem. Even with that you'll get slowdowns from spending time opening and decoding image data. A better approach is, basically, don't do that. In your collection views the images are probably displayed much smaller than their full size. Instead of using the full size image, generate a thumbnail at a more appropriate size and use that in the collection view. Do the thumbnail generation whenever you first get the image, whether from a download or from the user's photo library or wherever. Keep the thumbnail for use in the collection view. Only use the full size image when you really need it.
There are many examples online of how to scale images, so I won't include that here.