Combine: publish elements of a sequence with some delay - ios

I'm new to Combine and I'd like to get a seemingly simple thing. Let's say I have a collection of integer, such as:
let myCollection = [0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
I'd like to publish each element with a delay of, for example, 0.5 seconds.
print 0
wait for 0.5secs
print 1
wait for 0.5secs
and so forth
I can easily get the sequence publisher and print the elements like this:
let publisherCanc = myCollection.publisher.sink { value in
print(value)
}
But in this case all the values are printed immediately. How can I print the values with a delay? In Combine there's a .delay modififer, but it's not for what I need (indeed, .delay delays the entire stream and not the single elements). If I try:
let publisherCanc = myCollection.publisher.delay(for: .seconds(0.5), scheduler: RunLoop.main).sink { value in
print(value)
}
All I get it's just an "initial" delay, then the elements are printed immediately.
Thanks for your help.

Using the idea from the answer linked by Alexander in comments, you can create a publisher that emits a value every 0.5 seconds using Timer.publish(every:on:in:), then zip that together with your Array.publisher to make your downstream publisher emit a value every time both of your publishers have emitted a new value.
Publishers.Zip takes the n-th element of its of its upstream publishers and only emits when both of its upstreams have reached n emitted values - hence by zipping together a publisher that only emits its values at 0.5 second intervals with your original publisher that emits all of its values immediately, you delay each value by 0.5 seconds.
let delayPublisher = Timer.publish(every: 0.5, on: .main, in: .default).autoconnect()
let delayedValuesPublisher = Publishers.Zip(myCollection.publisher, delayPublisher)
let subscription = delayedValuesPublisher.sink { print($0.0) }

Try to use flatMap(maxPublishers:) with delay(for:scheduler:) operators.
import Foundation
import Combine
var tokens: Set<AnyCancellable> = []
let valuesToPublish = [1, 2, 3, 4, 5, 6, 7, 8, 9]
valuesToPublish.publisher
.flatMap(maxPublishers: .max(1)) { Just($0).delay(for: 1, scheduler: RunLoop.main) }
.sink { completion in
print("--- completion \(completion) ---")
} receiveValue: { value in
print("--- value \(value) ---")
}
.store(in: &tokens)
Setting maxPublishers property you can specify the maximum number of concurrent publisher subscriptions. Apple

Based on the examples provided in other answers, I came up with a solution with generics:
import Combine
import SwiftUI
struct TimedSequence<T: Any> {
typealias TimedJointPublisher = (Publishers.Zip<Publishers.Sequence<[T], Never>, Publishers.Autoconnect<Timer.TimerPublisher>>)
var sink: AnyCancellable?
init(array: [T], interval: TimeInterval, closure: #escaping (T) -> Void) {
let delayPublisher = Timer.publish(every: interval, on: .main, in: .default).autoconnect()
let timedJointPublisher = Publishers.Zip(array.publisher, delayPublisher)
self.sink = timedJointPublisher.sink(receiveValue: {r in
closure(r.0)
})
}
}
Usage:
1 - Basic types:
let m = TimedSequence(array: [1, 2, 3], interval: 2, closure: {
element in
let textReceived = String(element) //assigns 1 ...2 seconds...then 2...2 seconds...then 3 to textReceived
})
let m = TimedSequence(array: ["Hello", "World"], interval: 2, closure: {
element in
let textReceived = element.upperCased() //assigns HELLO ...2 seconds... then WORLD to textReceived
2 - Custom types:
class MyClass {
var desc: String
init(desc: String) {
self.desc = desc
}
}
let m = TimedSequence(array: [MyClass(desc: "t"), MyClass(desc: "s")], interval: 2, closure: {
str in
let textReceived = str.desc.uppercased()
})
Same as 1, except that str.desc (S and T respectively) gets assigned to textReceived at 2 second interval.

Related

Mix of distinctUntilChanged() and throttle(_ dueTime: RxTimeInterval, latest: Bool = true, scheduler: SchedulerType) methods

Is it possible to prepare mechanism in RXSwift which works like mix between
func distinctUntilChanged() and func throttle(_ dueTime: RxTimeInterval, latest: Bool = true, scheduler: SchedulerType).
Lets assume there are many signals being emitted to the subscription like:
A, B, C, D, B, D, A, X, C, Z ...
and lets assume I want to pass only once signal which occurrences can take place multiple times in specified timespan. In practise it means that if I specify timespan as 5s and signal with value A takes place 3 times during this timespan I will only pass this signal once and after timespan passed this mechanizm is being reset and continuous working in the manner. Obviously timespan starts when first occurrence of signal A takes place. In meantime all other signals (D, C, ...) works in the same approach.
Let me know if this is feasible.
Likely something very much like this will do what you want:
func example<T>(scheduler: SchedulerType, dueTime: RxTimeInterval, source: Observable<T>) -> Observable<T> where T: Hashable {
source.groupBy(keySelector: { $0 })
.flatMap { $0.throttle(dueTime, latest: false, scheduler: scheduler) }
}
Here are some tests and their results:
final class ExampleTests: XCTestCase {
// at three seconds, everything gets through
func testAt3() {
let scheduler = TestScheduler(initialClock: 0)
let source = scheduler.createObservable(timeline: "A-B-C-D-B-D-A-X-C-Z")
let expected = parseEventsAndTimes(timeline: "A-B-C-D-B-D-A-X-C-Z", values: { String($0) })
.offsetTime(by: 200)[0]
let result = scheduler.start {
example(scheduler: scheduler, dueTime: .seconds(3), source: source)
}
XCTAssertEqual(result.events, expected)
}
// at five seconds, the extra D is stopped
func testAt5() {
let scheduler = TestScheduler(initialClock: 0)
let source = scheduler.createObservable(timeline: "A-B-C-D-B-D-A-X-C-Z")
let expected = parseEventsAndTimes(timeline: "A-B-C-D-B---A-X-C-Z", values: { String($0) })
.offsetTime(by: 200)[0]
let result = scheduler.start {
example(scheduler: scheduler, dueTime: .seconds(5), source: source)
}
XCTAssertEqual(result.events, expected)
}
// at seven seconds, the extra B and D are both stopped.
func testAt7() {
let scheduler = TestScheduler(initialClock: 0)
let source = scheduler.createObservable(timeline: "A-B-C-D-B-D-A-X-C-Z")
let expected = parseEventsAndTimes(timeline: "A-B-C-D-----A-X-C-Z", values: { String($0) })
.offsetTime(by: 200)[0]
let result = scheduler.start {
example(scheduler: scheduler, dueTime: .seconds(7), source: source)
}
XCTAssertEqual(result.events, expected)
}
}

Swift Combine - combining publishers without waiting for all publishers to emit first element

I'm combining two publishers:
let timer = Timer.publish(every: 10, on: .current, in: .common).autoconnect()
let anotherPub: AnyPublisher<Int, Never> = ...
Publishers.CombineLatest(timer, anotherPub)
.sink(receiveValue: (timer, val) in {
print("Hello!")
} )
Unfortunately, sink is not called until both publishers emit at least one element.
Is there any way to make the sink being called without waiting for all publishers?
So that if any publisher emits a value, sink is called with other values set to nil.
You can use prepend(…) to prepend values to the beginning of a publisher.
Here's a version of your code that will prepend nil to both publishers.
let timer = Timer.publish(every: 10, on: .current, in: .common).autoconnect()
let anotherPub: AnyPublisher<Int, Never> = Just(10).delay(for: 5, scheduler: RunLoop.main).eraseToAnyPublisher()
Publishers.CombineLatest(
timer.map(Optional.init).prepend(nil),
anotherPub.map(Optional.init).prepend(nil)
)
.filter { $0 != nil && $1 != nil } // Filter the event when both are nil values
.sink(receiveValue: { (timer, val) in
print("Hello! \(timer) \(val)")
})

Combine framework serialize async operations

How do I get the asynchronous pipelines that constitute the Combine framework to line up synchronously (serially)?
Suppose I have 50 URLs from which I want to download the corresponding resources, and let's say I want to do it one at a time. I know how to do that with Operation / OperationQueue, e.g. using an Operation subclass that doesn't declare itself finished until the download is complete. How would I do the same thing using Combine?
At the moment all that occurs to me is to keep a global list of the remaining URLs and pop one off, set up that one pipeline for one download, do the download, and in the sink of the pipeline, repeat. That doesn't seem very Combine-like.
I did try making an array of the URLs and map it to an array of publishers. I know I can "produce" a publisher and cause it to publish on down the pipeline using flatMap. But then I'm still doing all the downloading simultaneously. There isn't any Combine way to walk the array in a controlled manner — or is there?
(I also imagined doing something with Future but I became hopelessly confused. I'm not used to this way of thinking.)
Use flatMap(maxPublishers:transform:) with .max(1), e.g.
func imagesPublisher(for urls: [URL]) -> AnyPublisher<UIImage, URLError> {
Publishers.Sequence(sequence: urls.map { self.imagePublisher(for: $0) })
.flatMap(maxPublishers: .max(1)) { $0 }
.eraseToAnyPublisher()
}
Where
func imagePublisher(for url: URL) -> AnyPublisher<UIImage, URLError> {
URLSession.shared.dataTaskPublisher(for: url)
.compactMap { UIImage(data: $0.data) }
.receive(on: RunLoop.main)
.eraseToAnyPublisher()
}
and
var imageRequests: AnyCancellable?
func fetchImages() {
imageRequests = imagesPublisher(for: urls).sink { completion in
switch completion {
case .finished:
print("done")
case .failure(let error):
print("failed", error)
}
} receiveValue: { image in
// do whatever you want with the images as they come in
}
}
That resulted in:
But we should recognize that you take a big performance hit doing them sequentially, like that. For example, if I bump it up to 6 at a time, it’s more than twice as fast:
Personally, I’d recommend only downloading sequentially if you absolutely must (which, when downloading a series of images/files, is almost certainly not the case). Yes, performing requests concurrently can result in them not finishing in a particular order, but we just use a structure that is order independent (e.g. a dictionary rather than a simple array), but the performance gains are so significant that it’s generally worth it.
But, if you want them downloaded sequentially, the maxPublishers parameter can achieve that.
I've only briefly tested this, but at first pass it appears that each request waits for the previous request to finish before starting.
I'm posting this solution in search of feedback. Please be critical if this isn't a good solution.
extension Collection where Element: Publisher {
func serialize() -> AnyPublisher<Element.Output, Element.Failure>? {
// If the collection is empty, we can't just create an arbititary publisher
// so we return nil to indicate that we had nothing to serialize.
if isEmpty { return nil }
// We know at this point that it's safe to grab the first publisher.
let first = self.first!
// If there was only a single publisher then we can just return it.
if count == 1 { return first.eraseToAnyPublisher() }
// We're going to build up the output starting with the first publisher.
var output = first.eraseToAnyPublisher()
// We iterate over the rest of the publishers (skipping over the first.)
for publisher in self.dropFirst() {
// We build up the output by appending the next publisher.
output = output.append(publisher).eraseToAnyPublisher()
}
return output
}
}
A more concise version of this solution (provided by #matt):
extension Collection where Element: Publisher {
func serialize() -> AnyPublisher<Element.Output, Element.Failure>? {
guard let start = self.first else { return nil }
return self.dropFirst().reduce(start.eraseToAnyPublisher()) {
$0.append($1).eraseToAnyPublisher()
}
}
}
You could create custom Subscriber where receive returning Subscribers.Demand.max(1). In that case the subscriber will request next value only when received one. The example is for Int.publisher, but some random delay in map mimics network traffic :-)
import PlaygroundSupport
import SwiftUI
import Combine
class MySubscriber: Subscriber {
typealias Input = String
typealias Failure = Never
func receive(subscription: Subscription) {
print("Received subscription", Thread.current.isMainThread)
subscription.request(.max(1))
}
func receive(_ input: Input) -> Subscribers.Demand {
print("Received input: \(input)", Thread.current.isMainThread)
return .max(1)
}
func receive(completion: Subscribers.Completion<Never>) {
DispatchQueue.main.async {
print("Received completion: \(completion)", Thread.current.isMainThread)
PlaygroundPage.current.finishExecution()
}
}
}
(110...120)
.publisher.receive(on: DispatchQueue.global())
.map {
print(Thread.current.isMainThread, Thread.current)
usleep(UInt32.random(in: 10000 ... 1000000))
return String(format: "%02x", $0)
}
.subscribe(on: DispatchQueue.main)
.subscribe(MySubscriber())
print("Hello")
PlaygroundPage.current.needsIndefiniteExecution = true
Playground print ...
Hello
Received subscription true
false <NSThread: 0x600000064780>{number = 5, name = (null)}
Received input: 6e false
false <NSThread: 0x60000007cc80>{number = 9, name = (null)}
Received input: 6f false
false <NSThread: 0x60000007cc80>{number = 9, name = (null)}
Received input: 70 false
false <NSThread: 0x60000007cc80>{number = 9, name = (null)}
Received input: 71 false
false <NSThread: 0x60000007cc80>{number = 9, name = (null)}
Received input: 72 false
false <NSThread: 0x600000064780>{number = 5, name = (null)}
Received input: 73 false
false <NSThread: 0x600000064780>{number = 5, name = (null)}
Received input: 74 false
false <NSThread: 0x60000004dc80>{number = 8, name = (null)}
Received input: 75 false
false <NSThread: 0x60000004dc80>{number = 8, name = (null)}
Received input: 76 false
false <NSThread: 0x60000004dc80>{number = 8, name = (null)}
Received input: 77 false
false <NSThread: 0x600000053400>{number = 3, name = (null)}
Received input: 78 false
Received completion: finished true
UPDATE
finally i found .flatMap(maxPublishers: ), which force me to update this interesting topic with little bit different approach. Please, see that I am using global queue for scheduling, not only some random delay, just to be sure that receiving serialized stream is not "random" or "lucky" behavior :-)
import PlaygroundSupport
import Combine
import Foundation
PlaygroundPage.current.needsIndefiniteExecution = true
let A = (1 ... 9)
.publisher
.flatMap(maxPublishers: .max(1)) { value in
[value].publisher
.flatMap { value in
Just(value)
.delay(for: .milliseconds(Int.random(in: 0 ... 100)), scheduler: DispatchQueue.global())
}
}
.sink { value in
print(value, "A")
}
let B = (1 ... 9)
.publisher
.flatMap { value in
[value].publisher
.flatMap { value in
Just(value)
.delay(for: .milliseconds(Int.random(in: 0 ... 100)), scheduler: RunLoop.main)
}
}
.sink { value in
print(" ",value, "B")
}
prints
1 A
4 B
5 B
7 B
1 B
2 B
8 B
6 B
2 A
3 B
9 B
3 A
4 A
5 A
6 A
7 A
8 A
9 A
Based on written here
.serialize()?
defined by Clay Ellis accepted answer could be replaced by
.publisher.flatMap(maxPublishers: .max(1)){$0}
while "unserialzed" version must use
.publisher.flatMap{$0}
"real world example"
import PlaygroundSupport
import Foundation
import Combine
let path = "postman-echo.com/get"
let urls: [URL] = "... which proves the downloads are happening serially .-)".map(String.init).compactMap { (parameter) in
var components = URLComponents()
components.scheme = "https"
components.path = path
components.queryItems = [URLQueryItem(name: parameter, value: nil)]
return components.url
}
//["https://postman-echo.com/get?]
struct Postman: Decodable {
var args: [String: String]
}
let collection = urls.compactMap { value in
URLSession.shared.dataTaskPublisher(for: value)
.tryMap { data, response -> Data in
return data
}
.decode(type: Postman.self, decoder: JSONDecoder())
.catch {_ in
Just(Postman(args: [:]))
}
}
extension Collection where Element: Publisher {
func serialize() -> AnyPublisher<Element.Output, Element.Failure>? {
guard let start = self.first else { return nil }
return self.dropFirst().reduce(start.eraseToAnyPublisher()) {
return $0.append($1).eraseToAnyPublisher()
}
}
}
var streamA = ""
let A = collection
.publisher.flatMap{$0}
.sink(receiveCompletion: { (c) in
print(streamA, " ", c, " .publisher.flatMap{$0}")
}, receiveValue: { (postman) in
print(postman.args.keys.joined(), terminator: "", to: &streamA)
})
var streamC = ""
let C = collection
.serialize()?
.sink(receiveCompletion: { (c) in
print(streamC, " ", c, " .serialize()?")
}, receiveValue: { (postman) in
print(postman.args.keys.joined(), terminator: "", to: &streamC)
})
var streamD = ""
let D = collection
.publisher.flatMap(maxPublishers: .max(1)){$0}
.sink(receiveCompletion: { (c) in
print(streamD, " ", c, " .publisher.flatMap(maxPublishers: .max(1)){$0}")
}, receiveValue: { (postman) in
print(postman.args.keys.joined(), terminator: "", to: &streamD)
})
PlaygroundPage.current.needsIndefiniteExecution = true
prints
.w.h i.c hporves ht edownloadsa erh appeninsg eriall y.-) finished .publisher.flatMap{$0}
... which proves the downloads are happening serially .-) finished .publisher.flatMap(maxPublishers: .max(1)){$0}
... which proves the downloads are happening serially .-) finished .serialize()?
Seem to me very useful in other scenarios as well. Try to use default value of maxPublishers in next snippet and compare the results :-)
import Combine
let sequencePublisher = Publishers.Sequence<Range<Int>, Never>(sequence: 0..<Int.max)
let subject = PassthroughSubject<String, Never>()
let handle = subject
.zip(sequencePublisher.print())
//.publish
.flatMap(maxPublishers: .max(1), { (pair) in
Just(pair)
})
.print()
.sink { letters, digits in
print(letters, digits)
}
"Hello World!".map(String.init).forEach { (s) in
subject.send(s)
}
subject.send(completion: .finished)
From the original question:
I did try making an array of the URLs and map it to an array of publishers. I know I can "produce" a publisher and cause it to publish on down the pipeline using flatMap. But then I'm still doing all the downloading simultaneously. There isn't any Combine way to walk the array in a controlled manner — or is there?
Here's a toy example to stand in for the real problem:
let collection = (1 ... 10).map {
Just($0).delay(
for: .seconds(Double.random(in:1...5)),
scheduler: DispatchQueue.main)
.eraseToAnyPublisher()
}
collection.publisher
.flatMap() {$0}
.sink {print($0)}.store(in:&self.storage)
This emits the integers from 1 to 10 in random order arriving at random times. The goal is to do something with collection that will cause it to emit the integers from 1 to 10 in order.
Now we're going to change just one thing: in the line
.flatMap {$0}
we add the maxPublishers parameter:
let collection = (1 ... 10).map {
Just($0).delay(
for: .seconds(Double.random(in:1...5)),
scheduler: DispatchQueue.main)
.eraseToAnyPublisher()
}
collection.publisher
.flatMap(maxPublishers:.max(1)) {$0}
.sink {print($0)}.store(in:&self.storage)
Presto, we now do emit the integers from 1 to 10, in order, with random intervals between them.
Let's apply this to the original problem. To demonstrate, I need a fairly slow Internet connection and a fairly large resource to download. First, I'll do it with ordinary .flatMap:
let eph = URLSessionConfiguration.ephemeral
let session = URLSession(configuration: eph)
let url = "https://photojournal.jpl.nasa.gov/tiff/PIA23172.tif"
let collection = [url, url, url]
.map {URL(string:$0)!}
.map {session.dataTaskPublisher(for: $0)
.eraseToAnyPublisher()
}
collection.publisher.setFailureType(to: URLError.self)
.handleEvents(receiveOutput: {_ in print("start")})
.flatMap() {$0}
.map {$0.data}
.sink(receiveCompletion: {comp in
switch comp {
case .failure(let err): print("error", err)
case .finished: print("finished")
}
}, receiveValue: {_ in print("done")})
.store(in:&self.storage)
The result is
start
start
start
done
done
done
finished
which shows that we are doing the three downloads simultaneously. Okay, now change
.flatMap() {$0}
to
.flatMap(maxPublishers:.max(1) {$0}
The result now is:
start
done
start
done
start
done
finished
So we are now downloading serially, which is the problem originally to be solved.
append
In keeping with the principle of TIMTOWTDI, we can instead chain the publishers with append to serialize them:
let collection = (1 ... 10).map {
Just($0).delay(
for: .seconds(Double.random(in:1...5)),
scheduler: DispatchQueue.main)
.eraseToAnyPublisher()
}
let pub = collection.dropFirst().reduce(collection.first!) {
return $0.append($1).eraseToAnyPublisher()
}
The result is a publisher that serializes the delayed publishers in the original collection. Let's prove it by subscribing to it:
pub.sink {print($0)}.store(in:&self.storage)
Sure enough, the integers now arrive in order (with random intervals between).
We can encapsulate the creation of pub from a collection of publishers with an extension on Collection, as suggested by Clay Ellis:
extension Collection where Element: Publisher {
func serialize() -> AnyPublisher<Element.Output, Element.Failure>? {
guard let start = self.first else { return nil }
return self.dropFirst().reduce(start.eraseToAnyPublisher()) {
return $0.append($1).eraseToAnyPublisher()
}
}
}
Here is one page playground code that depicts possible approach. The main idea is to transform async API calls into chain of Future publishers, thus making serial pipeline.
Input: range of int from 1 to 10 that asynchrounosly on background queue converted into strings
Demo of direct call to async API:
let group = DispatchGroup()
inputValues.map {
group.enter()
asyncCall(input: $0) { (output, _) in
print(">> \(output), in \(Thread.current)")
group.leave()
}
}
group.wait()
Output:
>> 1, in <NSThread: 0x7fe76264fff0>{number = 4, name = (null)}
>> 3, in <NSThread: 0x7fe762446b90>{number = 3, name = (null)}
>> 5, in <NSThread: 0x7fe7624461f0>{number = 5, name = (null)}
>> 6, in <NSThread: 0x7fe762461ce0>{number = 6, name = (null)}
>> 10, in <NSThread: 0x7fe76246a7b0>{number = 7, name = (null)}
>> 4, in <NSThread: 0x7fe764c37d30>{number = 8, name = (null)}
>> 7, in <NSThread: 0x7fe764c37cb0>{number = 9, name = (null)}
>> 8, in <NSThread: 0x7fe76246b540>{number = 10, name = (null)}
>> 9, in <NSThread: 0x7fe7625164b0>{number = 11, name = (null)}
>> 2, in <NSThread: 0x7fe764c37f50>{number = 12, name = (null)}
Demo of combine pipeline:
Output:
>> got 1
>> got 2
>> got 3
>> got 4
>> got 5
>> got 6
>> got 7
>> got 8
>> got 9
>> got 10
>>>> finished with true
Code:
import Cocoa
import Combine
import PlaygroundSupport
// Assuming there is some Asynchronous API with
// (eg. process Int input value during some time and generates String result)
func asyncCall(input: Int, completion: #escaping (String, Error?) -> Void) {
DispatchQueue.global(qos: .background).async {
sleep(.random(in: 1...5)) // wait for random Async API output
completion("\(input)", nil)
}
}
// There are some input values to be processed serially
let inputValues = Array(1...10)
// Prepare one pipeline item based on Future, which trasform Async -> Sync
func makeFuture(input: Int) -> AnyPublisher<Bool, Error> {
Future<String, Error> { promise in
asyncCall(input: input) { (value, error) in
if let error = error {
promise(.failure(error))
} else {
promise(.success(value))
}
}
}
.receive(on: DispatchQueue.main)
.map {
print(">> got \($0)") // << sideeffect of pipeline item
return true
}
.eraseToAnyPublisher()
}
// Create pipeline trasnforming input values into chain of Future publishers
var subscribers = Set<AnyCancellable>()
let pipeline =
inputValues
.reduce(nil as AnyPublisher<Bool, Error>?) { (chain, value) in
if let chain = chain {
return chain.flatMap { _ in
makeFuture(input: value)
}.eraseToAnyPublisher()
} else {
return makeFuture(input: value)
}
}
// Execute pipeline
pipeline?
.sink(receiveCompletion: { _ in
// << do something on completion if needed
}) { output in
print(">>>> finished with \(output)")
}
.store(in: &subscribers)
PlaygroundPage.current.needsIndefiniteExecution = true
In all of the other Reactive frameworks this is really easy; you just use concat to concatenate and flatten the results in one step and then you can reduce the results into a final array. Apple makes this difficult because Publisher.Concatenate has no overload that accepts an array of Publishers. There is similar weirdness with Publisher.Merge. I have a feeling this has to do with the fact that they return nested generic publishers instead of just returning a single generic type like rx Observable. I guess you can just call Concatenate in a loop and then reduce the concatenated results into a single array, but I really hope they address this issue in the next release. There is certainly the need to concat more than 2 publishers and to merge more than 4 publishers (and the overloads for these two operators aren't even consistent, which is just weird).
EDIT:
I came back to this and found that you can indeed concat an arbitrary array of publishers and they will emit in sequence. I have no idea why there isn't a function like ConcatenateMany to do this for you but it looks like as long as you are willing to use a type erased publisher its not that hard to write one yourself. This example shows that merge emits in temporal order while concat emits in the order of combination:
import PlaygroundSupport
import SwiftUI
import Combine
let p = Just<Int>(1).append(2).append(3).delay(for: .seconds(0.25), scheduler: RunLoop.main).eraseToAnyPublisher()
let q = Just<Int>(4).append(5).append(6).eraseToAnyPublisher()
let r = Just<Int>(7).append(8).append(9).delay(for: .seconds(0.5), scheduler: RunLoop.main).eraseToAnyPublisher()
let concatenated: AnyPublisher<Int, Never> = [q,r].reduce(p) { total, next in
total.append(next).eraseToAnyPublisher()
}
var subscriptions = Set<AnyCancellable>()
concatenated
.sink(receiveValue: { v in
print("concatenated: \(v)")
}).store(in: &subscriptions)
Publishers
.MergeMany([p,q,r])
.sink(receiveValue: { v in
print("merge: \(v)")
}).store(in: &subscriptions)
What about the dynamic array of URLs, something like data bus ?
var array: [AnyPublisher<Data, URLError>] = []
array.append(Task())
array.publisher
.flatMap { $0 }
.sink {
}
// it will be finished
array.append(Task())
array.append(Task())
array.append(Task())
Another approach, if you want to collect all the results of the downloads, in order to know which one failed and which one not, is to write a custom publisher that looks like this:
extension Publishers {
struct Serialize<Upstream: Publisher>: Publisher {
typealias Output = [Result<Upstream.Output, Upstream.Failure>]
typealias Failure = Never
let upstreams: [Upstream]
init<C: Collection>(_ upstreams: C) where C.Element == Upstream {
self.upstreams = Array(upstreams)
}
init(_ upstreams: Upstream...) {
self.upstreams = upstreams
}
func receive<S>(subscriber: S) where S : Subscriber, Self.Failure == S.Failure, Self.Output == S.Input {
guard let first = upstreams.first else { return Empty().subscribe(subscriber) }
first
.map { Result<Upstream.Output, Upstream.Failure>.success($0) }
.catch { Just(Result<Upstream.Output, Upstream.Failure>.failure($0)) }
.map { [$0] }
.append(Serialize(upstreams.dropFirst()))
.collect()
.map { $0.flatMap { $0 } }
.subscribe(subscriber)
}
}
}
extension Collection where Element: Publisher {
func serializedPublishers() -> Publishers.Serialize<Element> {
.init(self)
}
}
The publisher takes the first download task, converts its output/failure to a Result instance, and prepends it to the "recursive" call for the rest of the list.
Usage: Publishers.Serialize(listOfDownloadTasks), or listOfDownloadTasks.serializedPublishers().
One minor inconvenient of this implementation is the fact that the Result instance needs to be wrapped into an array, just to be flattened three steps later in the pipeline. Perhaps someone can suggest a better alternative to that.

Swift Combine: `append` which does not require output to be equal?

Using Apple's Combine I would like to append a publisher bar after a first publisher foo has finished (ok to constrain Failure to Never). Basically I want RxJava's andThen.
I have something like this:
let foo: AnyPublisher<Fruit, Never> = /* actual publisher irrelevant */
let bar: AnyPublisher<Fruit, Never> = /* actual publisher irrelevant */
// A want to do concatenate `bar` to start producing elements
// only after `foo` has `finished`, and let's say I only care about the
// first element of `foo`.
let fooThenBar = foo.first()
.ignoreOutput()
.append(bar) // Compilation error: `Cannot convert value of type 'AnyPublisher<Fruit, Never>' to expected argument type 'Publishers.IgnoreOutput<Upstream>.Output' (aka 'Never')`
I've come up with a solution, I think it works, but it looks very ugly/overly complicated.
let fooThenBar = foo.first()
.ignoreOutput()
.flatMap { _ in Empty<Fruit, Never>() }
.append(bar)
I'm I missing something here?
Edit
Added a nicer version of my initial proposal as an answer below. Big thanks to #RobNapier!
I think instead of ignoreOutput, you just want to filter all the items, and then append:
let fooThenBar = foo.first()
.filter { _ in false }
.append(bar)
You may find this nicer to rename dropAll():
extension Publisher {
func dropAll() -> Publishers.Filter<Self> { filter { _ in false } }
}
let fooThenBar = foo.first()
.dropAll()
.append(bar)
The underlying issue is that ignoreAll() generates a Publisher with Output of Never, which usually makes sense. But in this case you want to just get ride of values without changing the type, and that's filtering.
Thanks to great discussions with #RobNapier we kind of concluded that a flatMap { Empty }.append(otherPublisher) solution is the best when the output of the two publishers differ. Since I wanted to use this after the first/base/'foo' publisher finishes, I've written an extension on Publishers.IgnoreOutput, the result is this:
Solution
protocol BaseForAndThen {}
extension Publishers.IgnoreOutput: BaseForAndThen {}
extension Combine.Future: BaseForAndThen {}
extension Publisher where Self: BaseForAndThen, Self.Failure == Never {
func andThen<Then>(_ thenPublisher: Then) -> AnyPublisher<Then.Output, Never> where Then: Publisher, Then.Failure == Failure {
return
flatMap { _ in Empty<Then.Output, Never>(completeImmediately: true) } // same as `init()`
.append(thenPublisher)
.eraseToAnyPublisher()
}
}
Usage
In my use case I wanted to control/have insight in when the base publisher finishes, therefore my solution is based on this.
Together with ignoreOutput
Since the second publisher, in case below appleSubject, won't start producing elements (outputting values) until the first publisher finishes, I use first() operator (there is also a last() operator) to make the bananaSubject finish after one output.
bananaSubject.first().ignoreOutput().andThen(appleSubject)
Together with Future
A Future already just produces one element and then finishes.
futureBanana.andThen(applePublisher)
Test
Here is the complete unit test (also on Github)
import XCTest
import Combine
protocol Fruit {
var price: Int { get }
}
typealias 🍌 = Banana
struct Banana: Fruit {
let price: Int
}
typealias 🍏 = Apple
struct Apple: Fruit {
let price: Int
}
final class CombineAppendDifferentOutputTests: XCTestCase {
override func setUp() {
super.setUp()
continueAfterFailure = false
}
func testFirst() throws {
try doTest { bananaPublisher, applePublisher in
bananaPublisher.first().ignoreOutput().andThen(applePublisher)
}
}
func testFuture() throws {
var cancellable: Cancellable?
try doTest { bananaPublisher, applePublisher in
let futureBanana = Future<🍌, Never> { promise in
cancellable = bananaPublisher.sink(
receiveCompletion: { _ in },
receiveValue: { value in promise(.success(value)) }
)
}
return futureBanana.andThen(applePublisher)
}
XCTAssertNotNil(cancellable)
}
static var allTests = [
("testFirst", testFirst),
("testFuture", testFuture),
]
}
private extension CombineAppendDifferentOutputTests {
func doTest(_ line: UInt = #line, _ fooThenBarMethod: (AnyPublisher<🍌, Never>, AnyPublisher<🍏, Never>) -> AnyPublisher<🍏, Never>) throws {
// GIVEN
// Two publishers `foo` (🍌) and `bar` (🍏)
let bananaSubject = PassthroughSubject<Banana, Never>()
let appleSubject = PassthroughSubject<Apple, Never>()
var outputtedFruits = [Fruit]()
let expectation = XCTestExpectation(description: self.debugDescription)
let cancellable = fooThenBarMethod(
bananaSubject.eraseToAnyPublisher(),
appleSubject.eraseToAnyPublisher()
)
.sink(
receiveCompletion: { _ in expectation.fulfill() },
receiveValue: { outputtedFruits.append($0 as Fruit) }
)
// WHEN
// a send apples and bananas to the respective subjects and a `finish` completion to `appleSubject` (`bar`)
appleSubject.send(🍏(price: 1))
bananaSubject.send(🍌(price: 2))
appleSubject.send(🍏(price: 3))
bananaSubject.send(🍌(price: 4))
appleSubject.send(🍏(price: 5))
appleSubject.send(completion: .finished)
wait(for: [expectation], timeout: 0.1)
// THEN
// A: I the output contains no banana (since the bananaSubject publisher's output is ignored)
// and
// B: Exactly two apples, more specifically the two last, since when the first Apple (with price 1) is sent, we have not yet received the first (needed and triggering) banana.
let expectedFruitCount = 2
XCTAssertEqual(outputtedFruits.count, expectedFruitCount, line: line)
XCTAssertTrue(outputtedFruits.allSatisfy({ $0 is 🍏 }), line: line)
let apples = outputtedFruits.compactMap { $0 as? 🍏 }
XCTAssertEqual(apples.count, expectedFruitCount, line: line)
let firstApple = try XCTUnwrap(apples.first)
let lastApple = try XCTUnwrap(apples.last)
XCTAssertEqual(firstApple.price, 3, line: line)
XCTAssertEqual(lastApple.price, 5, line: line)
XCTAssertNotNil(cancellable, line: line)
}
}
As long as you use .ignoreOutput(), it is safe to replace "ugly" .flatMap { _ in Empty<Fruit, Never>() } to simple .map { Fruit?.none! } which will never be called anyway and just changes the Output type.

Sliding windows in RxSwift

Coming from the RxJava background, I can not come up with a standard approach to implement sliding windows in RxSwift. E.g. I have the following sequence of events:
1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, ...
Let's imagine event emission happens twice in a second. What I want to be able to do is to transform this sequence into a sequence of buffers, each buffer containing last three seconds of data. Plus, each buffer is to be emitted once in a second. So the result would look like that:
[1,2,3,4,5,6], [3,4,5,6,7,8], [5,6,7,8,9,10], ...
What I would do in RxJava is I would use one of the overloads of the buffer method like so:
stream.buffer(3000, 1000, TimeUnit.MILLISECONDS)
Which leads exactly to the result I need to accomplish: sequence of buffers, each buffer is emitted once in a second and contains last three seconds of data.
I checked RxSwift docs far and wide and I did not find any overloads of buffer operator which would allow me to do that. Am I missing some non-obvious (for RxJava user, ofc) operator?
I initially wrote the solution using a custom operator. I have since figured out how it can be done with the standard operators.
extension ObservableType {
func buffer(timeSpan: RxTimeInterval, timeShift: RxTimeInterval, scheduler: SchedulerType) -> Observable<[E]> {
let trigger = Observable<Int>.timer(timeSpan, period: timeShift, scheduler: scheduler)
.takeUntil(self.takeLast(1))
let buffer = self
.scan([Date: E]()) { previous, current in
var next = previous
let now = scheduler.now
next[now] = current
return next.filter { $0.key > now.addingTimeInterval(-timeSpan) }
}
return trigger.withLatestFrom(buffer)
.map { $0.sorted(by: { $0.key <= $1.key }).map { $0.value } }
}
}
I'm leaving my original solution below for posterity:
Writing your own operator is the solution here.
extension ObservableType {
func buffer(timeSpan: RxTimeInterval, timeShift: RxTimeInterval, scheduler: SchedulerType) -> Observable<[E]> {
return Observable.create { observer in
var buf: [Date: E] = [:]
let lock = NSRecursiveLock()
let elementDispoable = self.subscribe { event in
lock.lock(); defer { lock.unlock() }
switch event {
case let .next(element):
buf[Date()] = element
case .completed:
observer.onCompleted()
case let .error(error):
observer.onError(error)
}
}
let spanDisposable = scheduler.schedulePeriodic((), startAfter: timeSpan, period: timeShift, action: { state in
lock.lock(); defer { lock.unlock() }
let now = Date()
buf = buf.filter { $0.key > now.addingTimeInterval(-timeSpan) }
observer.onNext(buf.sorted(by: { $0.key <= $1.key }).map { $0.value })
})
return Disposables.create([spanDisposable, elementDispoable])
}
}
}

Resources