Observable distinctUntilChanged returns duplicates - ios

Setup: RxSwift 4.2 Swift 4.1 Xcode 9.4.1
I'm currently using distinctUntilChanged to get unique values.
But in my case it's only working for "sorted" values.
Like for example here:
func unique(source: Observable<Int>) -> Observable<Int> {
return source.distinctUntilChanged()
}
Here is the corresponding test:
func testUnique() {
let input = Observable.from([1,2,3,4,4,5,4])
let expectation = [Recorded.next(0, 1),
Recorded.next(0, 2),
Recorded.next(0, 3),
Recorded.next(0, 4),
Recorded.next(0, 5),
Recorded.completed(0)]
_ = Class().unique(source: input).subscribe(observer)
XCTAssertEqual(observer.events, expectation)
}
And my test is failing with:
XCTAssertEqual failed: ("[next(1) # 0, next(2) # 0, next(3) # 0, next(4) # 0, next(5) # 0, next(4) # 0, completed # 0]")
is not equal to ("[next(1) # 0, next(2) # 0, next(3) # 0, next(4) # 0, next(5) # 0, completed # 0]") -
So the last 4 is a duplicate.
Is this behavior expected or a bug?

It is indeed expected behaviour, that is why operator is called '.distinctUntilChanged()'. What you apparently want is '.distinct()', but it is not available in basic 'RxSwift' framework, only in 'RxSwiftExt' framework - https://github.com/RxSwiftCommunity/RxSwiftExt#distinct

This is expected behaviour. distinctUntilChanged is comparing with the previous value in the stream, in your case it should emit:
let expectation = [Recorded.next(0, 1),
Recorded.next(0, 2),
Recorded.next(0, 3),
Recorded.next(0, 4),
Recorded.next(0, 5),
Recorded.next(0, 4),
Recorded.completed(0)]
Think you can use scan to remember previous values and then just check if it needs to emit new value. Here's a quick solution using scan:
extension Observable where Element: Equatable {
func unique() -> Observable<Element> {
return scan([Element](), accumulator: { previousValues, nextValue in
if !previousValues.contains(nextValue) {
return previousValues + [nextValue]
}
return previousValues
})
.distinctUntilChanged()
.map { $0.last! }
}
}

I made up a simple type specifiy answer myself.
It's basicly the RxSwiftExt distinct solution:
func unique(source: Observable<Int>) -> Observable<Int> {
var cache = Set<Int>()
return source.flatMap { element -> Observable<Int> in
if cache.contains(element) {
return Observable<Int>.empty()
} else {
cache.insert(element)
return Observable<Int>.just(element)
}
}
}

Related

How to slice a list with a custom step/increment in Dart?

In Python, you can specify a "step" argument to a list slice that specifies the separation between indices that are selected to be in the slice:
my_list[start:stop:step]
However, none of the list methods in Dart seem to offer this functionality: sublist and getRange just take the start and end index.
How can I do this in Dart without using an ugly for-loop?
For example, to select only the even indices of a list I currently see no alternative to this:
List<Object> myList = ...;
List<Object> slice = [];
for (var i = 0; i < myList.length; i += 2) {
slice.add(myList[i]);
}
Or slightly less ugly with a list comprehension:
[for (var i = 0; i < myList.length; i += 2) myList[i]]
I could write my own function or extension method, but that defeats the purpose, I'm looking for ideally a built-in or a third package solution.
For this you can create extension on list to return custom result.
List<T> slice([int? start, int? end, int? step]) {
if (start == null && end == null && step == null) {
return this!;
} else if (start != null && end == null && step == null) {
return this!.sublist(start);
} else if (start != null && end != null && step == null) {
return this!.sublist(start, end);
} else if (start != null && end != null && step != null) {
// iterate over the list and return the list
// iterator start from start index
// iterator end at end index
// iterator step by step
final list = <T>[];
for (var i = start; i < end; i += step) {
list.add(this![i]);
}
return list;
} else {
return this!;
}
}
You can use the slice extension on any list. Below are examples of how to use it.
Example 1
This example will return the slice list of the list depending starting and ending index.
final list1 = [1, 2, 3, 4, 5];
final result = list1.slice(1, 4);
print(result); // [2, 3, 4]
Example 2
This example will return the slice list of the list depending starting index.
final list1 = [1, 2, 3, 4, 5];
final result = list1.slice(1);
print(result); // [2, 3, 4, 5]
Complate program.
You can run this example in Dartpad to check results.
void main() {
final list1 = [1, 2, 3, 4, 5,6,7,8,9,10,11,12,13,14,15,15,17,18,19,20];
// Example - 1
final result = list1.slice(1, 4);
print(result); // [2, 3, 4]
//Example - 2
final result2 = list1.slice(10);
print(result2); // [11, 12, 13, 14, 15, 15, 17, 18, 19, 20]
//Example - 3
final result4 = list1.slice(4, 10, 2);
print(result4); // [5, 7, 9]
//Example - 4
final result3 = list1.slice();
print(result3); // [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 15, 17, 18, 19, 20]
}
extension ListHelper<T> on List<T>? {
List<T> slice([int? start, int? end, int? step]) {
if (start == null && end == null && step == null) {
return this!;
} else if (start != null && end == null && step == null) {
return this!.sublist(start);
} else if (start != null && end != null && step == null) {
return this!.sublist(start, end);
} else if (start != null && end != null && step != null) {
// iterate over the list and return the list
// iterator start from start index
// iterator end at end index
// iterator step by step
final list = <T>[];
for (var i = start; i < end; i += step) {
list.add(this![i]);
}
return list;
} else {
return this!;
}
}
}
You can easily create your own slice method in Dart.
The first thing to decide is whether you want it to be lazy or eager—does it create a list or an iterable.
The traditional Dart way would be an iterable, created from another iterable, which is also slightly more complicated to write.
extension LazySlice<T> on Iterable<T> {
/// A sub-sequence ("slice") of the elements of this iterable.
///
/// The elements of this iterable starting at the [start]th
/// element, and ending before the [end]th element, or sooner
/// if this iterable has fewer than [end] elements.
/// If [end] is omitted, the sequence continues
/// to the end of this iterable.
/// If [step] is provided, only each [step]th element of the
/// [start]..[end] range is included, starting with the first,
/// and skipping `step - 1` elements after each that is included.
Iterable<T> slice([int start = 0, int? end, int step = 1]) {
// Check inputs.
RangeError.checkNotNegative(start, "start");
if (end != null && end < start) {
throw RangeError.range(end, start, null, "end");
}
if (step < 1) {
throw RangeError.range(step, 1, null, "step");
}
// Then return an iterable.
var iterable = this;
if (end != null) iterable = iterable.take(end);
if (start > 0) iterable = iterable.skip(start);
if (step != 1) iterable = iterable.step(step);
return iterable;
} // slice
/// Every [step] element.
///
/// The first element of this iterable, and then every
/// [step]th element after that (skipping `step - 1`
/// elements of this iterable between each element of
/// the returned iterable).
Iterable<T> step(int step) {
if (step == 1) return this;
if (step < 1) {
throw RangeError.range(step, 1, null, "step");
}
return _step(step);
}
/// [step] without parameter checking.
Iterable<T> _step(int step) sync* {
var it = iterator;
if (!it.moveNext()) return;
while (true) {
yield it.current;
for (var i = 0; i < step; i++) {
if (!it.moveNext()) return;
}
}
} // _step
} // extension LazySLice
Working with a list is much easier:
extension EagerSlice<T> on List<T> {
List<T> slice([int start = 0, int? end, int step = 1]) {
if (step == 1) return sublist(start, end); // Checks parameters.
end = RangeError.checkValidRange(start, end, length);
if (step < 1) {
throw RangeError.range(step, 1, null, "step");
}
return <T>[for (var i = start; i < end; i += step) this[i]];
}
}
(Effectively the same approach proposed by #Anakhand in the comments above, just with better parameter checking.)
The list approach is easier, mainly because we don't already have a step method on iterables, which picks every nth element.

How can i find sum of three element in a given away

Is there a method we use to reach the desired number in an array given in dart language.. I can do this for binary ones, but I can't do it for a code that finds the sum of 3 or more elements
For example
Input: candidates = [10,1,2,7,6,1,5], target = 8
Output:
[
[1,1,6],
[1,2,5],
[1,7],
[2,6]
]
this is the my code i have done until now
void main() {
var candidates = [10, 1, 2, 7, 6, 1, 5], target = 8;
var answer = [];
for (int i = 0; i < candidates.length; i++) {
for (int j = 0; j < candidates.length; j++) {
if (candidates[i] + candidates[j] == target && i != j && i < j) {
answer.add([candidates[i], candidates[j]]);
}
}
}
}
I am sure this can be done more efficient but since the solution is for some Leetcode assignment, I don't really want to spend too much time on optimizations.
I have tried added some comments in the code which explains my way of doing it:
void main() {
getSumLists([10, 1, 2, 7, 6, 1, 5], 8).forEach(print);
// [5, 1, 2]
// [1, 6, 1]
// [1, 7]
// [6, 2]
getSumLists([2, 5, 2, 1, 2], 5).forEach(print);
// [2, 1, 2]
// [5]
}
Iterable<List<int>> getSumLists(
List<int> candidates,
int target, {
List<int>? tempAnswer,
int sum = 0,
}) sync* {
// We cannot use default value in parameter since that makes list const
final tempAnswerNullChecked = tempAnswer ?? [];
if (sum == target) {
// We got a result we can return.
// OPTIMIZATION: If you know the returned list from each found result is not
// being used between found results, you can remove the `.toList()` part.
yield tempAnswerNullChecked.toList();
} else if (sum > target) {
// No need to search further in this branch since we are over the target
return;
}
// Make a copy so we don't destroy the input list but also so it works even
// if provided list as input is non-growing / non-modifiable
final newCandidates = candidates.toList();
while (newCandidates.isNotEmpty) {
// We take numbers from the end of the list since that is more efficient.
final number = newCandidates.removeLast();
// Recursive call where we return all results we are going to find given
// the new parameters
yield* getSumLists(
newCandidates,
target,
tempAnswer: tempAnswerNullChecked..add(number),
sum: sum + number,
);
// Instead of creating a new tempAnswerNullChecked, we just reuse it and
// make sure we remove any value we are temporary adding
tempAnswerNullChecked.removeLast();
// Ensure we don't get duplicate combinations. So if we have checked the
// number `1` we remove all `1` so we don't try the second `1`.
newCandidates.removeWhere((element) => element == number);
}
}

Elegant way to combineLatest without dropping values and imbalanced publishers in Swift Combine

I have two Publishers A and B. They are imbalanced as in A will emit 3 values, then complete, B will only emit 1 value, then complete (A actually can emit a variable number, B will remain 1 if that helps):
A => 1, 2, 3
B => X
B also runs asynchronously and will likely only emit a value after A already emitted its second value (see diagram above). (B might also only emit any time, including after A already completed.)
I'd like to publish tuples of A's values combined with B's values:
(1, X) (2, X) (3, X)
combineLatest is not up for the job as it will skip the first value of A and only emit (2, X) and (3, X). zip on the other hand will not work for me, because B only emits a single value.
I am looking for an elegant way to accomplish this. Thanks!
Edit and approach to a solution
A bit philosophical, but I think there is fundamental question if you want to go the zip or combineLatest route. You definitely need some kind of storage for the faster publisher to buffer events while you wait for the slower to start emitting values.
One solution might be to create a publisher that collects events from A until B emits and then emits all of the collected events and continues emitting what A gives. This is actually possible through
let bufferedSubject1 = Publishers.Concatenate(
prefix: Publishers.PrefixUntilOutput(upstream: subject1, other: subject2).collect().flatMap(\.publisher),
suffix: subject1)
PrefixUntilOutput will collect everything until B emits (subject2) and then switch to just regularly passing the output of it.
However if you run
let cancel = bufferedSubject1.combineLatest(subject2)
.sink(receiveCompletion: { c in
print(c)
}, receiveValue: { v in
print(v)
})
you are still missing the first value from A (1,X) -- this seems to be a bit like a race condition: Will bufferedSubject1 have all values emitted first or does subject2 provide a value to combineLatest first?
What I think is interesting is that without any async calls, the behavior seems to be undefined. If you run the sample below, sometimes™️ you get all values emitted. Sometimes you are missing out on (1,X). Since there is no async calls and no dispatchQueue switching here, I would even assume this is a bug.
You can "dirty fix" the race condition by providing a delay or even just a receive(on: DispatchQueue.main) between bufferedSubject1 and combineLatest, so that before we continue the pipeline, we hand back control to the DispatchQueue and let subject2 emit to combineLatest.
However, I would not deem that elegant and still looking for a solution that uses zip semantics but without having to create an infinite collection of the same value (which does not play well with sequential processing and unlimited demand, the way I see it).
Sample:
var subject1 = PassthroughSubject<Int, Never>()
var subject2 = PassthroughSubject<String, Never>()
let bufferedSubject1 = Publishers.Concatenate(prefix: Publishers.PrefixUntilOutput(upstream: subject1, other: subject2).collect().flatMap(\.publisher),
suffix: subject1)
let bufferedSubject2 = Publishers.Concatenate(prefix: Publishers.PrefixUntilOutput(upstream: subject2, other: subject1).collect().flatMap(\.publisher),
suffix: subject2)
let cancel = bufferedSubject1.combineLatest(subject2)
.sink(receiveCompletion: { c in
print(c)
}, receiveValue: { v in
print(v)
})
subject1.send(1)
subject1.send(2)
subject2.send("X")
subject2.send(completion: .finished)
subject1.send(3)
subject1.send(completion: .finished)
Ok, this was an interesting challenge and though it seemed deceptively simple, I couldn't find a simple elegant way.
Here's a working approach (though hardly elegant) that seems to not suffer from the race condition of using PrefixUntilOutput/Concatenate combo.
The idea is to use combineLatest, but one that emits as soon as the first publisher emits, with the other value being nil so that we don't lose the initial values. Here's a convenience operator that does that that I called combineLatestOptional:
extension Publisher {
func combineLatestOptional<Other: Publisher>(_ other: Other)
-> AnyPublisher<(Output?, Other.Output?), Failure>
where Other.Failure == Failure {
self.map { Optional.some($0) }.prepend(nil)
.combineLatest(
other.map { Optional.some($0) }.prepend(nil)
)
.dropFirst() // drop the first (nil, nil)
.eraseToAnyPublisher()
}
}
Armed with the above, the second step in the pipeline uses Scan to collect values into an accumulator until the other publisher emits the first value. There are 4 states of the accumulator that I'm representing this state with a State<L, R> type:
fileprivate enum State<L, R> {
case initial // before any one publisher emitted
case left([L]) // left emitted; right hasn't emitted
case right([R]) // right emitted; left hasn't emitted
case final([L], [R]) // final steady-state
}
And the final operator combineLatestLossless is implemented like so:
extension Publisher {
func combineLatestLossless<Other: Publisher>(_ other: Other)
-> AnyPublisher<(Output, Other.Output), Failure>
where Failure == Other.Failure {
self.combineLatestOptional(other)
.scan(State<Output, Other.Output>.initial, { state, tuple in
switch (state, tuple.0, tuple.1) {
case (.initial, let l?, nil): // left emits first value
return .left([l]) // -> collect left values
case (.initial, nil, let r?): // right emits first value
return .right([r]) // -> collect right values
case (.left(let ls), let l?, nil): // left emits another
return .left(ls + [l]) // -> append to left values
case (.right(let rs), nil, let r?): // right emits another
return .right(rs + [r]) // -> append to right values
case (.left(let ls), _, let r?): // right emits after left
return .final(ls, [r]) // -> go to steady-state
case (.right(let rs), let l?, _): // left emits after right
return .final([l], rs) // -> go to steady-state
case (.final, let l?, let r?): // final steady-state
return .final([l], [r]) // -> pass the values as-is
default:
fatalError("shouldn't happen")
}
})
.flatMap { status -> AnyPublisher<(Output, Other.Output), Failure> in
if case .final(let ls, let rs) = status {
return ls.flatMap { l in rs.map { r in (l, r) }}
.publisher
.setFailureType(to: Failure.self)
.eraseToAnyPublisher()
} else {
return Empty().eraseToAnyPublisher()
}
}
.eraseToAnyPublisher()
}
}
The final flatMap creates a Publishers.Sequence publisher from all the accumulated values. In the final steady-state, each array would just have a single value.
The usage is simple:
let c = pub1.combineLatestLossless(pub2)
.sink { print($0) }
zip on the other hand will not work for me, because B only emits a single value.
Correct, so fix it so that that’s not true. Start a pipeline at B. Using flatmap turn its signal into a publisher for a sequence of that signal, repeated. Zip that with A.
Example:
import UIKit
import Combine
func delay(_ delay:Double, closure:#escaping ()->()) {
let when = DispatchTime.now() + delay
DispatchQueue.main.asyncAfter(deadline: when, execute: closure)
}
class ViewController: UIViewController {
var storage = Set<AnyCancellable>()
let s1 = PassthroughSubject<Int,Never>()
let s2 = PassthroughSubject<String,Never>()
override func viewDidLoad() {
super.viewDidLoad()
let p1 = s1
let p2 = s2.flatMap { (val:String) -> AnyPublisher<String,Never> in
let seq = Array(repeating: val, count: 100)
return seq.publisher.eraseToAnyPublisher()
}
p1.zip(p2)
.sink{print($0)}
.store(in: &storage)
delay(1) {
self.s1.send(1)
}
delay(2) {
self.s1.send(2)
}
delay(3) {
self.s1.send(3)
}
delay(2.5) {
self.s2.send("X")
}
}
}
Result:
(1, "X")
(2, "X")
(3, "X")
Edit
After stumbling on this post I wonder if the problem in your example is not related to the PassthroughSubject:
PassthroughSubject will drop values if the downstream has not made any demand for them.
and in fact using :
var subject1 = Timer.publish(every: 1, on: .main, in: .default, options: nil)
.autoconnect()
.measureInterval(using: RunLoop.main, options: nil)
.scan(DateInterval()) { res, interval in
.init(start: res.start, duration: res.duration + interval.magnitude)
}
.map(\.duration)
.map { Int($0) }
.eraseToAnyPublisher()
var subject2 = PassthroughSubject<String, Never>()
let bufferedSubject1 = Publishers.Concatenate(prefix: Publishers.PrefixUntilOutput(upstream: subject1, other: subject2).collect().flatMap(\.publisher),
suffix: subject1)
let cancel = bufferedSubject1.combineLatest(subject2)
.sink(receiveCompletion: { c in
print(c)
}, receiveValue: { v in
print(v)
})
subject2.send("X")
DispatchQueue.main.asyncAfter(deadline: .now() + 3) {
subject2.send("Y")
}
I get this output :
(1, "X")
(2, "X")
(3, "X")
(3, "Y")
(4, "Y")
(5, "Y")
(6, "Y")
And that seems to be the desired behavior.
I don't know if it is an elegant solution but you can try to use Publishers.CollectByTime :
import PlaygroundSupport
import Combine
PlaygroundPage.current.needsIndefiniteExecution = true
let queue = DispatchQueue(label: "com.foo.bar")
let cancellable = letters
.combineLatest(indices
.collect(.byTimeOrCount(queue, .seconds(1), .max))
.flatMap { indices in indices.publisher })
.sink { letter, index in print("(\(index), \(letter))") }
indices.send(1)
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
indices.send(2)
indices.send(3)
}
DispatchQueue.main.asyncAfter(deadline: .now() + 1) {
letters.send("X")
}
DispatchQueue.main.asyncAfter(deadline: .now() + 3.3) {
indices.send(4)
}
DispatchQueue.main.asyncAfter(deadline: .now() + 3.5) {
letters.send("Y")
}
DispatchQueue.main.asyncAfter(deadline: .now() + 3.7) {
indices.send(5)
indices.send(6)
}
Output :
(X, 1)
(X, 2)
(X, 3)
(Y, 3)
(Y, 4)
(Y, 5)
(Y, 6)
Algorithmically speaking, you need to:
wait until B emits an event, collect all elements that A emits
store the element you just received from B
emit the pair of elements emitted so far by A
emit the rest of elements that A emits after B emitted it's element
An implementation of the above algoritm can be done like this:
// `share` makes sure that we don't cause unwanted side effects,
// like restarting the work `A` does, as we subscribe multiple
// times to this publisher
let sharedA = a.share()
// state, state, state :)
var latestB: String!
var cancel = sharedA
// take all elements until `B` emits
.prefix(untilOutputFrom: b.handleEvents(receiveOutput: { latestB = $0}))
// wait on those elements
.collect()
// uncollect them
.flatMap { $0.publisher }
// make sure we deliver the rest of elements from `A`
.append(sharedA)
// now, pair the outputs together
.map { ($0, latestB) }
.sink(receiveValue: { print("\($0)") })
Maybe there's a way to avoid the state (latestB), and use a pure pipeline, couldn't yet find it, though.
P.S. As an added bonus, if B is expected to emit more than one element, than with a simple change we can support this scenario too:
let sharedA = a.share()
let sharedB = b.handleEvents(receiveOutput: { latestB = $0}).share()
var latestB: String!
var cancel = sharedA.prefix(untilOutputFrom: sharedB)
.collect()
.flatMap { $0.publisher }
.append(sharedA)
.map { ($0, latestB)}
.sink(receiveValue: { print("\($0)") })

How to filter out empty flux

For example, I have the following code which creates a Mono with a list of 3 numbers 1,2,3. I want to filter out the number 1. The result would be a list of 2 numbers 2,3. What should I do in the flatMapMany so that it skip the number 1?
Mono.just(new LinkedList<>{{
add(1);
add(2);
add(3);
}})
.flatMapMany(number -> {
if (number == 1) {
// not return
}
return number;
})
.collectList()
.map(numbers -> {
// numbers should be 2,3
})
A follow-up question
what if in my code I return Flux.empty() when number is 1
.flatMapMany(number -> {
if (number == 1) {
return Flux.empty()
}
return number;
})
.filter(i ->{
// how to filter out Flux.empty() ?
})
In the filter, how can I detect if i is empty flux and filter it out
Take a look at this example. You can use flatMapIterable to convert a mono of list to Flux.
Mono.just(List.of(1,2,3))
.flatMapIterable(Function.identity())
.filter(i -> i != 1)
.collectList()
.subscribe(s -> System.out.println(s)); //prints [2,3]
For the follow up question. We need to use .handle which is a combination of map and filter methods.
Flux.fromIterable(List.of(1, 2, 3))
.handle((number, sink) -> {
if (number != 1) {
sink.next(number);
}
})
.collectList()
.subscribe(s -> System.out.println(s));
If you really want to go with your approach, then we need to filter like this
Flux.fromIterable(List.of(1, 2, 3))
.flatMap((number) -> number == 1 ? Flux.empty() : Flux.just(number))
.collectList()
.subscribe(s -> System.out.println(s));

"let" in rspec doesn't write to database

The below does not write to my database so my tests fail:
let(:level_1) { Fabricate(:level, number: 1, points: 100) }
let(:level_2) { Fabricate(:level, number: 2, points: 200) }
Level.count # 0
However, the following does work
before do
level_1 = Fabricate(:level, points: 100, number: 1)
level_2 = Fabricate(:level, points: 200, number: 2)
end
Level.count # 2
This seems very strange.
Its because let is lazily-loaded. Meaning, only when you invoke level1 and level2(inside the examples), the blocks will be executed and the records will be created. A workaround is to use let! which is invoked before each example.
Try
let!(:level_1) { Fabricate(:level, number: 1, points: 100) }
let!(:level_2) { Fabricate(:level, number: 2, points: 200) }
Now, Level.count will return 2
For more, see https://www.relishapp.com/rspec/rspec-core/v/2-5/docs/helper-methods/let-and-let

Resources