I have a couple of tableviewcells. It has 2 rate values 200 and 300 respectively. These 2 values I’m adding to an array like so
let theRateElement = (Int(product?.theRate ?? "") ?? 0)
rateArray.append(theRateElement)
self.sumedArr = rateArray.reduce(0, { ($0 + $1)})
Now, my summedArr will have the value 500
I also have on my tableviewcell a picker textfield. The picker gives values from 1 to 10. If I select 2 from the pickerview of first cell, then that 2 will be multiplied with 200 to give 400 & if I select 3 from the pickerview of 2nd tableview cell, then then the value of 300 will become 900 (300*3)
I'm taking the sum of these 2 new values like so...(in picker didSelectRow)
let finalAmt = fixedRate * totalItems
let theRateElement = finalAmt
updatedRateArray.append(theRateElement)
self.sumedArr = updatedRateArray.reduce(0, { ($0 + $1)})
Now summedArr will have the value (200*2)+(300*3) = 1300
But what I want to achieve is if I multiply just the 1st value of 200 by 2 to get 400 and keep the 2nd value of 300 as it is then I should get the total as 400+300 = 700. But how to achieve this that I’m not able to figure out.
The issue could be fixed like so...
There is an array which has the initial values of 200 & 300. Now the value with which I will multiply 2, 3 etc. is surely one of the values of the initial array. So that is checked and those values are removed. Now the initial array will be left with those elements which is not multiplied by any number.
The nos. which are multiplied is also stored in an array. Now this array and the initial array are both combined and their elements are added. This finally gives me the sum of numbers multiplied by 2,3,4...etc. and those numbers which are not multiplied which is the desired result..
Related
I am a beginner in google sheets and I couldn't get around this formula. I have range of cells and I want to subtract last non empty cell to first cell (Z-A), here is the image:
As the values are updated in columns C, D, E and so on. I want to get the last non empty cell (from right) and subtract the values by moving backward (left). Like this:
sub = 10(Column G)-0(Column F)-10(Column E)-0(Column D)-10(Column C)
Can we devise a formula which will get the last non empty cell and subtract values until the first value? Here is the link to the sample sheet Thank you
try:
=LOOKUP(1, INDEX(1/(C2:F2<>"")), C2:F2)-(SUM(C2:F2)-
LOOKUP(1, INDEX(1/(C2:F2<>"")), C2:F2))
Suggestion: Use a custom function
You may use the following script as a custom function to get the difference between the value of the last cell and the sum of the other cells:
function SUBTRACTFROMLASTNUMBER(data) { //you can rename the custom function name
var sum = 0;
var data2 = data[0].filter(x => {
return (x != "") ? x : null;
}); //filtered data
var lastNumber = data2.pop(); //last number
data2.map(x => sum += x); //sums the remaining values
return (lastNumber - sum); //returns the output
}
This custom function extracts the selected data from the sheet and then separates the value of the last cell using pop() and then filters and sums the remaining data using filter() and map() and then subtracts the sum from the value of the last cell.
Usage
You may use this function as:
=SUBTRACTFROMLASTNUMBER(<range>)
Reference:
How to Manipulate Arrays in JavaScript
In widget iOS14, I want to show values in graphical form using bar chart.
I have used this library "https://github.com/dawigr/BarChart" to draw bar chart.
But in Xcode12+, it's not showing negative values and considering negative value as 0 and throwing warning as shown in screen shot.
"[SwiftUI] Invalid frame dimension (negative or non-finite)"
You could try to normalise your input Values to prevent getting errors like this.
e.g.: if your data set contains values from -10 to 100, your min normalised value would be 0 and your max normalised value 1. This only works if your numbers are CGFloat, Double or something like this, numbers in Int format would be rounded up.
This could be done by using an extension like this:
extension Array where Element == Double {
var noramlized: [Double] {
if let min = self.min(), let max = self.max() {
return self.map{ ($0 - min) / (max - min) }
}
return []
}
}
I don't no how you get your values for the frame exactly, but I think you did something like this:
// normalise your data set:
let data : [Double] = [Double]()
youChart(inputData: data.noramlized)
// get values for the frame
let height = data.noramlized.max()
// if your normalised value is too small for your purpose (your max will always be 1.0 but in relation to the rest it might fit), you can scale it up like height * 20.
// the width might be a non changing value that you will enter manually or it will append on the count of bars in your chart.
I am using the SciChart API for 2D Charts from SciCharts, I wanted to know if it is possible to know the min and max visible index/indices from ISCIAxisCore.
I need to get this information from the callback made to SCIVisibleRangeChangeListener when visible range as changed. So I can calculate some extended information, all the data is stored in a few arrays and the graph is only showing a section of it and I need to show some average values based on the visible range of the graph.
I know I could use Swift API to get the index out of the array but this seems to me like the most inefficient way of getting the visible min and max index of the data set, as it will need to search in a data set that can span more than 5k records.
I suspect, you are looking for one of the following:
search index range directly on the DataSeries, using -getIndicesXRange:xCoordinateCalculator:
search index in the underlying ISCIList, via the -findIndexOf:searchMode:isSorted: on the dataSeries.xValues
I added the following listener with prints into our Line Chart Example, which showcases how to use getIndicesXRange:
xAxis.visibleRangeChangeListener = { (axis, oldRange, newRange, animated) in
guard animated == false else { return }
if let axis = axis, let min = newRange?.minAsDouble, let max = newRange?.maxAsDouble {
let indicesRange = SCIIndexRange()
dataSeries.getIndicesXRange(indicesRange, xCoordinateCalculator: axis.currentCoordinateCalculator)
print("Values: \(min) : \(max)")
print("Min: \(indicesRange.min), Max \(indicesRange.max)")
}
}
Hope that helps.
How can I increase a value just after I return a cell in my tableView ?
var indexDigit1 = 0
var indexDigit2 = 1
cell.detailTextLabel?.text = "\(array[indexDigit1])"+"\(array[indexDigit2])"
indexDigit1 = indexDigit1 + 2
indexDigit2 = indexDigit2 + 2
return cell
But I'd like that indexDigit1 and indexDigit2 change for every cell in my tableView.
So for example : indexDigit1 should be 2 and indexDigit2 should be 3 for the second cell, 4 and 5 for the third etc.. So the value in the label can change !
Where is this code living? Seems like if you want indexDigit1 to be (+1) of the row index (second cell would be indexPath.row == 1), and indexDigit2 to be (+2) of the row index, you don't even need a variable, and you want just use cellForItem's indexPath.row property.
If I understand what you need, you can create an array that its position (index) will represent the value you need which in your case indexDigit1 is the position and the value is 0.
For example:
Let's say your table has only 10 cells, thus:
array = [Int](count: 10, repeatedValue: 0)
For the technical part you will use the method cellForRowAtIndexPath and assign the value you need for the indexPath.row you need:
array[indexPath.row] = theValueYouNeed
Also, looks like maybe you need some sort of formula for the value, maybe theValueYouNeed += 2 or something more complicated that will suit your need.
Now, you can extract the value you need for each indexPath.row:
let theValueYouNeedForTheCorrespondingRow = array[indexPath.row]
Little explication: I have an app with 5 textfields, i already got the code to make the avaerage of them. Using floates and giving a variable to each textfield.so, the adding of the value of all the textfield /5 (divided per 5)
My problem is that when a textfield is leaved emptty, its not goona be /5, if user fill in only 4 cells it will be/4 and so on.
Question: how to divide the adding of cells depending in how many cells have content?
I was tryng with textfield.text.length > 0 in an if. But i dont get it
Thanks, hope i was clear.
You could check each textfield and see if the text is NULL or blank (#"")
Use an int to keep track of the number of textfields that have a value.
For example:
int counter = 0;
float total = 0.0;
if([textfield1.text length] > 0)
{
counter++;
float thisValue = [textfield1.text floatValue];
total = total + thisValue;
}
repeat this for all 5 textfields and then divide your total by the counter variable.
You could include the if statement in a loop if you are going to vary the number of text fields or to reduce the amount of coding you do.
EDIT I have changed my answer based on the good suggestions of other to use text.length
try to count the number of used textfields by increasing a number every time you read out a textfield that has valid text in it and divide the result by that number. (logically the "number" should be initialized as 0)