Dart: how to convert a column letter into number - dart

Currently using Dart with gsheets_api, which don't seem to have a function to convert column letters to numbers (column index)
As an example , this is what I use with AppScript (input: column letter, output: column index number):
function Column_Nu_to_Letter(column_nu)
{
var temp, letter = '';
while (column_nu > 0)
{
temp = (column_nu - 1) % 26;
letter = String.fromCharCode(temp + 65) + letter;
column_nu = (column_nu - temp - 1) / 26;
}
return letter;
};
This is the code I came up for Dart, it works, but I am sure there is a more elegant or correct way to do it.
String colLetter = 'L'; //Column 'L' as example
int c = "A".codeUnitAt(0);
int end = "Z".codeUnitAt(0);
int counter = 1;
while (c <= end) {
//print(String.fromCharCode(c));
if(colLetter == String.fromCharCode(c)){
print('Conversion $colLetter = $counter');
}
counter++;
c++;
}
// this output L = 12
Do you have any suggestions on how to improve this code?

First we need to agree on the meaning of the letters.
I believe the traditional approach is "A" is 1, "Z" is 26, "AA" is 27, "AZ" is 52, "BA" is 53, etc.
Then I'd probably go with something like these functions for converting:
int lettersToIndex(String letters) {
var result = 0;
for (var i = 0; i < letters.length; i++) {
result = result * 26 + (letters.codeUnitAt(i) & 0x1f);
}
return result;
}
String indexToLetters(int index) {
if (index <= 0) throw RangeError.range(index, 1, null, "index");
const _letters = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
if (index < 27) return _letters[index - 1];
var letters = <String>[];
do {
index -= 1;
letters.add(_letters[index.remainder(26)]);
index ~/= 26;
} while (index > 0);
return letters.reversed.join("");
}
The former function doesn't validate that the input only contains letters, but it works correctly for strings containing only letters (and it ignores case as a bonus).
The latter does check that the index is greater than zero.

A simplified version base on Irn's answer
int lettersToIndex(String letters) =>
letters.codeUnits.fold(0, (v, e) => v * 26 + (e & 0x1f));
String indexToLetters(int index) {
var letters = '';
do {
final r = index % 26;
letters = '${String.fromCharCode(64 + r)}$letters';
index = (index - r) ~/ 26;
} while (index > 0);
return letters;
}

Related

Unable to understand firstTerm = secondTerm; secondTerm = nextTerm; in fibonacci series

class Main {
public static void main(String[] args) {
int n = 5, firstTerm = 0, secondTerm = 1;
System.out.println("Fibonacci Series till " + n + " terms:");
for (int i = 1; i <= n; ++i) {
System.out.print(firstTerm + " ");
// compute the next term
int nextTerm = firstTerm + secondTerm;
firstTerm = secondTerm;
secondTerm = nextTerm;
}
}
}
//Q) Unable to understand why firstTerm = secondTerm;
secondTerm = nextTerm; statement is written, can anyone explain me this concept
The fibonnaci sequence is defined by
F(0) = 0 // This is our first term
F(1) = 1 // This is the second term
F(n) = F(n - 1) + F(n - 2)
To calculate a term that is neither the first term, nor the second term, we need to sum, the two previous terms.
This is the reason why while iterating, the second term value is assigned to the first term and so on
You will have more details here

How to convert large number to shorten K/M/B in Dart

How can I create function that convert large number into shorten number with character in Dart?
like
1000 => 1K
10000 => 10K
1000000 => 1M
10000000 => 10M
1000000000 => 1B
There is a built-in function in Dart that can be used and it's simple:
var f = NumberFormat.compact(locale: "en_IN");
print(f.format(12345));
to make it a method:
getShortForm(var number) {
var f = NumberFormat.compact(locale: "en_US");
return f.format(number);
}
for this to work import
import 'package:intl/intl.dart';
Refer to this doc for more https://pub.dev/documentation/intl/latest/intl/NumberFormat-class.html
If you are looking for a hard way:
getShortForm(int number) {
var shortForm = "";
if (number != null) {
if (number < 1000) {
shortForm = number.toString();
} else if (number >= 1000 && number < 1000000) {
shortForm = (number / 1000).toStringAsFixed(1) + "K";
} else if (number >= 1000000 && number < 1000000000) {
shortForm = (number / 1000000).toStringAsFixed(1) + "M";
} else if (number >= 1000000000 && number < 1000000000000) {
shortForm = (number / 1000000000).toStringAsFixed(1) + "B";
}
}
return shortForm;
}
String toString(int value) {
const units = <int, String>{
1000000000: 'B',
1000000: 'M',
1000: 'K',
};
return units.entries
.map((e) => '${value ~/ e.key}${e.value}')
.firstWhere((e) => !e.startsWith('0'), orElse: () => '$value');
}
A simpler approach, if you only need the suffix. It may not be compiling, but this is the idea.
String getSuffix (int t)
{
int i = -1;
for ( ; (t /= 1000) > 0 ; i++ );
return ['K','M','B'][i];
}
Edit
This is the mathematical way to do it, and it compiles. The point is you are searching for the amount of "groups of 3 decimal" places:
x 000 - 1
x 000 000 - 2
and so on. Which is log1000 number.
String getSuffix (int num)
{
int i = ( log(num) / log(1000) ).truncate();
return (num / pow(1000,i)).truncate().toString() + [' ','K','M','B'][i];
}
The Intl package does this as "compact" numbers, but it has a fixed format and it will also change with different locales, which might or might not be what you want.
Make a class and used its static method every where.
class NumberFormatter{
static String formatter(String currentBalance) {
try{
// suffix = {' ', 'k', 'M', 'B', 'T', 'P', 'E'};
double value = double.parse(currentBalance);
if(value < 1000){ // less than a thousand
return value.toStringAsFixed(2);
}else if(value >= 1000 && value < (1000*100*10)){ // less than a million
double result = value/1000;
return result.toStringAsFixed(2)+"k";
}else if(value >= 1000000 && value < (1000000*10*100)){ // less than 100 million
double result = value/1000000;
return result.toStringAsFixed(2)+"M";
}else if(value >= (1000000*10*100) && value < (1000000*10*100*100)){ // less than 100 billion
double result = value/(1000000*10*100);
return result.toStringAsFixed(2)+"B";
}else if(value >= (1000000*10*100*100) && value < (1000000*10*100*100*100)){ // less than 100 trillion
double result = value/(1000000*10*100*100);
return result.toStringAsFixed(2)+"T";
}
}catch(e){
print(e);
}
}
}

Find longest common substring of array of Strings

In my Swift 3.0 app, I want to determine the best name for something by finding the longest common substring of 6 to 12 strings.
Example strings:
ON/OFF office lights
DIM office lights
VALUE office lights
FB office lights
FB VALUE office lights
Desired output:
office lights
I've come across multiple StackOverflow answers for the longest subsequence but haven't been able to adapt any of them to my needs..
Any help would be greatly appreciated!
I converted Java & C++ code into Swift 3 , collected from GeeksForGeeks Longest Common Subsequence & Longest Common Substring.
It works !
class LongestCommon
{
// Returns length of LCS for X[0..m-1], Y[0..n-1]
private static func lcSubsequence(_ X : String , _ Y : String ) -> String
{
let m = X.characters.count
let n = Y.characters.count
var L = Array(repeating: Array(repeating: 0, count: n + 1 ) , count: m + 1)
// Following steps build L[m+1][n+1] in bottom up fashion. Note
// that L[i][j] contains length of LCS of X[0..i-1] and Y[0..j-1]
for i in stride(from: 0, through: m, by: 1)
{
for j in stride(from: 0, through: n, by: 1)
{
if i == 0 || j == 0
{
L[i][j] = 0;
}
else if X[X.index( X.startIndex , offsetBy: (i - 1) )] == Y[Y.index( Y.startIndex , offsetBy: (j - 1) )]
{
L[i][j] = L[i-1][j-1] + 1
}
else
{
L[i][j] = max(L[i-1][j], L[i][j-1])
}
}
}
// Following code is used to print LCS
var index = L[m][n]
// Create a character array to store the lcs string
var lcs = ""
// Start from the right-most-bottom-most corner and
// one by one store characters in lcs[]
var i = m
var j = n
while (i > 0 && j > 0)
{
// If current character in X[] and Y are same, then
// current character is part of LCS
if X[X.index( X.startIndex , offsetBy: (i - 1) )] == Y[Y.index( Y.startIndex , offsetBy: (j - 1) )]
{
lcs.append(X[X.index( X.startIndex , offsetBy: (i - 1) )])
i-=1
j-=1
index-=1
}
// If not same, then find the larger of two and
// go in the direction of larger value
else if (L[i-1][j] > L[i][j-1])
{
i-=1
}
else
{
j-=1
}
}
// return the lcs
return String(lcs.characters.reversed())
}
// Returns length of LCS for X[0..m-1], Y[0..n-1]
private static func lcSubstring(_ X : String , _ Y : String ) -> String
{
let m = X.characters.count
let n = Y.characters.count
var L = Array(repeating: Array(repeating: 0, count: n + 1 ) , count: m + 1)
var result : (length : Int, iEnd : Int, jEnd : Int) = (0,0,0)
// Following steps build L[m+1][n+1] in bottom up fashion. Note
// that L[i][j] contains length of LCS of X[0..i-1] and Y[0..j-1]
for i in stride(from: 0, through: m, by: 1)
{
for j in stride(from: 0, through: n, by: 1)
{
if i == 0 || j == 0
{
L[i][j] = 0;
}
else if X[X.index( X.startIndex , offsetBy: (i - 1) )] == Y[Y.index( Y.startIndex , offsetBy: (j - 1) )]
{
L[i][j] = L[i-1][j-1] + 1
if result.0 < L[i][j]
{
result.length = L[i][j]
result.iEnd = i
result.jEnd = j
}
}
else
{
L[i][j] = 0 //max(L[i-1][j], L[i][j-1])
}
}
}
// Following code is used to print LCS
let lcs = X.substring(with: X.index(X.startIndex, offsetBy: result.iEnd-result.length)..<X.index(X.startIndex, offsetBy: result.iEnd))
// return the lcs
return lcs
}
// driver program
class func subsequenceOf(_ strings : [String] ) -> String
{
var answer = strings[0] // For on string answer is itself
for i in stride(from: 1, to: strings.count, by: 1)
{
answer = lcSubsequence(answer,strings[i])
}
return answer
}
class func substringOf(_ strings : [String] ) -> String
{
var answer = strings[0] // For on string answer is itself
for i in stride(from: 1, to: strings.count, by: 1)
{
answer = lcSubstring(answer,strings[i])
}
return answer
}
}
Usage :
let strings = ["ON/OFF office lights",
"DIM office lights",
"VALUE office lights",
"FB office lights",
"FB VALUE office lights"]
print(LongestCommon.subsequenceOf(strings))
print(LongestCommon.substringOf(strings))

Google Sheet formula for cumulative sum with condition

I have a Google Sheet with the following layout:
Number | Counted? | Cumulative Total
4 | Y | 4
2 | | 6
9 | Y | 15
... | ... | ...
The first cell in the Cumulative Total column is populated with this formula:
=ArrayFormula((SUMIF(ROW(C2:C1000),"<="&ROW(C2:1000),C2:C1000)
However this counts all rows in the 'Number' column. How can I make the Cumulative Total only count rows where the Counted? cell is Y?
Try this in C2 and copy down:
= N(C1) + A2 * (B2 = "Y")
Update
Doesn't seem to work with SUMIFS, but there is a very slow matrix multiplication alternative:
=ArrayFormula(MMult((Row(2:1000)>=Transpose(Row(2:1000)))*Transpose(A2:A1000*(B2:B1000="Y")), Row(2:1000)^0))
Assuming "Number" in column A and "Counted?" in column B, try in C1
={"SUM"; ArrayFormula(if(ISBLANK(B2:B),,mmult(transpose(if(transpose(row(B2:B))>=row(B2:B), if(B2:B="Y", A2:A,0), 0)),row(B2:B)^0)))}
(Change ranges to suit).
Example
custom formula sample:
=INDEX(IF(B3:B="","", runningTotal(B3:B,1,,A3:A)))
sample file
source code
related
code:
/**
* Get running total for the array of numbers
* by makhrov.max#gmail.com
*
* #param {array} numbers The array of numbers
* #param {number} total_types (1-dafault) sum, (2) avg, (3) min, (4) max, (5) count;
* 1-d array or number
* #param {number} limit number of last values to count next time.
* Set to 0 (defualt) to take all values
* #param {array} keys (optional) array of keys. Function will group result by keys
* #return The hex-code of cell background & font color
* #customfunction
*/
function runningTotal(numbers, total_types, limit, keys) {
// possible types to return
var oTypes = {
'1': 'sum',
'2': 'avg',
'3': 'min',
'4': 'max',
'5': 'count'
}
// checks and defaults
var errPre = '🥴 ';
if( typeof numbers != "object" ) {
numbers = [ [numbers] ];
}
total_types = total_types || [1];
if( typeof total_types != "object" ) {
total_types = [ total_types ];
}
if( keys && typeof keys != "object" ) {
keys = [ [keys] ];
}
if (keys) {
if (numbers.length !== keys.length) {
throw errPre + 'Numbers(' +
numbers.length +
') and keys(' +
keys.length +
') are of different length'; }
}
// assign types
var types = [], type, k;
for (var i = 0; i < total_types.length; i++) {
k = '' + total_types[i];
type = oTypes[k];
if (!type) {
throw errPre + 'Unknown total_type = ' + k;
}
types.push(type);
}
limit = limit || 0;
if (isNaN(limit)) {
throw errPre + '`limit` is not a Number!';
}
limit = parseInt(limit);
// calculating running totals
var result = [],
subres = [],
nodes = {},
key = '-',
val;
var defaultNode_ = {
values: [],
count: 0,
sum: 0,
max: null,
min: null,
avg: null,
maxA: Number.MIN_VALUE,
maxB: Number.MIN_VALUE,
maxC: Number.MIN_VALUE,
minA: Number.MAX_VALUE,
minB: Number.MAX_VALUE,
minC: Number.MAX_VALUE
};
for (var i = 0; i < numbers.length; i++) {
val = numbers[i][0];
// find correct node
if (keys) { key = keys[i][0]; }
node = nodes[key] ||
JSON.parse(JSON.stringify(defaultNode_));
/**
* For findig running Max/Min
* sourse of algorithm
* https://www.geeksforgeeks.org
* /sliding-window-maximum-maximum-of-all-subarrays-of-size-k/
*/
// max
//reset first second and third largest elements
//in response to new incoming elements
if (node.maxA<val) {
node.maxC = node.maxB;
node.maxB = node.maxA;
node.maxA = val;
} else if (node.maxB<val) {
node.maxC = node.maxB;
node.maxB = val;
} else if (node.maxC<val) {
node.maxC = val;
}
// min
if (node.minA>val) {
node.minC = node.minB;
node.minB = node.minA;
node.minA = val;
} else if (node.minB>val) {
node.minC = node.minB;
node.minB = val;
} else if (node.minC>val) {
node.minC = val;
}
// if limit exceeds
if (limit !== 0 && node.count === limit) {
//if the first biggest we earlier found
//is matching from the element that
//needs to be removed from the subarray
if(node.values[0]==node.maxA) {
//reset first biggest to second and second to third
node.maxA = node.maxB;
node.maxB = node.maxC;
node.maxC = Number.MIN_VALUE;
if (val <= node.maxB) {
node.maxC = val;
}
} else if (node.values[0]==node.maxB) {
node.maxB = node.maxC;
node.maxC = Number.MIN_VALUE;
if (val <= node.maxB) {
node.maxC = val;
}
} else if (node.values[0]==node.maxC) {
node.maxC = Number.MIN_VALUE;
if (val <= node.maxB) {
node.maxC = val;
}
} else if(node.values[0]==node.minA) {
//reset first smallest to second and second to third
node.minA = node.minB;
node.minB = node.minC;
node.minC = Number.MAX_VALUE;
if (val > node.minB) {
node.minC = val;
}
}
if (node.values[0]==node.minB) {
node.minB = node.minC;
node.minC = Number.MAX_VALUE;
if (val > node.minB) {
node.minC = val;
}
}
if (node.values[0]==node.minC) {
node.minC = Number.MAX_VALUE;
if (val > node.minB) {
node.minC = val;
}
}
// sum
node.sum -= node.values[0];
// delete first value
node.values.shift();
// start new counter
node.count = limit-1;
}
// add new values
node.count++;
node.values.push(val);
node.sum += val;
node.avg = node.sum/node.count;
node.max = node.maxA;
node.min = node.minA;
// remember entered values for the next loop
nodes[key] = node;
// get the result depending on
// selected total_types
subres = [];
for (var t = 0; t < types.length; t++) {
subres.push(node[types[t]]);
}
result.push(subres);
}
// console.log(JSON.stringify(nodes, null, 4));
return result;
}

not correct num histgram

Im trying to make a toString method that prints out a histogram that shows how often each character of the alphabet is used in a string. The most frequent character has to be 60 #s long, with the rest of the characters then scaled to match.
My issue is with making the equation that scales the rest of the letters to the correct length for the histogram. My current equation is (myArray[i]/max) * 60, but im getting really weird results.
If I put in "hello world" to be analyzed, L would be the most common occuring letter, seen 3 times. So L should have 60 #s for the histogram, h should have 20, o should have 40 etc. Instead im getting results like d : 10
e : 10
h : 10
l : 360
o : 20
r : 10
w : 10
Sorry for how sloppy this is right now, im just trying to figure out whats going on
public class LetterCounter
private static int[] alphabetArray;
private static String input;
/**
* Constructor for objects of class LetterCounter
*/
public LetterCounter()
{
alphabetArray = new int[26];
}
public void countLetters(String input) {
this.input = input;
this.input.toLowerCase();
//String s= input;
//s.toLowerCase();
for ( int i = 0; i < input.length(); i++ ) {
char ch= input.charAt(i);
if (ch >= 97 && ch <= 122){
alphabetArray[ch-'a']++;
}
}
}
public void getTotalCount() {
for (int i = 0; i < alphabetArray.length; i++) {
if(alphabetArray[i]>=0){
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i]);
}
}
}
public void reset() {
for (int i =0; i<alphabetArray.length; i++) {
if(alphabetArray[i]>=0){
alphabetArray[i]=0;
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i]);
}
}
}
public String toString() {
String s = "";
int max = alphabetArray[0];
int markCounter = 0;
for(int i =0; i<alphabetArray.length; i++) {
//finds the largest number of occurences for any letter in the string
if(alphabetArray[i] > max) {
max = alphabetArray[i];
}
}
for(int i =0; i<alphabetArray.length; i++) {
//trying to scale the rest of the characters down here
if(alphabetArray[i] > 0) {
markCounter = (alphabetArray[i] / max) * 60;
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i] + markCounter);
}
}
for (int i = 0; i < alphabetArray.length; i++) {
//prints the whole alphabet, total number of occurences for all chars
if(alphabetArray[i]>=0){
char ch = (char) (i+97);
System.out.println(ch +" : "+alphabetArray[i]);
}
}
return s;
}
}
There are many many problems with your code, but lets go one by one.
First of all, your print statement is simply misleading. Change it to
System.out.println(ch +" : "+alphabetArray[i] + " " + markCounter);
and you will see
d : 1 0
e : 1 0
h : 1 0
l : 3 60
o : 2 0
r : 1 0
w : 1 0
As you can see: the counters are correct (1,1,1,3,2,1,1). But the your scaling doesn't work:
1 / 3 --> 0 ... and 0 * 3 ... is still 0
3 / 3 --> 1 and 1 * 3 ... is 60
but of course, when you dont print a space between 1 and 0 and 3 and 60.
Thus to get correct scaling, just change to:
markCounter = alphabetArray[i] * 60 / max;
Other things worth mentioning:
You are overriding toString(). Then you should put #Override in fron t of that method
toLowerCase() returns a new string in lower case; just calling it without pushing the result back into your string ... just throws away the "lower casing".
toString() shouldnt print to the console. The whole idea is that you put all the information into the string that you return. In other words: in the end you do some System.out.println(someLetterCounter.toString()
Your code is extremely low-level. You don't iterate arrays using for (int), you can do (int letter : alphabetArray) instead
You might want to read about Map. You see, if you would be using a Map<Character, Integer> where the map key would represent the different characters, and the map value represents a counter for each character ... well, you could throw out most of your code; and come up with a solution that would require a few lines of code only!
( and seriously: because of all these issues, debugging your code was really much harder than it needed to be )
countLetters seems has some issues. You can not convert String to lowercase by just calling
this.input.toLowerCase();
Because String is immutable in java. You have to assign it like:
this.input = input.toLowerCase();
Another problem is you are using input variable from parameter instead of this.input which has lower case string. You can do this way to make work countLetters method:
public void countLetters(String input) {
this.input = input.toLowerCase();
for ( int i = 0; i < this.input.length(); i++ ) {
char ch= this.input.charAt(i);
if (ch >= 97 && ch <= 122) {
alphabetArray[ch-'a']++;
}
}
}

Resources