I try to resolve the highest value of a Map, with different field names. In this case highestValue should be "Luke"
String? highestValue;
Map<Object?, Object?>? points =
{
"Vader": 40,
"Obi-Wan": 20,
"Luke": 50,
};
highestValue = ...
thanks
Your map is terribly typed so I am not sure what situations we need to take into account here. E.g. what should happen if the value are not an int? But I have made the following example of how you could do it:
void main() {
String? highestValue;
Map<Object?, Object?>? points =
{
"Vader": 40,
"Obi-Wan": 20,
"Luke": 50,
};
highestValue = points.entries.reduce((a, b) {
final aValue = a.value;
final bValue = b.value;
if (aValue is! int) {
return b;
}
if (bValue is! int) {
return a;
}
return aValue > bValue ? a : b;
}).key as String?;
print(highestValue); // Luke
}
But as you can see, a lot of the logic comes from the fact that your map contains Object? objects and not a more specific type like int.
Extra solution added after request in comment
If we want to have extract the second biggest, we are starting to get into the territory where it makes more sense to just generate a list and then order it.
So we can do something like this where we do this based on the map:
void main() {
String? highestValue;
Map<Object?, Object?>? points = {
"Vader": 40,
"Obi-Wan": 20,
"Obi-Wan": 20,
"Luke": 50,
};
final sortedEntries = points.entries.toList()
..sort((entry1, entry2) {
final aValue = entry1.value;
final bValue = entry2.value;
if (aValue is! int) {
return 1;
}
if (bValue is! int) {
return -1;
}
return bValue.compareTo(aValue);
});
sortedEntries
.forEach((entry) => print('Key: ${entry.key} | Value: ${entry.value}'));
// Key: Luke | Value: 50
// Key: Vader | Value: 40
// Key: Obi-Wan | Value: 20
String nameOfTheSecondLargest = sortedEntries[1].key as String;
print(nameOfTheSecondLargest); // Vader
}
Map<String, int>? points =
{
"Vader": 40,
"Obi-Wan": 20,
"Luke": 50,
};
String highest = points.keys.first;
int max = points.values.first;
points.forEach((key, value) {
if(max < value){
max=value;
highest = key;
}
});
You can do something like this. More easy to understand.
Simplest way to do this :
import:
import 'dart:math';
then
Map<Object?, Object?> points = {
"Vader": 40,
"Obi-Wan": 20,
"Luke": 50,
};
var highestValue = (points.values.toList().map((e) => e as int).reduce(max));
print(highestValue);
Related
I'm new to Dart. I was trying to convert integer to roman. But It returns nothing. Can you guys help me? here is my code sample.
this code is from the Leetcode problem section.
class Solution {
String intToRoman(int num) {
List<int> numbers = [1,4,5,9,10,40,50,90,100,400,500,900,1000];
List<String> romans = ["I","IV","V","IX","X","XL","L","XC","C","CD","D","CM", "M"];
int index = romans.length - 1;
String roman = '';
for(num >0;numbers[index]<=num;){
roman += romans[index];
num -= numbers[index];
index -= 1;
}
return roman;
}
}
just change a little bit on the logic
.try on dartpad: https://dartpad.dev/?id
void main() {
print (intToRoman(30)); // result: XXX
}
String intToRoman(int num) {
List<int> numbers = [1000, 900, 500, 400, 100, 90, 50, 40, 10, 9, 5, 4, 1];
List<String> romans = ["M","CM","D","CD","C","XC","L","XL","X","IX","V","IV","I"];
String roman = '';
for (int i = 0; i < numbers.length; i++) {
while (num >= numbers[i]) {
roman += romans[i];
num -= numbers[i];
}
}
return roman;
}
This solution is based on Wiki:
class Solution {
/// digit: 3=Thousands(10³), 2=Hundreds(10²), 1=Tens(10), 0=Units(1)
/// Range for roman numerals: 1...3999
static final romanNumerals = <int,Map<int,String>>{
1 : {3:'M', 2:'C', 1:'X', 0:'I'},
2 : {3:'MM', 2:'CC', 1:'XX', 0:'II'},
3 : {3:'MMM', 2:'CCC', 1:'XXX', 0:'III'},
4 : {2:'CD', 1:'XL', 0:'IV'},
5 : {2:'D', 1:'L', 0:'V'},
6 : {2:'DC', 1:'LX', 0:'VI'},
7 : {2:'DCC', 1:'LXX', 0:'VII'},
8 : {2:'DCCC', 1:'LXXX', 0:'VIII'},
9 : {2:'CM', 1:'XC', 0:'IX'},
};
/* ---------------------------------------------------------------------------- */
Solution();
/* ---------------------------------------------------------------------------- */
String intToRoman(int number) {
if (number < 1 || number >= 4000) return '';
var list = number.toString().split('').map(int.parse).toList();
var buffer = StringBuffer();
final len = list.length;
for (var i = 0; i < len; i++) {
var digit = list[i];
if (digit == 0) continue;
buffer.write(romanNumerals[digit]![len - 1 - i]);
}
return buffer.toString();
}
/* ---------------------------------------------------------------------------- */
void intToRoman2(int number) {
print(intToRoman(number));
}
}
void main(List<String> args) {
Solution()
..intToRoman2(3)
..intToRoman2(58)
..intToRoman2(1994)
;
}
Output:
III
LVIII
MCMXCIV
This code was already sent to LeetCode with the following results:
Runtime: 1130 ms, faster than 27.96% of Dart online submissions for Integer to Roman.
Memory Usage: 150.5 MB, less than 44.09% of Dart online submissions for Integer to Roman.
why not use the simple way?
I use this extension to convert english numbers to persian numbers
extension StringExtensions on String {
String persianNumber() {
String number = this;
number = number.replaceAll("1", "۱");
number = number.replaceAll("2", "۲");
number = number.replaceAll("3", "۳");
number = number.replaceAll("4", "۴");
number = number.replaceAll("5", "۵");
number = number.replaceAll("6", "۶");
number = number.replaceAll("7", "۷");
number = number.replaceAll("8", "۸");
number = number.replaceAll("9", "۹");
number = number.replaceAll("0", "۰");
return number;
}
}
extension IntExtensions on int {
String persianNumber() {
String number = this.toString();
number = number.replaceAll("1", "۱");
number = number.replaceAll("2", "۲");
number = number.replaceAll("3", "۳");
number = number.replaceAll("4", "۴");
number = number.replaceAll("5", "۵");
number = number.replaceAll("6", "۶");
number = number.replaceAll("7", "۷");
number = number.replaceAll("8", "۸");
number = number.replaceAll("9", "۹");
number = number.replaceAll("0", "۰");
return number;
}
}
I'm stuck on this common interview question using Dart. I need to return the most common character in a given string. I'm trying to create a map with a count for each character as the first step.
This is my progress so far:
main(List<String> arguments) {
maxChar('hello');
}
void maxChar(String word) {
Map<String, int> charMap = {};
int max = 0;
String maxChar = '';
word.split('').forEach((char) {
if(charMap.containsValue(char)) {
charMap[char]+1;
return;
} else {
charMap[char] = 1;
}
});
print(charMap);
}
Right now its not even counting the correct amount of the letter 'l'. It's outputting:
{h: 1, e: 1, l: 1, o: 1}
What am I doing wrong? Is there an easier way to return the most common character in a String in Dart?
Thanks!
EDIT:
Ok, I've solved it, but surely there is a more concise way of solving this problem. See my solution below:
main(List<String> arguments) {
print(max_char.maxChar('hello'));
}
String maxChar(String word) {
Map<String, int> charMap = {};
int max = -1;
String maxChar = '';
word.split('').forEach((char) {
if(charMap.containsKey(char)) {
charMap[char]++;
return;
} else {
charMap[char] = 1;
}
});
charMap.forEach((k,v) {
if(v > max) {
max = v;
maxChar = k;
}
});
return maxChar;
}
A shorter approach to counting the characters is definitely possible:
String charCount(String chars) {
int maxChar = -1;
int maxCount = 0;
var counts = <int, int>{};
for (var char in chars.runes) {
int count = counts.update(char, (n) => n + 1, ifAbsent: () => 1);
if (count > maxCount) {
maxCount = count;
maxChar = char;
}
}
return String.fromCharCode(maxChar);
}
If you just want to count the characters, you can remove all the lines mentioning maxCount and maxChar.
I use integers to represent the characters instead of strings. That's cheaper and just as precise, and it allows you to recognize and combine Unicode UTF-16 surrogates.
var num1 = 10.12345678
What should i do with num1 to delete digits after two decimal point without rounding its value.
I need output as 10.12
import 'package:flutter/material.dart';
void main() => runApp(MaterialApp(
title: ' Delete digits after two decimal point ',
theme: ThemeData(primarySwatch: Colors.blue),
home: MyHome(),
));
class MyHome extends StatefulWidget {
#override
_MyHomeState createState() => _MyHomeState();
}
class _MyHomeState extends State<MyHome> {
#override
Widget build(BuildContext context) {
var num1 = 10.12345678;
print(num1); // I need output as 10.12
return Container();
}
}
If you want to round the number:
var num1 = 10.12345678;
var num2 = double.parse(num1.toStringAsFixed(2)); // num2 = 10.12
If you do NOT want to round the number:
Create this method:
double getNumber(double input, {int precision = 2}) =>
double.parse('$input'.substring(0, '$input'.indexOf('.') + precision + 1));
Usage:
var input = 113.39999999999999;
var output = getNumber(input, precision: 1); // 113.9
var output = getNumber(input, precision: 2); // 113.99
var output = getNumber(input, precision: 3); // 113.999
You can use intl package (https://pub.dartlang.org/packages/intl#-installing-tab-)
var num1 = 10.12345678;
var f = new NumberFormat("###.0#", "en_US");
print(f.format(num1));
Some answers here did not work (top answer is round, not truncate).
here is a way:
(n * 100).truncateToDouble()/100
if you want round the number use this.
double mod = pow(10.0, places);
return ((val * mod).round().toDouble() / mod);
if you just want to truncate use this.
return val - val % 0.01;
String toFixed2DecimalPlaces(double data, {int decimalPlaces = 2}) {
List<String> values = data.toString().split('.');
if (values.length == 2 && values[0] != '0' && values[1].length >= decimalPlaces && decimalPlaces > 0)
return values[0] + '.' + values[1].substring(0, decimalPlaces);
else
return data.toString();
}
You can also try this ----> (0.2055).toStringAsFixed(2)
var per = 0.2055;
Text( "result view -> ${double.parse((per * 100).toStringAsFixed(2))}%",
style: TextStyle(color: Colors.white, fontSize: 10)),
result value ->
input -> 0.2055
output -> result view ->20.00
There is a simple solution to this problem.
double value = 17.56565656;
//as string
String formatted = value.toStringAsFixed(2); // 17.56
//as double
double formattedDouble = double.parse(formatted); //17.56
extension NoRoundingDecimal on double {
String toDecimalAsFixed(int toDecimal) {
var right;
try {
right = this.toString().split(".")[1].padRight(toDecimal, "0").substring(0, toDecimal);
} catch (e) {
right = "00";
}
var left = this.toString().split(".")[0];
double number = double.parse(left + "." + right);
return number.toStringAsFixed(toDecimal);
}
}
Example1:
double price = 71.999999999;
print("number: ${price.toDecimalAsFixed(3)}");
Result: number: 71.999
Example2:
double price = 71;
print("number: ${price.toDecimalAsFixed(3)}");
Result: number: 71.000
I am trying to put fibonacci number in an array and wanted to see the array output in playground console but for some reason I do not see any ouput. Can someone plz help in making me understand the mistake that I am cdoing in my program ?
import UIKit
class FibonacciSequence {
let includesZero: Bool
let values: [Int]
init(maxNumber: Int, includesZero: Bool) {
self.includesZero = includesZero
values = [0]
var counter: Int
if (includesZero == true) { counter = 0 }
else { counter = 1 }
for counter <= maxNumber; {
if ( counter == 0 ) {
values.append(0)
counter = 1
}
else {
counter = counter + counter
values.append(counter)
}
}
println(values)
}
println(values)
return values
}
let fibanocciSequence = FibonacciSequence(maxNumber:123, includesZero: true)
#ABakerSmith has given you a good rundown of the problems in the code as-is, but you also might want to consider, instead of a class that initializes an array member variable, writing a SequenceType that returns fibonacci numbers:
struct FibonacciSequence: SequenceType {
let maxNumber: Int
let includesZero: Bool
func generate() -> GeneratorOf<Int> {
var (i, j) = includesZero ? (0,1) : (1,1)
return GeneratorOf {
(i, j) = (j, i+j)
return (i < self.maxNumber) ? i : nil
}
}
}
let seq = FibonacciSequence(maxNumber: 20, includesZero: false)
// no arrays were harmed in the generation of this for loop
for num in seq {
println(num)
}
// if you want it in array form:
let array = Array(seq)
You could of course memoize the sequence if you want to improve performance on multiple generations.
Your problem is your code has errors in it; if there are errors in your code Playgrounds won't run it and you won't get any output.
On the line for counter <= maxNumber; you've got a semi-colon, but also, I'm pretty sure you can't declare a for loop like that, unless I'm missing something? You could use a while loop though.
Why are you trying to return values from your init method?
You've declared values as a constant but are then trying to change it using append.
Using this code and fixing the errors stated does not produce the Fibonacci sequence, instead it produces: [0, 0, 2, 4, 8, 16, 32, 64, 128]
Try this code:
class FibonacciSequence {
let values: [Int]
init(maxNumber: Int, includesZero: Bool) {
var tempValues = includesZero ? [0] : [1]
var current = 1
do {
tempValues.append(current)
let nMinus2 = tempValues[tempValues.count - 2]
let nMinus1 = tempValues[tempValues.count - 1]
current = nMinus2 + nMinus1
} while current <= maxNumber
self.values = tempValues
}
}
Then create an instance:
let fibanocciSequence = FibonacciSequence(maxNumber:123, includesZero: true)
println(fibanocciSequence.values) // [0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89]
Hope that helps!
I'm a beginner to swift. I'm just trying out to write a program to print nth prime, but having some trouble converting sqrt function to int. Below is the code which works fine with c/c++.
func nthPrime(n: Int64)
{
var i:Int64=4,j:Int64=0, prime:Int64=0
var count:Int64=0
while count != n
{
for (j=2 ; j < Int64((sqrt(i))) + 1 ; j++) //Shows error cant invoke init with argument list of type (#lvalue Int64,$T9)
{
if(i%j == 0)
{
i++
break
}
else if(j == Int64(sqrt(i)))
{
count++
i++
}
}
}
println("\(n)th prime is \(prime)")
}
Is it possible to do this kind of comparison in swift? I know if I change the var i and j to Double it will remove the error but the code wont work properly. Any other suggestions
sqrt method input parameter needs to be a Double. So you need to cast it to Double. You will also need to use the math method called ceil.
In mathematics and computer science, the floor and ceiling functions
map a real number to the largest previous or the smallest following
integer, respectively.
It will result in a Double so you need to convert the result back to Integer again. Try using it like this:
Int(ceil(sqrt(Double(i))))
//
extension Int {
var isPrime:Bool{
if self < 2 { return false }
let squareRoot = Int(sqrt(Double(self)))
if squareRoot * squareRoot == self { return false }
for i in 2..<Int(ceil(sqrt(Double(self)))) {
if self % i == 0 { return false }
}
return true
}
}
//
1.isPrime // false
2.isPrime // true
3.isPrime // true
4.isPrime // false
5.isPrime // true
6.isPrime // false
7.isPrime // true
8.isPrime // false
9.isPrime // false
10.isPrime // false
11.isPrime // true
//
let myInt = 7
if myInt.isPrime {
// do this
} else {
// do that
}
//
var twoDigitsPrimeNumbers:[Int] = []
for number in 1..<100 {
if number.isPrime {
twoDigitsPrimeNumbers.append(number)
}
}
println(twoDigitsPrimeNumbers.description) // [2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83, 89, 97]
func nthPrime(nth:Int)-> Int {
var primeCounter = 0
var number = 2
while true {
if number.isPrime {
primeCounter++
if nth == primeCounter { return number}
}
number++
}
}
nthPrime(1000) // 7,919