I'm doing a fraction calculator and I'm trying to add 2 fractions when I put in my fractions it just adds my Whole numbers only and nothing else if my fraction is 2 3/4 + 2 3/5 it add the whole numbers and outputs 4
var firstStep = firstDenomInTextField! * firstWholeInTextField! / firstDenomInTextField!
var secondStep = firstStep + firstNumInTextField! / firstDenomInTextField!
var thirdStep = secondDenomInTextField! * secondWholeInTextField! / secondDenomInTextField!
var fourthStep = thirdStep + secondNumInTextField! / secondDenomInTextField!
var calculatedAnswer = (secondStep + fourthStep)
var numerator = Int(calculatedAnswer * 10 * 10)
println(numerator)
answerLabel.hidden = false
answerLabel.text = printSimplifiedFraction(Numerator: numerator)
printSimplifiedFraction Function
func printSimplifiedFraction(Numerator numerator: Int, Denominator denominator: Int = 100) -> String
{
var finalNumerator = numerator;
var finalDenominator = denominator;
var wholeNumbers:Int = numerator / denominator;
var remainder:Int = numerator % denominator;
//println("wholeNumbers = \(wholeNumbers), remainder = \(remainder)");
//println("\(denominator) % \(remainder) = \(denominator % remainder)");
if(remainder > 0)
{
// see if we can simply the fraction part as well
if(denominator % remainder == 0) // no remainder means remainder can be simplified further
{
finalDenominator = denominator / remainder;
finalNumerator = remainder / remainder;
}
else
{
finalNumerator = remainder;
finalDenominator = denominator;
}
}
if(wholeNumbers > 0 && remainder > 0)
{
// prints out whole number and fraction parts
return("Simplified fraction of \(numerator)/\(denominator) = \(wholeNumbers) \(finalNumerator)/\(finalDenominator)");
}
else if (wholeNumbers > 0 && remainder == 0)
{
// prints out whole number only
return("Simplified fraction of \(numerator)/\(denominator) = \(wholeNumbers)");
}
else
{
// prints out fraction part only
return("Simplified fraction of \(numerator)/\(denominator) = \(finalNumerator)/\(finalDenominator)");
}
}
My Question I want to make it so it does not just add the whole numbers but add the whole fraction.
If you need any clarifications or questions please comment them down below
If you are working with fractions, you should use Double instead of Int. Also when multiplying by 10 you should use 10.0 instead. Be careful you are mixing Int(Integers) with Double(fractions). Also when declaring vars as parameters if you omit it will be a constant by default, if you would like to change it you don't need a second var, just add var in front of it when declaring it there).
I think You should restart it from the beginning, Syntax is OK. Don't forget to convert from Int to Double when needed.
I think your math is somewhat of in the code your present. The following code calculates the fraction and returns a string with the result.
...
var wholeTotal = firstWholeInTextField! + secondWholeInTextField!
var numeratorTotal = (firstNumInTextField! * secondDenomInTextField!) + (secondNumInTextField! * firstDenomInTextField!)
let denominatorTotal = firstDenomInTextField! * secondDenomInTextField!
while numeratorTotal > denominatorTotal {
wholeTotal++
numeratorTotal -= denominatorTotal
}
let resultString = simplifiedFractionStringWith(wholeTotal, numerator: numeratorTotal, denominator: denominatorTotal)
answerLabel.text = ("The result is: " + resultString)
func simplifiedFractionStringWith(wholeNumber: Int, numerator: Int, denominator: Int) -> String {
if numerator > 0 {
return ("\(wholeNumber) \(numerator)/\(denominator)")
} else {
return ("\(wholeNumber)")
}
}
That being said I really believe the whole thing is much better solved creating a fraction-struct...
Like I mentioned in my other answer I think the whole thing would be much better solved by creating a struct for your fraction. As I (clearly) have no life, I've put together another example:
struct fraction {
var wholeNumber = 0
var numerator = 0
var denominator = 0
func asString()-> String {
if numerator > 0 {
return ("\(wholeNumber) \(numerator)/\(denominator)")
} else {
return ("\(wholeNumber)")
}
}
func combinedWith(aFraction: fraction) -> fraction {
var wholeTotal = wholeNumber + aFraction.wholeNumber
var numeratorTotal = (numerator * aFraction.denominator) + (aFraction.numerator * denominator)
let denominatorTotal = denominator * aFraction.denominator
while numeratorTotal > denominatorTotal {
wholeTotal++
numeratorTotal -= denominatorTotal
}
let combinedFraction = fraction(wholeNumber: wholeTotal, numerator: numeratorTotal, denominator: denominatorTotal)
return combinedFraction
}
}
Then the code to show the total of two fractions would look something like this (in your app):
let firstFraction = fraction(wholeNumber: firstWholeInTextField!, numerator: firstNumInTextField!, denominator: firstDenomInTextField!)
let secondFraction = fraction(wholeNumber: secondWholeInTextField!, numerator: secondNumInTextField!, denominator: secondDenomInTextField!)
let combinedFraction = firstFraction.combinedWith(secondFraction)
answerLabel.text = ("The result is: " + combinedFraction.asString())
Related
I'm encountering a big problem when using the number 0 (zero) as a factor for the colors to generate scales, the numbers close to 0 (zero) end up becoming almost white, impossible to see a difference.
The idea is that above 0 (zero) it starts green and gets even stronger and below 0 (zero) starting with a red one and getting stronger.
I really need any number, even if it's 0.000001 already has a visible green and the -0.000001 has a visible red.
Link to SpreadSheet:
https://docs.google.com/spreadsheets/d/1uN5rDEeR10m3EFw29vM_nVXGMqhLcNilYrFOQfcC97s/edit?usp=sharing
Note to help with image translation and visualization:
Número = Number
Nenhum = None
Valor Máx. = Max Value
Valor Min. = Min Value
Current Result / Expected Result
After reading your new comments I understand that these are the requisites:
The values above zero should be green (with increased intensity the further beyond zero).
The values below zero should be red (with increased intensity the further beyond zero).
Values near zero should be coloured (not almost white).
Given those requisites, I developed an Apps Script project that would be useful in your scenario. This is the full project:
function onOpen() {
var ui = SpreadsheetApp.getUi();
ui.createMenu("Extra").addItem("Generate gradient", "parseData").addToUi();
}
function parseData() {
var darkestGreen = "#009000";
var lighestGreen = "#B8F4B8";
var darkestRed = "#893F45";
var lighestRed = "#FEBFC4";
var range = SpreadsheetApp.getActiveRange();
var data = range.getValues();
var biggestPositive = Math.max.apply(null, data);
var biggestNegative = Math.min.apply(null, data);
var greenPalette = colourPalette(darkestGreen, lighestGreen, biggestPositive);
var redPalette = colourPalette(darkestRed, lighestRed, Math.abs(
biggestNegative) + 1);
var fullPalette = [];
for (var i = 0; i < data.length; i++) {
if (data[i] > 0) {
var cellColour = [];
cellColour[0] = greenPalette[data[i] - 1];
fullPalette.push(cellColour);
} else if (data[i] < 0) {
var cellColour = [];
cellColour[0] = redPalette[Math.abs(data[i]) - 1];
fullPalette.push(cellColour);
} else if (data[i] == 0) {
var cellColour = [];
cellColour[0] = null;
fullPalette.push(cellColour);
}
}
range.setBackgrounds(fullPalette);
}
function colourPalette(darkestColour, lightestColour, colourSteps) {
var firstColour = hexToRGB(darkestColour);
var lastColour = hexToRGB(lightestColour);
var blending = 0.0;
var gradientColours = [];
for (i = 0; i < colourSteps; i++) {
var colour = [];
blending += (1.0 / colourSteps);
colour[0] = firstColour[0] * blending + (1 - blending) * lastColour[0];
colour[1] = firstColour[1] * blending + (1 - blending) * lastColour[1];
colour[2] = firstColour[2] * blending + (1 - blending) * lastColour[2];
gradientColours.push(rgbToHex(colour));
}
return gradientColours;
}
function hexToRGB(hex) {
var colour = [];
colour[0] = parseInt((removeNumeralSymbol(hex)).substring(0, 2), 16);
colour[1] = parseInt((removeNumeralSymbol(hex)).substring(2, 4), 16);
colour[2] = parseInt((removeNumeralSymbol(hex)).substring(4, 6), 16);
return colour;
}
function removeNumeralSymbol(hex) {
return (hex.charAt(0) == '#') ? hex.substring(1, 7) : hex
}
function rgbToHex(rgb) {
return "#" + hex(rgb[0]) + hex(rgb[1]) + hex(rgb[2]);
}
function hex(c) {
var pool = "0123456789abcdef";
var integer = parseInt(c);
if (integer == 0 || isNaN(c)) {
return "00";
}
integer = Math.round(Math.min(Math.max(0, integer), 255));
return pool.charAt((integer - integer % 16) / 16) + pool.charAt(integer % 16);
}
First of all the script will use the Ui class to show a customised menu called Extra. That menu calls the main function parseData, that reads the whole selection data with getValues. That function holds the darkest/lightest green/red colours. I used some colours for my example, but I advise you to edit them as you wish. Based on those colours, the function colourPalette will use graphical linear interpolation between the two colours (lightest and darkest). That interpolation will return an array with colours from darkest to lightest, with as many in-betweens as the maximum integer in the column. Please notice how the function uses many minimal functions to run repetitive tasks (converting from hexadecimal to RGB, formatting, etc…). When the palette is ready, the main function will create an array with all the used colours (meaning that it will skip unused colours, to give sharp contrast between big and small numbers). Finally, it will apply the palette using the setBackgrounds method. Here you can see some sample results:
In that picture you can see one set of colours per column. Varying between random small and big numbers, numerical series and mixed small/big numbers. Please feel free to ask any doubt about this approach.
A very small improvement to acques-Guzel Heron
I made it skip all non numeric values, beforehand it just errored out.
I added an option in the menu to use a custom range.
Thank you very much acques-Guzel Heron
function onOpen() {
const ui = SpreadsheetApp.getUi();
ui.createMenu('Extra')
.addItem('Generate gradient', 'parseData')
.addItem('Custom Range', 'customRange')
.addToUi();
}
function parseData(customRange = null) {
const darkestGreen = '#009000';
const lighestGreen = '#B8F4B8';
const darkestRed = '#893F45';
const lighestRed = '#FEBFC4';
let range = SpreadsheetApp.getActiveRange();
if (customRange) {
range = SpreadsheetApp.getActiveSpreadsheet().getRange(customRange);
}
const data = range.getValues();
const biggestPositive = Math.max.apply(null, data.filter(a => !isNaN([a])));
const biggestNegative = Math.min.apply(null, data.filter(a => !isNaN([a])));
const greenPalette = colorPalette(darkestGreen, lighestGreen, biggestPositive);
const redPalette = colorPalette(darkestRed, lighestRed, Math.abs(biggestNegative) + 1);
const fullPalette = [];
for (const datum of data) {
if (datum > 0) {
fullPalette.push([greenPalette[datum - 1]]);
} else if (datum < 0) {
fullPalette.push([redPalette[Math.abs(datum) - 1]]);
} else if (datum == 0 || isNaN(datum)) {
fullPalette.push(['#ffffff']);
}
}
range.setBackgrounds(fullPalette);
}
function customRange() {
const ui = SpreadsheetApp.getUi();
result = ui.prompt("Please enter a range");
parseData(result.getResponseText());
}
function colorPalette(darkestColor, lightestColor, colorSteps) {
const firstColor = hexToRGB(darkestColor);
const lastColor = hexToRGB(lightestColor);
let blending = 0;
const gradientColors = [];
for (i = 0; i < colorSteps; i++) {
const color = [];
blending += (1 / colorSteps);
color[0] = firstColor[0] * blending + (1 - blending) * lastColor[0];
color[1] = firstColor[1] * blending + (1 - blending) * lastColor[1];
color[2] = firstColor[2] * blending + (1 - blending) * lastColor[2];
gradientColors.push(rgbToHex(color));
}
return gradientColors;
}
function hexToRGB(hex) {
const color = [];
color[0] = Number.parseInt((removeNumeralSymbol(hex)).slice(0, 2), 16);
color[1] = Number.parseInt((removeNumeralSymbol(hex)).slice(2, 4), 16);
color[2] = Number.parseInt((removeNumeralSymbol(hex)).slice(4, 6), 16);
return color;
}
function removeNumeralSymbol(hex) {
return (hex.charAt(0) == '#') ? hex.slice(1, 7) : hex;
}
function rgbToHex(rgb) {
return '#' + hex(rgb[0]) + hex(rgb[1]) + hex(rgb[2]);
}
function hex(c) {
const pool = '0123456789abcdef';
let integer = Number.parseInt(c, 10);
if (integer === 0 || isNaN(c)) {
return '00';
}
integer = Math.round(Math.min(Math.max(0, integer), 255));
return pool.charAt((integer - integer % 16) / 16) + pool.charAt(integer % 16);
}
int main(){
int input;
int bin = 0, i = 1;
print("Please input a number");
input = num.parse(stdin.readLineSync());
while(input > 0)
{
bin = bin + (input % 2)*i;
input = input/2;
i = i * 10;
}
return 0;
}
It returned infinite numbers.
You just need to take care of double to int conversion: input = (input/2).floor()
See this working code:
void main() {
int input;
int bin = 0, i = 1;
input = 5;
while(input > 0)
{
bin = bin + (input % 2)*i;
input = (input/2).floor();
i = i * 10;
}
print(bin);
}
Here is a version of the above function that:
uses the integer division
with a ternary conditional operator, avoids the conversion to string.
sets the most significant bit to the left (bin = (dec % 2) + bin; which is what most people expects, but is not what the original snippet did)
String dec2bin(int dec) {
var bin = '';
while (dec > 0) {
bin = (dec % 2 == 0 ? '0' : '1') + bin;
dec ~/= 2;
}
return bin;
}
P.S: But, of course, one can simply write:
var bin = num.toRadixString(2);
There is no real need to write your own dec2bin function.
As the int result can become easily big in length and the max int value in dart is 2e53 (and much less if you compile to web). it's better to change the approach and return as a String.
String dec2bin(int decimal) {
String bin = '';
while (decimal > 0) {
bin = bin + (decimal % 2).toString();
decimal = (decimal / 2).floor();
}
return bin;
}
print(dec2bin(132070242815));
Result: 1111111110011111111111111111110101111
//number is always in int
static String decimalToBinary(int number) {
return number.toRadixString(2);
}
//binary is always in string
static int binaryToDecimal(String binary) {
return int.parse(binary, radix: 2);
}
I am doing some bitwise operations in Swift style, which these codes are originally written in Objective-C/C. I use UnsafeMutablePointer to state the beginning index of memory address and use UnsafeMutableBufferPointer for accessing the element within the scope.
You can access the original Objective-C file Here.
public init(size: Int) {
self.size = size
self.bitsLength = (size + 31) / 32
self.startIdx = UnsafeMutablePointer<Int32>.alloc(bitsLength * sizeof(Int32))
self.bits = UnsafeMutableBufferPointer(start: startIdx, count: bitsLength)
}
/**
* #param from first bit to check
* #return index of first bit that is set, starting from the given index, or size if none are set
* at or beyond its given index
*/
public func nextSet(from: Int) -> Int {
if from >= size { return size }
var bitsOffset = from / 32
var currentBits: Int32 = bits[bitsOffset]
currentBits &= ~((1 << (from & 0x1F)) - 1).to32
while currentBits == 0 {
if ++bitsOffset == bitsLength {
return size
}
currentBits = bits[bitsOffset]
}
let result: Int = bitsOffset * 32 + numberOfTrailingZeros(currentBits).toInt
return result > size ? size : result
}
func numberOfTrailingZeros(i: Int32) -> Int {
var i = i
guard i != 0 else { return 32 }
var n = 31
var y: Int32
y = i << 16
if y != 0 { n = n - 16; i = y }
y = i << 8
if y != 0 { n = n - 8; i = y }
y = i << 4
if y != 0 { n = n - 4; i = y }
y = i << 2
if y != 0 { n = n - 2; i = y }
return n - Int((UInt((i << 1)) >> 31))
}
Testcase:
func testGetNextSet1() {
// Passed
var bits = BitArray(size: 32)
for i in 0..<bits.size {
XCTAssertEqual(32, bits.nextSet(i), "\(i)")
}
// Failed
bits = BitArray(size: 34)
for i in 0..<bits.size {
XCTAssertEqual(34, bits.nextSet(i), "\(i)")
}
}
Can someone guide me why the second testcase fail but the objective-c version pass ?
Edit: As #vacawama mentioned: If you break testGetNextSet into 2 tests, both pass.
Edit2: When I run tests with xctool, and tests which calling BitArray's nextSet() will crash while running.
Objective-C version of numberOfTrailingZeros:
// Ported from OpenJDK Integer.numberOfTrailingZeros implementation
- (int32_t)numberOfTrailingZeros:(int32_t)i {
int32_t y;
if (i == 0) return 32;
int32_t n = 31;
y = i <<16; if (y != 0) { n = n -16; i = y; }
y = i << 8; if (y != 0) { n = n - 8; i = y; }
y = i << 4; if (y != 0) { n = n - 4; i = y; }
y = i << 2; if (y != 0) { n = n - 2; i = y; }
return n - (int32_t)((uint32_t)(i << 1) >> 31);
}
When translating numberOfTrailingZeros, you changed the return value from Int32 to Int. That is fine, but the last line of the function is not operating properly as you translated it.
In numberOfTrailingZeros, replace this:
return n - Int((UInt((i << 1)) >> 31))
With this:
return n - Int(UInt32(bitPattern: i << 1) >> 31)
The cast to UInt32 removes all but the lower 32 bits. Since you were casting to UInt, you weren't removing those bits. It is necessary to use bitPattern to make this happen.
Finally I found out that startIdx just need to be initialized after allocation.
self.startIdx = UnsafeMutablePointer<Int32>.alloc(bitsLength * sizeof(Int32))
self.startIdx.initializeFrom(Array(count: bitsLength, repeatedValue: 0))
Or use calloc with just one line code:
self.startIdx = unsafeBitCast(calloc(bitsLength, sizeof(Int32)), UnsafeMutablePointer<Int32>.self)
Furthermore, I use lazy var to defer the Initialization of UnsafeMutableBufferPointer until the property is first used.
lazy var bits: UnsafeMutableBufferPointer<Int32> = {
return UnsafeMutableBufferPointer<Int32>(start: self.startIdx, count: self.bitsLength)
}()
On the other hand, don't forget to deinit:
deinit {
startIdx.destroy()
startIdx.dealloc(bitsLength * sizeof(Int32))
}
I am trying to create a function in Playground using Swift where a calculation is made several times, and then added to the total sum of calculations until the loop is over. Everything seems to be working, except that when I try to sum the every calculation to the last total, it just gives me the value of the calculation. Here is my code:
func Calc(diff: String, hsh: String, sperunit: Float, rate: Float, n: Int16, p: Float, length: Int16) -> Float {
//Divisions per Year
let a: Int16 = length/n
let rem = length - (a*n)
let spl = Calc(diff, hsh: hash, sperunit: sperunit, rate: rate)
for var i = 0; i < Int(a) ; i++ { //also tried for i in i..<a
var result: Float = 0
let h = (spl * Float(n) / pow (p,Float(i))) //This gives me a correct result
result += h //This gives me the same result from h
finalResult = result
}
finalResult = finalResult + (Float(rem) * spl / pow (p,Float(a))) //This line is meant to get the result variable out of the loop and do an extra calculation outside of the loop
print(finalResult)
return finalResult
}
Am I doing something wrong?
Currently your variable result is scoped to the loop and does not exist outside of it. Additionally every run of the loop creates a new result variable, initialized with 0.
What you have to do is move the line var result: Float = 0 in front of the for loop:
var result: Float = 0
for var i = 0; i < Int(a) ; i++ {
let h = (spl * Float(n) / pow (p,Float(i)))
result += h
finalResult = result
}
Additionally you can remove the repeated assignment of finalResult = result and just do it once after the loop is over.
You can probably remove the finalResult completely. Just write
var result: Float = 0
for var i = 0; i < Int(a) ; i++ {
let h = (spl * Float(n) / pow (p,Float(i)))
result += h
}
result += (Float(rem) * spl / pow (p,Float(a)))
print(result)
return result
I am presently working on a first Swift project, and making a newcomer's errors.
The class I am working on is meant to export three of its methods to global variables:
var getAngle:[AnyObject]!;
var getHours:[AnyObject]!
var getMinutes:[AnyObject]!;
class GpsViewController : UIViewController {
// ....
required init() {
// super.init();
formatter.dateFormat = "yyyy-MM-dd";
parser.locale = NSLocale(localeIdentifier: "en_US_POSIX");
parser.dateFormat = "yyyy-MM-dd'T'HH:mm:ssZZZZ";
getAngle = self.restrictToAngle;
getHours = self.restrictToHours;
getMinutes = self.restrictToMinutes;
}
func getHourAndAngle() { // Hour hand angle in radians from 12 o'clock position, clockwise.
var now = NSDate();
var today = now;
var todayFormatted = this.formatter(today);
var yesterday = today.dateWithTimeIntervalSinceNow(-24 * 60 * 60);
var yesterdayFormatted = this.formatter(yesterday);
var tomorrow = today.dateWithTimeIntervalSinceNow(24 * 60 * 60);
var tomorrowFormatted = this.formatter(tomorrow);
var yesterdaySunset = this.presentLocation[yesterdayFormatted]["sunset"];
var todaySunrise = this.presentLocation[todayFormatted]["sunrise"];
var todaySunset = this.presentLocation[todayFormatted]["sunset"];
var tomorrowSunrise = this.presentLocation[tomorrowFormatted]["sunrise"];
var duration = 0.0;
var position = 0.0;
var startingHour = 18;
var offset = 0.0;
if now.isLessThanDate(todaySunrise) {
length = todaySunrise.timeIntervalSinceDate(yesterdaySunset);
position = now.timeIntervalSinceDate(yesterdaySunset);
offset = -0.5;
} else if now.isLessThanDate(todaySunset) {
length = todaySunset.timeIntervalSinceDate(todaySunrise);
position = now.timeIntervalSinceDate(todaySunrise);
offset = 0.5;
startingHour = 6;
} else {
length = tomorrowSunrise.timeIntervalSinceDate(todaySunset);
position = now.timeIntervalSinceDate(todaySunset);
offset = 1.5;
}
var proportion = position / length;
var angle = M_PI + (2 * M_PI * (proportion + offset));
var hours = floor(24 * 60 * 60 * ((1 + proportion + offset) % 1.0) / (60 * 60));
var minutes = floor(24 * 60 * 60 * (1 + proportion + offset) % 60 * 60);
return ["angle": angle, "hour": hour, "minutes": minutes];
}
func restrictToHours() {
return getHoursMinutesAngle["hour"];
}
func restrictToAngle() {
return getHoursMinutesAngle["angle"];
}
func restrictToMinutes() {
return getHoursMinutesAngle["minutes"];
}
// ...
}
I'm getting several errors, including in init()'s assignment of getAngle, "Value of type GpsViewController has no member restrictToAngle".
Could you tell me what the n00b errors are here?
You are trying to assign a function self.restrictToAngle to getAngle, a variable of type [AnyObject]! (i.e. an array).
As #EricD and others pointed out, there are quite a lot of issues. For understanding how function definition, calls & return type work in swift, check this Apple documentation page.