Rounding in mql4 - mql4

Used in mql4 language NormalizeDouble() function sometimes does not work properly. You can see in the example below.
double value=0.9731300000000001;
double value2=NormalizeDouble(value,Digits);
Print(value2);
As a result, although I want to get the result of 0.97313, the result is 0.9731300000000001. How can I fix.
double Yuvarla(double Sayi)
{
double _sayi1=MathRound(Sayi*MathPow(10,Digits));
double _sayi2=_sayi1/MathPow(10,Rakam);
return _sayi2;
}
this method didn't work either

Related

Where does additional value comes from in Dart code?

The following code produces two values as output. It's clear why it prints in func. But where does the 2nd value come from (null)?
func(){
print("in func");
}
void main() {
var x = func;
print(x());
}
Output:
in func
null
You treated Dart like a scripting language and as a result, you got puzzled. Treat Dart as a the real programming language it is and it will be crystal clear:
dynamic func() {
// ^-- "dynamic" was inserted by your compiler,
// to make up for the fact you failed to mention a return type
print("in func");
// this is inserted by your compiler, to make up for the
// fact that a non-void function failed to provide a return value:
return null;
}
void main() {
var x = func;
// so this first calls the function, which prints it's line
// and then it prints the functions return value, which is null.
print(x());
}
Be explicit when programming. Use the power this language gives you.
When in doubt, turn on analysis. It will tell you what I told you: you missed to explicitely state things and now the compiler has to cover for you. That makes it hard to see what actually happens. Don't be hard on future readers. One of them is you, 5 seconds from now. Make it easy for yourself and be explicit about what code is doing.

How can I convert a string in a textfield to an Int in Swift?

I tried for a long time to turn the text into an Int but it did not work. I tried it like this:
(AnzahlString is a textfield)
var AnzahlAInt = 0
if let AnzahlAString = AnzahlString.text {
let AnzahlAInt = Int(AnzahlAString)
}
But then I always get the error:
Value of optional type 'Int?' must be unwrapped to a value of type 'Int'
Then I added a ! at the end of Int(AnzahlAString)! so I don't get a error, but now when I press on the button, the app crashes. It was predictable, but how can I change this now to an Int without the !?
At first glance, it looks like you have two things to check for:
is AnzahlString.text present, and
does it represent an Int
The first check is in fact not necessary, since .text will never return nil, even though it's marked as Optional. This means you can safely force-unwrap it.
The second check is easily done by using the ?? operator:
let AnzahlAInt = Int(AnzahlString.text!) ?? 0
PS, just as a stylistic hint: variable names in Swift ususally start with a lowercase letter, names starting with capital letters are used for types.
PPS: your code as written shadows AnzahlAInt - the value of your var is never changed.
The reason why the resulting Int is optional, is that parsing might or might not succeed. For example, if you try to parse the string "Fluffy Bunnies" into an Int, there is no reasonable Int that can be returned, therefore the result of parsing that string will be nil.
Furthermore, if you force the parser by using !, you're telling Swift that you know for sure that the string you pass will always result in a valid Int, and when it doesn't, the app crashes.
You need to handle the situation in which the parse result is nil. For example:
if let AnzahlAIntResult = Int(AnzahlAString) {
// We only get here if the parse was successful and we have an Int.
// AnzahlAIntResult is now an Int, so it can be assigned to AnzahlAInt.
AnzahlAInt = AnzahlAIntResult
}
You did a good job so far but missed out one thing.
This line tries to convert the String into an Int. However this can fail, since your String can be something like this "dfhuse".
let AnzahlAInt = Int(AnzahlAString)
This is why the result of Int(AnzahlAString) is an Optional (Int?). To use it as an real Int, you have to unwrap it.
First solution is the !, however, every time this does fail your app crashes. Not a good Idea to use so.
The best solution would be Optional Binding, as you already used to get the text of your text field.
if let AnzahlAString = AnzahlString.text {
if let safeInt = Int(AnzahlAString) {
// You can use safeInt as a real Int
} else {
print("Converting your String to an Int failed badly!")
}
}
Hope this helps you. Feel free to ask again if something is unclear.
For unwrapping you can also use guard like this
Simple and easy
guard let AnzahlAInt = Int(AnzahlString.text!) else {
return
}
print(AnzahlAInt)

Ambiguous reference to member when using ceil or round

I am just trying to use the ceil or round functions in Swift but am getting a compile time error:
Ambiguous reference to member 'ceil'.
I have already imported the Foundation and UIKit modules. I have tried to compile it with and without the import statements but no luck. Does anyone one have any idea what I am doing wrong?
my code is as follow;
import UIKit
#IBDesignable class LineGraphView: GraphView {
override func setMaxYAxis() {
self.maxYAxis = ceil(yAxisValue.maxElement())
}
}
This problem occurs for something that might seem strange at first, but it's easily resolved.
Put simply, you might think calling ceil() rounds a floating-point number up to its nearest integer, but actually it doesn't return an integer at all: if you give it a Float it returns a Float, and if you give it a Double it returns a Double.
So, this code works because c ends up being a Double:
let a = 0.5
let c = ceil(a)
…whereas this code causes your exact issue because it tries to force a Double into an Int without a typecast:
let a = 0.5
let c: Int = ceil(a)
The solution is to convert the return value of ceil() to be an integer, like this:
let a = 0.5
let c = Int(ceil(a))
The same is true of the round() function, so you'd need the same solution.
Depending on the scope of where you call ceil, you may need to explicitly call Darwin's ceil function (deep in a closure, etc). Darwin is imported through Foundation, which is imported by UIKit.
let myFloat = 5.9
let myCeil = Darwin.ceil(myFloat) // 6
On linux, Glibc is used in place of Darwin for C-level API access. You would have to explicitly import Glibc and call Glibc.ceil(myFloat) instead of using Darwin.

how do i declare variables, compare them and then use them inside a function

i am developing an ea that requires me to compare the high of previous 2 bars and whichever one is higher, use that as a stop loss value.
same for opposite side trades, i need to compare previous 2 lows and use the lower one as stop loss value.
what i am doing is this:-
void onTick()
{
static int ticket=0;
double ab=(//calculation for ab);
double de=(//calculation for de);
if(Low[1]<Low[2])
double sll=Low[1];
if(Low[1]>Low[2])
double sll=Low[2];
if(buy logic comes here)
{
double entryPrice=////////;
double stoploss=sll-xyz;
double takeprofit=entryPrice+((entryPrice-stoploss)*3);
ticket = OrderSend(Symbol(),...entryPrice,stoploss,takeprofit,.....);
}
if(ticket == false)
{
Alert("Order Sending Failed");
}
}
the problem is i am not able to reference the values of sll and get an error message saying "sll undeclared identifier"
i am fairly new to programming and would appreciate if someone can help me out with this.
I have added most of the code for you to understand the logic.
you would have to declare them outside the scope of the if statements if you want to use variables anywhere else so instead of doing that take a look at this
double sll; // declare sll outside the if statements
if(Low[1]<Low[2])
sll=Low[1];
if(Low[1]>Low[2])
sll=Low[2];
if(buy logic comes here)
{
bool res = OrderSend(..........);
}
Judging by what you wrote, it looks like you may be using res somewhere else too which then you need to define outside of the if statement because scoping.

Format for int since XCode 4.5

Since upgrading to XCode 4.5, printing ints to the console results in unusually high values. Eg:
int someInt = 300;
NSLog([NSString stringWithFormat:#"Some int: %d", someInt]); // prints Some int: 11581443
Usually I only see this when using the wrong format string for the data type. I'm using LLDB.
you wrong use NSLog.
void NSLog (
NSString *format,
...
);
ex:
int someInt = 100;
NSString* str = [NSString stringWithFormat:#"%d",someInt];
NSLog(#"%#",str);
or
NSLog(#"%d", someInt)
or
NSLog(#"%#", [NSString stringWithFormat:#"%d",someInt])
Try NSLog(#"Integer: %i", int)
#askovpen is right about your incorrect use of NSLog, however this line in your question is interesting :
using the wrong format string for the data type
Of course you get garbage out - you're putting garbage in!
NSLog works by using the first parameter to work out how big the other parameters are going to be. i.e. if you put %c it expects a char next in the parameters. If you put %d it expects an int. So if you pass in an int and tell it to expect a float then yea, it's not going to work. Why would you expect that it would?
The reason you might be getting different values in XCode 4.5 instead of other XCodes might be due to changes in the memory management during compilation, or might be due to any number of other things.

Resources