AnsiString to Double C++ - c++builder

I am trying to convert an AnsiString to a double, but after running the programm, it says "20.60 is not a valid floating point".
AnsiString Temperatur
RE_Temperatur->Text = Temperatur;
Chart1->Series[0]->AddXY(DT_Uhrzeit->Time, RE_Temperatur->Text.ToDouble(),
"", clRed);
The value for Temperatur comes from a SerialPort connection from my Arduino and delivers something like 20.60 and so on. When I cut the string to the first 2 digits it is working fine. Guess the . has something to do with the error but I dont know how to solve it.
EDIT //
I now tried to just replace the "." with a "," with following code:
RE_Temperatur->Text = StringReplace(RE_Temperatur->Text, ".", ",",
TReplaceFlags() << rfReplaceAll);
Now I get an Error Message "20,60 is not a valid floating point". Its so frustrating :/

Try this simple piece of code in a new console program:
AnsiString Temperatur = "12.45"; // locale independent
AnsiString Temperatur2 = "12,45"; // correct in my German locale
double temp = StrToFloat(Temperatur, TFormatSettings::Invariant());
double temp2 = StrToFloat(Temperatur2); // In my case: German locale!
printf("%8.4f %8.4f\n", temp, temp2);
The output is:
12.4500 12.4500
You can see that it works as expected. Note that on my system, the comma is the decimal separator for the locale, and yet this code works fine with the period as decimal separator, because of the invariant format settings.
So make your choice: use TFormatSettings::Invariant() if you need independence of the locale, but don't use it if you want it to use the decimal separator for the locale of the user.

You can use StrToFloat it has a second version that takes a TFormatSettings object which you can use to specify a specific DecimalSeperator. BTW it does not return a float as its name might suggest it returns an Extended aka long double.

My bet is that your problem is the decimal point. The character used for it is one of the Windows environment variables and can vary from computer to computer. There are winapi calls to receive it but I do not remember them ...
The problem with AnsiString("1.23").ToDouble() is that catch/try will not catch the exception (at least on my compiler BDS2006). There are workarounds. I am using atoi() instead which is not crashing but rather truncate to integer when invalid character present so it can be used to detect the correct decimal point character and conversion:
char _floating_point_char=' '; // decimal point separator
AnsiString strnum (AnsiString s,int sig=1,int pnt=1,int hex=0,char dot=_floating_point_char);
double str2num(AnsiString s,int sig=1,int pnt=1,int hex=0,char dot=_floating_point_char);
AnsiString strnum(AnsiString s,int sig,int pnt,int hex,char dot)
{
if (dot==' ')
{
float x;
x=atof("0.5"); if (x>=0.25) _floating_point_char='.';
x=atof("0,5"); if (x>=0.25) _floating_point_char=',';
dot=_floating_point_char;
}
int i,l,a,e,e0,exp=pnt;
AnsiString q="";
l=s.Length();
if (hex) exp=0; // exponent is e,E so it colide with hex e,E
e0=0; for (i=1;i<=l;i++)
{
e=0;
a=s[i];
if ((a>='0')&&(a<='9')) { e=1; q+=char(a ); sig=0; }
if ((a>='A')&&(a<='F'&&(hex))){ e=1; q+=char(a ); sig=0; }
if ((a>='a')&&(a<='f'&&(hex))){ e=1; q+=char(a ); sig=0; }
if ((a=='e')||(a=='E'))
if (!hex) { if (!exp) break; e=1; q+=char('e'); exp=0; sig=1; pnt=0; e0=0; }
if (a=='+') { if (!sig) break; e=1; sig=0; }
if (a=='-') { if (!sig) break; e=1; q+=char(a ); sig=0; }
if (a=='.') { if (!pnt) break; e=1; q+=char(dot); pnt=0; }
if (a==',') { if (!pnt) break; e=1; q+=char(dot); pnt=0; }
if ((e0)&&(!e)) break;
e0|=e;
}
if (q.Length()<=0) q="0";
return q;
}
double str2num(AnsiString s,int sig,int pnt,int hex,char dot)
{
return atof(strnum(s,sig,pnt,hex,dot).c_str());
}
It simply detect if . or , is the right one and then replace it in string before conversion... Of coarse fetching the correct decimal point character from winapi is safer as the decimal point separator can be anything... You can ignore the hex part I am using this for any kind of numbers ...

Related

ExpandEnvironmentStringsA() String Subscript out of range

I have a Function which checks to see if IIS is installed and gets the installation path.
int IsIISInstalled(string &pathname)
{
DWORD returnvalue;
long keyres = RegOpenKeyExA(HKEY_LOCAL_MACHINE, "Software\\Microsoft\\InetStp\\", 0, KEY_READ, &miva);
if (keyres == ERROR_SUCCESS)
{
char szBuffer[512];
DWORD dwBufferSize = sizeof(szBuffer);
ULONG nError;
nError = RegQueryValueExA(miva, "InstallPath", NULL, NULL, (LPBYTE)szBuffer, &dwBufferSize);
if (nError == ERROR_SUCCESS)
{
char retBuffer[512];
DWORD nsize = sizeof(retBuffer);
returnvalue = ExpandEnvironmentStringsA(szBuffer, retBuffer, nsize);
pathname = retBuffer;
}
}
if (!pathname.empty())
return 1;
else
return 0;
}
When i attach to my executable and debug this there is a return value from ExpandEnvironmentStringsA
in retBuffer showing the installation path. returnvalue shows 28 as is the TCHARS that was put in the buffer. Once i step into the next line setting the string pathname to the retBuffer it fails giving me a string subscript out of range. I understand what that error means, i have done it plenty of times. What is odd to me is if i specify a new string var in the function:
string fakeresult;
and set fakeresult to retBuffer:
fakeresult = retBuffer;
just as i am in the code above it passes through just fine with no errors. I am calling the function with this code.
string iis_path, miva_path;
int disable;
char *full_path;
//getMivaLocation(miva_path);
bool good2go;
int iisinstalled, empressaReturn, miaReturn;
iisinstalled = IsIISInstalled(iis_path);
Does this have to do with the fact i am passing pathname by reference to the function?
If that is the case why?
How could i fix this to be able to return my data?
I am not a well educated c++ coder i am learning a lot of this as i go and have learned much from you guys. Hoping someone has an idea on this as i do not wish to spend much more time researching to no avail. Thanks.
After stepping into further i found that the pathname is being set as it should. It is immediately crashing on the next call to another functions so my debugging was incorrect. Now i have scoped further hopefully i can fix this. Also for my own note, the reason why it would not crash when i set fakestring instead of pathname was due to the fact i was passing pathname in the function call that is crashing. With no value it is not setting something out of its scope.

Objective-C how to convert a keystroke to ASCII character code?

I need to find a way to convert an arbitrary character typed by a user into an ASCII representation to be sent to a network service. My current approach is to create a lookup dictionary and send the corresponding code. After creating this dictionary, I see that it is hard to maintain and determine if it is complete:
__asciiKeycodes[#"F1"] = #(112);
__asciiKeycodes[#"F2"] = #(113);
__asciiKeycodes[#"F3"] = #(114);
//...
__asciiKeycodes[#"a"] = #(97);
__asciiKeycodes[#"b"] = #(98);
__asciiKeycodes[#"c"] = #(99);
Is there a better way to get ASCII character code from an arbitrary key typed by a user (using standard 104 keyboard)?
Objective C has base C primitive data types. There is a little trick you can do. You want to set the keyStroke to a char, and then cast it as an int. The default conversion in c from a char to an int is that char's ascii value. Here's a quick example.
char character= 'a';
NSLog("a = %ld", (int)test);
console output = a = 97
To go the other way around, cast an int as a char;
int asciiValue= (int)97;
NSLog("97 = %c", (char)asciiValue);
console output = 97 = a
Alternatively, you can do a direct conversion within initialization of your int or char and store it in a variable.
char asciiToCharOf97 = (char)97; //Stores 'a' in asciiToCharOf97
int charToAsciiOfA = (int)'a'; //Stores 97 in charToAsciiOfA
This seems to work for most keyboard keys, not sure about function keys and return key.
NSString* input = #"abcdefghijklkmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890!##$%^&*()_+[]\{}|;':\"\\,./<>?~ ";
for(int i = 0; i<input.length; i ++)
{
NSLog(#"Found (at %i): %i",i , [input characterAtIndex:i]);
}
Use stringWithFormat call and pass the int values.

get int value including zeros (incase if it starts with 0) from string

I have a question regarding the converting string to intvalue. My question and issue is in case if I have string called "001223" I am getting 1223 as intvalue. But I want to get the 001223 as final int value. Please let me know if my question is not clear. Thanks for your time
There is no difference in value between the numbers 001223 , 1223, 2446/2 or 1223.000. They all refer to the same number.
If you want to keep leading zeroes, then you need to either keep it as a string or maintain another piece of information so it can be rebuilt later, basically the number of zeroes at the front, such as:
struct sNumWithLeadingZeros {
size_t zeroCount;
unsigned int actualValue;
};
I'd probably suggest the former (keeping it as a string) since that's likely to be less effort.
"Leading zeros" are to do with the textual representation of an integer, when stored as integer values in a computer the leading zeros do not exist.
However, if what you want to do is display the number with the same number of digits it had before being converted from text then: if the string contains only the digits of the number, e.g. you have #"001223" then you can take the length of this string to determine the number of digits. Later when converting the number back to string format you can use a formatted conversion, e.g. stringWithFormat:, and a format specifier which specifies the required number of digits. You'll need to read up on formats in the documentation, but here is an example:
NSString *input = #"001223";
int x = [input intValue];
int digits = (int)input.length;
NSString *output = [NSString stringWithFormat:#"%0*d", digits, x];
The value of output will be the same as input. The format broken down is: 0 - leading zeros; * use a dynamic field with, will use the value of digits; d - int.
HTH
One cannot prefix leading 0s in int data type. But if you see 0 prefix then the number is octal not decimal. Octal value can be created by changing base. For this you can use wrapper class like Integer.
But if one wants leading 0s for displaying data then he/she can use following code
public class Sample
{
public static void main(final String[] argv)
{
System.out.printf("%06d", 1223);
System.out.println();
}
}

C++ - Removing invalid characters when a user paste in a grid

Here's my situation. I have an issue where I need to filter invalid characters that a user may paste from word or excel documents.
Here is what I'm doing.
First I'm trying to convert any unicode characters to ascii
extern "C" COMMON_STRING_FUNCTIONS long ConvertUnicodeToAscii(wchar_t * pwcUnicodeString, char* &pszAsciiString)
{
int nBufLen = WideCharToMultiByte(CP_ACP, 0, pwcUnicodeString, -1, NULL, 0, NULL, NULL)+1;
pszAsciiString = new char[nBufLen];
WideCharToMultiByte(CP_ACP, 0, pwcUnicodeString, -1, pszAsciiString, nBufLen, NULL, NULL);
return nBufLen;
}
Next I'm filtering out any character that does not have a value between 31 and 127
String __fastcall TMainForm::filterInput(String l_sConversion)
{
// Used to store every character that was stripped out.
String filterChars = "";
// Not Used. We never received the whitelist
String l_SWhiteList = "";
// Our String without the invalid characters.
AnsiString l_stempString;
// convert the string into an array of chars
wchar_t* outputChars = l_sConversion.w_str();
char * pszOutputString = NULL;
//convert any unicode characters to ASCII
ConvertUnicodeToAscii(outputChars, pszOutputString);
l_stempString = (AnsiString)pszOutputString;
//We're going backwards since we are removing characters which changes the length and position.
for (int i = l_stempString.Length(); i > 0; i--)
{
char l_sCurrentChar = l_stempString[i];
//If we don't have a valid character, filter it out of the string.
if (((unsigned int)l_sCurrentChar < 31) ||((unsigned int)l_sCurrentChar > 127))
{
String l_sSecondHalf = "";
String l_sFirstHalf = "";
l_sSecondHalf = l_stempString.SubString(i + 1, l_stempString.Length() - i);
l_sFirstHalf = l_stempString.SubString(0, i - 1);
l_stempString = l_sFirstHalf + l_sSecondHalf;
filterChars += "\'" + ((String)(unsigned int)(l_sCurrentChar)) + "\' ";
}
}
if (filterChars.Length() > 0)
{
LogInformation(__LINE__, __FUNC__, Utilities::LOG_CATEGORY_GENERAL, "The Following ASCII Values were filtered from the string: " + filterChars);
}
// Delete the char* to avoid memory leaks.
delete [] pszOutputString;
return l_stempString;
}
Now this seems to work except, when you try to copy and past bullets from a word document.
o Bullet1:
 subbullet1.
You will get something like this
oBullet1?subbullet1.
My filter function is called on an onchange event.
The bullets are replaced with the value o and a question mark.
What am I doing wrong, and is there a better way of trying to do this.
I'm using c++ builder XE5 so please no Visual C++ solutions.
When you perform the conversion to ASCII (which is not actually converting to ASCII, btw), Unicode characters that are not supported by the target codepage are lost - either dropped, replaced with ?, or replaced with a close approximation - so they are not available to your scanning loop. You should not do the conversion at all, scan the source Unicode data as-is instead.
Try something more like this:
#include <System.Character.hpp>
String __fastcall TMainForm::filterInput(String l_sConversion)
{
// Used to store every character sequence that was stripped out.
String filterChars;
// Not Used. We never received the whitelist
String l_SWhiteList;
// Our String without the invalid sequences.
String l_stempString;
int numChars;
for (int i = 1; i <= l_sConversion.Length(); i += numChars)
{
UCS4Char ch = TCharacter::ConvertToUtf32(l_sConversion, i, numChars);
String seq = l_sConversion.SubString(i, numChars);
//If we don't have a valid codepoint, filter it out of the string.
if ((ch <= 31) || (ch >= 127))
filterChars += (_D("\'") + seq + _D("\' "));
else
l_stempString += seq;
}
if (!filterChars.IsEmpty())
{
LogInformation(__LINE__, __FUNC__, Utilities::LOG_CATEGORY_GENERAL, _D("The Following Values were filtered from the string: ") + filterChars);
}
return l_stempString;
}

Odd atoi(char *) issue

I'm experiencing a very odd issue with atoi(char *). I'm trying to convert a char into it's numerical representation (I know that it is a number), which works perfectly fine 98.04% of the time, but it will give me a random value the other 1.96% of the time.
Here is the code I am using to test it:
int increment = 0, repetitions = 10000000;
for(int i = 0; i < repetitions; i++)
{
char randomNumber = (char)rand()%10 + 48;
int firstAtoi = atoi(&randomNumber);
int secondAtoi = atoi(&randomNumber);
if(firstAtoi != secondAtoi)NSLog(#"First: %d - Second: %d", firstAtoi, secondAtoi);
if(firstAtoi > 9 || firstAtoi < 0)
{
increment++;
NSLog(#"First Atoi: %d", firstAtoi);
}
}
NSLog(#"Ratio Percentage: %.2f", 100.0f * (float)increment/(float)repetitions);
I'm using the GNU99 C Language Dialect in XCode 4.6.1. The first if (for when the first number does not equal the second) never logs, so the two atoi's return the same result every time, however, the results are different every time. The "incorrect results" seemingly range from -1000 up to 10000. I haven't seen any above 9999 or any below -999.
Please let me know what I am doing wrong.
EDIT:
I have now changed the character design to:
char numberChar = (char)rand()%10 + 48;
char randomNumber[2];
randomNumber[0] = numberChar;
randomNumber[1] = 0;
However, I am using:
MAX(MIN((int)(myCharacter - '0'), 9), 0)
to get the integer value.
I really appreciate all of the answers!
atoi expects a string. You have not given it a string, you have given it a single char. A string is defined as some number of characters ended by the null character. You are invoking UB.
From the docs:
If str does not point to a valid C-string, or if the converted value would be out of the range of values representable by an int, it causes undefined behavior.
Want to "convert" a character to its integral representation? Don't overcomplicate things;
int x = some_char;
A char is an integer already, not a string. Don't think of a single char as text.
If I'm not mistaken, atoi expects a null-terminated string (see the documentation here).
You're passing in a single stack-based value, which does not have to be null-terminated. I'm extremely surprised it's even getting it right: it could be reading off hundreds of garbage numbers into eternity, if it never finds a null-terminator. If you just want to get the number of a single char (as in, the numeric value of the char's human-readable representation), why don't you just do int numeric = randomNumber - 48 ?

Resources