I would like to convert a string to int and to compare 2 ints :
I tried :
var str1="0.0.1";
var str2="0.0.2";
var s1 = int.parse(str1.replaceAll(".", ""));
var s2 = int.parse(str2.replaceAll(".", ""));
print(s1); //1
print(s2); //2
if(s1 < s2){
print("ok");
}
but I get:
any idea?
The code runs without issues on DartPad with Flutter 2.2.1 and Dart SDK 2.13.1. It prints the following
1
2
ok
int._throwFormatException is usually displayed when the source String is invalid as mentioned on the SDK. I also suggest trying to update the Flutter and Dart SDK that you're using to see if it fixes the issue on your end.
Related
How do you write a Binary Literal in Dart?
I can write a Hex Literal like so:
Int Number = 0xc
If I try the conventional way to write a Binary Literal:
Int Number = 0b1100
I get an error. I've tried to look it up, but I've not been able to find any information other than for hex.
There are currently no built-in binary number literals in Dart (or any base other than 10 and 16).
The closest you can get is: var number = int.parse("1100", radix: 2);.
Maybe you can use this:
// 0b1100 -> 1 at 3th bit and 1 at 2nd bit
final number = 1 << 3 | 1 << 2;
// Print binary string
print(number.toRadixString(2)) // 1100
Try binary package:
import 'package:binary/binary.dart';
void main() {
// New API.S
print(0x0C.toBinaryPadded(8)); // 00001100
}
see: https://pub.dev/documentation/binary/latest/
I am pretty new to Swift, and I don't have much exposure to C.
I am trying to write a function in C that will get a Swift string that I can then do something with. The problem is that I'm not 100% sure what the type should be in Swift to make C like what it sees.
So far, I have found several examples on Stack that seem like good starting points, but some examples seem dated for the current version of Swift.
I first started by using this example to get C and Swift talking to one another: Swift call C call Swift? I then took that and tried updating the Swift function to return a string of some kind. I understand that it needs to be a UTF-8 return type, but I'm not sure how to go about sending things properly. I've looked at How to pass a Swift string to a c function?, How to convert String to UnsafePointer<UInt8> and length, and How to convert string to unicode(UTF-8) string in Swift?, but none of them really work for a solution. Or I'm just typing it in incorrectly. So far, the closest I can get to returning something is as follows.
In Swift, my ViewController is:
import UIKit
class ViewController: UIViewController {
#_silgen_name("mySwiftFunc") // give the function a C name
public func mySwiftFunc(number: Int) -> [CChar]
{
print("Hello from Swift: \(number)")
let address: String = "hello there";
let newString = address.cString(using: String.Encoding.utf8)
return newString!
}
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
blah()
}
}
And in C, the header is like:
#ifndef cfile_h
#define cfile_h
#include <stdio.h>
const char * mySwiftFunc(int);
int blah(void);
#endif /* cfile_h */
And the source is like:
#include "cfile.h"
int blah() {
const char * retVal = mySwiftFunc(42); // call swift function
printf("Hello from C: %s", retVal);
return 0;
}
There is a bridging header file that just has #include "cfile.h". Obviously, there is still a lot of remnants from the first example, and these will be cleaned up later.
What needs to change to make this work? Right now, the console spits out
Hello from Swift: 42
Hello from C: (B\214
The Swift equivalent of const char * is UnsafePointer<CChar>?, so that's the correct return value. Then you have to think about memory management. One options is do allocate memory for the C string in the Swift function, and leave it to the caller to release the memory eventually:
public func mySwiftFunc(number: Int) -> UnsafePointer<CChar>? {
print("Hello from Swift: \(number)")
let address = "hello there"
let newString = strdup(address)
return UnsafePointer(newString)
}
passes a Swift string to strdup() so that a (temporary) C string representation is created automatically. This C string is then duplicated. The calling C function has to release that memory when it is no longer needed:
int blah() {
const char *retVal = mySwiftFunc(42);
printf("Hello from C: %s\n", retVal);
free((char *)retVal);
return 0;
}
⚠️ BUT: Please note that there are more problems in your code:
mySwiftFunc() is an instance method of a class, and therefore has an implicit self argument, which is ignored by the calling C function. That might work by chance, or cause strange failures.
#_silgen_name should not be used outside of the Swift standard library, see this discusssion in the Swift forum.
A slightly better alternative is #_cdecl but even that is not officially supported. #_cdecl can only be used with global functions.
Generally, calling Swift functions directly from C is not officially supported, see this discussion in the Swift forum for the reasons and possible alternatives.
This question already has answers here:
Could not find an overload for '^' that accepts the supplied arguments
(2 answers)
Closed 6 years ago.
I'm using IBM's online Swift sandbox and I'm trying to run this code but it's giving me an error:
The code:
var a: Double = 1.0
var b: Double = 2.0
var c: Double = 3.0
var x: Double = 0.0
func x_func(a_var: Double, b_var: Double, c_var: Double) -> Double {
x = (-b + (b^2 - 4*a*c)^(1/2))/(2*a)
return x
}
print(x_func(a_var: a, b_var: b, c_var: c))
print(a)
print(b)
print(c)
The error:
<unknown>:0: error: unable to execute command: Killed
<unknown>:0: error: compile command failed due to signal (use -v to see invocation)
Could someone help me figure out what's wrong? I'm brand new to Swift, so I don't see an error here.
Try using pow when you want to use exponents, the ^ doesn't work with Swift for that. With Swift ^ produces bitwise XOR - not exponent.
import Foundation
var a: Double = 1.0
var b: Double = 2.0
var c: Double = 3.0
var x: Double = 0.0
func x_func(a_var: Double, b_var: Double, c_var: Double) -> Double {
x = (-b + pow((pow(b, 2) - 4*a*c), 1/2))/2*a
return x
}
Make sure when using things like IBM Sandbox, or HackerRank or whatever to import Foundation framework! I know it's easy to forget and can cause a lot of headache if you do.
Also, unless you're doing extra computation in your x_func method that you're not showing, you're asking for parameters to be passed in that you don't even use. You can either get rid of the properties, or change the function to not take in any variables - it would probably be more efficient to get rid of the properties and change it to something like this:
import Foundation
func x_func(a: Double, b: Double, c: Double) -> Double {
x = (-b + pow((pow(b, 2) - 4*a*c), 1/2))/2*a
return x
}
I have a receipt validation class that is deprecated since Swift 3 has released. I fixed some issues, but I still have many ...
Here is the GitHub source code I used : https://gist.github.com/baileysh9/4386ea92b047d97c7285#file-parsing_productids-swift and https://gist.github.com/baileysh9/eddcba49d544635b3cf5
First Error :
var p = UnsafePointer<UInt8>(data.bytes)
Compiler throws : Cannot invoke initializer for type UnsafePointer(UInt8) with an argument list of type UnsafeRawPointer
Second error
while (ptr < end)
Binary operators < cannot be applied to two UnsafePointer(UInt8) operands
Thank you very much in advance :)
EDIT
Thanks to LinShiwei answer I found a solution to UnsafePointer declaration. It compiles but not tested yet (because other errors avoid me to test) :
func getProductIdFromReceipt(_ data:Data) -> String?
{
let tempData: NSMutableData = NSMutableData(length: 26)!
data.withUnsafeBytes {
tempData.replaceBytes(in: NSMakeRange(0, data.count), withBytes: $0)
}
var p: UnsafePointer? = tempData.bytes.assumingMemoryBound(to: UInt8.self)
In Swift 3, you cannot init an UnsafePointer using an UnsafeRawPointer.
You can use assumingMemoryBound(to:) to convert an UnsafeRawPointer into an UnsafePointer<T>. Like this:
var ptr = data.bytes.assumingMemoryBound(to: UInt8.self)
Use debugDescription or distance(to:) to compare two pointer.
while(ptr.debugDescription < endPtr.debugDescription)
or
while(ptr.distance(to:endPtr) > 0)
It is possible to put a regular pointer sign i C & in front of a Int8 array or Uint8 array to make a pointer to supply a C-function input. Like the &aBuffer array below (but the data array must copied locally first, to keep control of the storage of the data storage until finished with the operation, you get an error else). Here in a routine handling dropInteraction data (delivering a byte array):
func interpretInstanceData(filename: String, Buffer: [UInt8]) -> String {
var aBuffer = Buffer
let sInstanceData = String(cString: Ios_C_InterpretInstanceData(filename, &aBuffer, Int32(aBuffer.count)))
The question is slightly old. But googling for a solution on the topic of converting a swift byte array to a C-pointer (what else is UnsafePointer< UInt8 >). This question made a hit. But I think this answer is helpful for later editions of Swift (that I use). Would have worked even then. Worked in any kind of use needing a pointer from swift (just make the array of right type first).
May have recently changed to just this, without the ".bytes." part:
var p: UnsafePointer = data.assumingMemoryBound(to: UInt8.self)
from the original:
var p = UnsafePointer<UInt8>(data.bytes)
I am converting string to Float/Decimal in monotouch but it is giving format exception. I am using Decimal.Parse(), Convert.ToDecimal(). Please give any solution for this conversion.
decimal d = Convert.ToDecimal(UIDevice.CurrentDevice.SystemVersion, CultureInfo.InvariantCulture);
decimal d = Decimal.Parse(UIDevice.CurrentDevice.SystemVersion, CultureInfo.InvariantCulture);
SystemVersion is a string with multiple dots . characters. That won't parse correctly as is into a float or a decimal.
Depending on what you want you could modify the string before parsing. E.g. if you want 7.0 out of the string 7.0.2 then you could so a substring (up to the 2nd . character).
OTOH if what you need to do is a version check (most common operation) then you only need to do this:
if (UIDevice.CurrentDevice.CheckSystemVersion (7,0)) {
// do this in iOS7+
} else {
// do this before iOS7
}
System.Version has a constructor that parses that kind of string, from 2 to 4 components, in the form of:
major.minor[.build[.revision]]
So, to parse UIDevice.CurrentDevice.SystemVersion you can do:
var version = new Version (UIDevice.CurrentDevice.SystemVersion);
if (version.Major >= 7) {
//iOS7+
} else {
// anything else
}
If you're trying to figure out a particular version, use this. Otherwise #poupou's solution is great and more iOS-minded.