how can access the binary data file(.DAT). i am using geonames API. can anyone help me?
If you are referring to the binary flat file format used by MaxMinds GeoLocation database, they offer some handy utility classes in C# and Java to access it.
http://www.maxmind.com/app/api
Assuming you are using C# (from the tag), you can use BinaryReader class to read binary data. See How to read and write to a binary file:
FileStream fs = File.Open(Environment.CurrentDirectory + #"\settings.bin", FileMode.Open);
BinaryReader reader = new BinaryReader(fs);
long number = reader.ReadInt64();
byte[] bytes = reader.ReadBytes(3);
string s = reader.ReadString();
reader.Close();
fs.Close();
Console.WriteLine(number);
foreach (byte b in bytes)
{
Console.Write("[{0}]", b);
}
Console.WriteLine();
Console.WriteLine(s);
Related
I have the in-memory bytes of a "blob", but the API that I want to process this "blob" with only accepts dart:io File objects.
Is there a way to create a "fake" dart:io File , simply wrapping my in-memory bytes, so that I can pass this "fake" File to my API?
Assume that a file system doesn't exist, and assume that I can't write the in-memory bytes to a "real" file.
Thanks!
You can create a memory-file using MemoryFileSystem from the package file:
Example:
File file = MemoryFileSystem().file('test.dart')
..writeAsBytesSync(blobBytes);
Add path provider dependency on pubspec.yaml
dependencies:
path_provider: 0.2.2
Write byte data to file, use it, then delete it.
import 'dart:io';
import 'package:path_provider/path_provider.dart';
main() async {
String dir = (await getTemporaryDirectory()).path;
File temp = new File('$dir/temp.file');
var bytes = [0, 1, 2, 3, 4, 5];
await temp.writeAsBytes(bytes);
/*do something with temp file*/
temp.delete();
}
You can override the io using the IOOverides class. See the below example:
await IOOverrides.runZoned(() async {
// Write your code here.
},
createDirectory: (p0) {
return MockDirectory()
},
createFile: (p0) {
return MockeFile();
},
);
You can check the doc here for more info.
I always use cross_file when designing APIs that work with files.
In addition to working on multiple platforms, it also has an XFile.fromData constructor that allows you to create a file in memory.
I'm very new to Android programming. I have a code which creates a file in a designated folder and then tried to write something to it. Like below:
path = System.Environment.GetFolderPath(System.Environment.SpecialFolder.MyDocuments);
var filename = Path.Combine(path, "Test.xml");
Directory.CreateDirectory (path);
if (!File.Exists (path + "/" + "Test.xml")) {
File.Create (path + "/" + "Test.xml");
}
using (var streamWriter = new StreamWriter(filename, true))
{
streamWriter.WriteLine("<?xml version='1.0' encoding='utf-8'?>");
streamWriter.WriteLine ("<Apples>");
streamWriter.WriteLine ("</Apples>");
}
In line using (var streamWriter = new StreamWriter(filename, true)), I'm getting the Sharing Violation on path error.
Could someone please point me as to exactly where I'm going wrong and provide me a solution.
Thanks,
Anirban
Why do you create the file then reopen it to write to it. StreamWriter has an method that will do just that. It will create a new file if it doesn't exist.
Initializes a new instance of the StreamWriter class for the specified file on the specified path, using the default encoding and buffer size. If the file exists, it can be either overwritten or appended to. If the file does not exist, this constructor creates a new file.
StreamWriter could not access the file because File.Create returned a FileStream you did not consume.
As mentioned above, the File.Create is not necessary. You could also use:
using (var writer = new StreamWriter(File.Create(statusTxtPath)))
{
// do work here.
}
which will consume the file stream and close it. Whenever working with streams and most classes that interact with streams, be sure to use the using() block to ensure handles are released properly.
Ok...I have managed to resolve the issue...by using
using (var streamWriter = new StreamWriter (File.Create (path + "/" + "DoctorsList.xml")))
Basically, I use the Any23 distiller to extract RDF statements from files embedded with RDFa (The actual files where created by DBpedia Spotlight using the xhtml+xml output option). By using Any23 RDFa distiller I can extract the RDF statements (I also tried using Java-RDFa but I could only extract the prefixes!). However, when I try to pass the statements to a Jena model and print the results to the console, nothing happens!
This is the code I am using :
File myFile = new File("T1");
Any23 runner= new Any23();
DocumentSource source = new FileDocumentSource(myFile);
ByteArrayOutputStream outA = new ByteArrayOutputStream();
InputStream decodedInput=new ByteArrayInputStream(outA.toByteArray()); //convert the output stream to input so i can pass it to jena model
TripleHandler writer = new NTriplesWriter(outA);
try {
runner.extract(source, writer);
} finally {
writer.close();
}
String ttl = outA.toString("UTF-8");
System.out.println(ttl);
System.out.println();
System.out.println();
Model model = ModelFactory.createDefaultModel();
model.read(decodedInput, null, "N-TRIPLE");
model.write(System.out, "TURTLE"); // prints nothing!
Can anyone tell me what I have done wrong? Probably multiple things!
Is there any easy way i can extract the subjects of the RDF statements directly from any23 (bypassing Jena)?
As I am quite inexperienced in programming any help would be really appreciated!
You are calling
InputStream decodedInput=new ByteArrayInputStream(outA.toByteArray()) ;
before calling any23 to insert triples. At the point of the call, it's empty.
Move this after the try-catch block.
I'm having a problem with UTF8 encoding in my asp.net mvc 2 application in C#. I'm trying let user download a simple text file from a string. I am trying to get bytes array with the following line:
var x = Encoding.UTF8.GetBytes(csvString);
but when I return it for download using:
return File(x, ..., ...);
I get a file which is without BOM so I don't get Croatian characters shown up correctly. This is because my bytes array does not include BOM after encoding. I triend inserting those bytes manually and then it shows up correctly, but that's not the best way to do it.
I also tried creating UTF8Encoding class instance and passing a boolean value (true) to its constructor to include BOM, but it doesn't work either.
Anyone has a solution? Thanks!
Try like this:
public ActionResult Download()
{
var data = Encoding.UTF8.GetBytes("some data");
var result = Encoding.UTF8.GetPreamble().Concat(data).ToArray();
return File(result, "application/csv", "foo.csv");
}
The reason is that the UTF8Encoding constructor that takes a boolean parameter doesn't do what you would expect:
byte[] bytes = new UTF8Encoding(true).GetBytes("a");
The resulting array would contain a single byte with the value of 97. There's no BOM because UTF8 doesn't require a BOM.
I created a simple extension to convert any string in any encoding to its representation of byte array when it is written to a file or stream:
public static class StreamExtensions
{
public static byte[] ToBytes(this string value, Encoding encoding)
{
using (var stream = new MemoryStream())
using (var sw = new StreamWriter(stream, encoding))
{
sw.Write(value);
sw.Flush();
return stream.ToArray();
}
}
}
Usage:
stringValue.ToBytes(Encoding.UTF8)
This will work also for other encodings like UTF-16 which requires the BOM.
UTF-8 does not require a BOM, because it is a sequence of 1-byte words. UTF-8 = UTF-8BE = UTF-8LE.
In contrast, UTF-16 requires a BOM at the beginning of the stream to identify whether the remainder of the stream is UTF-16BE or UTF-16LE, because UTF-16 is a sequence of 2-byte words and the BOM identifies whether the bytes in the words are BE or LE.
The problem does not lie with the Encoding.UTF8 class. The problem lies with whatever program you are using to view the files.
Remember that .NET strings are all unicode while there stay in memory, so if you can see your csvString correctly with the debugger the problem is writing the file.
In my opinion you should return a FileResult with the same encoding that the files. Try setting the returning File encoding,
I have a site where I allow members to upload photos. In the MVC Controller I take the FormCollection as the parameter to the Action. I then read the first file as type HttpPostedFileBase. I use this to generate thumbnails. This all works fine.
In addition to allowing members to upload their own photos, I would like to use the System.Net.WebClient to import photos myself.
I am trying to generalize the method that processes the uploaded photo (file) so that it can take a general Stream object instead of the specific HttpPostedFileBase.
I am trying to base everything off of Stream since the HttpPostedFileBase has an InputStream property that contains the stream of the file and the WebClient has an OpenRead method that returns Stream.
However, by going with Stream over HttpPostedFileBase, it looks like I am loosing ContentType and ContentLength properties which I use for validating the file.
Not having worked with binary stream before, is there a way to get the ContentType and ContentLength from a Stream? Or is there a way to create a HttpPostedFileBase object using the Stream?
You're right to look at it from a raw stream perspective because then you can create one method that handles streams and therefore many scenarios from which they come.
In the file upload scenario, the stream you're acquiring is on a separate property from the content-type. Sometimes magic numbers (also a great source here) can be used to detect the data type by the stream header bytes but this might be overkill since the data is already available to you through other means (i.e. the Content-Type header, or the .ext file extension, etc).
You can measure the byte length of the stream just by virtue of reading it so you don't really need the Content-Length header: the browser just finds it useful to know what size of file to expect in advance.
If your WebClient is accessing a resource URI on the Internet, it will know the file extension like http://www.example.com/image.gif and that can be a good file type identifier.
Since the file info is already available to you, why not open up one more argument on your custom processing method to accept a content type string identifier like:
public static class Custom {
// Works with a stream from any source and a content type string indentifier.
static public void SavePicture(Stream inStream, string contentIdentifer) {
// Parse and recognize contentIdentifer to know the kind of file.
// Read the bytes of the file in the stream (while counting them).
// Write the bytes to wherever the destination is (e.g. disk)
// Example:
long totalBytesSeen = 0L;
byte[] bytes = new byte[1024]; //1K buffer to store bytes.
// Read one chunk of bytes at a time.
do
{
int num = inStream.Read(bytes, 0, 1024); // read up to 1024 bytes
// No bytes read means end of file.
if (num == 0)
break; // good bye
totalBytesSeen += num; //Actual length is accumulating.
/* Can check for "magic number" here, while reading this stream
* in the case the file extension or content-type cannot be trusted.
*/
/* Write logic here to write the byte buffer to
* disk or do what you want with them.
*/
} while (true);
}
}
Some useful filename parsing features are in the IO namespace:
using System.IO;
Use your custom method in the scenarios you mentioned like so:
From an HttpPostedFileBase instance named myPostedFile
Custom.SavePicture(myPostedFile.InputStream, myPostedFile.ContentType);
When using a WebClient instance named webClient1:
var imageFilename = "pic.gif";
var stream = webClient1.DownloadFile("http://www.example.com/images/", imageFilename)
//...
Custom.SavePicture(stream, Path.GetExtension(imageFilename));
Or even when processing a file from disk:
Custom.SavePicture(File.Open(pathToFile), Path.GetExtension(pathToFile));
Call the same custom method for any stream with a content identifer that you can parse and recognize.