Looking for a open source WBXML parser/writer to use with active sync . Do you know a good one ? (JAVA)
I have a good WBXML encoder and decoder implemented in Java (for Android) for my app (Corporate AddressBook). It works well in regular Java as well.
Look in the wbxml folder (you will need both the wbxml and the activesync directories).
kxml2 is the best implementation I've seen so far, you need to provide the codepages for ActiveSync yourself.
Code pages are simple arrays of string (tag names) easily composed from the ActiveSync wbxml specification.
Related
We are developing an Application which runs on various plattforms (Windows, Windows RT, MacOSX, iOS, Android).
The Problem is how to manage the different localizations on the different Platforms in an Easy Way. The Language Files on the different platforms have various formats (some are xml based, others are simple key-value pairs and others are totally crazy formats like on MacOS)
I'm sure, we aren't the first company with this problem, but I wasn't able to find an easy to use solution o achive the possibility to have one "datasource" where the strings are collected in different languages (the best would be an User Interface for the translators) and then can export it to the different formats for the different platforms.
Does anybody has a solution for this problem?
Greetings
Alexander
I recommend using GNU Gettext toolchain for management and at runtime use either
some alternate implementation for runtime reading like Boost.Locale,
own implementation (the .mo format is pretty trivial) or
use Translate Toolkit to convert the message catalogs to some other format of your liking.
You can't use the libintl component of GNU Gettext, because it is licensed under LGPL and terms of both Apple AppStore and Windows Live Store are incompatible with that license. But it is really trivial to reimplement the bit you need at runtime.
The Translate Toolkit actually reimplements all or most of GNU Gettext and supports many additional localization formats, but the Gettext .po format has most free tools for it (e.g. poedit for local editing and Weblate for online editing) so I recommend sticking with it anyway. And read the GNU Gettext manual, it describes the intended process and rationale behind it well.
I have quite good experience with the toolchain. The Translate Toolkit is easy to script when you need some special processing like extracting translatable strings from your custom resource files and Weblate is easy to use for your translators, especially when you rely on business partners and testers in various countries for most translations like we do.
Translate Toolkit also supports extracting translatable strings from HTML, so the same process can be used for translating your web site.
I did a project for iPhone and Android which had many translations and I think I have exactly the solution you're looking for.
The way I solved it was to put all translation texts in an Excel spreadsheet and use a VBA macro to generate the .string and .xml translation files from there. You can download my example Excel sheet plus VBA macro here:
http://members.home.nl/bas.de.reuver/files/multilanguage.zip
Just recently I've also added preliminary Visual Studio .resx output, although that's untested.
edit:
btw also my javascript xcode/eclipse converter might be of use..
you can store your translations on https://l10n.ws and get it via they API
Disclaimer: I am the CTO and Co-Founder at Tethras, but will try to answer this in a way that is not just "Use our service".
As loldop points out above, you really need to normalize your content across all platforms if you want to have a one-stop solution for managing your localized content. This can be a lot of work, and would require much coding and scripting and calling of various tools from the different SDKs to arrive at a common format that would service the localization needs of all the various file formats you need to support. The length and complexity of my previous sentence is inversely proportional to the amount of work you would need to do to arrive at a favorable solution for all of this.
At Tethras, we have built a platform that alleviates the need for multi-platform software publishers to have to do this. We support all of the native formats from the platforms you list above, and can leverage translations from one file format to another. For example, translate the content in Localizable.strings from your iOS app into a number of languages, then upload your equivalent strings.xml file from Android or foo.resx from Windows RT to the system, and it will leverage translations for you automatically. Any untranslated strings will be flagged and you can order updates for these strings.
In effect, Tethras is a CMS for localized content across many different native files formats.
I'm looking to port my working Android XML parser to Blackberry, but the latter's Java feature set isn't as rich? I didn't want to have to write two parsers.
The following code yields "The method getXMLReader() is undefined for the type SAXParser":
SAXParserFactory spf = SAXParserFactory.newInstance();
SAXParser sp = spf.newSAXParser();
XMLReader xr = sp.getXMLReader();
Am I just out of luck here?
It's true I am trying to use org.xml.sax. I've read all the XML parsing discussions I can find out there. I wonder now if I can do this? Should I be using org.kxml2 instead because org.xml.sax makes no sense in BlackBerry land?
Thanks for any advice!
Russ
You don't need to use the getXmlReader() method.
Now that you have your SAXParser use it to parse a document or stream.
SaxParserFactory spf = SaxParerFacter.newInstance();
SAXParser parser = spf.newSAXParser();
Open your stream or file and call and assign it to a variable. Let's call ours input.
parser.parse(input, handler)
The handler file will implement all of the call backs to handle the events the parser encounters.
I found this explanation of SAX to be quite helpful.
I'll answer this though I suspect there are others who know better.
My assessment of BlackBerry is that it's very poor in its API set. So, the SAX XML parser isn't available as it is on Android. Okay, that's cool. It's older and from a "smaller" time.
Worse though, it appears very challenging even to add a third-party library to a BlackBerry application. I followed various posts out there and failed to incorporate my own "third-party" JAR convincingly into a BlackBerry project despite the collective wisdom of a number of web pages on the topic.
I was thinking then of writing my own parsing engine to replace SAXParser.parse(). How hard could it be since my expectations for it are childishly simple?
Very hard indeed since it appears that the JavaME support for java.lang.Class is impoverished as well; it doesn't support the important reflection methods such as getDeclaredMethods() for use in creating the engine (into which I naturally wanted to plug my existing XML parser-handler).
Alas, this makes me wonder just what BlackBerry apps out there are able to do? I'm probably giving this world short shrift, but a couple of days were sufficient for me to go from zero to parsing XML texts off the web on Android, so I expected a very easy time of it here too.
Please feel free to shred my answer. If you can and do, especially if you add a real one, it will doubtless greatly benefit other folk new to BlackBerry development including me later when I come back to the problem (so that I can avoid brute-force stringing through the XML stream instead of cleanly parsing it).
I'm looking for the best option to store my application settings. I decided to write own class that inherits from TPersistent which would store all the config options available. Currently I'm looking for the best way to save it - and I found JvAppStorage which looked very promising (as I'm using JVCL in my project anyway...) but it doesn't handle unicode (WideStrings) properly. For XML files it stores chars as entities, for ini file it seems to be stored ok, but in both cases loading strings replaces the text with lots of question marks...
Is there any good replacement that handles Unicode as well?
Thanks in advance.
Recently converted to JSON from ini files (and dreaded xml!) for setting storage. It's just so convenient and flexible. See SuperObject.
It's quite common use use UTF-8 as the on-disk representation of Unicode data. In your code, use the Utf8String data type to hold data encoded that way so you remember that you'll need to convert it before using it in the rest of your application.
I use MSXML to store settings per user in a personal directory on the network.
It should handle Unicode as well.
I have an external device that spits out UDP packets of binary data and software running on an embedded system that needs to read this data stream, parse it and do somethign useful. The binary data gets logged to a file as well. I would like to write a parser that can easily take the input directly from either the UDP stream, or a file, parse the data into a specific format and then direct the output to either a file (e.g. matlab dat file) or to another process that will do some real time processing. Are there any resources that would help me with this and what is the best way to go about this? I think it might make sense to use C++ streams but I'm not familiar with creating custom output streams. Does this seem like a good approach to take or is there a better way to go about it?
Thanks.
The beauty of binary data is that its is generally of very fixed format.
A typical method of parsing it is to declare a structure that maps onto the received packets, and then to just use type-casts to read the fields as structure elements.
The beauty is that this requires no parsing.
you have to be careful about structure packing rules, and endian-ness to make the structure map exactly the same way. Use of the C "offsetof" and "sizeof" macros is useful to emit some debug info to check that your structure is indeed mapping to what you think it is mapping.
Packing rules can typically be altered either by directives (such as #pragma's) or command line options. Endian-ness you are stuck with. If its different from what your embedded system uses, declare all the fields as bytes, or use something like the "ntoh" macro to do the byte swapping.
The New Jersey Machine Code Toolkit is a scheme for decoding arbitrary binary patterns. It was originally designed for decoding instruction sets, but it ought to be just fine for decoding message formats. You provide a description of the binary format, it synthesizes code to access the fields of that format (when valid). THus you can refer to message fields using generated function calls rather than think about where the field is or how it is encoded.
I've been looking for a good general purpose binary network protocol definition framework to provide a way to write real-time game servers and clients (think World Of Warcraft or Quake III) in multiple languages (e.g. Java backend server and iPhone front-end client written in Objective-C and Cocoa).
I want to support Java Flash clients, iPhone clients and C# clients on windows (and XNA clients on XBOX).
I'm looking for a way to efficiently send/receive messages over a TCP/IP or UDP socket stream connection. I'm not looking for something that can be sent over an HTTP Web Service, like JSON or XML marshalled Objects. Although Hessian's binary web service protocol is a very interesting solution
I want a network protocol format and client/server basic implementation that will allow a client to connect to a server and send any message in the defined protocol and receive any message in the protocol without having to bind to some kind of RPC endpoint. I want a generic stream of any message in my protocol incoming and outgoing. This is so that I can support things like the server sending all clients the positions of various entities in the game every 100 milliseconds.
The network protocol frameworks I've found are as follows:
Google's Protocol Buffer - but it lacks support for things like sending/receiving arbitrary messages from your given protocol.
Apache Thrift - an interesting option but it is geared mainly towards RPC instead of generic game client/server socket type connections where the client or server can send messages at any time and not just in response to a client RPC request.
Raknet Multiplayer - Raknet provides full multiplayer network library (it's free for indie development with revenue under $250k)
UPDATE : OculusVR Acquired RakNet and its Free/OpenSource now. U can find it on Github
Hessian Binary Web Service Protocol - is a HTTP web service binary protocol, it is well-suited to sending binary data without any need to extend the protocol with attachments.
Raknet provides a good game/simulation oriented multiplayer library.
Apache Thrift and Google's protocol buffers seem to be the simplest approaches to using in a game network protocol client/server architecture.
Hessian seems like a great fit if you want to create a web based game server with a Java or flash client using some type of server push technology like COMET. Hessian might provide a really interesting way to support real-time games on the web and even be able to host them on VM web solutions like Google's App engine or Amazon's EC2.
There's some discussion about using various protocol definition frameworks for games and other uses:
Comparison of Various Serialization Frameworks
Thrift vs Protocol Buffers - Thrift is declared the better framework because it has a fully supported RPC client/server implementation
Using Protocol Buffers for client server Game API determining what type of message to decode
Bi-Directional RPC using thrift
DIS
If you do go the route of writing your own protocol, you may want to read the answer I posted here.
In summary it discusses what you should think about when writing a protocol, and list a few tricks for versioning and maintaining backwards and forward compatibility.
If you are really concerned about multiple platforms and language, be sure to take into account endian issues. A binary protocol designed for this use must use network-byte-order, so it needs custom per-data-type serialization functions; you cannot just blindly push C structs into network buffers.
A common solution for this problem at game companies is to have protocol description language or specification in a simple format like XML or python or lua, and then have code generation for each target language that generated packet classes with both data structure and serialization. This specification could use a type system that starts with basic types, then extends to include game-specific types with semantic information, enumerations or more complex structures. For example a data file could look like:
Attack = {
source = 'objectId',
target = 'objectId',
weapon = 'weapon::WEAP_MAIN',
seed = 'int'
}
This could generate code like:
#define PT_ATTACK 10002
class PacketAttack : public Packet {
public:
PacketAttack () : m_packetType(PacketAttack::s_packetType) {}
ObjectId m_source;
ObjectId m_target;
WeaponType m_weapon;
int m_seed;
bool Write(Stream* outStream) {
Packet::Write(outStream);
outStream << m_source;
outStream << m_target;
outStream << m_weapon
outStream << m_seed;
}
bool Read(Stream* inStream);
static const int s_packetType;
};
This does require some more infrastructure.. streams, packet base classes, safe serialization functions..
I want to echo Bill K's suggestion. It's not hard to roll your own protocol.
For the iPhone side, have a look at AsyncSocket which support for delimiter based TCP packets built in, and it's not hard to build a solution which uses packet headers.
If you quickly want to have a testserver to play against AsyncSocket on the iPhone, you can look at Naga (for the java server part) which has ready made stuff both for delimiter based packets and packets with headers. Naga was partially written with networked games in mind.
I disagree with "roll your with simple delimited strings approach": question is, what exactly would be the benefit? Getting to write and maintain more code?
The only reasons I could see would be lack of tool support (writing for some odd platform), or specific (very) hard performance or message size constraints.
Or, sometimes, really wanting to write a format -- that's ok, but it must be an explicit reason.
Depending on exact needs I would suggest considering JSON, since it can read and write arbitrary messages; has good object binders for Java (just like xml), is easier to read than binary formats, and is all around "good enough" for many use cases.
If message size is very important, Protobuf can work well -- while its size is not always as small as gzipped alternatives (gzip+xml, gzip+json compress very well), it's usually close.
ASN.1 fits the definition of "good general purpose binary network protocol definition framework". It's also standardized by ITU-T, so there's a lot of existing tools and libraries for various languages.
The DER encoding is suitable for efficient network communications, the XER encoding for human-readable (and writable) permanent storage.
Because you want to use different languages and also because you want something clean/small, I suggest the protocol buffers of google. You need a pre-compile part for the RPC but I really think that's the best option when you begin to mix different languages.. Here's the link: http://code.google.com/apis/protocolbuffers/docs/overview.html
Why not implement UDP directly? Your question mostly mentions what you don't want.. What further form of abstration do you want on top of UDP?
Download the Quake III sourcecode and see how they frame game updates over UDP?
The IP protocol has been designed to support multiple devices/OSes in a uniform way, isn't this what you ask for?
What protocol has implementations across a huge range of systems, hmm, IP perhaps?