do not use com.sun.xml.internal.*? - ant

Is this statement true:
com.sun.xml.internal package is an internal package as the name suggestes.
Users should not write code that depends on internal JDK implementation classes. Such classes are internal implementation details of the JDK and subject to change without notice
One of my colleagues used one of the classes in his code, which caused javac task in Ant fail to compile our project as the compiler couldn't find the class. Answer from Sun/Oracle says that this is expected behavior of the compiler as user shouldn't use the package.
Question is why the classes in the package made public in the first place?
Thanks,
Sarah

Sun classes in the JDK are prefixed sun.* and are not part of the public supported interface so should be used with care. From the Sun FAQ:
The classes that Sun includes with the
Java 2 SDK, Standard Edition, fall
into package groups java., javax.,
org.* and sun.. All but the sun.
packages are a standard part of the
Java platform and will be supported
into the future. In general, packages
such as sun., that are outside of the
Java platform, can be different across
OS platforms (Solaris, Windows, Linux,
Macintosh, etc.) and can change at any
time without notice with SDK versions
(1.2, 1.2.1, 1.2.3, etc). Programs
that contain direct calls to the sun.
packages are not 100% Pure Java. In
other words:
The java., javax. and org.* packages
documented in the Java 2 Platform
Standard Edition API Specification
make up the official, supported,
public interface.
If a Java program directly calls only
API in these packages, it will operate
on all Java-compatible platforms,
regardless of the underlying OS
platform.
The sun.* packages are not part of the
supported, public interface.
A Java program that directly calls
into sun.* packages is not guaranteed
to work on all Java-compatible
platforms. In fact, such a program is
not guaranteed to work even in future
versions on the same platform.

It's because Java visibility modifiers (especially at the type level, where there are only two options) don't currently have the granularity to achieve the sort of visibility you're hinting at. I don't know the specifics of the internal class or classes you're using, but basically making the classes private would have made them unfit for their intended purpose, so the only other choice was public.

Sadly JAXB (bundled with Java6) seems to rely on a non-public class "com.sun.xml.internal.bind.marshaller.NamespacePrefixMapper" to allow you to specify namespace prefixes when marshalling to xml.
You have to really go out of your way to get this compiling with ant:
http://pragmaticintegration.blogspot.com/
Summary:
Option 1.
Add jre libs as bootclasspathref
Add property: includeJavaRuntime="yes"
Option 2.
Use JAXB-RI libs - change property to "com.sun.xml.bind.marshaller.NamespacePrefixMapper"
Also mentioned here:
Define Spring JAXB namespaces without using NamespacePrefixMapper

Related

Where I can find in Dart SDK an implemention of the UnmodifiableSetView?

Where I can find in Dart SDK an implemention of the UnmodifiableSetView?
In C# language exists System.Collections.Immutable.ImmutableHashSet<T>.
In Java language exists Collections.unmodifiableSet().
But I cannot find anything similar in Dart SDK.
Where I can find it in Dart SDK?
P.S.
I use Dart language not for compiling it to the Javascript language.
I use it (as is) for the computations and I need UnmodifiableSetView but I connot find it in Dart SDK.
There is an UnmodifiableSetView in package:collection.
This is an official implementation of an unmodifiable view of a Set created by the Dart team.
There is no similar class in the platform libraries. Unlike Java and C#, the Dart platform libraries are limited in size, and functionality that can just as easily be implemented in a separate library are made available as packages instead.
It's always a question about the trade-off between convenience (making everything immediately available) and size/discoverability (don't overwhelm the user).
If you are looking for functionality in, say, dart:collection, and doesn't find it, then the package:collection package is a good second location to look. Not all dart:-libraries have corresponding packages, but some do.

MonoTouch Migration Analyser

Is there any migration analysers available for MonoTouch ?
I have seen one for Mono, but not for MonoTouch.
Short answer: No, there is none at the moment.
Long answer
The situation is a bit different from Mono. In general you test a complete and compiled (against a specific version of the framework) .NET application with MoMA, to get a report of what pieces are missing (or incomplete) in Mono that could affect the execution of your application on other platforms (e.g. OSX and Linux).
Testing a complete applications for MonoTouch would reports tons of issues - since the UI toolkit is totally different. E.g. anything about System.Windows.Forms, WPF... would always missing.
However if your assemblies are already split into (something like) an MVC design it would be possible to test some (the non-UI parts) of them against definitions based on the MonoTouch base class library.
Finally if someone has an immediate need (or looking for a nice project) MoMA is available as open source and the evaluation versions of MonoTouch contains all the assemblies needed to build the definitions files. A bit of extra filtering could make this into a very nice tool.
Alternative
You can see a list of the assemblies that are part of MonoTouch and some platform restrictions (compared to .NET) you should be aware.

Running Scala code on iOS [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Any way to use some Scala for iOS coding?
Would it be possible to use the Scala.NET implementation, and then MonoTouch to run Scala code on an iOS device?
I have not been able to find a page with binaries of Scala.NET that I can test, so the following are just general guidelines as to what you can do with MonoTouch and .NET languages.
MonoTouch can run any ECMA CIL that you feed to it. When you consider using a new language with Monotouch, there are two components that come into play:
Tooling for the IDE
Runtime for the language
The tooling for the IDE is the part responsible for starting the builds, providing intellisense and if you use Interface Builder, it creates a set of helper methods and properties to access the various outlets in your UI. As of today, we have only done the full implementation for C#. What this means for an arbitrary language is that you wont get the full integrated experience until someone does the work to integrate other languages.
This is not as bad as it sound, it just means that you need to give up on using XIB files from your language and you probably wont get syntax highlighting and intellisense. But if you are porting code from another language, you probably dont need it. This also means that you would probably have to build your assembly independently and just reference that from your C# project.
So you compile with FoobarCompiler your code into a .dll and then reference in your main C# project.
The language runtime component only matters for languages that generate calls into a set of supporting routines at runtime and those routines are not part of the base class libraries (BCL). C# makes a few of those calls, but they are part of the BCL.
If your compiler generates calls to a supporting runtime that is not part of the BCL, you need to rebuild your compiler runtime using the Mono Mobile Profile. This is required since most runtimes target a desktop edition of the BCL. There are many other API profiles available, like Silverlight, Mono Mobile, Compact Framework and Micro Framework.
Once you have your runtime compiled with our core assemblies, then you are done
If you had read the MonoTouch FAQ, you would have noticed that it currently supports only C# and no other CLR languages.
Binaries for the Scala.NET library and the compiler can be obtained via SVN, in the bin folder of the preview:
svn co http://lampsvn.epfl.ch/svn-repos/scala/scala-experimental/trunk/bootstrap
Bootstrapping has been an important step, and ongoing work will add support for missing features (CLR generics, etc). All that will be done.
For now we're testing Scala.NET on Microsoft implementations only, but we would like our compiler to be useful for as many profiles and runtime implementations as possible.
A survivor's report on using Scala.NET on XNA at http://www.srtsolutions.com/tag/scala
Miguel Garcia
http://lamp.epfl.ch/~magarcia/ScalaNET/

How should I maintain JDK7 projects, so that they automatically could be downgraded for JDK6?

I have few own APIs with around 2000 classes overall. Some of them use the new Path API from JDK7. Most other classes, however, do not rely on any new JDK APIs or new language features. So most classes could be used in a JDK6 environment (which I plan to do). Let's assume, I've annotated all JDK7-only classes with #Java7Only.
What I need now, is a way to create a JDK6-only subset of all my projects more-or-less automatically, without introducing new version branching or product lines (would be too complicated to maintain).
All projects are created using Netbeans, thus using Ant. Many projects depend on others.
Please help me evaluate, which ideas according to my problem is most appropriate. Which problems could occur with each idea?
Common first step for all ideas
Let an annotation processor search for #Java7Only-annotated classes and store the list to a properties file.
Idea 1 (specific)
Write a tool which would use the properties file to recursively copy the whole project, except JDK7-only files.
Build the copied project using JDK6 by invoking ant, thus getting a JDK6-compliant jar.
Idea 2 (specific)
Write a second annotation processor which would use the properties file to pass everything except JDK7-only files to a JavaCompiler instance.
Either build a jar using Java APIs or use Ant API for that.
(This would be a Java-only idea, but probably too complicated)
Idea X (abstract)
Somehow influence the Ant build process (by overwriting some targets?) and for each JDK6-compliant class: let Ant compile two versions of it (one time with JDK6 compiler, another time with JDK7 compiler).
(JDK7-only classes would be compiled only once, using the JDK7 compiler, of course)
Package each bunch to a separate jar.
Possible common problems to the ideas
Some projects dependent on others, so some actions (such as packaging) should consider this.
Remember: the JDK7 compiler generates downward incompatible class files, that's why every possible idea has to happen on sources-level (before or during the build process, not afterwards).
My thoughts on Idea 2:
Essentially this is invoking a compiler within a compiler. Annotation processors are run as part of compilation. Can this be done safely? Is there any static state in Sun's javac that would cause problems. (I don't know the answer but from memory there might be some static state that could cause problems in this scenario).
Idea 1 seems simpler and better to me.
But taking a step back, is it possible to separate out all the JDK 7 specific stuff into a separate module and compile it separately, into a different JAR?
Have the 'main' project, compiled using JDK 6 (which JDK 7 would have no problems reading because it is backwards compatible)
The JDK 7 specific module(s), with source in a different directory, which includes the 'main' JAR on the compilation classpath, could be built separately, with a different build.xml if necessary.
This only partially applies but I'd thought I'd mention it anyway.
The problem with just using -source 1.6 -target 1.6 options for validation is that you can still use Java 7 API when compiled using JDK 7.
I've used the Animal Sniffer Maven Plugin for a few projects now and it has proved quite useful. This plugin scans byte-code of your classes for JDK API usage. That is, you can tell it to fail the build if you attempt to use JDK 7 API when you are targeting JDK 6. This wont help much for separating out classes as you need but it could be useful as a final validation step combined with -source 1.6 -target 1.6 compiler options.
There is also an animal sniffer Ant plugin, as mentioned from the Animal Sniffer main page.

Enable generics in blackberry jde 4.5.0

When i compiled my application in blackberry it shows the following error.
generics are not supported in -source 1.3
(use -source 5 or higher to enable generics)
how to solve this
Java 1.3 is barbaric and no one should ever have to suffer its indignities. Fortunately, there is a solution!
Generics, enums, changing return signature in overrides, and pretty much everything that makes java usable was introduced in java 1.5. (see http://en.wikipedia.org/wiki/Java_version_history). Fortunately, most of java 1.5 was designed to be backwards compatible and not require JVM / bytecode changes. (or maybe this was unfortunate, as it lead Java's implementation of generics to be much weaker than C#. just try creating a generic class with static methods / fields that use the generic parameter)
This IBM article does a good job of explaining the background:
http://www.ibm.com/developerworks/java/library/j-jtp02277.html
But this JVM similarity allowed for creation of tools such as:
http://retrotranslator.sourceforge.net/
This is the section from my Ant buildfile that calls retrotranslator:
&lt java jar="${transformer.jar.exe}"
fork="true"
classpath="${epic-framework.dir}/tools/retrotranslator-runtime13-1.2.9.jar:${epic-framework.dir}/tools/retrotranslator-runtime-1.2.9.jar"
args="-srcjar ${build.dir}/classes5.jar -target 1.3 -destjar ${build.dir}/classes5to3.jar"
/>
Run the converted jar through preverify.exe and then give it to rapc.exe and you will have a working Blackberry app written with Java 1.5.
Edit: I missed a key detail in my original post. In addition to being Java 1.3, the Blackberry class hierarchy is missing many classes that would normally be a part of a Java SE 1.3 JDK. The one you will hit first is StringBuilder -- javac transforms ("string" + "otherstr" + "blah blah") into StringBuilder.append("string").append("otherstr").append("blah blah"). That class doesn't exist on BB, so you break. However, BB has StringBuffer, so writing an adapter between the two is pretty easy. The one catch is that BB disallows apps from adding classes into java.*. This can be very effectively fixed in the build process: 1) build your app against Java 1.5 w/ java.lang.StringBuilder on the classpath, 2) string transform java.lang.Stringbuilder (and everything else in your compat shim) to live in com.mycorp.java.lang.StringBuilder and build it into a JAR file. 3) Use that JAR file w/ retrotranslator and retrotranslator will update all bytecode references to java.lang.StringBuilder so that they now point to com.mycorp.java.lang.StringBuilder. Now you have a java 1.3 compatible bytecode that can be run on a Blackberry.
If anyone is interested in this stuff, contact me. I could look into open sourcing the compat library I have.
This is a limitation of J2ME, which uses a subset of the J2SE (no collections, reflection, etc.) and a Java language level of 1.3. Any code written for J2SE will most likely need to be manually ported.
It seems the JDK5 is not yet supported.
Same question was asked on the blackberrry forum but about enum support:
Sadly, the BlackBerry api is very behind in terms of Java versioning. There's no Generics, no Maps, no Enums - it's based around JDK 1.3.
I believe there is no way of enabling this feature within your BlackBerry app. If you find one, I'd be very interested to hear about it.

Resources