I'm looking for language independent way to get public package exports (and other package metadata).
For example I choose some packages for java, js, php.
Then I download them, analyse and get exports for each.
I've found a lot of analysers on https://analysis-tools.dev/ but I think prettiers and linters doesn't suitable for me
Do you know tools which can help me to analyse packages in this way?
Related
I am writing a R package for my Msc's project, together with some .Rmd analysis that uses my package and generates some html reports.
I consider both the package code AND the resulting analysis as deliverables for my project, so I would like to showcase the two in a same location, maybe the same github which is already hosting my package. Question is, can I just add a /reports directory to the package structure and put my html files inside ? As package will be mostly internal use, I don't plan on submitting it to CRAN anytime soon if that matters. However, I would also like to avoid distributing my reports to other potential users.
I've just found out about the .Rbuildignore file which seems like it would do the trick, but I'd be curious to hear what's the standard practice in this case or any other suggestions you may have.
Thanks,
I'm getting started with FunScript with a working example. Using Nuget to add the needed libraries, it works well.
In watching a 2013 video on channel9, they are making use of TypeScript.Api<...> to load types from typescript definition files.
I'm however unable to find this type provider anywhere.
Where is it located?
I realized that a good number of the type definitions have been compiled into libraries and available on nuget but I can't really use this since some of the code will be local typescript definition files.
The questions therefore are
Where is the TypeScript.Api<...> type provider?
If it is not available or the best way to use typescript definition, what other options exists.
As Thomas said, the type provider was removed mainly because it couldn't generate generic types, but the idea is to bring it back at some point.
For the moment, though not ideal, you can generate your own bindings following these steps.
Download or clone Funscript repository
git clone https://github.com/ZachBray/FunScript
Build the project
cd FunScript
build.cmd
This needs to be improved but for now you need to zip the .d.ts files you want to convert and then:
cd build\TypeScript
bin\FunScript.TypeScript.exe C:\Path\to\typedefinitions.zip
cd Output
Please note the first time you build the definitions it may take several minutes. Once it's done in the output folder you'll find the compiled .dll libraries with the bindings.
Also, while you're at it. It's better if you use the FunScript version you just build into build\main\bin, as it will probably be more updated than the nuget package.
Good luck and have fun(script)!
There were a bunch of changes in FunScript, so the TypeScript.Api<...> type provider is no longer the recommended way of calling JavaScript libraries from FunScript.
Instead, the bindings for JavaScript libraries are pre-generated and you can find them as packages on NuGet, if you search for the FunScript tag (NuGet search is not very good, so you may need to go through a number of pages to find the one you need...).
If you want to use a local TypeScript definition, then you'll need to run the command line tool to generate the bindings. The F# Atom plugin does this in the build script, so looking there is a good place to start. It has a local copy of various TypeScript bindings in the typings folder (together with the FunScript binaries needed to process them).
I liked the type provider approach much better, but sadly, type providers are somewhat restricted in what kind of types they can provide, so it wasn't all that powerful...
I want to open libraries, because currently I want to see the algorithms used for drawing, modify them and implement them in my program. For example: I have tried to create an algorithm on my own for lines. But I failed. And even if I had succeeded, I fear that it might not give the same result as the algorithm in the libraries. And I don't want this to happen. That's why I want to copy the algorithms used for the methods in libraries. And I really hope that this will help me create the application I'm currently working on and with other applications in the future.
I tried to open the libraries with a code editor. But I had troubles finding the libraries- I don't really know where are they placed nor in what files are their codes stored.
How to open a Java library? Or is there a place in the Internet where the code is uploaded?
It sounds like what you want is to get inside the standard Java libraries (so you can see the code for methods like Graphics.drawLine()).
You can download the source files from the same place you got the JDK, if you are on Windows or Linux. For the Mac, see this question. You can even set up Eclipse so that you can debug into that source as if it were your own code.
However, you will probably not find line-drawing code in Java in these libraries - the Graphics implementation will almost certainly use native methods, and may just call existing methods in the OS.
If you are specifically looking for line drawing algorithms, another option would be to look at the Wikipedia page for the Bresenham (aliased) or Wu (antialiased) algorithm.
Edit:
The part of a Graphics2D call that actually puts pixels on the screen is probably inside a system call and therefore the source would not be available.
A java vector graphics library like Batik might have source for some of these algorithms, but probably relies on the Graphics2D calls for most of them. So, you might look for a comprehensive vector graphics library written in a language other than Java, where those graphics calls do not already exist by default.
Alternately, checking the table of contents for a computer graphics book might point you at a variety of algorithms that you could look up on Wikipedia.
For any given library:
Make sure to obey all licenses when using another's code
If you are referring to the Java SDK source code, you can find it here: http://grepcode.com/
If the project is open source, you can usually just get the source from the project website. No problem, though make sure to obey their license.
If the project is NOT open source, well, then you're in a pickle licensing wise, so I do NOT endorse this, however, you would need to use a Java Decompiler such as JD-Gui
As far as what drawing algorithms to use, there are so many different ones (obviously, people have been trying to draw quickly for many many years), your best bet is to figure out exactly what you need to do and then search for that specific need separately. There isn't really a good repository of ALL of them, except maybe wikipedia.
If you are using the libraries they are on your classpath. Check out how to figure out your classpath in whichever IDE you are using and you can find the JARs you depend on. If they are packaged with sources all you need to do it unjar them and look at the sources.
If you don't have access to the sources you can get the code using a Java Decompiler.
If you are trying to look at a standard Java library, see the other answers about getting the source to the JDK.
If you are interested in an open source library (such as something maintained by the Apache project), look on the site of the project for a 'source jar' which you can open with a standard zip utility.
If the library you want is not open source or you cannot find the source for it, you can try to decompile it. If you are using Eclipse, try this decompiler.
Apache Maven is a very popular build and dependency management tool in the Java open source ecosphere. I did some tests to find out if it can handle compiled Free Pascal / Delphi units and found it easy to implement. So it would be possible to
release open source libraries precompiled for Free Pascal (or Delphi) in a public Maven repository
include metadata in this repository which contains dependency information
use Maven on the command line to download the open source library from the public repository, and automatically resolve all dependencies
local repositories, working as proxies, could be used to cache frequently used binaries
automatic checksum generation and verification (provided by Maven) would reduce the risk of downloading corrupted binaries
source code and even documentation files could be provided with the binaries
binaries can be provided with or without debug information
continuous integration servers like Hudson, TeamCity or CruiseControl can be used to build projects whenever changes have been submitted to the source control system and notify developers about build errors
This way of dependency management could be very beneficial for open source projects which use many third party libraries with complex dependencies. It would avoid typical conflicts caused by using wrong versions.
For the developer, the workflow for editing and building a project would be reduced to a minimum:
checkout the project source from internal version control system
edit source file(s)
run mvn package to automatically download all required third party libraries (precompiled units) if they are not yet in the workstation's local repository
compile and run
The only additional file for Apache Maven which is required in the project folder is the POM.XML file containing the project information.
Edit: while Maven is usable for some of the required tasks, implementing a solution like Maven in native Free Pascal would have some advantages: no Java SDK required, support for all development platforms where Free Pascal is available, maintenance and plugin development in Pascal.
Usage of a Maven-like tool would not be helpful for open source projects only - commercial projects could access and use the artifacts in public Maven repositories in the same way as well.
Maven features are listed at http://maven.apache.org/maven-features.html
Update:
one use case could be the build of Lazarus, where Maven would download all required libraries and invoke the compiler with the necessary build path arguments. Changes in the dependencies on lower levels would be propagated automatically up to the parent build.
Possible benefits:
less time needed to set up a new work
station, no manual installation of
third party libraries required
less errors caused by wrong library
versions, detection of version
conflicts (for example if two
libraries depend on different
versions of a third library)
artifacts which are created inhouse
can be added to the local maven
repository and shared between
developers and project, central
storage of all artifacts with
metadata
builds are reproducible, just by
using the same source and project
metadata file (pom.xml)
can reduce development time and
increase project stability
Update #2: FPMake
the FPMake build system for Free Pascal seems to be a tool with much potential, in many details it is quite similar to Maven:
FPMake is a pascal based build system developed for and distributed with FPC
FPMake standardizes the building by defining some limits like standard directories
the command fppkg <packagename> will look in a database for the package, extract it, and then compile fpmake.pp and run it
it has standard build targets (clean, build, install, ...)
it can create a 'manifest' file suitable for import into a repository (like mvn deploy or mvn install), the manifest is an XML file which looks very similar to a pom.xml in Maven:
FPMake manifest file:
<packages>
<package name="my-package">
<version major="0" minor="7" micro="6" build="1"/>
<filename>my-package-0.7.6-1.zip</filename>
<author>my name</author>
<license>GPL</license>
<homepageurl>http://www.freepascal.org/</homepageurl>
<email>myname#freepascal.org</email>
<description>this is the package description</description>
<dependencies>
<dependency>
<package packagename="rtl"/>
</dependency>
</dependencies>
</package>
</packages>
Freepascal has been working on a package system of its own in a cross between apt-get and freebsd ports style. (download source/build/install automatically), called fppkg.
However work has stalled. People investing time are the bottleneck, not people wanting to choose tools.
As far as Maven goes, I don't like auxilary tools that need installation of huge external runtimes. It might be fine for a big major app (like Open Office), but not for an util.
I also prefer a tool that is designed to the FPC reality and workflow.
Documentation tools, build tools, download systems, testsuite systems are already all there, it just need a person that dedicates a lot of time into it to make it happen.
Some typical problems when introducing a new technology in a project as FPC, and why it has a tendency to make its own tools:
need to train 20+ committers in parttime.
The only COMMON programming language you can assume is Free Pascal. Even Delphi inner workings can't be taken for granted to be known (many committers came directly to FPC or even still via TP or a Mac Pascal)
Obviously that makes something with plugins in a different language annoying.
Bash script is a close second. (g)make third, but already a magnitude less.
All servers are *nix-like (FreeBSD, OS X, Linux), but not all run Apache. (e.g. my FreeBSD mirror runs XSHTTPD)
somebody most knowledgable must be dedicated maintainer for a long time. Fix problems, update/ do migrations etc. Perferably more than one for obvious reasons.
a major pain are Linux distributions (and FreeBSD to a lesser degree), most maintainers of *nix packages are not capable of more than "./configure;make;make install", and must be spoonfed with a near buildable repository and auxilary files.
In-distribution packaging of FPC/Lazarus has always been important, and is still increasing
All distributions have their own special rules about metadata, depedancies, and how sources must be published. Particularly Debian/Ubuntu is very bureaucratic and slow.
Most don't like third party auto-installers on top of their systems (since that bypasses their dependancy control)
This all leads to the effective practice that own tools in Pascal with minimal scripting work best. Some tools used:
Gmake is mainly used to parameterise the build process on a per directory level, a successor, fpcmake (not really a make derivative despite the name) has begun, but the migration hasn't completed.
Latex and a latex to html conversion (tex4ht, but debian uses hevea) are used in the documentation building (the non library documentation)
The community site (netscape community server which uses TCL scripting, a heavy complex application server) has been a trouble ever since it started, but specially lately since the maintainer became less active.
Mantis has been a problem (specially the email module would crash or lame the server due to the volume), but it has been whipped into shape during successive updates and hard work of several lazarus devels. Currently it is a decent workhorse.
lazarus.freepascal.org PHPBB forum OTOH is relatively painless since a lot of younger people know how to deal with it.
The same goes for subversions (though the more advanced scale needs some adjusting, not everybody is deep into the ins and outs of mergetracking)
If somebody was really serious about Maven, I usually would ask him:
to CRITICIALLY investigate the use for the project. In a very concrete way, with schedule and time estimates. Birds-eye level "everything's possible" overviews are essentialy worthless.
Give some thought on future change of used technologies. Every technology is eventually replaced, even the in-house ones, in 18 year+ projects. A new technology must not make migrations of other infrastructural components hard or involved. The new technology to end all new technologies doesn't exist.
Make a migration plan. Migration is often underrated and underestimated.
And in the end, there is always the 1000000 Euro question, who will do the daily maintenance?
Keep in mind that in a company you just kick the person responsible for the application server. But in an informal environment this is way harder, specially long term, since people's lives, occupations and time spent on the project vary.
Sounds like an interesting plan, but the Delphi community (and FPC even more so, I'd imagine!) values libraries as source far more than precompiled libraries. The general consensus is that anyone who uses a binary-only library is a fool, for two reasons: You can't fix any bugs you find in it, and compiler changes will break compatibility.
After a recent juggling with our ant scripts I've started to wonder if something better is possible.
I need a builder that will know to recompile all required .java files for me.
For ex. for this structure
public class A { ]
public class B extends A {}
public class C {
B b;
}
For: Compile('C') Will know to compile A, B, C.
For: B changed, Compile('C') will know to recompile just B.
I know of several alternatives, Ivy which seems like an extension of ant which is our current java builder. Scons which we are currently using for building C++ code, scons is excellent in doing the above described behavior for C code. Then there are reports of Maven being almost but not quite there.
What would you suggest? What tools are you using Free Software / Commercial for you build system?
Thank you,
Maxim.
Ant, with 'depend' task and with 'closure' option turned on
'make', from IDEA ide
None of ivy, scons or maven will help you with your problem as stated.
What do you mean by "for Compile('C')"? I don't think this is what you have in your ant file.
For this case, Ant should be working as desired: you have described its default behaviour. In the same javac element, Ant will only recompile changed classes. See the Ant manual entry for the javac task, especially the 'includeDestClasses' attribute.
You should probably post an example ant file that you are finding inadequate.
maven, both for my personal and my commercial products
In your question you describe inter-class dependencies. Most build systems, in particular Maven, are aimed more at inter-project dependencies. I believe most systems just recompile all the classes in a project and most of the benefits of these build systems is in building as few projects as possible.
Both Maven and Ivy will allow you to easily specify both external and internal dependencies of your project, including which version of the project you depend on. They will both also automatically download external libraries (such as apache commons) to your local machine as part of the build process if they are not already locally cached, saving a lot of work manually downloading and organizing third party jar files.
Ivy is an extension of ant, like you mention. I recommend Maven. It is a convention oriented build system that I've used successfully and feel is quite mature. Maven requires far less up front effort to start using and is quite extensible.