How to create a dependency in Atmel Studio 7.0? - libraries

I need to #include a file called ClearCore.h but it refuses to show up as a dependency.
https://i.stack.imgur.com/1kTvS.jpg
https://i.stack.imgur.com/dn6XT.jpg
As you can see, the file is in the library and most other files are listed as dependencies, but this one refuses to despite being #included.
Here is the error message from compilation.
https://i.stack.imgur.com/tM2wk.jpg

I am an applications engineer at Teknic and I saw your post and wanted to offer a few points to help you get your code running.
It sounds like you are having trouble adding the ClearCore library as a dependency. As this is more a function of ATMEL, there could be several reasons as to why we are running into issues here.
The easiest way to troubleshoot this would be to use one of Teknic’s already linked and ready to run example projects (we include these examples in addition to the ClearCore libraries). You can use the provided example projects as a template to sort out whatever may be causing the dependency issues.
You can find these examples here: https://teknic-inc.github.io/ClearCore-library/SdkExamples.html
Keep in mind that if you move the example projects into other directories, some of the relative file path definitions may be broken.
If you have any questions about the examples projects please feel free to give us a call at 585-784-7454, or use our "Contact Us" form online at https://www.teknic.com/contact/.
Best regards,
Mark D. – Teknic Servo Systems Engineer

Related

What is the difference between a unity plugin and a dll file?

i am new to Unity and i am try to understand plugins. I have got the difference between a managed plugin and a native plugin, but what is not very clear to me is:
what is the difference between a plugin and a dll? what should i expect to find in an sdk to make it usable in my unity project?
Thanks a lot
To expand on #Everts comment instead of just copying it into an answer, I'll go a little into details here
What is a plugin?
It's a somewhat vague word for a third-party library that is somehow integrated with the rest of your game. It means that it neither is officialy supported by Unity, nor is it a part of your core code. It can be "plugged" in or out without altering its internals, so it must provide some kind of API that can be used by the game code.
For example, you'll find many plugins that handle external services like ads, notifications, analytics etc. You'll also find a couple of developer-tools that can also be called plugins, like tile-based map editors and such.
Plugins come in many forms - DLL files are one example but some plugins actually provide full source code for easier use. And of course, other plugins will provide native code for different platforms, like Objective-C for iOS or .jars for Android.
So to answer your first question:
DLL is simply a pre-compiled source file that can be a part of a plugin
A plugin is a whole library that can consist of multiple files with different formats (.cs, .dll, .jar, .m etc)
What do you need to use an sdk?
First of all - documentation. Like I said before, and like you noticed yourself, not all plugins give you access to the source code. And unfortunately, not many sdks have extensive and developer-friendly documentations so it can be a tough task to actually understand how to use a given sdk.
Secondly - the code. Many sdks give you some kind of "drag & drop" library, a single folder with all the neccessary files inside that you simply add to your Unity projects. I've also seen sdks that use Unity packages that you have to import via Assets > Import Package > Custom Package.
Once you have the code and documentation it's time to integrate it with your game. I strongly recommend using an abstract lyer in your game as, in my experience, you often have to change sdks for various reasons and you don't want to rewrite your game logic every time. So I suggest encapsulating sdk-related code in a single class so that you have to change only one class in your code when switching from, say, one ad provider to another (and keep the old class in case you need to switch back).
So you basically need three things:
Documentation (either a readme file or an online documentation)
The code (precompiled or source)
A versatile integration

How to identify what projects have been affected by a code change

I have a large application to manage consisting of of three or four executables and as many as fifty .dlls. Many of the source code files are shared across many of the projects.
The problem is a familiar one to many of us - if I change some source code I want to be able to identify which of the binaries will change and, therefore, what it is appropriate to retest.
A simple approach would be simply to compare file sizes. That is an 80% acceptable solution, but there is at least a theoretical possibility of missing something. Secondly, it gives me very little indication as to WHAT has changed; It would be ideal to get some form of report on this so I can then filter out irrelevant (e.g. dates/versions copyrights etc..)
On the plus side :
all my .dcus are in a row - I mean they are all built into a single folder
the build is controlled by a script (.bat)(easy, for example, to emit .obj files if that helps)
svn makes it easy to collect together any (two) revisions for comparison
On the minus side
There is no policy to include all used units in all projects; some units get included because they are on a search path.
Just knowing that a changed unit is used/compiled by a project is not sufficient proof that the binary is affected.
Before I begin writing some code to solve the problem I would like to ask the panel what suggestions they might have as to how to approach this.
The rules of StackOverflow forbid me to ask for recommended software, but if anyone has any positive experiences of continuous integration tools that would help - great
I am open to any suggestion or observation that is relevant in this context.
It seems to me that your question boils down to knowing which units are contained in your various executables. Since you are using search paths, it will be hard for you to work this out ahead of time. The most robust way to find out is to consult the .map file that the compiler emits. This contains a list of all units contained in your executable.
Once you know which units are contained in each executable, you need to know whether or not anything has changed in those units. That information is contained in your revision control system. Put this all together and you have the information that you need.
Of course, just because the source code for a unit has changed, you might argue that re-testing is not needed. Perhaps the only change made was the version, or the date in a copyright label or some such. But it is asking too much to be able to ask a computer to make such a judgement. At some point you need a human to step up and take responsibility.
What is odd about this though is that you are asking the question at all. It seems to me to be enormously risky to attempt partial testing. I cannot understand why you don't simply retest the entire product.
After using it for > 10 years for commercial in-house and freelancer work in large projects, I can recommend to try Apache Ant. It is a build tool which supports dependencies, and has many very helpful features.
Apache Ant also integrates nicely with CI tools such as Hudson/Jenkins, Bamboo etc.
Another suggestion - based on experience with Maven - is to design the general software architecture as modular as possible. If modules (single or multiple source or DCU files in one directory) use a version number in the directory name as a version number, it is possible to control exactly how application are composed from these modules.
If you want to program such a tool yourself the approach would be something like this:
First you need to detect wheter there were any changes made to seperate source files. As you already figured out comparing the file size is bad idea as the file size can stay the same despite lots of changes made to it (as long as there is same amount of text in pas file its size won't change). So instead you could check the last modification time for specific file or create some hash value like MD5 hash for comparison (can be quite slow).
Then you need to generate yourself a dependancy tree which will tell you which files are used for which project/subproject.
Finally based on changes detected in seperate files you check the dependancy tree to see which projects needs to be recompiled.
The problem of such approach is that you would probably have to update the dependancy tree manually each time when new unit is added to the project or an existing one is removed from the project.
But the best way would be to go and use some version controll software istead of reinventing the wheel. I myself like the way how GIT works and I belive that with proper implementation of GIT into the project mannager itself could be quite powerfull do to GIT support of branching/subbranching (each project is its own branch, each version of your software can be its own subbranch).
Now latest version of Delphi does have GIT integration done though SVN but this unfortunately limits some of best GIT functionality. So if you maybe decide to go and integrate GIT support directly into Delphi I'm first in line to use it.

Collection of PNaCl ports with compiled .pexe files (Ex: ImageMagick)?

The list of current NaCl ports is here: https://code.google.com/p/naclports/wiki/PortList
I'm curious if there is (or will be) a repository for PNaCl executables (.pexe files) since they only need a .nmf manifest wrapper to run?
Please list any PNaCl resources here, I'm looking for ImageMagick specifically.
I know I could probably build the .pexe myself but I don't have the time to learn Native Client.
The short answer is: this isn't available yet, but we are working on it.
The long answer:
There isn't currently a binary repository for naclports. We are planning to add that soon (likely this quarter).
I took a quick look at the ImageMagick port, and it looks like it currently only generates libraries, not an application binary (i.e. nexe/pexe). My guess is that it wouldn't be too hard to add this, however.
Even with a pexe, you'll need to have a way to launch the application, give it commandline options, and a set of files to process. I've discussed a way to do this with my team, but we haven't started work on it yet (though again, we'll likely work on this in Q1 2014).

Qt generic error message

This is the error messsage I get.
I know it's kind of an eye roller, that it's difficult nigh impossible to tell what I may need without the source, but it seems like a deployment problem as people that installed the Qt SDK can run it. Plus, I figured I'd have better luck asking here than with a chinese developer that speaks google-english.
So here's what I've done:
I installed the MSVC2012.
I used a program called cffexplorer to see what the exe was looking for. I have the 7 or so .dlls that are at the top of the tree.
I found a recent (jun 2013) qwindows.dll from elsewhere on my system and put it in ./plugins (I've tried this file in ./, ./plugins, and ./plugins/platforms
I created a qt.conf with the following data (I determined the format from an existing Qt based app that works)
[Paths]
Plugins = plugins
Yet, I continue to get this message. Any suggestions on what I might look for to clear this up?
Ask the developer what compiler was used to build the application. Then you will need the right dll (that was built with the same compiler as the application). Also notice that (by default) the documentation says that qwindows.dll should be in the platforms folder in the same path as your executable, read more here. Depending on whether the developer used a Qt built with angle, you may also need: libEGL.dll and libGLESv2.dll. Dependency walker might help you find dependencies that are not there.

In Delphi do you use include paths or explicity include all required files?

Looking at our codebase some code is included in a project explicitly and is pulled in from the search path. Does anyone have an opinion as to which is best practise and why?
Update:
I thought I would clarify my question. All our paths are relatives, so we can have multiple branches that all refer to code within their branches. So I'm not asking about relative paths, but whether units should be in the .dpr or picked up using the search path, which is why the previously asked questions don't quite answer my needs. Thanks to everyone
I have a very basic way of determining this... If the code is specific to the project (not used elsewhere) I include it explicitly. All shared code gets pulled from the library path.
best regards,
don
I don't think I can count the number of times I've helped someone who discovered that the compiler was finding a duplicate copy of a unit somewhere on their search path where they did not expect to find it. They couldn't understand why they were changing their code in the editor (on a copy of the units not found in the search path) and not seeing any change in the behavior of the application. Explicitly including the unit and not setting a search path means there can only be one copy of the unit found by the compiler.
This has been covered here before:
In Delphi, should I add shared units to my projects, to a shared package, or neither?
What is the best way to share Delphi source files among projects?
My answer to the first question is also my answer to your question.
my libraries are in SVN, and I usually check them out for (branch them into) a project at ../libraries relative to the project. This keeps the scope of the includes dirs small and to the point.
In the real source (.pas), paths are totally forbidden.
no project related paths in global delphi searchpath (only per project, or they are truly universally shared sources/components)
I hate poluting source with hardcoded paths, so I usually have only a few units in the project, always with relative paths. Not the VSS w:\ drive substitutes hack please! Typically these are the units that pull in framework parts or are needed due to visual inheritance or form initialization.
Unfortunately, relative paths can be dangerous with Delphi, because they are relative to the working directory, which can change according to Delphi dialogs (e.g. Open). The solution is simple, have an include file with an unique name in the main project.
The shared code vs specific code is a good rule.
I use VSSConnextion a lot, so files that I usually need to check out/in together, naturally belongs to the same project.
After upgrading Delphi twice and moving my project to new computers twice, I've learned that hard-coded paths are evil because root directories tend to change. Doubly so if you're working on a shared project.
I had the same problem. The blue-dots where not showing up in the gutter.
Simple solution (one of):
Menu > Project > Compiler > Build Configuration... set to DEBUG instead of release.
Delphi 2007
PS: well, I thought I was done coding. Someone had requested a new feature. :)

Resources