Change Contiki MAC protocol to TDMA - contiki

I am working on the Cooja simulator. I create a network contains some sky motes. By default, CSMA is used as the mac layer of the motes but I want to change it to TDMA. TDMA MAC layer exists in Contiki 2.6 as tdma_mac but its removed in more recent versions of Contiki. How to solve this issue?

In the recent versions in Contiki and in all versions of Contiki-NG there is a MAC protocol called TSCH.
To switch, add this to your application's Makefile:
MAKE_MAC = MAKE_MAC_TSCH
More information: https://github.com/contiki-ng/contiki-ng/wiki/Tutorial%3A-Switching-to-TSCH

Related

How does Swift 5.1 runtime work with older versions of iOS?

About a year ago, if you wanted to use Swift 4.2 for iOS development, you would have to install Xcode 10, which meant that you used iOS 12 SDK. As part your apps deployment, Swift 4.2 runtime would automatically be bundled with your app binary. This would mean that user installing your app would essentially download a copy of that Swift runtime that will enable your app work.
However, ABI stability came with Swift 5, and you no longer needed to bundle a runtime if your deployment target was iOS 12.2, since the runtime was now part of that iOS version. However, if you wanted to support iOS 10 and iOS 11, this Swift runtime would still be bundled with your app binary, and it would behave the same way as described above.
Documentation on swift.org states the same:
Apps deploying back to earlier OS releases will have a copy of the Swift runtime embedded inside them. Those copies of the runtime will be ignored — essentially inert — when running on OS releases that ship with the Swift runtime.
So far so good. If you use Xcode 10.2 with Swift 5.0, and you deploy your app to older iOS releases, you will still bundle Swift 5.0 runtime with it. Then, if your app is running on iOS 12, app will use the runtime provided by the iOS, and if it's running on e.g. iOS 11, it would use the runtime that was bundled as part of the app binary. Now the first question: Is that a correct assumption?
Now we come to Swift 5.1 and iOS 13 that will be released in September. Swift 5.1 includes some additional runtime features, e.g. opaque result types, which require Swift 5.1 runtime.
In WWDC 2019 session 402 "What's New in Swift", the speaker, when discussing the Swift 5.1 feature Opaque Result Type (SE-0244), mentions that the feature will only work on new OSes:
Requires new Swift runtime support
Available on macOS Catalina, iOS 13, tvOS 13, watchOS 6 and later
This is the confusing part for me. Wouldn't Swift runtime 5.1 be shipped with your app regardless if you support older iOS versions (e.g. iOS 10 as well), thus enabling it to use these new runtime features or am I just not understanding this correctly?
Now the first question: Is that a correct assumption?
Yes, that is correct.
Wouldn't Swift runtime 5.1 be shipped with your app regardless if you support older iOS versions (e.g. iOS 10 as well), thus enabling it to use these new runtime features or am I just not understanding this correctly?
The embedded runtime is not exactly the same runtime as the one found in your OS. E.g. the runtime in your OS is tightly integrated:
By being in the OS, the Swift runtime libraries can be tightly integrated with other components of the OS, particularly the Objective-C runtime and Foundation framework. The OS runtime libraries can also be incorporated into the dyld shared cache so that they have minimal memory and load time overhead compared to dylibs outside the shared cache.
Source: https://swift.org/blog/abi-stability-and-apple/
Of course, the embedded runtime cannot be tightly integrated into older systems. The embedded runtime can only support features that were already possible on the current system it is being executed. Features that require a newer systems are simply not present when your app runs on an older one.
Note that this has never been different for ObjC. If a class or a method only exists starting with a certain OS version, you can still deploy backwards to older system versions but then you cannot use that class/method there as it simply doesn't exist.
if (#available(iOS 13, *)) {
// Code requiring iOS 13
} else {
// Alternative code for older OS versions
}
or in Swift:
if #available(iOS 13, *) {
// Code requiring iOS 13
} else {
// Alternative code for older OS versions
}
Just like with ObjC, new Swift features will only be available for new OSes from now on. Only if it is possible to make these features also available for older OSes, regardless if these shipped a runtime or need to use the embedded one, this feature may also deploy backwards, though not necessarily all the way.
E.g. 10.15 introduces a new feature in its bundled runtime, then maybe this feature can also be made available for 10.14 and 10.13 using a shim library but not for 10.12 down to 10.9, then this feature will be tagged as "Requiring macOS 10.13 or newer".
If you deploy to 10.15, nothing has to be done, as the runtime of 10.15 supports the feature. If you deploy to 10.14 or 10.13, then the compiler will add shim library (like it would add an embedded runtime) and on 10.13 and 10.14 the code in this library will be used while on 10.15 and later the code in the runtime will be used. If you deploy to systems earlier than 10.13, this is okay but you must not use this feature on these systems then.
Of course, if a new feature can be made available even trough the embedded runtime, it can certainly also be made available using a shim library for all systems that shipped with an own runtime which just didn't support this feature, as the shim library can then use the same code that the embedded runtime uses.
The ability to sometimes make new features available even to older systems is explained by the very last question on that page:
Is there anything that can be done to allow runtime support for new Swift features to be backward deployed to older OSes?
It may be possible for some kinds of runtime functionality to be backward deployed, potentially using techniques such as embedding a “shim” runtime library within an app. However, this may not always be possible. The ability to successfully backward-deploy functionality is fundamentally constrained by the limitations and existing bugs of the shipped binary artifact in the old operating system. The Core Team will consider the backward deployment implications of new proposals under review on a case-by-case basis going forward
Source: https://swift.org/blog/abi-stability-and-apple/

Qt for iOS: Error unknown module(s) in Qt: webkitwidgets

I have downloaded .dmg file for Qt on Android and iOS. I have installed it and tried to run examples. I am able to run examples under folder "widgets" for iOS but when I try to run example for "webkitwidgets" and run qmake on (let say for example of "browser") .pro (in this case for example browser.pro) then it gives me error : "Error unknown module(s) in Qt: webkitwidgets". Then I checked "ios" folder in installed qt directory. I found in "Include" folder there is no folder called "QtWebkitWidgets" while there is folder "QtWidgets". Let me know solution asap because I am doing some poc on Qt for iOS and need to have conclusion soon. Did I miss some steps while installation or do I have to do some extra steps to execute examples for QtWebkitWidgets. Let me know whether QtWebkitWidgets module is supported for in Qt for iOS or not. Note that I am using Qt 5.2 with XCode 5. If more detail is required then let me know.
Apple explicitly forbids that any programming language be compiled/interpreted on the iOS device itself, except by their own WebKit. So Qt’s WebKit is disallowed.
It is worthy to note thet part of the QtWebKit team has started the project QtWebEngine to explore the option of providing a Chromium/Blink based web engine instead of QtWebKit, and in addition to that, the iOS port of Qt will need their own webview API since Apple does not allow additional web engines on their iOS devices.

Can not use cvsnakeimage in opencv 2.4

I need to use the cvsnakeimage function for active contours but apparently it is not available in opencv2.4.
I have opencv2.4 installed on my PC. I have already linked my VS2010 project to opencv_legacy242d.lib but still cvSnakeImage is undefined.
Don't forget to:
#include <opencv2/legacy/legacy.hpp>
and make sure you are linking with either opencv_legacy242.lib or opencv_legacy242d.lib.
I don't have a Windows box at my disposal, but this worked on Mac OS X.

Mono Android Monodroid Native Library DllNotFoundException

I am attempting to get a Native C/C++ library working with Monodroid in the emulator, using DllImport. I am developing mainly in Windows/Visual Studio 2010.
I have built a native C/C++ library (ZeroMQ) using the Android NDK tools, to both x86 and armeabi platforms, using a Ubuntu virtual machine. Is this correct - x86 for emulator and armeabi for the real device? (This is certainly the case on the iPhone/MonoTouch - though in that case statically linked libzmq.a file and DllImport __Internal)
I have added the x86 version of libzmq.so to my MonoDroid project under the directory structure lib\x86\libzmq.so
When I first attempted to build/deploy to the simulator, I got an error 'cannot determine abi type', so I've added x86 to the AndroidNativeLibrary Include="lib\x86\libzmq.so"> project item group. This then deployed.
I have a DllImport for the function to call [DllImport("libzmq"
I've tried libzmq.so, lib/x86/libzmq, lib/x86/libzmq.so etc, but then I call the DllImported method (running in the emulator), I always get a DllNotFoundException.
Can anyone give me some direction?
EDIT: After reading another support answer which states that the emulator uses armeabi .so libraries, I have removed the x86/libzmq and added my lib/armeabi/libzmq.so as an AndroidNativeLibrary. I also removed the project file Abi entry, and indeed the project built and deployed fine.
However, I still get a DllNotFoundException when I try to call a DllImported function. Any ideas?
Many thanks
I don't have any immediate ideas why it isn't working for you; [DllImport("zmq")] should work.
The SanityTests sample exercises the DllImport attribute.
The DllImport: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/Hello.cs#L240
The Android.mk to build libfoo.so: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/jni/Android.mk
Building libfoo.so by calling ndk-build: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/SanityTests.csproj#L82
Including libfoo.so into the .apk: https://github.com/xamarin/monodroid-samples/blob/master/SanityTests/SanityTests.csproj#L96
(This is a repeat of my reply to your email in the ZeroMQ mailing list).
This error could be caused by using an incorrect platform target in your Mono project. I'm not sure if Mono/MonoDevelop supports this, but you probably need to create an ARM platform target, as opposed to x86/x64/Any CPU.
You would experience the same errors if you tried to do P/Invoke interop between an x64-targeted assembly with an x86 native library or vice-versa.
If you're using the clrzmq bindings for bridging Mono and libzmq, you will need to create a new platform target for ARM and rebuild the bindings. The solution only defines x86 and x64 platform targets currently.
Ok, I finally got it working! For whatever reason, ZeroMQ doesn't build correctly using the current/latest NDK r7. The instructions at http://www.zeromq.org/build:android use NDK r6. I also downloaded an old version of the NDK r5b (it was the easiest old version to find a download for), and rebuilt ZeroMQ using it. Result, it now works on emulator and phone!

iPhone and OpenCV

I know that OpenCV was ported to Mac OS X, however I did not find any info about a port to the iPhone.
I am not a Mac developer, so that I do not know whether a Mac OS X port is enough for the iPhone.
Does anyone know better than me?
OpenCV does indeed work on the iphone. Use the configure script here to compile the library: cross compiling for iphone
You just have to cross-compile just as you do your apps.
OpenCV now (since 2012) has an official port for the iPhone (iOS).
You can find all of OpenCV's releases here.
And find install instructions here:
Tutorials & introduction for the new version 3.x
The latest build script from Computer Vision Talks works great for Xcode 4 and iOS 4.3 . I have tried the script myself and it is just awesome!
Here is opencv2.0 on iPhone iphone opencv test
OpenCV is now available as a framework for iOS. Just drag and drop into your project. It supports video capture too. See the article and get the example project here: opencv framework for ios
For the sake of transparency, I wrote this article and it is hosted on my company's website.
I haven't tried using OpenCV specifically, but I do dev for the iPhone and can say that most libraries I've tried that work on OS X DO NOT work on the iPhone out of the box. Some of them just needed a little tweaking/configuration to be done and then it was fine on the iPhone, but the reality is that the phone is missing quite a few backend components that OS X supports. Most complex libraries (OpenCV sounds like one of them) aren't going to work without a major effort - particularly since OpenCV sounds like it depends on several other external libraries as well...so those would have to be ported too.
All you need is to generate XCode project for OpenCV project using cmake or cmake gui tool.
Remember to set option to generate XCode project instead of the default option to use CMakeFiles.
Open generated project, change the base SDK to iPhone SDK, and hit build!
Since OpenCV does not support iOS at now (but they has announced iPhone support in version 2.2), highgui library won't compile. So if you need camera access you have to write it yourself.
Anyway, other libraries should compile and work on the device. (Works for me).
iPhone do support OpenCV if you want to use it first go to the best OpenCV on iPhone Documentation on the web: Yoshimasa Niwa's
I used it and i already have an app on the AppStore that uses Face Detection and Image Processing: Flags&Faces if you have any doubts please contact me.
Note that OpenCV runs very fast on Intel chips but the iPhone is arm. Of course OpenCV is extremely useful but it won't be that fast. Also, there's no way to get a live video stream on the iPhone so all of the normal potential CV applications sort of lose their appeal, don't they?
You can also install OpenCV using a package manager like Cocoapods.
To quote the installation guide:
You want to add pod 'OpenCV', '~> 3.0' similar to the following to
your Podfile:
target 'MyApp' do
pod 'OpenCV', '~> 3.0'
end
Then run a pod install
inside your terminal, or from CocoaPods.app.
Here's modified script (based on the one from LambdaJive) that builds universal OpenCV framework for iPhone/iPhone Simulator - universal-i386arm opencv framework
The following post by Yoshimasa does indeed work with I OS and the IPhone 4 and is able to access both the front and back cameras.
The link to the project is using opencv on iphone en
and the sample code is at webgit and it is linked from this article. I really encourage to read the article before getting the source code.
A project utilizing opencv on the iPhone (Lucas-Kanade optical flow to be exact). Source code available and app is on the AppStore as well -

Resources