Integrating a RTOS like Erika Enterprise on a hypervisor that can run AUTOSAR application? - can-bus

I'm new to automotive applications and trying to understand the stack for automotive systems. My goal is to implement a system where I can test AUTOSAR applications.
Currently, I'm looking to implement this using the following configuration below:
x86_64, Common Desktop PC (Hardware)
Xen (Hypervisor)
Erika Enterprise (RTOS, run as guest VM)
RT-Druid (for Erika/AUTOSAR application development, on a different development machine)
There is already
reference material
on how to setup Xen and Erika on x86 hardware.
Questions:
How do I run an Erika/AUTOSAR application on this setup. Lets say a simple "Hello World" example. Do I simply copy and paste it on my VM and run?
Is there already an implementation of CAN bus on Erika Enterprise? How do I implement this feature on my setup?

Related

Distribution of containerized applications to end users

I have containerized an application which is comprised of a Node.js application, Nginx, and MongoDB. I'm currently using Docker Compose to start and stop the application on my development machine.
I'd like to distribute the application, along with the volume that contains the MongoDB database files, so that an end user can easily start the application on their computer and point their web browser to it.
Some factors I'm considering:
The end user is almost certainly not familiar with containerization and is probably not comfortable playing around in a terminal.
The end user is likely to be using macOS or Windows, but Linux should be supported.
Asking the user to install Docker is possible, but I don't like that it requires Hyper-V on Windows, which conflicts with other software such as VirtualBox.
I could write a cross-platform GUI application that manages Docker with simple "start" and "stop" buttons. However, I am not married to using Docker if there is an easier path forward. Should I look into something like Facebook's executable archives?

How to run Vagrant with nvidia docker as provider

I'm part of a team developing a machine learning application.
currently we're using Vagrant with a Docker provider as a uniform dev environment.
We want to utilize the GPUs on our computers when we play around during development, and I found that Nvidia released nvidia-docker to enable that for a simple docker container.
How can I use nvidia-docker as a provider for Vagrant?
If it's not possible, is there any equivalent solution?
It is important for us to develop on top of the same docker image that we deploy since we depend on multiple interacting opensource libraries, and we want to manage them in one place
(no dependencies breaking when deploying)

Is there a way to use Windows with GUI capability on Dockers

I'm thinking is there a way to leverage on Dockers concept for my windows base Desktop application. I need to run GUI test, performance test, workflow test etc.. for each build. What I currently do is use Hyper-V with pre setup different OS images.
Is there a easy way to achieve same thing using docker concept. As I know this can be achieve for non GUI application. but how about the GUI base desktop apps.

How to deploy a rails application on Windows PC (windows 7 / windows 8)?

I have built a rails app which is used as a standalone enterprise application. The application needs to run on Windows desktops (entire user base runs Windows machines). I am able to run it quite successfully on an Ubuntu machine but it's not something customers will prefer to run.
Since deploying on a windows machine is quite messy AFAIK. I would like deploy it on Windows using a virtual machine (VirtualBox).
Requirements would be -
Application installation on Windows 7 / Windows 8.
User should be able to access rails server by browser running on his/her system via localhost or any other IP address.
Application should auto-start when user reboots the machine.
Ideally user should be able to download and install the software on his/her machine by himself/herself.
I am working to make this work but would like to know the feasibility of this solution. Would like to if I am getting the concepts wrongs or if there is something which is simply not possible or is not making any sense.
Take a look at Vagrant, which is a highly scriptable VM host. You can then generate batch files to automatically start the VM on boot.
To deploy new code, you'll just want to provide them with a new VM image they can copy into your app directory.
That said, I agree with other comments that this might not be the right platform for your use case. The main reason for building web apps is so that many clients can use your app over the web using just one set of servers. Deploying a web server to each client seems like it's defeating that advantage.

OpenCV deployment on windows azure

Is it possible to deploy an OpenCV application to windows azure?
Open CV is comes into client application category accessible through user interface and also can be used for backend processing. Windows Azure Cloud services is used for web application so Open CV does not fit in the application model. For backend processing you may think to use cloud service as worker but that need lots of work on your part and defeat the purpose.
For the sake of completeness and possibility, you sure can get a Windows Azure Virtual Machine, along with Windows OS and deploy OpenCV application there. Once ready you can Remote Desktop to the VM and use it. You may pay monthly cost to use the VM but you sure can do it. But I am sure that is not your objective either.
Yes, I'll say its possible to install OpenCV applications to Azure.
Check the following Deep Learning VM
It comes with pre-installed software. Most of the machine learning libraries along with the OpenCV project are pre-installed
You can also use APIs to host your models on the Windows Azure

Resources