How to make Vector CANOE recognise the STM8 board connected via the interface? - can-bus

I tried to program the STM8 board to send CAN messages which can be viewed with the help of Vector CANoe. However the messages are in pending state and I am not able to make the Vector CANoe recognise the stm8 board connected via the hardware interface. How can I send view the messages that are sent by the microcontroller board in the CANoe trace window?
Thanks.

Related

trigger latching 3v relay with signal from HT12D

I am sending a signal from an HT12E through an rf transmitter to an rf receiver then an HT12D. This all works fine and the data signal from the HT12D is sent to a CZH-LABS D-1022A(for filling a pool). The pulse triggers the relay on the CHZ so that works ok.
What I want to do is take the same pulse/signal from the HT12D and send it through a 3vdc regulator (it is currently at 5vdc) and then through an EC2-3TNU latching relay.
The relay would then turn on power to an ESP12E, which would connect to the wifi and send a message to ThingSpeak that the original signal was received and the pool filling relay triggered.
The problem is that even though the signal from the HT12D reads 3vdc and it lights up an LED light when the signal is received, it doesn't trigger the latching relay.
I am attaching a schematic to show the wiring of my project.
To summarize:
The 5vdc (converted to 3vdc) signal from the HT12D will light up an LED but won't trigger a 3vdc latching relay EC2-3TNU.
I thought I may be the size of the smoothing capacitor after the center tap rectifier, but if I connect the ESP12E up to 3vdc and the activate the signal the LED still lights up but the latching relay won't engage.
I am flummoxed! Can anyone think of why the latching relay isn't activated by the 3vdc signal? Could it be I need a larger capacitor? The latching relay works because if I tap it with a 3vdc lead it triggers. This has nothing to do with code so I'm not providing any. Although once the ESP12E is activated, connects to the wifi and uploads to ThingSpeak, the ESP is coded to turn on an output pin to activate the latching relays reset pin turning off the ESP until the next signal input.
Any suggestions to solve this problem or a work around would be greatly appreciated.
I can't create tags and one for HT12D or HT12E would really be helpful.
I got this figured out. Just added a transistor, diode, and resistor.

Does client send disassociation to the first AP during roaming?

During WLAN roaming(Layer 2), when the client decides to roam from one AP to another AP with a better signal strength, does the client send a disassociation frame to the first AP?
If no, how does the first AP know whether the client has roamed or left the coverage area? Is it only through the communication between the two APs after the client has associated with the second AP?
When does the first AP decide to drop the frames buffered for the client?
Is disassociation used at all in the roam sequence?
Thanks in advance.

Use device serial number instead of PID in HL7?

We are developing a medical device which shall be connected to a Patient Data Management Systems (PDMS). The idea is to use HL7 messages to push measured data directly to the PDMS.
The device itself is too small to fit a convenient user interface to input the patient id. Is it possible in HL7 to transmit just the device serial number instead of the PID and let the PDMS make the connection between the device and the patient?
The device serial number should be in the PRT-10 field, not the PID segment.

Able to receive a wifi packet through a Zigbee chip

I have a wireless sensor network deployed in a building. Each node is in a separate room. All the sensory data goes to a datastore.
The user once he/she gets to a room should be able to get the sensory data on his phone from the datastore, provided that we know in which room he/she is. GPS does not give high accurary neither infering it from the wifi signal strength. We thought of having the phone send a dummy frame through wifi that can be intercepted by the sensor node and then based on the node who gets it, or gets it first (in case many nodes intercept that frame) should give an indication to the system of what room the user is in
Wifi and Zigbee both communicate on 2.4Ghz. Is there a way I can intercept all the RF signals from the Zigbee node and entrepret the frame even if it is not a a Zigbee frame?
No, it's not possible, they use different signaling methods.

iOS - Receive an external input to my app from a switch with BT or IR?

I am building an assistive iOS app for a kid that uses a switch to control his computer (a simple button that can send only one massege to the computer).
I am looking for a way to connect my app to a switch that can send click events to my app.
It can through by BT, IR or even through the earphone connection (headset port).
(BTW he can not use the iOS screen as the switch).
Any ideas ?
A BT connection requires you to be a certified MFi developer, and that requires money and a real company.
The headphone port would be a great place to interface with. You could wire a simple switch over the microphone line and ground line which, I think, would create a square wave duty cycle for on and off. I've done something similar where we used the headphone port to communicate to a microcontroller through a sound wave that was then converted to 16 bit packets and used to control additional hardware and also give feedback from that hardware.
Another option is a wifi connection, an arduino with a wifi shield and the button on that.
Edit:
The more I think about it, the more I would say use the headphone port. It will be super cheap, the programming to detect the presses will be really easy, and this will probably be the fastest way to achieve your solution. Provided you can solder.
I'm going to suggest going down a different path. Instead of trying to connect the switch directly to the iPhone, use something like an Arduino board with both physical switch and ethernet I/O ports plugged into the local network, and create what amounts to a physical I/O server.
The Arduino handles the physical interfacing and your iPhone app only has to handle the communications protocol to the Arduino over Wi-Fi.
One inexpensive solution would be to use the mic or mic input on the headset port. Connect the switch up to some sort of tone generator (555 timer or Arduino, plus piezo speaker or headset cable). Have the app run an input Audio Queue, and pass the Audio Queue input buffers to a DSP narrow band filter or an FFT. Monitor the frequency band of the tone generator for any significant energy burst above the background noise level. Potentially use multiple separate tone frequencies for more than one switch.
Added: Another simple alternative might be to use the switch to activate a solenoid or small motor (scavenged from an old motorized toy or similar) to tap a capacitive pen or ball of conductive foam on the iPod Touch display. No MFi, WIFI or audio DSP coding required.

Resources