ATTiny serial communication does not work at 3.3V with ESP8266 (NodeMCU V3) - esp8266

I am transmitting data from ATTiny85 to ESP8266(NodeMCU v3). I am powering NodeMCU using USB cable.
It works fine when I power up ATTiny using 5V.
However I'm planning to transfer my project to ESP32, which is not 5V tolerant. Therefore I have to run ATTiny at 3.3V (which is possible according to datasheet). However I'm not getting intended results when I'm using 3.3V power to ATTiny85. I'm getting some nonsense. Using level converters is one alternative but I like to know what I have done wrong.
To clarify, Only change I'm doing is powering ATTiny with 3.3V instead of 5V.
I have prepared the following demo to demonstrate my situation. Thanks in advance :)
I'm using Ardiono 1.8.10
NodeMCU core : 2.7.4
ATTiny85 is running using 8MHz internal oscillator.
NodeMCU code
char rx;
void setup()
{
Serial.begin(57600);
}
void loop()
{
if (Serial.available()){
rx=Serial.read();
Serial.print(rx);
}
}
ATTiny85 code
#include "SoftwareSerial.h"
SoftwareSerial Monitor(5, 4);
uint8_t x=0; //temp
uint8_t y=128; // ECG
void setup() {
Monitor.begin(57600);
}
void loop() {
Monitor.print("E"+String(int(y)));
Monitor.print("T"+String(int(x)));
x=x+1;
y=y-1;
delay(10);
}
Serial monitor od NodeMCU when ATTiny is running at 5V from Arduino UNO
T95E32T96E31T97E30T98E29T99E28T100E27T101E26T102E25T103E24T104
Screen-Shot 1
Serial monitor od NodeMCU when ATTiny is running at 3.3V from Arduino UNO
⸮qxt⸮q99⸮qx5⸮q9x⸮qx6⸮q97⸮qx7⸮q96⸮qxx⸮q95⸮qx9⸮q9t⸮q9`⸮q93⸮q9q⸮q9r⸮q9r⸮q9q⸮q93
Screen-Shot 2
Serial monitor od NodeMCU when ATTiny is running at 3.3V from ESP8266
3x⸮rtv⸮q37⸮rt7⸮qsv⸮rtx⸮qsu⸮rty⸮q3t⸮r5`⸮q3s⸮r5q⸮q3r⸮rur⸮q3q⸮r5s⸮q3`⸮r5t
Screen-Shot 3

It looks like ATTiny85 has to run SoftwareSerial at 38400bps when you're powering it using 3.3V.
However it works fine at 57600bps when powering it with 5V.

You said you run the ATTiny85 at 8MHz ... Did you enabled the divider CKDIV8? If yes, ATTiny would have a hard time to maintain this baud rate.
Your data (shown in the first post) looks like ATTiny send some bits not in time. Keep in mind that the internal oscillator is not that stable (increasing voltage seems to make the internal oscillator work more stable), but stable frequency is mandatory to perform stable asynchronous data transfair. Switching to the external crystal oscillator should stabilize the UART Communication (i know, ATTiny85 has not that much pins and using 2 for a crystal, 2 for the UART communication will leave you only with one additional pin. Probably you can switch to ATTiny84 instead).
Note:
If you choose a crystal and you know you will implement UART configuration, choose the frequency of the crystal which can be divided by your preferred baud rate. Especially if you are going to use the USI of the ATTiny85 this will help to avoid error in the transaction (for example you could use crystal with the frequency of 7.3728MHz).
Also keep in mind that you need to consider the voltage before selecting a crystal (means if you need to choose frequency >10MHz you need to work with the voltage between 4.5V and 5.5V)

Related

Analog to digital sampling rate affected by String() function on ESP8266?

I'm using an ESP8266 NodeMCU 12-E development board to capture audio from a pre-amplified electret microphone, then I upload it to the web where it will be converted to a wav file. My first thought was to cast the integer values of analogRead(A0) on the ESP8266 as String type, then concatenate them into a longer string payload which I can publish to an MQTT broker.
My MQTT client subscribers didn't seem to be getting proper sound files, because all I heard were series of rhythmic pops.
I decided to investigate if my code on the ESP8266 board was even capturing things properly. I stripped the code down to these few lines which seem to cause problems:
#include <ESP8266WiFi.h>
const char *ssid = "____"; // Change it
const char *pass = "____"; // Change it
void setup()
{
Serial.begin(115200);
Serial.println(0); //start
WiFi.mode(WIFI_STA);
WiFi.begin(ssid, pass);
}
void loop()
{
int analog = analogRead(A0);
if (analog > 255) {
analog = 255;
}
else if (analog < 0){
analog = 0;
}
Serial.print(String(analog));
Serial.print(" ");
}
Here's how I use the code above to produce a wav file to check if the sound is what I expect:
- I start up the ESP8266 development board
- I turn on the Serial Monitor and clear all previous output
- I power up my electret microphone and speak into it
- I power down my electret microphone
- I copy the contents of the Serial Monitor (which is a series of integers) into a text file called `audio.raw`
- I copy `audio.raw` to a linux machine that has ffmpeg installed
- I issue the command `ffmpeg -f u8 -ar 11111 -ac 1 -i audio.raw -y audio.wav` on the linux machine
When I listen to the audio.raw file, I hear my voice, but the speed is maybe 5-10 times faster than normal. (I also get a lot of noise and distortion, but that might be a separate issue with the input signal quality.)
I then tried changing this one line of code Serial.print(String(analog)) to Serial.print(analog). Then I repeated the steps above. But this time, my voice sounds like it is about 2 times faster than normal.
Why does changing this one line from Serial.print(String(analog)) to Serial.print(analog) make such a big difference?
Is it because the String() function is a very expensive operation that takes up a lot of time? And when the script needs more time to process each line of code, the script then has less time to capture enough analogRead(A0) data points? And if I run the same ffmpeg command using all the same flags, then ffmpeg will try to meet the -ar 11111 requirement by speeding up the audio play? Which would imply that my sampling rate is dependent on execution speed of my script? Which means I have to consider variable execution speeds across other boards of the same model due to variability in manufacturing precision, environmental temperature, etc...?
Your sampling rate is coupled to your loop implementation (as you have discovered). This will also cause jitter in your sampling rate as different code paths will take different amounts of time and interrupt service routines will also steal CPU cycles.
This jitter will be one of the causes of distortion in your output.
When I listen to the audio.raw file, I hear my voice, but the speed is maybe 5-10 times faster than normal.
The ESP8266 has a hardware UART so the code can potentially load the UART's FIFO buffer faster than it can output. This would be a source of the perceived faster sampling rate but also cause jitter or data loss when the buffer fills up. Depending on the implementation, when the buffer fills it will drop data or alternatively block (causing jitter).
Why does changing this one line from Serial.print(String(analog)) to Serial.print(analog) make such a big difference?
Is it because the String() function is a very expensive operation that takes up a lot of time? And when the script needs more time to process each line of code, the script then has less time to capture enough analogRead(A0) data points?
Yes, yes and yes.
One of the reasons for the performance difference is that String() involves allocating and managing memory on the heap to store the characters.
Serial.print(analog) uses a fixed size buffer on the stack as the code knows the maximum number of characters required to display an int.
And if I run the same ffmpeg command using all the same flags, then ffmpeg will try to meet the -ar 11111 requirement by speeding up the audio play?
Yes. ffmpeg assumes that the samples have a fixed sampling rate but this does not match the samples that are being printed out.
Which would imply that my sampling rate is dependent on execution speed of my script?
Yes!
Which means I have to consider variable execution speeds across other boards of the same model due to variability in manufacturing precision, environmental temperature, etc...?
Yes. There will be a multitude of variables that affect execution speeds.
What can you do?
Decouple the sampling of data from the code execution.
This can be done by implementing an Interrupt Service Routine. Tie the ISR to a hardware timer so it executes at a fixed sampling rate and avoiding jitter.
The ISR can write to a buffer which the code in loop() transmits over the serial connection. The ISR and serial transmission code need to manage the buffer to ensure that neither overrun the other. One means of doing this is to use alternate buffers that the ISR and transmission code use.
Since you use Serial.begin(115200) ESP8266 Microcontroller will transfer 115200 bits per second through serial port. Which is 115200 / 8 = 14400 bytes per second and that means since you use u8 (unsigned 8 bit) format for audio, each sample consists of a single byte. Just change the ffmpeg -ar parameter to 14400.
I don't any have microphones which i can connect to MCU for testing but it should work properly this way. The other -ac parameter is correct since it is mono channel audio.
Edit : Also don't use String() constructor while printing out to Serial.
While using Serial() constructor sound speeds up about 5 times because String converts your 1 byte value to 3 bytes, example ; byte : 255 -> String : "2", "5", "5" , you don't have to consider execution speed of Microcontroller, it will output 115200 bits per second as if you defined. You just need to consider it's output.
Finally delete the line
Serial.print(" ");
Also change
int analog = analogRead(A0);
to
byte analog = (byte)analogRead(A0);
since int consists of 4 bytes, you would not want to send extra 3 bytes to serial.
And after changing int to byte you can get rid this code block
if (analog > 255) {
analog = 255;
}
else if (analog < 0){
analog = 0;
}
If you connect ESP8266 to linux device through usb which has ffmpeg on it you can use
ttylog -b 115200 -d /dev/ttyUSB0 | ffmpeg -f u8 -ar 14400 -ac 1 -i - -y audio.wav
to capture audio data in realtime from ESP8266.

ESP8266 Arduino available memory

When I compile a simple Blink sketch on Arduino for ESP8266, it looks like 38% of the memory is used by something:
Global variables use 31,576 bytes (38%) of dynamic memory, leaving 50,344 bytes for local variables. Maximum is 81,920 bytes.
Where does this memory go? I have an application that requires a lot of memory and wanted to see if I can disable / reduce usage by some Arduino built-in libraries.
Code below:
void setup() {
pinMode(LED_BUILTIN, OUTPUT);
// Initialize the LED_BUILTIN pin as an output
}
void loop() {
digitalWrite(LED_BUILTIN, LOW);
// Turn the LED on (Note that LOW is the voltage level
// but actually the LED is on; this is because
// it is acive low on the ESP-01)
delay(1000);
// Wait for a second
digitalWrite(LED_BUILTIN, HIGH);
// Turn the LED off by making the voltage HIGH
delay(2000);
// Wait for two seconds (to demonstrate the active low LED)
}
It is used by the variables that you initialize and firmware libs. If you want to write a code that is longer, you're going need more memory. By using the basic library for ESP it already occupies some memory for the configuration and firmware settings. If you use less variables and a simple logic in your program, that's going to drastically reduce your program size. Actually, it'll take less memory even for bigger programs since all the libraries are included for a larger program is also same.
But if it's really large concentrate on your logics and reduce the stress for the ESP and give it to a mainframe computer to do the complex calculations and logics(also helps in less consumption of power and less heat dissipation).

How can i tell if a peripheral is connected to GPIO?

I want to be able to detect when a peripheral sensor is NOT connected to my Raspberry Pi 3.
For example, if I have a GPIO passive infrared sensor.
I can get all the GPIO ports like this:
PeripheralManagerService manager = new PeripheralManagerService();
List<String> portList = manager.getGpioList();
if (portList.isEmpty()) {
Log.i(TAG, "No GPIO port available on this device.");
} else {
Log.i(TAG, "List of available ports: " + portList);
}
Then I can connect to a port like this:
try {
Gpio pir = new PeripheralManagerService().openGpio("BCM4")
} catch (IOException e) {
// not thrown in the case of an empty pin
}
However even if the pin is empty I can still connect to it (which technically makes sense, as gpio is just binary on or off). There doesn't seem to be any api, and I can't legitimately think of logically how you can differentiate between a pin that has a peripheral sensor connected and one that is "empty".
Therefore at the moment, there is no way for me to assert programmatically that my sensors and circuit is setup correctly.
Any one have any ideas? Is it even possible from a electronics point of view?
Reference docs:
https://developer.android.com/things/sdk/pio/gpio.html
There are lots of ways to do "presence detection" electrically, but nothing that you will find intrinsically in the SoC. You wouldn’t normally ask a GPIO pin if something is attached—it would have no way to tell you that.
Extra GPIO pins are often used to detect if a peripheral is attached to a connector. The plug for some sensor could include a “detect” line that is shorted to ground and pulls the GPIO low when the sensor is attached, for example. USB and SDIO do something similar with some dedicated circuitry in the interface.
You could also build more elaborate detection circuits using things like current sensing, but they would inevitably have to put out a binary signal that you capture through a dedicated GPIO.
This is easier to achieve for serial peripherals, since you can usually send a basic command and verify that you get a response.
Detection using solely the input line can be tough. First, you'd want to narrow the scope of the problem. Treat as not-present the condition of a sensor not being connected, the sensor being connected but not responding, or the sensor responding in an uncharacteristic manner.
So, if it is a digital sensor, then communicating with the sensor may be enough to tell if it is present or not (especially if checksums or parity bits are involved).
Some analog sensors also have specific specs on how it behaves when triggered. You can utilize deviation from those specs to determine if the sensor is not present.
If you have a digital sensor w/o any error checking on it's output, where you clock out data (so all 0s or all 1s is valid) or it's just a binary 1 or 0 for output, then you'd need external help. Same for most analog sensors.
This external help would be something where you put the system in a known controlled state, press a button, and it then checks the sensors for output within a specific range. To be absolutely sure, you'd want at least two different states, to ensure your digital or analog inputs didn't happen to be stuck at the correct state for your test.
Just about any other method would be external to the system. Using additional IO to "detect" a sensor could help increase confidence the sensor is there, but you could get false positives where all you've learned is that "something" is there - not necessarily the sensor you expect.

Check the battery status with NodeMCU?

I use an ESP8266 dev board from NodeMCU with Lua. I power my chip with two AA batteries, which gives me 3V. See this:
https://www.hackster.io/noelportugal/ifttt-smart-button-e11841
How do I check the battery status using NodeMCU?
With a recent firmware you can use adc.readvdd33(). That should be enough for your case
I read somewhere that adc.readvdd33() was deprecated? Effectively it is for many of the ESP8266 modules available, the docs say, "If the ESP8266 has been configured to use the ADC for sampling the external pin, this function will always return 65535". So that means that any ESP8266 that has an ADC pin (like ESP8266-07 or -12, etc.) has this shunted in firmware.
But by adding a couple of resistors to make a voltage divider, you can still use the ADC pin for this.
[![schematics][1]][1]
[1]: http://i.stack.imgur.com/FEILF.png
Those resistor values will allow it to read 0-12V, as a value between 0-1024. (The voltage at the ADC pin must be less than 1V.)
val = adc.read(0)
Addendum: Adding this to your circuit incurs a power draw of approx. 0.01 milliamps, small but more than nothing. Multiply the values by 1000 to reduce it to infinitesimal. Or use 18 megaohm for r1 and 2 megaohm for r2, which divides the voltage by 10, and (wild guess) drains less current than most if not all batteries will attenuate when disconnected.

Contiki OS CC2538: Reducing current / power consumption

I am trying to drive down the current consumption of the contiki os running on the CC2538 development kit.
I would like to operate the device from a CR2032 with a run life of 2 years. To achieve this I would need an average current less than 100uA.
However when I run the following at 3V, I get the following results:
contiki/examples/hello-world = 0.4mA - 2mA
contiki/examples/er-rest-example/er-example-client = 27mA
contiki/examples/er-rest-example/er-example-server = 27mA
thingsquare websocket example = 4mA
I have also designed my own target platform based on the cc2538 and get similar results.
I have read the guide at https://github.com/contiki-os/contiki/blob/648d3576a081b84edd33da05a3a973e209835723/platform/cc2538dk/README.md
and have ensured that in the contiki-conf.h file:
- LPM_CONF_ENABLE 1
- LPM_CONF_MAX_PM 2
Can anyone give me some pointers as to how I can get the current down. It would be most appreciated.
Regards,
Shane
How did you measure the current?
You have to be aware that using a basic ampere meter to measure the current consumption of contiki-os wouldn't give you relevant results. The system is turning on/off the radio at a relative high rate (8Hz by default) in order to perform the CCA. This might not be very easy to catch for an ampere meter.
To have an idea of the current consumption when the device is in deep sleep (and then make calculation to determine the averaged current consumption), I'd rather put the device in the PM state before the program reach the infinite while loop. I used the following code to do that:
lpm_enter();
REG(SYS_CTRL_PMCTL) = SYS_CTRL_PMCTL_PM2;
do { asm("wfi"::); } while(0);
leds_on(LEDS_RED); // should not reach here
while(1){
...
On the CC2538, the CCA check consumes about 10-15mA and last approximately 2ms. When the radio transmit a packet, it consume 25mA. Have a look at this post: Contiki UDP packet transmission duration with CC2538.
Furthermore, to save a little more current, turn off the serial com:
#define CC2538_CONF_QUIET 1
Are you using the SmartRF board? If you want to make proper current measurement with this board, you have to remove every jumpers: P486, P487, P411 and P408. Keep only the jumpers of BTN_SEL and the RESET signals.

Resources