I was trying to connect peripherals over the SPI bus and it didn't work. So checked the outputs with the oscilloscope and discovered the chip doesn't respond to spi library commands.
The only thing I get is the noise on the TX and RX, other pins voltages do not change at all. I tested it on two NodeMCUs (unofficial LoLin and Amica) with both master and dev firmwares. Here are the commands for the spi:
spi.setup(1, spi.MASTER, spi.CPOL_LOW, spi.CPHA_LOW, 20, 8)
spi.send(1, 170, 170, 170, 170) -- 170 == 0b10101010
What could be the problem?
Edit
TX/RX noise turned out to be a UART signal from the serial communication with the computer.
SPI bus works. It's just too fast for my crappy oscilloscope.
Also I discovered that argument databits can be in range [1,32] , clock_div - [0,~1200].
Related
I am working with the Songhe Mega2560 + WiFi R3 Mega2560 + ESP8266 4MB Memory integrated circuit for a project involving connecting to a WiFi signal and reading the RSSI value.
Below is a basic sketch that I uploaded to the Mega2560 to communicate to the ESP8266 through Serial3 to test the firmware:
#include "WiFiEsp.h"
// Emulate Serial3 on pins 6/7 if not present
#ifndef HAVE_HWSERIAL3
#include "SoftwareSerial.h"
SoftwareSerial Serial3(6, 7); // RX, TX
#endif
void setup() {
// initialize serial for debugging
Serial.begin(115200);
// initialize serial for ESP module
Serial3.begin(115200);
// initialize ESP module
WiFi.init(&Serial3);
// check for the presence of the shield
if (WiFi.status() == WL_NO_SHIELD) {
Serial.println("WiFi shield not present");
// don't continue
while (true);
}
// Print WiFi MAC address
printMacAddress();
}
void loop() {
// do nothing
}
I flashed different versions of espressif's AT firmware but the serial monitor keeps showing this:
22:28:07.009 -> [WiFiEsp] Initializing ESP module
22:28:08.023 -> [WiFiEsp] >>> TIMEOUT >>>
22:28:10.026 -> [WiFiEsp] >>> TIMEOUT >>>
22:28:12.022 -> [WiFiEsp] >>> TIMEOUT >>>
22:28:14.023 -> [WiFiEsp] >>> TIMEOUT >>>
22:28:16.020 -> [WiFiEsp] >>> TIMEOUT >>>
22:28:17.006 -> [WiFiEsp] Cannot initialize ESP module
22:28:23.017 -> [WiFiEsp] >>> TIMEOUT >>>
22:28:23.017 -> [WiFiEsp] No tag found
22:28:23.017 -> WiFi shield not present
I am not sure if it is a firmware issue so I have tried multiple versions of AT firmware. The baud rate I have set is 115200. I have been looking at many other sources online, but I cannot seem to initialize WiFiEsp's WiFi module and I would really appreciate some help on this matter.
I have been following these steps for flashing and testing.
Toggle DIP switches 5,6,7 to ON and all else OFF and RXD/TXD to RXD0
Connect USB cable from port COM3 (on my computer) to integrated PCB with Mega2560 + ESP8266 WiFi
Use esptool.py to flash firmware to the ESP8266
The latest, released firmware for ESP8266 is the "ESP8266-IDF-AT_V2.2.1.0.zip" downloadable at espressif.com
I download the factory_xxx.bin to address 0 since I read that it indicates all hardware configurations for the ESP module. Below is the command I ran:
esptool.py --chip auto --port COM3 --baud 115200 --before default_reset --after hard_reset write_flash -z --flash_mode dio --flash_size 4MB 0x0 factory_WROOM-02.bin
Disconnect USB cable
Toggle DIP switches 1,2,3,4 to ON and all else OFF and RXD/TXD to RXD3
Connect USB cable and upload sketch with Arduino IDE and read serial monitor
This is the procedure I have been trying to debug. If anymore information is required for better help, please let me know and I will try my best to provide.
The screenshot below is the command I run which successfully flashes (I think):
C:\Users\[MY NAME]\Downloads\ESP8266_NONOS_SDK-3.0.5\ESP8266_NONOS_SDK-3.0.5\bin>esptool.py write_flash --flash_mode dout --flash_size 4MB-c1 0x0 boot_v1.7.bin 0x01000 at/1024+1024/user1.2048.new.5.bin 0x1fb000 blank.bin 0x1fc000 esp_init_data_default_v08.bin 0xfe000 blank.bin 0x1fe000 blank.bin
esptool.py v4.4
Found 1 serial ports
Serial port COM3
Connecting....
Detecting chip type... Unsupported detection protocol, switching and trying again...
Connecting....
Detecting chip type... ESP8266
Chip is ESP8266EX
Features: WiFi
Crystal is 26MHz
MAC: a4:e5:7c:b6:77:c0
Uploading stub...
Running stub...
Stub running...
Configuring flash size...
Flash will be erased from 0x00000000 to 0x00000fff...
Flash will be erased from 0x00001000 to 0x00065fff...
Flash will be erased from 0x001fb000 to 0x001fbfff...
Flash will be erased from 0x001fc000 to 0x001fcfff...
Flash will be erased from 0x000fe000 to 0x000fefff...
Flash will be erased from 0x001fe000 to 0x001fefff...
Flash params set to 0x0360
Compressed 4080 bytes to 2936...
Wrote 4080 bytes (2936 compressed) at 0x00000000 in 0.4 seconds (effective 92.5 kbit/s)...
Hash of data verified.
Compressed 413556 bytes to 296987...
Wrote 413556 bytes (296987 compressed) at 0x00001000 in 26.2 seconds (effective 126.1 kbit/s)...
Hash of data verified.
Compressed 4096 bytes to 26...
Wrote 4096 bytes (26 compressed) at 0x001fb000 in 0.1 seconds (effective 373.3 kbit/s)...
Hash of data verified.
Compressed 128 bytes to 75...
Wrote 128 bytes (75 compressed) at 0x001fc000 in 0.1 seconds (effective 11.8 kbit/s)...
Hash of data verified.
Compressed 4096 bytes to 26...
Wrote 4096 bytes (26 compressed) at 0x000fe000 in 0.1 seconds (effective 365.1 kbit/s)...
Hash of data verified.
Compressed 4096 bytes to 26...
Wrote 4096 bytes (26 compressed) at 0x001fe000 in 0.1 seconds (effective 357.1 kbit/s)...
Hash of data verified.
Leaving...
Hard resetting via RTS pin...
After this, I test it with Arduino's SerialPassThrough sketch replacing Serial1 with Serial3 and I get no response from running the command: AT.
I would appreciate any help on how to resolve this and where I could possibly be going wrong. Thanks!
I have two Roland midi devices that behave the same when I try to send a bank and program change. It always sets it to the first patch of the bank. It won't change the patch I choose in the bank. Pro Logic can, however, switch to different banks.
The following example cause the devices to change to the bank but the program (patch) on the device defaults to the first in that bank and not number 9.
var event = AKMIDIEvent(controllerChange: 0, value: 89, channel: 0)
midiOut.sendEvent(event)
event = AKMIDIEvent(controllerChange: 32, value: 64, channel: 0)
midiOut.sendEvent(event)
event = AKMIDIEvent(programChange: 9, channel: 0)
midiOut.sendEvent(event)
Anyone have experience with sending this MIDI messages?
I was going through the same issue and was about to go crazy. It turns out the Program Change values in various MIDI data specifications, from various vendors, are 1 based. Not 0. Or perhaps it is the AudioKit implementation that is wrong?
So, instead of a programChange value of 9 you should use a value of 8. Here is my code for changing the current instrument on channel 0 to the Bösendorfer grand piano on a Yamaha Clavinova keyboard, where the programChange value in the MIDI data specification is designated as 1.
midiOut.sendControllerMessage(0, value: 108) // MSB sound bank selection
midiOut.sendControllerMessage(32, value: 0) // LSB sound bank selection
midiOut.sendEvent(AKMIDIEvent(programChange: 0, channel: 0)) // Initiate program change based on MSB and LSB selections
While reading various documentation about how MIDI works I also saw some forum posts describing keyboards that expect the LSB bank selection before the MSB bank selection. That is however not my understanding of how MIDI should work, but worth a try if you still cannot make it work with your Roland keyboards.
I am using a SIM800L module with a Texas Instrument Launchpad, with a MSP430G2553 microcontroller, not using an external library for SIM800L.
Problem Statement:
A Simple text message (SMS with Text Mode) is sent, received as a blank message on cellphone.
SIM details:
1. SIM 1 : Location: India. Operator: AirTel, 4G compatible SIM Card.
2. SIM 2 : Location: India. Operator: Tata Docomo, 3G compatible SIM card.
What I know already:
UART Drivers in firmware are tested and working, non-polling, interrupt driven.
No blocking time delays added as a substitute to read responses of AT commands. I read the response and proceed only if positive acknowledgement is received, <CR><LF>OK<CR><LF> for most of the commands.
I have confirmed the data bits transmitted and received on Tx-Rx pins by means of an oscilloscope. Everything seems as expected, including voltage levels.
What I have read:
Some speculation through unofficial sources (Of course forums) that SIM800L is only 2G compatible.
(Shallow reading from wikipedia) I have read through GSM 3.38 and GSM 3.40, and the Data Coding Scheme section for understanding how the encoding of text is handled in suited/relevant AT command (AT+CSMP).
Various forums including the ones for arduino with which SIM800L modules are very popularly used.
Related posts on Stackoverflow:
Recieving Blank SMS SIM800 using AT Commands and Python on Raspberry Pi 2
How to send SMS with GSM module SIM800 and Arduino Uno?
Sending GSM Character Set in SMS with SIM800L Module
The answer in the first one seemed to have worked for him, it didn't work for me.
What have I tried:
I have used the same module with an instance of Docklight serial terminal. SMS sent from Docklight are received on my cellphone and appear as expected, not blank.
On day 0, before integrating module with Launchpad hardware, I have tested the overall firmware state machine with exact copy of expected responses from SIM800L.
The results for both the SIM cards are same, except for some of the initial configurations, but I load a typical set of configuration in both of them before I initiate any SMS related task.
Typical values that I use are:
Echo Off
CSMP: 17, 167, 0, 0 (I have tried 17, 167, 0, 0, but no luck). Default from SIM 1 is 17,11,0,246, and that from SIM 2 is 17, 255, 0, 0.
CSCS: "IRA"
Failed combinations on serial port: (SIM 1 and SIM 2)
CSMP: 17, 11, 0, 246 | CSCS: "IRA" - Sends a blank SMS
CSMP: 17, 11, 0, 246 | CSCS: "GSM" - Sends a blank SMS
CSMP: 17, 11, 0, 246 | CSCS: "HEX" - Sends a blank SMS
Successful combinations on serial port: (SIM 1 and SIM 2)
CSMP: 17, 167, 0, 0 | CSCS: "IRA"
CSMP: 17, 167, 0, 8 | CSCS: "IRA"
CSMP: 17, 11, 0, 0 | CSCS: "GSM"
CSMP: 17, 167, 0, 0 | CSCS: "GSM"
CSMP: 17, 167, 0, 8 | CSCS: "GSM"
To be honest, I played hunch with these combinations before I studied what field reflects what change these combinations (which are poorly documented in the SIM800L User guide).
Any idea what I might be missing here? I am open to thinking that it is more of a RTFM (Read The Fat Manual) issue.
Ok, managed to resolve the issue.
It was not about the SIM800L at all.
The whole payload was followed by a '\0' which is unexpected (I know, very poor on my side). The serial term has no issues with it whatsoever.
Debugging was fun!
I am thinking about getting a new monitor. Short of plugging one in and seeing what happens, how can I tell if my laptop can output 4k resolution?
Is there a command I can run that will tell me this?
I thought maybe I could xrandr, but I think that only tells me what each monitor is capable of (even if the controller is capable of more.)
I also thought maybe I could look up the device from the lspci and find it on google, but I couldn't find much.
My lspci -v says:
00:02.0 VGA compatible controller: Intel Corporation 3rd Gen Core processor Graphics Controller (rev 09) (prog-if 00 [VGA controller])
Subsystem: Lenovo Device 500d
Flags: bus master, fast devsel, latency 0, IRQ 45
Memory at f3000000 (64-bit, non-prefetchable) [size=4M]
Memory at d0000000 (64-bit, prefetchable) [size=256M]
I/O ports at 6000 [size=64]
Expansion ROM at <unassigned> [disabled]
Capabilities: <access denied>
Kernel driver in use: i915
Update: Mr. Llama's suggestion about xrandr made me think that this output could be useful to someone who knows more than I do. Here's my xrandr:
Screen 0: minimum 8 x 8, current 5206 x 1080, maximum 16384 x 16384
eDP-1-0 connected 1366x768+0+0 309mm x 173mm
1366x768 60.1*+
... other resolution options removed for brevity ...
320x240 60.1
HDMI-1-0 connected 1920x1080+3286+0 160mm x 90mm
1920x1080 60.0*+ 50.0 59.9
... other resolution options removed for brevity ...
640x480 60.0 59.9
DisplayPort-1-1 connected 1920x1080+1366+0 509mm x 286mm
1920x1080 60.0*+
... other resolution options removed for brevity ...
720x400 70.1
VGA-1-0 disconnected
HDMI-1-1 disconnected
DisplayPort-1-0 disconnected
Looks like the xrandr command might be of use to you. It lists the available and current monitor resolution.
See this SuperUser question for more information.
I'm trying to get RSSI or signal strength from WiFi packets.
I want also RSSI from 'WiFi probe requests' (when somebody is searching for a WiFi hotspots).
I managed to see it from kismet logs but that was only to make sure it is possible - I don't want to use kismet all the time.
For 'full time scanning' I'm using scapy. Does anybody know where can I find the RSSI or signal strength (in dBm) from the packets sniffed with scapy? I don't know how is the whole packet built - and there are a lot of 'hex' values which I don't know how to parse/interpret.
I'm sniffing on both interfaces - wlan0 (detecting when somebody connects to my hotspot), and mon.wlan0 (detecting when somebody is searching for hotspots).
Hardware (WiFi card) I use is based on Prism chipset (ISL3886). However test with Kismet was ran on Atheros (AR2413) and Intel iwl4965.
Edit1:
Looks like I need to access somehow information stored in PrismHeader:
http://trac.secdev.org/scapy/browser/scapy/layers/dot11.py
line 92 ?
Anybody knows how to enter this information?
packet.show() and packet.show2() don't show anything from this Class/Layer
Edit2:
After more digging it appears that the interface just isn't set correctly and that's why it doesn't collect all necessary headers.
If I run kismet and then sniff packets from that interface with scapy there is more info in the packet:
###[ RadioTap dummy ]###
version= 0
pad= 0
len= 26
present= TSFT+Flags+Rate+Channel+dBm_AntSignal+Antenna+b14
notdecoded= '8`/\x08\x00\x00\x00\x00\x10\x02\x94\t\xa0\x00\xdb\x01\x00\x00'
...
Now I only need to set the interface correctly without using kismet.
Here is a valuable scapy extension that improves scapy.layers.dot11.Packet's parsing of present not decoded fields.
https://github.com/ivanlei/airodump-iv/blob/master/airoiv/scapy_ex.py
Just use:
import scapy_ex
And:
packet.show()
It'll look like this:
###[ 802.11 RadioTap ]###
version = 0
pad = 0
RadioTap_len= 18
present = Flags+Rate+Channel+dBm_AntSignal+Antenna+b14
Flags = 0
Rate = 2
Channel = 1
Channel_flags= 160
dBm_AntSignal= -87
Antenna = 1
RX_Flags = 0
To summarize:
signal strength was not visible because something was wrong in the way that 'monitor mode' was set (not all headers were passed/parsed by sniffers). This monitor interface was created by hostapd.
now I'm setting monitor mode on interface with airmon-ng - tcpdump, scapy show theese extra headers.
Edited: use scapy 2.4.1+ (or github dev version). Most recent versions now correctly decode the « notdecoded » part
For some reason the packet structure has changed. Now dBm_AntSignal is the first element in notdecoded.
I am not 100% sure of this solution but I used sig_str = -(256 - ord(packet.notdecoded[-2:-1])) to reach first element and I get values that seems to be dBm_AntSignal.
I am using OpenWRT in a TP-Link MR3020 with extroot and Edward Keeble Passive Wifi Monitoring project with some modifications.
I use scapy_ex.py and I had this information:
802.11 RadioTap
version = 0
pad = 0
RadioTap_len= 36
present = dBm_AntSignal+Lock_Quality+b22+b24+b25+b26+b27+b29
dBm_AntSignal= 32
Lock_Quality= 8
If someone still has the same issue, I think I have found the solution:
I believe this is the right cut for the RSSI value:
sig_str = -(256-ord(packet.notdecoded[-3:-2]))
and this one is for the noise level:
noise_str = -(256-ord(packet.notdecoded[-2:-1]))
The fact that it says "RadioTap" suggests that the device may supply Radiotap headers, not Prism headers, even though it has a Prism chipset. The p54 driver appears to be a "SoftMAC driver", in which case it'll probably supply Radiotap headers; are you using the p54 driver or the older prism54 driver?
I have similar problem, I set up the monitor mode with airmon-ng and I can see the dBm level in tcpdump but whenever I try the sig_str = -(256-ord(packet.notdecoded[-4:-3])) I get -256 because the returned value from notdecoded in 0. Packet structure looks like this.
version = 0
pad = 0
len = 36
present = TSFT+Flags+Rate+Channel+dBm_AntSignal+b14+b29+Ext
notdecoded= ' \x08\x00\x00\x00\x00\x00\x00\x1f\x02\xed\x07\x05
.......