I'm trying to implement a method to write data to an SD card from a dsPIC33F. I currently can transmit the data via UART to Bluetooth and USB, but I can't find anything online in regards to writing to an SD card via UART; it all seems to be SPI.
I would use SPI, but I'm already using I2C and it seems to be difficult to use both SPI and I2C on the same PIC, due to them sharing pins.
So, can anyone suggest any information on writing data to an SD card via UART, or maybe a way to use both SPI and I2C concurrently?
All I want is some form of storage method, so if someone can suggest another method, maybe EEPROM or USB flash drive, then I'm all ears. I will need at least 2GB of storage, the more the better.
Most SD cards natively support SPI communication but not UART so a direct UART connection isn't possible. I would recommend against the USB flash drive as there is a lot of overhead there that complicates things. And EEPROM is likely to use SPI or I2C so you're still left with the problem of having the one set of peripheral pins already in use.
Your best option given the chip you are using is to use the Peripheral Pin Select feature to map some available pins to be the 4 SPI pins you need. Section 11.6 of the datasheet has a good explanation of how to remap pin functions. That is probably the easiest solution.
One other approach you could possibly use is to use the UART to communicate with another PIC that has the SPI pins available but that, too, introduces a lot of extra overhead and complexity.
SD cards can work with SDIO or SPI.
In order to eliminate the SPI / I2C pin sharing problem, I would:
1) check if the sensor can be replaced with an SPI one
2) if not, I would implement an Software SPI using other pins (the MCU is Master, thus this is much easier)
Related
Background
I'm very new to electronics/IoT dev. I'm trying to create a solution to be able to read my wife's Car's CAN Bus signal (messages) and store it to an SD card. I hope to analyze the data and build a dashboard based on the car's telemetry.
This specific question is in relation to a chip (STM32F1) on an IoT board (MXChip AZ3166) I already own, which I hope to incorporate into my overall solution as the data acquisition layer.
For reference the:
Chips is the: STMicroelectronics STM32F103C8T6, 32bit ARM Cortex M3 Microcontroller
and the IoT board is the: (MXChip AZ3166 IoT DevKit)
Reading the MXChip AZ3166 board's spec and after doing some research, I have found out that the MXChip AZ3166 comprises two main chipsets:
Vendor
Part Number
Ref Link
STMicroelectronics
STM32F103C8T6
https://uk.rs-online.com/web/p/microcontrollers/1023545
MXChip
EMW3166
https://www.mxchip.com/en/products/module/54
Main Question
The product specification mentions the STM32F1 features Comprising of motor control peripherals plus CAN and USB full speed interfaces, it also states it has 1x CAN Channel. Does that mean I can interface the MXChip AZ3166 board featuring this chip via the GPIO pins to the CAB bus in my wife's car and receive the CAN Bus signals (I presume adhering to the
ISO 11898-1 CAN data communication protocol).
How would I find out which pins to connect to the CAN Hi & CAN low connections on the cars CAN Bus?
Concerning power, how would I determine that the CAN signal received doesn't fry the MXChip Board with a stated max Operation voltage of 3.3v?
Yes you'll want an MCU with a built-in CAN controller for communicating on a CAN bus. However, the CAN standard only covers the physical and data link layers. You need to know the application layer in order to meaningfully interact with a bus.
The application layer on a car may or may not be proprietary. It may even be encrypted. If you don't know what protocol is uses, then no can do. Reverse-engineering CAN protocols is hackish, hard and dangerous. Plugging into a CAN bus where you have no clue about timing considerations etc is also very dangerous.
But cars usually have an "on-board diagnostics" (OBD) port used for service purposes, with standardized application layers, through which you may have access to various parts of the car. There's lots of different standards for OBD and older ones didn't even use CAN. It depends on the car model.
In case of the OBD port the pinouts are standardized and you can find them on the Internet. Otherwise it is very simple to find out which signal that's CANH and CANL with an oscilloscope. CANH goes 2.5V +1V and CANL 2.5V -1V. A more hacky solution is to measure this with a multimeter, but it's perfectly possible since one signal with be slightly above 2.5V and the other slightly below.
CAN is standardized so if you have a CAN bus on the board, you connect there. In some cases there may be 12V supply wired together with the signal and that's the only one which could fry something.
Overall, please note that the project you describe here is very difficult and not a beginner task. It sounds as you have next to no experience of electronics/embedded systems, so I would recommend picking a far simpler project.
Furthermore, modifying car electronics or installing your own electronics in a car is illegal in most parts of the world. Third party type approvals with EMC tests are mandatory (and very expensive). If your car is involved in an accident and they find custom electronics without type approval in it, you could be facing serious legal consequences.
I like software and I am comfortable writing lua code for nodemcu devkit. But I am not a hardware guy.
I need to setup the nodemcu so that it is a standalone module, ie powered by a battery. What is the best way to do this so that the whole thing is as small as possible? Which battery should I use(If I need to continuously run the nodemcu for 2 hours with wifi ON)? How to connect the battery with the nodemcu, ie any regulator needed?
Right now I am powering nodemcu via USB. I researched this in google, but couldn't find a satisfactory solution.
I am using this nodemcu devkit
Please point me to the correct forum, if this is the wrong place to ask this question
The easiest would be to NOT use that devkit but use one with a LiPo connector instead e.g. https://www.adafruit.com/product/2821. An alternative would be to use WeMos D1 minis with a LiPo/battery shield.
If you want to stick to your devkit you "just" have to make sure you feed 3.3V to the 3.3V pin. Since batteries deliver no stable voltage over time you need to place some kind of buck/boost converter (step-down/step-up) between the battery and the module.
You could of course also target the VIN pin which expects 5V but that would be less efficient if you used a LiPo battery. First, the external converter would boost from the LiPo's ~3.7V to 5V. Then the devkits internal converter would have to bring this down to 3.3V again.
Furthermore, if you feed the ESP8266 from batteries you want to pay attention to any power that'd be lost along the way. With the L7805 voltage regulator for example there is a constant 'quiescent' current draw of 6mA. That may be ok if your source is a power adapter but less ideal when it's a battery. Buck/boost converters are more efficient.
If you just need it to stand alone, would you consider using a phone charger? Just simply plug in the USB cable to any phone charger that you can find. That way it will provide stable power for as long as you want.
In case you really want to use a battery. Then you will need a 7805 regulator IC. This will step down your voltage to 5V. Then direct this to Vin Pin on the NodeMCU. I believe you should get 3.3V out of the 3v3 pin.
Is it possible to associate single wireless network interface controller (WNIC) with multiple Wireless Access Points (WAP) at a time? If not: why?
I've never heard about such a feature, so I assume it's technically impossible or fairly difficult and rarely implemented. Is it really that difficult/impossible to implement driver providing such a feature? Is it software or hardware difficulty?
I assume that TCP/IP protocols' specifications doesn't limit us at all because if I attach multiple WNICs to my computer, I can easily connect to multiple APs.
If it's software difficulty, than what's the actual problem? Does Linux/Windows kernel or WNIC's drivers limits it? Or maybe system libraries (like libc on GNU/Linux systems)?
If it's hardware difficulty, what actually limits us? Antennas? Using single radio frequency at a time? If yes, than why can't we implement frequency hopping (like Kismet does)? Because of lost packets during time spent on other channels? If yes, than can we associate WNIC with multiple routers working on the same channel (I know that channel overlapping is bad)?
Note: I'm not talking about dual band routers. I assume that we consider most common WNIC and AP which both work on 2.4GHz channels. If I have to put my question into OS context, than I choose GNU/Linux context.
Yes. The basic technique is that the client tells AP 'A' that it is going to sleep and then talks to AP 'B' while A is buffering frames for it.
Microsoft research worked this out a while ago:
http://research.microsoft.com/en-us/um/redmond/projects/virtualwifi/
Many low-level drivers support Wi-Fi interface virtualization (e.g. the BRCM wl command has options which support this).
Apple's AirDrop and MultiPeer features for OS X and iOS use a similar technique, but instead of talking to a 2nd AP they talk to a peer device.
I have seen several projects now which derive novel spatial information from radio data collected from a typical wireless router:
http://wisee.cs.washington.edu/
http://www.extremetech.com/extreme/133936-using-wifi-to-see-through-walls
The idea of using a wireless router as a sort of passive radar is fantastic.
I am very interested in experimenting with data collected from a wireless router myself, but there is little information on how to go about actually interfacing with a wireless router and getting a raw stream of information collected by the device. Similar questions have been asked on here before, but I am yet to see a satisfactory answer.
I don't have the rep points necessary to link to the other questions but see:
'Capture Raw Signal from WiFi card as You Would a Sound Card'
'raw wifi “signal data” access'
I am looking for a solution that would let me use a low-cost device such as the oh so common WRT54G wireless router. If your answer involves custom radio hardware, you needn't bother posting.
As far as I know, the only option using a commodity hardware is to use Intel 5300 Wifi card. You can get the complex CSI (amplitude and phase info therein) from the three antenna on it from a sample of subcarriers (OFDM). You can take a look at this site:
http://dhalperi.github.io/linux-80211n-csitool/
If you read the wisee research paper you will find the platform they use for the system, it is USRP N210 from Ettus plus GNU radio software.
So it is not your usual WiFi AP they are using but the SDR solution this question also hints about.
WiFi devices are build to handle physical layer in silicon and the monitor mode is the best thing you can get without going the SDR path. You can get quite a lot of information from it - the radiotap header contains for example received signal strength and receiving antenna information. But if you really want to explore physical layer of WiFi then commodity hardware is not going to cut it.
My and my fellow students are deciding on a choosing a simple microcontroller to do very basic image processing. We are basically trying to implement template matching to find a set of objects in specific portions of the image. We'd like to use a connect a webcam to the microcontroller to do the job take the pictures and look for the objects. We also require basic wireless communication (e.g. bluetooth or wifi).
I don't think we will have the luxury of using state-of-the-art microcontroller, but something thats been around for a while (due to budget and stuff). Could anyone please advise on which specs of the microcontrolelr would be the most relevant for the above task (e.g. CPU, MIPS, etc).
Thanks a lot!
For this kind of a task, I would say the amount of RAM is the most relevant spec.
A microcontroller with an external memory interface allows you to extend the data space with additional SRAM to hold your image data.
Also note, that memory is needed for any protocol stacks you need to implement (Bluetooth, TCP/IP even more so).
You probably want to have total RAM in tens of kilobytes, preferably 100+ kB.
It is also nice to have plenty of program memory available when learning and experimenting. Later on you can try to optimize and squeeze your code into a more confined device.
As for the architecture, choose something you can easily find development tools and examples for.ARM, AVR and PIC are all good candidates among others.
Also find out what interfaces you need to use to
control the camera (e.g. I2C or SPI)
read pixel data (e.g. parallel or analog)
Connecting directly to a webcam's USB interface would not be a straightforward task, as the microcontroller would need to act as a USB host.
Good luck with your project!
You may need a microcontroller with following features:
USB 2.0 Host controller
1.2MB of memory for buffer 640*480*2(bytes per pixel)*2(double buffer)
(you may use lower resolution if there are not enough memory)
Wifi controller
CPU power strong enough for your task
Ready open source code
It seems that broadcom controllers may be useful here.
Also, you can by off-the-shell Wifi router with usb port and use it for your project
(i.e. Linksys E3000 )