The base64 encode formatted output from Arduino HMAC-SHA1 does not match with JAVA/python/online tool - twitter

I am working on an Arduino project which is required an authorized authentication based on OAuth 1.0 to connects to the cloud. This is alike [Authorizing a request to Twitter API][1], and I am stuck in the step of [Creating a signature][2]. The whole process of creating a signature requires algorithms like encodeURL, base64encode, and hmac-sha1. On my Arduino project, I use Cryptosuite(link 3) library for hmac-sha1 and arduino-base64(link 4) library for base64encode. Both of them are working fine separately. However, I need to get a base64-formatted output of hmac-sha1. So I have tried this:
#include <avr/pgmspace.h>
#include <sha1.h>
#include <Base64.h>
uint8_t *in, out, i;
char b64[29];
static const char PROGMEM b64chars[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
char key[] = "testKey";
char basestring[] = "testing";
void printHash(uint8_t* hash) {
int i;
for (i=0; i<20; i++) {
Serial.print("0123456789abcdef"[hash[i]>>4]);
Serial.print("0123456789abcdef"[hash[i]&0xf]);
}
Serial.println();
}
void setup() {
Serial.begin(115200);
Serial.print("Result:");
Sha1.initHmac((uint8_t*)key, strlen(key));
Sha1.print(basestring);
printHash(Sha1.resultHmac());
Serial.println();
// encoding
char* input;
input = (char*)(Sha1.resultHmac());
int inputLen = strlen(input);
int encodedLen = base64_enc_len(inputLen);
char encoded[encodedLen];
// note input is consumed in this step: it will be empty afterwards
base64_encode(encoded, input, inputLen);
Serial.print("base64 result: ");
Serial.println(encoded);
}
void loop() {
}
The output of printHash that I got is 60d41271d43b875b791e2d54c34bf3f018a29763, which is exactly same with the online verification tool(link 5).
However, I supposed to get YNQScdQ7h1t5Hi1Uw0vz8Biil2M= for the base64 result. But I got L18B0HicKRhuxmB6SIFpZP+DpHxU which seems wrong. I have also tried to write a JAVA program and a python program, which also said that the output of the base64 result should be YNQScdQ7h1t5Hi1Uw0vz8Biil2M=
I also found this post: Issues talking between Arduino SHA1-HMAC and base64 encoding and Python(link 6). I have also tried the tidy function it mentioned from Adafruit-Tweet-Receipt(link 7).
// base64-encode SHA-1 hash output. This is NOT a general-purpose base64
// encoder! It's stripped down for the fixed-length hash -- always 20
// bytes input, always 27 chars output + '='.
for(in = Sha1.resultHmac(), out=0; ; in += 3) { // octets to sextets
b64[out++] = in[0] >> 2;
b64[out++] = ((in[0] & 0x03) << 4) | (in[1] >> 4);
if(out >= 26) break;
b64[out++] = ((in[1] & 0x0f) << 2) | (in[2] >> 6);
b64[out++] = in[2] & 0x3f;
}
b64[out] = (in[1] & 0x0f) << 2;
// Remap sextets to base64 ASCII chars
for(i=0; i<=out; i++) b64[i] = pgm_read_byte(&b64chars[b64[i]]);
b64[i++] = '=';
b64[i++] = 0;
Is there any mistake I've made in here?
Thanks!

So full example will be:
#include <avr/pgmspace.h>
#include <sha1.h>
#include <Base64.h>
char key[] = "testKey";
char basestring[] = "testing";
void printHash(uint8_t* hash) {
for (int i=0; i<20; i++) {
Serial.print("0123456789abcdef"[hash[i]>>4]);
Serial.print("0123456789abcdef"[hash[i]&0xf]);
}
Serial.println();
}
void setup() {
Serial.begin(115200);
Serial.print("Input: ");
Serial.println(basestring);
Serial.print("Key: ");
Serial.println(key);
Serial.print("Hmac-sha1 (hex): ");
Sha1.initHmac((uint8_t*)key, strlen(key));
Sha1.print(basestring);
uint8_t *hash;
hash = Sha1.resultHmac();
printHash(hash);
// base64 encoding
char* input = (char*) hash;
int inputLen = strlen(input) - 1; // skip null termination
int encodedLen = base64_enc_len(inputLen);
char encoded[encodedLen];
// note input is consumed in this step: it will be empty afterwards
base64_encode(encoded, input, inputLen);
Serial.print("Hmac-sha1 (base64): ");
Serial.println(encoded);
}
void loop() { }
which outputs:
Input: testing
Key: testKey
Hmac-sha1 (hex): 60d41271d43b875b791e2d54c34bf3f018a29763
Hmac-sha1 (base64): YNQScdQ7h1t5Hi1Uw0vz8Biil2M=

Related

How to edit this code to vie power consumption in google firebase?

#include <ESP8266WiFi.h>;
#include <WiFiClient.h>;
#include <ThingSpeak.h>;
const char* ssid = "Wifi_Project"; //Your Network SSID
const char* password = "1111aaaa"; //Your Network Password
WiFiClient client;
unsigned long myChannelNumber = 1947152; //Your Channel Number (Without Brackets)
const char * myWriteAPIKey = "QA02MFMGFJVIDBZR"; //Your Write API Key
#include <SoftwareSerial.h>
SoftwareSerial SMESerial (D6, D7);
int xVal, yVal, zVal;
float xtotal,ytotal,ztotal,xprice,yprice,zprice;
void setup() {
Serial.begin(9600);
SMESerial.begin(9600);
WiFi.begin(ssid, password);
ThingSpeak.begin(client);
}
void loop() {
if (SMESerial.available()<1) return;
char R=SMESerial.read();
int data=SMESerial.parseInt();
if (R == 'x')
xVal = data;
xtotal = xVal + xtotal;
xprice = xtotal*0.1;
Serial.print("X = ");
Serial.println(xtotal);
ThingSpeak.setField(1, xtotal);
ThingSpeak.setField(2, xprice);
if (R == 'y')
yVal = data;
ytotal = yVal + ytotal;
yprice = ytotal*0.1;
Serial.print("Y = ");
Serial.println(ytotal);
ThingSpeak.setField(3, ytotal);
ThingSpeak.setField(4, yprice);
if (R == 'z')
zVal = data;
ztotal = zVal + ztotal;
zprice = ztotal*0.1;
Serial.print("Z = ");
Serial.println(ztotal);
ThingSpeak.setField(5, ztotal);
ThingSpeak.setField(6, zprice);
ThingSpeak.writeFields(myChannelNumber, myWriteAPIKey);
}
I've made a wireless sensor networks project by using 3 slave nodemcu esp8266 to collect current reading using ACS712 and send it to master nodemcu esp8266. The master nodemcu should send data to phone to view the data through a mobile application. I used Thingspeak at first to view data of the power consumption but i want to use google firebase to view the power consumption easier for my application view. Need help please.

Posting a image taken by ESP32 CAM to Clarifai not working

I have the following ESP32CAM sketch that should take a picture and post it to Clarify:
#include "Arduino.h"
#include "esp_camera.h"
#include <WiFi.h>
#include <WiFiClientSecure.h>
#include <base64.h>
#include <HTTPClient.h>
#include <ArduinoJson.h>
// Select camera model
//#define CAMERA_MODEL_WROVER_KIT // Has PSRAM
//#define CAMERA_MODEL_ESP_EYE // Has PSRAM
//#define CAMERA_MODEL_M5STACK_PSRAM // Has PSRAM
//#define CAMERA_MODEL_M5STACK_WIDE // Has PSRAM
#define CAMERA_MODEL_AI_THINKER // Has PSRAM
//#define CAMERA_MODEL_TTGO_T_JOURNAL // No PSRAM
//CAMERA_MODEL_AI_THINKER
#define PWDN_GPIO_NUM 32
#define RESET_GPIO_NUM -1
#define XCLK_GPIO_NUM 0
#define SIOD_GPIO_NUM 26
#define SIOC_GPIO_NUM 27
#define Y9_GPIO_NUM 35
#define Y8_GPIO_NUM 34
#define Y7_GPIO_NUM 39
#define Y6_GPIO_NUM 36
#define Y5_GPIO_NUM 21
#define Y4_GPIO_NUM 19
#define Y3_GPIO_NUM 18
#define Y2_GPIO_NUM 5
#define VSYNC_GPIO_NUM 25
#define HREF_GPIO_NUM 23
#define PCLK_GPIO_NUM 22
const char* ssid = "mySSID";
const char* password = "myPass";
void setup() {
Serial.begin(115200);
Serial.setDebugOutput(true);
Serial.println();
camera_config_t config;
config.ledc_channel = LEDC_CHANNEL_0;
config.ledc_timer = LEDC_TIMER_0;
config.pin_d0 = Y2_GPIO_NUM;
config.pin_d1 = Y3_GPIO_NUM;
config.pin_d2 = Y4_GPIO_NUM;
config.pin_d3 = Y5_GPIO_NUM;
config.pin_d4 = Y6_GPIO_NUM;
config.pin_d5 = Y7_GPIO_NUM;
config.pin_d6 = Y8_GPIO_NUM;
config.pin_d7 = Y9_GPIO_NUM;
config.pin_xclk = XCLK_GPIO_NUM;
config.pin_pclk = PCLK_GPIO_NUM;
config.pin_vsync = VSYNC_GPIO_NUM;
config.pin_href = HREF_GPIO_NUM;
config.pin_sscb_sda = SIOD_GPIO_NUM;
config.pin_sscb_scl = SIOC_GPIO_NUM;
config.pin_pwdn = PWDN_GPIO_NUM;
config.pin_reset = RESET_GPIO_NUM;
config.xclk_freq_hz = 20000000;
config.pixel_format = PIXFORMAT_JPEG;
// if PSRAM IC present, init with UXGA resolution and higher JPEG quality
// for larger pre-allocated frame buffer.
if(psramFound()){
config.frame_size = FRAMESIZE_QVGA;
config.jpeg_quality = 10;
config.fb_count = 2;
} else {
config.frame_size = FRAMESIZE_QVGA;
config.jpeg_quality = 12;
config.fb_count = 1;
}
#if defined(CAMERA_MODEL_ESP_EYE)
pinMode(13, INPUT_PULLUP);
pinMode(14, INPUT_PULLUP);
#endif
// camera init
esp_err_t err = esp_camera_init(&config);
if (err != ESP_OK) {
Serial.printf("Camera init failed with error 0x%x", err);
return;
}
#if defined(CAMERA_MODEL_M5STACK_WIDE)
s->set_vflip(s, 1);
s->set_hmirror(s, 1);
#endif
WiFi.begin(ssid, password);
while (WiFi.status() != WL_CONNECTED) {
delay(500);
Serial.print(".");
}
Serial.println("");
Serial.println("WiFi connected");
classifyImage();
Serial.println("\nSleep....");
esp_deep_sleep_start();
}
void loop(){
}
void classifyImage() {
String response;
// Capture picture
camera_fb_t * fb = NULL;
fb = esp_camera_fb_get();
if(!fb) {
Serial.println("Camera capture failed");
return;
} else {
Serial.println("Camera capture OK");
}
size_t size = fb->len;
String buffer = base64::encode((uint8_t *) fb->buf, fb->len);
String imgPayload = "{\"inputs\": [{ \"data\": {\"image\": {\"base64\": \"" + buffer + "\"}}}]}";
buffer = "";
// Uncomment this if you want to show the payload
Serial.println(imgPayload);
esp_camera_fb_return(fb);
// Generic model
String model_id = "General";
HTTPClient http;
http.begin("https://api.clarifai.com/v2/models/" + model_id + "/outputs");
http.addHeader("Content-Type", "application/json");
http.addHeader("Authorization", "c7f894790533332388e23d4d21278321");
int httpResponseCode = http.POST(imgPayload);
if(httpResponseCode>0){
Serial.print(httpResponseCode);
Serial.print(" Returned String: ");
Serial.println(http.getString());
} else {
Serial.print("POST Error: ");
Serial.print(httpResponseCode);
}
// Parse the json response: Arduino assistant
const int jsonSize = JSON_ARRAY_SIZE(1) + JSON_ARRAY_SIZE(20) + 3*JSON_OBJECT_SIZE(1) + 6*JSON_OBJECT_SIZE(2) + JSON_OBJECT_SIZE(3) + 20*JSON_OBJECT_SIZE(4) + 2*JSON_OBJECT_SIZE(6);
StaticJsonDocument<jsonSize> doc;
// Deserialize the JSON document
DeserializationError error = deserializeJson(doc, response);
// Test if parsing succeeds.
if (error) {
Serial.print(F("deserializeJson() failed: "));
Serial.println(error.f_str());
return;
}
Serial.println(jsonSize);
Serial.println(response);
for (int i=0; i < 10; i++) {
// const name = doc["outputs"][0]["data"]["concepts"][i]["name"];
// const float p = doc["outputs"][0]["data"]["concepts"][i]["value"];
const char* name = doc["outputs"][0]["data"]["concepts"][i]["name"];
const char* p = doc["outputs"][0]["data"]["concepts"][i]["value"];
Serial.println("=====================");
Serial.print("Name:");
Serial.println(name[i]);
Serial.print("Prob:");
Serial.println(p);
Serial.println();
}
}
It posts the image to Clarifai bit what I get in return is:
-400 Returned String: {"status":{"code":11102,"description":"Invalid request","details":"Empty or malformed authorization header. Please provide an API key or session token.","req_id":"39d7b4f1b7ad489fb3a9a878000f6e88"},"outputs":[]}
-deserializeJson() failed: EmptyInput
What I need is to confirm if the HTTP POST request is formatted properly.
This problem is not the formatting of your POST request, it's the fact that your authorization header is incorrect (as the error "Empty or malformed authorization header" indicates).
As the Clarafai documentation indicates, the Authorization header should be:
Authorization: Key YOUR_API_KEY
your code is sending
Authorization: YOUR_API_KEY
change the line that sets the Authorization header to have the "Key " before the API key.
Given that the ESP32 is a fussy environment where a lot can go wrong with an HTTP request, a good way to debug these problems is to use the curl utility to attempt the same operation in a more full-featured environment. In this case on a Mac or Linux machine you could run
curl -X POST -F filename -H 'Authorization: YOUR_API_KEY' -H 'Content-type: application/json' https://api.clarifai.com/v2/models/MODEL_ID/outputs
where the photo you're testing with is stored in filename. The you can be sure the POST request is correct and work out what other things might be wrong.
Also it appears that you may have posted your API key to the Internet. If that's the case, I'd recommend invalidating the one in the code you posted and generating a new one.

Yaml File Format used in OpenCV

Here is a example (part of code) from OpenCV (https://docs.opencv.org/3.4.0/d4/da4/group__core__xml.html):
test.yml
%YAML:1.0
---
frameCount: 5
read.cpp
#include "opencv2/opencv.hpp"
#include <time.h>
using namespace cv;
using namespace std;
int main()
{
FileStorage fs("test.yml", FileStorage::READ);
int frameCount = (int) fs["frameCount"];
cout << frameCount << endl;
return 0;
}
Given the code, it works well and reads the yaml file. But when I remove %YAML:1.0, the code throws:
OpenCV Error: Unknown error code -49 (Input file is empty) in cvOpenFileStorage, file /home/pengfei/Documents/opencv-3.3.1/modules/core/src/persistence.cpp, line 4484
terminate called after throwing an instance of 'cv::Exception'
what(): /home/pengfei/Documents/opencv-3.3.1/modules/core/src/persistence.cpp:4484: error: (-49) Input file is empty in function cvOpenFileStorage
[1] 27146 abort (core dumped) ./Read
But I checked the yaml.org. There is no rule stating %YAML:1.0 is necessary. (http://yaml.org/start.html)
**My questions are (updated according to answer from #zindarod ): **
1. Is this OpenCV specific feature??
Yes, this is OpenCV specific requirment.
const char* yaml_signature = "%YAML";
const char* json_signature = "{";
const char* xml_signature = "<?xml";
OpenCV checks the file signature, then decide how to interpret the file.
2. How to know the yaml version I should use??
The yaml verison doesn't matter too much.
But it's better to use 1.0 specification. Probably OpenCV cannot parse other new specifications.
In OpenCV-3.3.1 in function cvOpenFileStorage located at /modules/core/src/persistence.cpp:
...
else
{
if( mem )
{
fs->strbuf = filename;
fs->strbufsize = fnamelen;
}
size_t buf_size = 1 << 20;
const char* yaml_signature = "%YAML";
const char* json_signature = "{";
const char* xml_signature = "<?xml";
char buf[16];
icvGets( fs, buf, sizeof(buf)-2 );
char* bufPtr = cv_skip_BOM(buf);
size_t bufOffset = bufPtr - buf;
if(strncmp( bufPtr, yaml_signature, strlen(yaml_signature) ) == 0)
fs->fmt = CV_STORAGE_FORMAT_YAML;
else if(strncmp( bufPtr, json_signature, strlen(json_signature) ) == 0)
fs->fmt = CV_STORAGE_FORMAT_JSON;
else if(strncmp( bufPtr, xml_signature, strlen(xml_signature) ) == 0)
fs->fmt = CV_STORAGE_FORMAT_XML;
else if(fs->strbufsize == bufOffset)
CV_Error(CV_BADARG_ERR, "Input file is empty");
...
OpenCV checks for file signature (json, yaml and xml). The version of YAML does not matter, as long as the first line contains the string "%YAML" in it.

One function gives several results in swift

I have a method in objective-C which I call from swift. It worked pretty well in swift 2, but in swift 3 the behaviour has changed. It gives me 3 different results, even though I send the same parameters.
Sometimes it doesnt find pfile, sometimes it fails on pin checking, sometimes works good and gives me x509.
char* ParsePKCS12(unsigned char* pkcs12_path, unsigned char * pin) {
printf("PARSE PATH: %s\n", pkcs12_path);
printf("PASSWORD: %s\n", pin);
NSString *pfile = [NSString stringWithUTF8String:pkcs12_path];
FILE *fp;
PKCS12 *p12;
EVP_PKEY *pkey;
X509 *cert;
BIO *databio = BIO_new(BIO_s_mem());
STACK_OF(X509) *ca = NULL;
if([[NSFileManager defaultManager] fileExistsAtPath:pfile]) {
NSLog(#"ok, pfile exists!");
} else {
NSLog(#"error, pfile does not exists!");
return "-1";
}
OpenSSL_add_all_algorithms();
ERR_load_crypto_strings();
fp = fopen([pfile UTF8String], "rb");
p12 = d2i_PKCS12_fp(fp, NULL);
fclose (fp);
if (!p12) {
fprintf(stderr, "Error reading PKCS#12 file\n");
ERR_print_errors_fp(stderr);
return "-1";
}
if (!PKCS12_parse(p12, (const char *)pin, &pkey, &cert, &ca)) { //Error at parsing or pin error
fprintf(stderr, "Error parsing PKCS#12 file\n");
ERR_print_errors_fp(stderr);
ERR_print_errors(databio);
return "-1";
}
BIO *bio = NULL;
char *pem = NULL;
if (NULL == cert) {
//return NULL;
return "-1";
}
bio = BIO_new(BIO_s_mem());
if (NULL == bio) {
return "-1";
}
if (0 == PEM_write_bio_X509(bio, cert)) {
BIO_free(bio);
//return NULL;
}
pem = (char *) malloc(bio->num_write + 1);
if (NULL == pem) {
BIO_free(bio);
return "-1";
}
memset(pem, 0, bio->num_write + 1);
BIO_read(bio, pem, bio->num_write);
BIO_free(bio);
PKCS12_free(p12);
return pem;
}
this code I call in swift like this:
self.x509 = String(cString:ParsePKCS12(UnsafeMutablePointer<UInt8>(mutating: self.path),
UnsafeMutablePointer<UInt8>(mutating: "123456"))!)
Your call
self.x509 = String(cString:ParsePKCS12(UnsafeMutablePointer<UInt8>(mutating: self.path),
UnsafeMutablePointer<UInt8>(mutating: "123456"))!)
does not work reliably because in both
UnsafeMutablePointer<UInt8>(mutating: someSwiftString)
calls, the compiler creates a temporary C string representation of
the Swift string and passes that to the function. But that C string
is only valid until the UnsafeMutablePointer constructor returns, which means that the second
string conversion can overwrite the first, or any other undefined
behaviour.
The simplest solution would be to change the C function to
take constant C strings (and use the default signedness):
char* ParsePKCS12(const char * pkcs12_path, const char * pin)
Then you can simply call it as
self.x509 = String(cString: ParsePKCS12(self.path, "123456"))
and the compiler creates temporary C strings which are valid during
the call of ParsePKCS12().

C# CRC-16-CCITT 0x8408 Polynomial. Help needed

I am new in communications programming. Basically, I need to get the hex equivalent of the CRC output. I have a hex string which is the parameter -
EE0000000015202020202020202020202020323134373030353935
This is concatenation of two strings. The output I need is E6EB in hex or 59115 in ushort. I tried different approaches based on what I found in the web but to no avail. The polynomial that I should be using is 0x8408, which is [CRC-16-CCITT][1], http://en.wikipedia.org/wiki/Polynomial_representations_of_cyclic_redundancy_checks.
I tried this approach, CRC_CCITT Kermit 16 in C#, but the output is incorrect. I also tried the bitwise ~ operator as some suggested for reverse computation, but still failed.
Any help is very much appreciated.
RevEng reports:
% ./reveng -s -w 16 EE0000000015202020202020202020202020323134373030353935e6eb
width=16 poly=0x1021 init=0xffff refin=true refout=true xorout=0xffff check=0x906e name="X-25"
So there's your CRC. Note that the CRC is reflected, where 0x8408 is 0x1021 reflected.
I found a solution and I'll post them in case someone will encounter the same.
private ushort CCITT_CRC16(string strInput)
{
ushort data;
ushort crc = 0xFFFF;
byte[] bytes = GetBytesFromHexString(strInput);
for (int j = 0; j < bytes.Length; j++)
{
crc = (ushort)(crc ^ bytes[j]);
for (int i = 0; i < 8; i++)
{
if ((crc & 0x0001) == 1)
crc = (ushort)((crc >> 1) ^ 0x8408);
else
crc >>= 1;
}
}
crc = (ushort)~crc;
data = crc;
crc = (ushort)((crc << 8) ^ (data >> 8 & 0xFF));
return crc;
}
private byte[] GetBytesFromHexString(string strInput)
{
Byte[] bytArOutput = new Byte[] { };
if (!string.IsNullOrEmpty(strInput) && strInput.Length % 2 == 0)
{
SoapHexBinary hexBinary = null;
try
{
hexBinary = SoapHexBinary.Parse(strInput);
if (hexBinary != null)
{
bytArOutput = hexBinary.Value;
}
}
catch (Exception ex)
{
throw ex;
}
}
return bytArOutput;
}
import System.Runtime.Remoting.Metadata.W3cXsd2001 for SoapHexBinary.

Resources