I'm doing the basic setup in python to pass data to an InfluxDB server I have running on a RaspberryPi. My issue is the write_points() function does not write ANY data to InfluxDB even though I am using the simplest possible measurement and field-set entry as a test:
from influxdb import InfluxDBClient
from influxdb_config import HOST, PORT, USERNAME, PASSWORD, DATABASE
from data_poll import quotes_response
import pprint
influxdbClient = InfluxDBClient(
host = HOST,
port = PORT,
username = USERNAME,
password = PASSWORD,
database = 'example'
)
data = [
{
"measurement": "stock price",
"fields": {
"price": 0.64,
"volume": 120
}
}
]
pprint.pprint(influxdbClient.ping())
pprint.pprint(influxdbClient.get_list_database())
influxdbClient.switch_database('example')
pprint.pprint(influxdbClient.write_points(data))
pprint.pprint(influxdbClient.query('SELECT * FROM example'))
I am able to communicate with the server via Python and, if I create values manually on the server, retrieve them in the same script. Below is a snippet of the terminal output that matches some of the requests in the above code snippet.
'1.8.4'
[{'name': '_internal'}, {'name': 'jsonAAPLDataTest'}, {'name': 'example'}]
True
ResultSet({})
Update 2021/03/14 - I'm currently using Python 3.9.2, but had the exact same issue utilizing 3.7.3 (tested by the API developers). My next attempt is to downgrade my InfluxDB instance from v1.8.4 to v1.7.4 to see if this, by chance, resolves the issue.
I was able to now right data to my InfluxDB v1.8.4 database using the proper API github.com/influxdata/influxdb-client-python. Prior to this I was utilizing the prior release API which apparentely must have a difference in the underlying functionality of writing to the database. FIgured I would at least follow up and share the information so others would know in case they encoutner this issue.
Related
I'm trying to use a custom authorizer to authenticate a web client.
I have succesfully created a dedicated lambda and a custom authorizer. If I launch aws iot describe-authorizer --authorizer-name <authorizer-name> I can see
{
"authorizerDescription": {
"authorizerName": "<authorizer-name>",
"authorizerArn": "...",
"authorizerFunctionArn": "...",
"tokenKeyName": "<token-key-name>",
"tokenSigningPublicKeys": {
"<public-key-name>": "-----BEGIN PUBLIC KEY-----\n<public-key-content>\n-----END PUBLIC KEY-----"
},
"status": "ACTIVE",
"creationDate": "...",
"lastModifiedDate": "...",
"signingDisabled": false,
"enableCachingForHttp": false
}
}
Moreover I can test it succesfully:
$ aws iot test-invoke-authorizer --authorizer-name '<authorizer-name>' --token '<public-key-name>' --token-signature '<private-key-content>'
{
"isAuthenticated": true,
"principalId": "...",
"policyDocuments": [ "..." ],
"refreshAfterInSeconds": 600,
"disconnectAfterInSeconds": 3600
}
$
But I cannot connect using the browser.
I'm using aws-iot-device-sdk and according the SDK documentation I should set customAuthHeaders and/or customAuthQueryString (my understanding is that the latter should be used in web environment due to a limitation of the browsers) with the headers / queryparams X-Amz-CustomAuthorizer-Name, X-Amz-CustomAuthorizer-Signature and TestAuthorizerToken but no matter what combination I set for these values the iot endpoint always close the connection (I see a 1000 / 1005 code for the closed connection)
What I've written so far is
const CUSTOM_AUTHORIZER_NAME = '<authorizer-name>';
const CUSTOM_AUTHORIZER_SIGNATURE = '<private-key-content>';
const TOKEN_KEY_NAME = 'TestAuthorizerToken';
const TEST_AUTHORIZER_TOKEN = '<public-key-name>';
function f(k: string, v?: string, p: string = '&'): string {
if (!v)
return '';
return `${p}${encodeURIComponent(k)}=${encodeURIComponent(v)}`;
}
const client = new device({
region: '...',
clientId: '...',
protocol: 'wss-custom-auth' as any,
host: '...',
debug: true,
// customAuthHeaders: {
// 'X-Amz-CustomAuthorizer-Name': CUSTOM_AUTHORIZER_NAME,
// 'X-Amz-CustomAuthorizer-Signature': CUSTOM_AUTHORIZER_SIGNATURE,
// [TOKEN_KEY_NAME]: TEST_AUTHORIZER_TOKEN
// },
customAuthQueryString: `${f('X-Amz-CustomAuthorizer-Name', CUSTOM_AUTHORIZER_NAME, '?')}${f('X-Amz-CustomAuthorizer-Signature', CUSTOM_AUTHORIZER_SIGNATURE)}${f(TOKEN_KEY_NAME, TEST_AUTHORIZER_TOKEN)}`,
} as any);
As you can see I started having also doubts about the headers names!
After running my code I see that the client tries to do a GET to the host with the querystring that I wrote.
I also see that IoT core responds with a 101 Switching Protocols, and then that my client send the CONNECT command to IoT via websocket and then another packet from my browser to the backend system.
Then the connection is closed by IoT.
Looking at cloudwatch I cannot see any interaction with the lambda, it's like the request is blocked.
my doubts are:
first of all, is it possible to connect via mqtt+wss using only a custom auth, without cognito/certificates? keep in mind that I am able to use a cognito identity pool without errors, but I need to remove it.
is it correct that I just need to set up the customAuthQueryString parameter? my understanding is that this should be used on the web.
what are the values I should set up for the various headers/queryparams? X-Amz-CustomAuthorizer-Name is self explanatory, but I'm not sure about X-Amz-CustomAuthorizer-Signature (it's correct to fill it with the content of my private key?). moreover I'm not sure about the TestAuthorizerToken. Is it the correct key to set up?
I've also tried to run the custom_authorizer_connect of the sdk v2 but it's still not working, and I run out of ideas.
turns out the problem was in the permissions set on the backend systems.
am getting this error message
{
"code": "Neo.TransientError.Database.DatabaseUnavailable",
"message": "Requested database is not available. Requested database name: 'graph.db'."
}
while sending a request through rest API, with this statement
{
"statements" : [ {
"statement" : "CREATE (n) RETURN id(n)"
} ]
}
I'm guessing you probably don't have a database called graph.db. That is the name of the file system directory, not the database. Unless you've set up a database yourself, use neo4j, which is the default.
As Nigel said, it'll be because you don't have a graph by that name.
I ran into this issue recently, after upgrading from 3.5.3 to 4.1, and having to figure out some new behaviours (why yes, I have been living under a rock).
Read through the server's logs - it reports the database name while it's starting up. If you're using a Docker instance, as I am, docker logs <instance-id> is your friend.
I am using replace in my rollup configuration for sapper and sapper-environment to pass environment variables to the client side in sapper - is this secure? Is there a better/safer way to approach this?
Using this config below:
rollup.config.js
const sapperEnv = require('sapper-environment');
export default {
client: {
input: config.client.input(),
output: config.client.output(),
plugins: [
replace({
...sapperEnv(),
'process.browser': true,
'process.env.NODE_ENV': JSON.stringify(mode)
})
...
And then this allows me to use the variables in stores.js:
import { writable } from 'svelte/store';
import Client from 'shopify-buy';
const key = process.env.SAPPER_APP_SHOPIFY_KEY;
const domain = process.env.SAPPER_APP_SHOPIFY_DOMAIN;
// Initialize a client
const client = Client.buildClient({
domain: domain,
storefrontAccessToken: key
});
export { key, domain, client };
I have tried running this in server,js and passing the variables through the session data, but client side no matter what I do they always seems to return 'undefined'.
There are two questions here — a) is it secure, and b) why are the values undefined?
The answer to the first question is 'no'. Any time you include credentials in JavaScript that gets served to the client (or in session data), you're making those credentials available to anyone who knows how to look for them. If you need to avoid that, you'll need your server (or another server) to make requests on behalf of authenticated clients.
As for the second part, it's very hard to tell without a reproduction unfortunately!
import pandas as pd
from google.cloud import bigquery
import google.auth
# from google.cloud import bigquery
# Create credentials with Drive & BigQuery API scopes
# Both APIs must be enabled for your project before running this code
credentials, project = google.auth.default(scopes=[
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/spreadsheets',
'https://www.googleapis.com/auth/bigquery',
])
client = bigquery.Client(credentials=credentials, project=project)
# Configure the external data source and query job
external_config = bigquery.ExternalConfig('GOOGLE_SHEETS')
# Use a shareable link or grant viewing access to the email address you
# used to authenticate with BigQuery (this example Sheet is public)
sheet_url = (
'https://docs.google.com/spreadsheets'
'/d/1uknEkew2C3nh1JQgrNKjj3Lc45hvYI2EjVCcFRligl4/edit?usp=sharing')
external_config.source_uris = [sheet_url]
external_config.schema = [
bigquery.SchemaField('name', 'STRING'),
bigquery.SchemaField('post_abbr', 'STRING')
]
external_config.options.skip_leading_rows = 1 # optionally skip header row
table_id = 'BambooHRActiveRoster'
job_config = bigquery.QueryJobConfig()
job_config.table_definitions = {table_id: external_config}
# Get Top 10
sql = 'SELECT * FROM workforce.BambooHRActiveRoster LIMIT 10'
query_job = client.query(sql, job_config=job_config) # API request
top10 = list(query_job) # Waits for query to finish
print('There are {} states with names starting with W.'.format(
len(top10)))
The error I get is:
BadRequest: 400 Error while reading table: workforce.BambooHRActiveRoster, error message: Failed to read the spreadsheet. Errors: No OAuth token with Google Drive scope was found.
I can pull data in from a BigQuery table created from CSV upload, but when I have a BigQuery table created from a linked Google Sheet, I continue to receive this error.
I have tried to replicate the sample in Google's documentation (Creating and querying a temporary table):
https://cloud.google.com/bigquery/external-data-drive
You are authenticating as yourself, which is generally fine for BQ if you have the correct permissions. Using tables linked to Google Sheets often requires a service account. Create one (or have your BI/IT team create one), and then you will have to share the underlying Google Sheet with the service account. Finally, you will need to modify your python script to use the service account credentials and not your own.
The quick way around this is to use the BQ interface, select * from the Sheets-linked table, and save the results to a new table, and query that new table directly in your python script. This works well if this is a one-time upload/analysis. If the data in the sheets will be changing consistently and you will need to routinely query the data, this is not a long-term solution.
I solved problem by adding scope object to client.
from google.cloud import bigquery
import google.auth
credentials, project = google.auth.default(scopes=[
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/bigquery',
])
CLIENT = bigquery.Client(project='project', credentials=credentials)
https://cloud.google.com/bigquery/external-data-drive
import pandas as pd
from google.oauth2 import service_account
from google.cloud import bigquery
#from oauth2client.service_account import ServiceAccountCredentials
SCOPES = ['https://www.googleapis.com/auth/drive','https://www.googleapis.com/auth/bigquery']
SERVICE_ACCOUNT_FILE = 'mykey.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
delegated_credentials = credentials.with_subject('myserviceaccountt#domain.iam.gserviceaccount.com')
client = bigquery.Client(credentials=delegated_credentials, project=project)
sql = 'SELECT * FROM `myModel`'
DF = client.query(sql).to_dataframe()
You can try to update your default credentials through the console:
gcloud auth application-default login --scopes=https://www.googleapis.com/auth/userinfo.email,https://www.googleapis.com/auth/drive,https://www.googleapis.com/auth/cloud-platform
Is it possible to add a monitoring package through the Softlayer API. On the portal, I can go into the Monitoring section and Order a "Monitoring Package - Basic", which will associate it with that Virtual Guest.
Is it possible to do this either during the placeOrder call or after the initial placeOrder call (i.e if the customer wants to add Basic Monitoring after the server is provisioned).
I tried to look into examples but they all assumed that there was a monitoring agent available, but it wasnt in my case. I also looked into Going Further with Softlayer part 3 but not sure how to extract the Basic Monitoring package from Product_Package Service.
Im using Python to do this, so any pointers in associating a Monitoring service during creation or after-creation would be very helpful.
Thanks in Advance!
try this:
"""
Order a Monitoring Package
Build a SoftLayer_Container_Product_Order_Monitoring_Package object for a new
monitoring order and pass it to the SoftLayer_Product_Order API service to order it
In this care we'll order a Basic (Hardware and OS) package with Basic Monitoring Package - Linux
configuration for more details see below
Important manual pages:
https://sldn.softlayer.com/reference/datatypes/SoftLayer_Container_Product_Order_Monitoring_Package
http://sldn.softlayer.com/reference/datatypes/SoftLayer_Product_Item_Price
http://sldn.softlayer.com/reference/services/SoftLayer_Product_Order/verifyOrder
http://sldn.softlayer.com/reference/services/SoftLayer_Product_Order/placeOrder
http://sldn.softlayer.com/reference/datatypes/SoftLayer_Monitoring_Agent_Configuration_Template_Group
License: http://sldn.softlayer.com/article/License
Author: SoftLayer Technologies, Inc. <sldn#softlayer.com>
"""
import SoftLayer
USERNAME = 'set me'
API_KEY = 'set me'
"""
Build a skeleton SoftLayer_Container_Product_Order_Monitoring_Package object
containing the order you wish to place.
"""
oderTemplate = {
'complexType': 'SoftLayer_Container_Product_Order_Monitoring_Package',
'packageId': 0, # the packageID for order monitoring packages is 0
'prices': [
{'id': 2302} # this is the price for Monitoring Package - Basic ((Hardware and OS))
],
'quantity': 0, # the quantity for order a service (in this case monitoring package) must be 0
'sendQuoteEmailFlag': True,
'useHourlyPricing': True,
'virtualGuests': [
{'id': 4906034} # the virtual guest ID where you want add the monitoring package
],
'configurationTemplateGroups': [
{'id': 3} # the templateID for the monitoring group (in this case Basic Monitoring package for Unix/Linux operating system.)
]
}
# Declare the API client to use the SoftLayer_Product_Order API service
client = SoftLayer.Client(username=USERNAME, api_key=API_KEY)
productOrderService = client['SoftLayer_Product_Order']
"""
verifyOrder() will check your order for errors. Replace this with a call to
placeOrder() when you're ready to order. Both calls return a receipt object
that you can use for your records.
Once your order is placed it'll go through SoftLayer's provisioning process.
"""
try:
order = productOrderService.verifyOrder(oderTemplate)
print(order)
except SoftLayer.SoftLayerAPIError as e:
print("Unable to verify the order! faultCode=%s, faultString=%s"
% (e.faultCode, e.faultString))
exit(1)
this is an example to create an network monitoring
"""
Create network monitoring
The script creates a monitoring network with Service ping
in a determinate IP address
Important manual pages
http://sldn.softlayer.com/reference/services/SoftLayer_Network_Monitor_Version1_Query_Host
http://sldn.softlayer.com/reference/datatypes/SoftLayer_Network_Monitor_Version1_Query_Host
License: http://sldn.softlayer.com/article/License
Author: SoftLayer Technologies, Inc. <sldn#softlayer.com>
"""
import SoftLayer.API
from pprint import pprint as pp
# Your SoftLayer API username and key.
USERNAME = 'set me'
API_KEY = 'set me'
# The ID of the server you wish to monitor
serverId = 7698842
"""
ID of the query type which can be found with SoftLayer_Network_Monitor_Version1_Query_Host_Stratum/getAllQueryTypes.
This example uses SERVICE PING: Test ping to address, will not fail on slow server response due to high latency or
high server load
"""
queryTypeId = 1
# IP address on the previously defined server to monitor
ipAddress = '10.104.50.118'
# Declare the API client
client = SoftLayer.Client(username=USERNAME, api_key=API_KEY)
networkMonitorVersion = client['SoftLayer_Network_Monitor_Version1_Query_Host']
# Define the SoftLayer_Network_Monitor_Version1_Query_Host templateObject.
newMonitor = {
'guestId': serverId,
'queryTypeId': queryTypeId,
'ipAddress': ipAddress
}
# Send the request for object creation and display the return value
try:
result = networkMonitorVersion.createObject(newMonitor)
pp(result)
except SoftLayer.SoftLayerAPIError as e:
print("Unable to create new network monitoring "
% (e.faultCode, e.faultString))
exit(1)
Regards