I am trying to execute an Airflow Dag using the GoogleAdsToGcsOperator, and I am getting this error:
Failed to execute job 1874 for task run_operator (Invalid constructor input for SearchGoogleAdsRequest: customer_id: "xxxxxxx"
query: "SELECT campaign.id, ad_group.id, ad_group.name FROM ad_group"
page_size: 10000
; 13968)
I am using this image version "composer-2.1.2-airflow-2.3.4"
Do someone knows how to solve this error?
Related
In trying to create the DockerOperator, I got this error:
Invalid arguments were passed to DockerOperator (task_id: t2). Invalid arguments were:
**kwargs: {'retrieve_output': True, 'retrieve_output_path': '/tmp/script.out'}
Here is my code:
from airflow.decorators import task, dag
from airflow.providers.docker.operators.docker import DockerOperator
from datetime import datetime
#dag(start_date=datetime(2023, 1, 1), schedule="#daily", catchup=False)
def docker_dag():
#task()
def t1():
pass
t2 = DockerOperator(
task_id='t2',
container_name="task_t2",
image='stock_image:v1.0.2',
command='python3 stock_data.py',
docker_url="tcp://docker-proxy:2375", # I have to use this on MacOS or I'll get a Permission Denied error
network_mode='bridge',
xcom_all=True,
retrieve_output=True,
retrieve_output_path="/tmp/script.out",
auto_remove=True,
mount_tmp_dir=False
)
t1() >> t2
dag = docker_dag()
Note: Here is the link to the documentation which shows my arguments do exist in the documentation. So why would I be getting an InvalidArgument error for just these 2 specific arguments?
retrieve_output and retrieve_output_path were added to the DockerOperator in the Docker provider version 2.2.0. :)
I have a code that connect successful through a libname to a Hive2 DB authenticated with kerberos:
libname hdb hadoop server="server-name" port=10000 schema="schema-name";
I have tested the code using Data Integration Studio and BASE, but when I call the same code with and the same user (checked putting in the code a %put &=SYSUSERID; and running in pipe a echo %username%) with a stored procedure the libname gives me error :
ERROR: Error trying to establish connection: Could not open client transport with JDBC Uri:
jdbc:hive2://"server-name":10000/"schema-name";ssl=true;principal=hive/_HOST#"dominion": GSS initiate
failed
ERROR: Error in the LIBNAME statement.
Getting the following error when I try to launch a Dataflow SQL job:
Failed to start the VM, launcher-____, used for launching because of status code: INVALID_ARGUMENT, reason: Error: Message: Invalid value for field 'resource.networkInterfaces[0].network': 'global/networks/default'. The referenced network resource cannot be found. HTTP Code: 400.
This issue just started today.
Adding the default network solved the issue.
How can I connect to a sql server using jdbc in spark-shell and then query the database using a stored procedure?
I have seen this code:
val url =
"jdbc:mysql://yourIP:yourPort/test?
user=yourUsername; password=yourPassword"
val df = sqlContext
.read
.format("jdbc")
.option("url", url)
.option("dbtable", "people")
.load()
But I need to run a stored procedure.
When I use exec command for the dbtable option above, it gives me this error:
com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect syntax near
the keyword 'exec'.
My DSN String looks like this:
dsn: odbc:DRIVER={SQL Server};Server=myserver.database.windows.net;Database=mydb;
username: myusername#myserver
password: mypwd
When I run the symfony task:
symfony doctrine:build-schema
It seems be a successfull connection, but it breaks with the following error:
SQLSTATE[42000]: Syntax error or access violation: 2812
[Microsoft][SQL Server Native Client 10.0][SQL Server]Could not find stored procedure
'sp_primary_keys_rowset'. (SQLExecute[2812] at ext\pdo_odbc\odbc_stmt.c:254).
Failing Query: "EXEC sp_primary_keys_rowset #table_name = Appointment"
Does anybody know what's the problem here? I could not find any useful information about there error codes.
I use symfony 1.4.10, PHP 5.3, SQL Azure.
Does the stored procedure sp_primary_keys_rowset exist in your database?
I found the solution:
In symfony/lib/plugins/sfDoctrinePlugin/lib/vendor/doctrine/Doctrine/Import/Mssql.php listTableColumns($table) I had to change
$sql = 'EXEC sp_primary_keys_rowset #table_name = ' . $this->conn->quoteIdentifier($table, true);
to
$sql = 'EXEC sp_pkeys #table_name = ' . $this->conn->quoteIdentifier($table, true);