In dask gateway's config docs, there is a setting adaptive period. How does this interact with the standard dask.config variable DASK_DISTRIBUTED__ADAPTIVE__INTERVAL? In general, the GatewaySchedulerService seems to have a parameters that mimic some dask.config options.
Are the following dask config vars used or ignored when using a gateway?
DASK_DISTRIBUTED__SCHEDULER__WORKER_TTL
DASK_DISTRIBUTED__ADAPTIVE__INTERVAL
DASK_DISTRIBUTED__ADAPTIVE__WAIT_COUNT
DASK_DISTRIBUTED__ADAPTIVE__TARGET_DURATION
If they are used, where should they be set? In the configuration yaml at gateway.backend.scheduler.extraContainerConfig, or some other place?
Related
New to AWS CDK, so please bear with me. I am trying to create multi-deployment CDK, wherein the Dev should be deployed to Account A and prod to Acocunt B.
I have created 2 stacks with the respective account numbers and so on.
mktd_dev_stack = Mktv2Stack(app, "Mktv2Stack-dev",
env=cdk.Environment(account='#####', region='us-east-1'),
stack_name = "myStack-Dev",
# For more information, see https://docs.aws.amazon.com/cdk/latest/guide/environments.html
)
the Prod is similar with the Prod account and different name. When I run them I plan on doing
cdk deploy Mktv2Stack-dev
and simiar for prod.
I am using the cdk 2.xx on Python
What my question is, does this setup give me an ability to pass a parameter, say details which is a dict object of names and criteria for resources that will be set up ? Or is there a way for me to pass parameter/dict from app.py to my program_name.py so that I can look up values from the dict and set them to resources accordingly.
Regards Tanmay
TL;DR Use a single stack and pass in the stg/prod as an env var to app.py.
Pass config down from app.py > Stacks > Constructs as Python Parameters (constructor args). Avoid using CDK Parameters* for config, says AWS's CDK Application Best Practices.
Practically speaking, you pass the account or alias as a environment variable, which app.py reads to perform the metadata lookups and set the stack props. Here's a node-flavoured version of this pattern:
AWS_ACCOUNT=123456789012 npx cdk deploy '*' -a 'node ./bin/app' --profile test-account"
Why not 2 stacks in app.py, one for PROD and one for STAGING?
A 2-stack approach can certainly work. The downsides are that you rarely want to deploy both environments at the same time (outside a CI/CD context). And cross-account permissions are trickier to handle safely if mixed in a single cdk deploy.
Customising Constructs for different environments
Within your code, use a dict, class or whatever to return the configuration you want based on an account or region input. Finally, pass the variables to the constructs. Here's an example of code that uses account, region and isProduction props to customise a s3 bucket:
const queriesBucket = new s3.Bucket(this, 'QueriesBucket', {
bucketName: `${props.appName.toLowerCase()}-queries-${props.env.account}-${
props.env.region
}`,
removalPolicy: props.isProduction
? cdk.RemovalPolicy.RETAIN
: cdk.RemovalPolicy.DESTROY,
versioned: props.isProduction,
lifecycleRules: [
{
id: 'metadata-rule',
prefix: 'metadata',
noncurrentVersionExpiration: props.isProduction
? cdk.Duration.days(30)
: cdk.Duration.days(14),
},
],
});
* "Parameter" has different meaning in Python and CDK. Passing variables between constructs in code using Python Parameters (=method arguments) is a best practice. In CDK-speak a Parameter has the special meaning of a variable value passed to CloudFormation at deploy time. These are not CDK best practice.
I have a CloudFlare Worker where I have environment variables set in the CF Settings..Environment Variables interface. I also have this wrangler.toml
In my worker's index.js I have code reading the variable REGISTRATION_API_URL. If the code is running in a deployed environment then it injects the value from the CF Settings into REGISTRATION_API_URL just fine.
But if I run
wrangler dev
or
wrangler dev --env local
then REGISTRATION_API_URL is undefined.
Originally I expected that the variable would be populated by the CF Settings values, but they aren't. So I tried the two vars setting in the wrangler.toml I show here but no difference. And I have spent a lot of time searching the docs and the greater web.
Are environment variables supported in a local dev environment? Any workarounds that people have come up with? Currently I am looking for undefined and defining the variable with a hard-coded value, but this is not a great answer.
Using wrangler 1.16.0
Thanks.
The docs could be more clear but if you are using the newer module syntax, the variables will not be available as global variables.
Environmental variables with module workers
When deploying a Module Worker, any bindings will not be available as global runtime variables. Instead, they are passed to the handler as a parameter – refer to the FetchEvent documentation for further comparisons and examples .
Here's an example.
export default {
async fetch(request, env, context) {
return new Response(env.MY_VAR);
},
};
KV namespaces are also available in the same object.
Maybe a bit late, but: no I don't think you can
But: you can always use self["YOUR_ENV_VARIABLE"] to get the value and then go from there (unfortunately the docs don't mention that)
Here is what I personally do in my Workers Site project to get the Release version (usually inserted via pipeline/action and then inserted via HtmlRewriter into the index.html):
const releaseVersion = self["RELEASE_VERSION"] || 'unknown'
How would one define environment variables for traefik2 so that they could be used within the dynamic file configuration ? e.g:
[http.routers]
[http.routers.the-rtr]
entrypoints = https
rule = Host(`rtr.$DOMAINNAME`)
where DOMAINNAME would have been defined somewhere (in a file, CLI arguments etc.)
Traefik's dynamic configuration does accept Go templating:
Traefik supports using Go templating to automatically generate repetitive portions of configuration files. These sections must be valid Go templates, augmented with the Sprig template functions.
See https://doc.traefik.io/traefik/providers/file/#go-templating
Note that Go Templating only works with dedicated dynamic configuration files. Templating does not work in the Traefik main static configuration file.
For example, if $DOMAINNAME is set as an environment variable, you can do
rule: Host(`{{ env "DOMAINNAME" | trimAll "\"" }}`)
Note: due to "env" quoting, the trimAll is needed — it might be better solution, but it's the better I've found so far.
Not sure if it's directly supported from traefik product.
I use file provider and in traefik.toml, I have:
[providers.file]
filename = "/etc/traefik/dynamic.config.toml"
watch = true
And I use separate mechanism like envsubst to generate (or regenerate as needed) the dynamic.config.toml file. Since I've watch = true, it gets loaded with latest info by traefik
Basically, the snipped you shared in the question can be used as a template file. Then use envsubst or similar to generate dynamic.config.toml.
Q&A on envsubst that I found useful: How to substitute shell variables in complex text files
Hope that helps.
This question mentions two types of performance stats:
carbon.*: Stats from graphite itself.
stats.* : Stats from statsd.
I am seeing 1., but I'm not seeing 2.
Is there a statsd configuration setting (e.g. some entry in the js file) which will let me see 2.?
(2) should be generated by default, but the default prefix is statsd. and not stats. as you've said. Maybe check that you don't have any Graphite rules expecting the wrong prefix.
You can also configure the prefix to whatever you want using the prefixStats property in the .js config file.
See the documentation in the example config:
https://github.com/etsy/statsd/blob/master/exampleConfig.js
I specified in the application.properties:
spring.cloud.config.uri=http://configserver:8888
but when I deploy a stream from the dashboard I can see in the logs
Fetching config from server at: http://localhost:8888
which means that it still tries to use the default settings.
Also any other properties like Kafka binder or zkNodes are not read from application.properties, but the default values are used, which makes the deployment to fail.
How can I override these properties for all the deployed app/streams?
The properties must be prefixed with spring.cloud.dataflow.applicationProperties.stream, like
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.config.uri=http://configserver:8888
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.kafka.binder.brokers=kafka:9092
spring.cloud.dataflow.applicationProperties.stream.spring.cloud.stream.kafka.binder.zkNodes=zookeeper:2181