How to solve resource region dependency issue in Terraform - jenkins

For example: I have two separate modules, module-us-east-1, module-us-west-2. Now the consecutivity of running is module-us-east-1 then module-us-west-2 in the same Jenkins Pipeline. But module-us-east-1 will need some resources which will be created by module-us-west-2 abviously both modules will create resource in different regions.
Any hints on how this can be accomplished? :/

Use the aws_region data resource. It will pull in the current region that you are running in, so you can use the same module for both.
data "aws_region" "current" {}
Then you can template in the region when you need it:
"arn:aws:logs:${data.aws_region.current.name}:1234567890:log-group:*"
Edit:
You need four modules. Two modules, one for each region with interdependencies, is a circular dependency graph and will break.
Split the modules into shared and private resources, creating a shared and private module for each. Then use data resources to import
the shared resources into the private modules.

Related

My service modules not recognised by dask

I have a service with in I have several modules and in the main file I am importing most of my modules like below.
from base_client import BaseClient
import request_dispatcher as rd
import utils as util
In one of the functions in main I am calling the dask client submit. When I try to get the result back from future object it give me modulenotfound error as below
****ModuleNotFoundError: No module named 'base_client'****
This is how I define my client and call the function
def mytask(url, dest):
.....
client = Client(<scheduler ip>)
f_obj = client.submit(mytask, data_url, destination)
How exactly can I make these modules available to scheduler and workers?
When you do submit, Dask wraps up your task and sends it to the worker(s). As part of this package, any variables that are required by the task are serialised and sent too. In the case of functions defined inline, this includes the whole function, but in the case of functions in a module, it is only the module and function names. This is done to same CPU and bandwidth (imagine trying to send all the source of all of the modules you happen to have imported).
On the worker side, the task is unwrappd, and for the function, this means importing the module, import base_client. This follows the normal python logic of looking in the locations defined by sys.path. If the file defining the module isn't there, you get the error above.
To solve, copy the file to a place that the worker can see it. You can do this with upload_file on a temporary basis (which uses a temporary directory), but you would be better installing the module using usual pip or conda methods. Importing from the "current directory" is likely to fail even with a local cluster.
To get more information, you would need to post a complete example, which shows the problem. In practice, functions using imports from modules are used all the time with submit without problems.

Aurelia compose viewmodel paths

I have components that has dynamic parts with compose. The dynamic parts are in other modules i.e. node projects.
If I want to use a custom element in a page like:
<my-custom-element body.bind="someVariableContainingThePath"></my-custom-element>
I get an error saying that the viewmodel specified in someVariableContainingThePath cannot be found in ./my-custom-element/someVariableContainingThePath.
What is the recommended way to deal with paths when using compose element.
I'm using webpack.
Is there a way to alias a module.
So can set someVariableContainingThePath='moduleA' and then specify
that moduleA = /some/path/my-body-custom-element ?
For webpack bundling you have to give it a hint. Otherwise your view will not get bundled.
you can add your component to globalResourses when configuring aurelia
http://aurelia.io/docs/fundamentals/app-configuration-and-startup#making-resources-global
You have to decorate your module names with PLATFORM.modulename as in .globalResources(PLATFORM.modulename('my-module'))

Create object link to different DOORS project

Apparently, links are not supposed to connect objects of modules which reside in different projects. I failed trying to create some, both manually as well as DXL-based.
My script
Module modA = edit("/foo/foo", true, false)
Module modB = read("/bar/bar", false)
Object objA = object(1472, modA)
Object objB = object(781, modB)
objA -> objB
The script prints the error:
-R-E- DXL: <Line:78> A linkset pairing restriction prevents the creation of links
from /foo/foo to /bar/bar.
No link will be created.
-I- DXL: execution halted
Is there any trick to bypass that and create a link using magic or hidden features?
That is not a restriction for linking across Projects. This error is telling your two things:
There is no Linkset defined between the two documents specified.
The setting for Mandatory linksets is turned on in the document you are linking from.
I HIGHLY recommend leaving the Mandatory linksets turned on for all modules. Linksets give you the ability to organize the type of links that you are creating. If you turn this off, users can create linksets from anything to anything with any linkset they define on the fly. I have seen this cause big problems at different companies because you can't easily identify what links you want to analyze for traceability.
We have instead created a handful of link modules that we use for all links in our database. For Example:
Traceability Links
Reference Links
Glossary Links
etc...
This way, in any document we can reuse the same views and filters to view traceability across the Project or Projects. We then set up the linksets to use these link modules only.
Long story short, you need to create a linkset between Module A and Module B in Module A properties.

Share erlang record declaration between two modules

I have a mnesia table callable from two modules- obviously both modules needing to refer to the records of the table.
Is there any way to declare the record type in one module and use it in the other module, without having to redefine and maintain the declaration in two modules? At the moment I had to declare the record type in each module.
You can declare it in an include file (in the typical app, it would be found in an include directory from the root of the app), then include it in each module.
-include("myrecords.hrl").
To see a practical example, I'll refer you to the rebar repo so you can see how this is typically structured: https://github.com/rebar/rebar

Group Controllers to functional Packages in Grails

I'm developing a Grails App. I have about 20 Controllers right now and there will be more. Is there a way to Group the Controllers in functional Packages? I would like to have something like:
grails-app/administration/<controller classes>
grails-app/usercontent/<controller classes>
grails-app/publiccontent/<controller classes>
The best would be if the Package would not appear in the URL.
You can do something similar by putting your controllers into Java/Groovy packages:
package administration
class UserController { ... }
and placing the source code into corresponding sub-directories of grails-app/controllers/, eg. grails-app/controllers/administration/UserController.groovy. This won't change the default URL Mapping (ie. the package name is not included in the URL). Note however, that your controller names have to be unique even across different packages!
I'm not aware of any easy approach to achieve the directory layout you suggested (no controller/ in the path).

Resources