open policy agent - How to persist policies from REST API? - open-policy-agent

I'm new to OPA (open policy agent) and trying to create new policy using REST API /v1/policies/{id}. It works! But, OPA server saves it to memory and after rebooting all my policies are removed. How can I fix it, which parameters should I use to persist created policies ?

OPA saves all policy and data used for evaluation in memory. Consider using Bundles to distribute policy and data to OPA. Furthermore, you have the option to persist activated bundles to disk for recovery purposes. When bundle persistence is enabled, OPA will attempt to read the bundle from disk on startup. The documentation covers more details about bundle persistence.

Related

How could register app from S3 bucket without public permission in Spring data flow server

I have running data flow server in PCF. And i want to register app(http://....jar) which is from S3 bucket and it does not have public access.
I see there are only 3 params available(--name, --type, --uri) for app register, how could pass credentials like --aws.accessKeyId and --aws.secretKey.
At the time of this writing, we do not have a AWS-native approach to resolve Spring Cloud Stream or Spring Cloud Task applications from S3 buckets directly. We have an open story on this matter, however. Today, you can only resolve publicly accessible application artifacts from S3 buckets.
Alternatively, you could host and resolve the applications using the SCDF App Tool that we ship.
There are few other alternatives, too, so feel free to try out the options and choose the method that works best for your use-case.

Permanently register app in Spring Data flow application

I have integrated Spring Data flow and uploaded application jar into the panel. However, whenever I restart the dataflow application I loose the app mapping with JAR. How can I permanently have it in spring-data-flow
I tried various places to register the app permanently but all in vain.
Thanks,
Dhruv
You need to add data source mapping to spring-data-flow application.
By default, it goes for embedded H2 database and hence the deployment gets lost.
Once I added the DB configuration. It was resolved.
Add the following lines in application.properties for mysql
server.port=8081
spring.datasource.url= jdbc:mysql://localhost:3306/app_batch
spring.datasource.username=root
spring.datasource.password=
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.jpa.hibernate.ddl-auto=none
SCDF requires a persistent RDBMS like MySQL, Oracle and others for production deployments.
The app-registry (i.e., a registry for app coordinates), task/batch execution history, stream/task definitions, audit trails, and other metadata about all of your deployments via SCDF are tracked in the persistent database.
If you don't provide one, by default, SCDF uses H2 - an in-memory database. Though it allows you to bootstrap with this database rapidly, it should not be used in production deployments. If the server restarts/crashes, the in-memory footprint goes away and a new session is created. That's why persistent storage is a requirement, so it can survive independently even when SCDF restarts.

Root access to just the dataflow instances

Is it possible to configure an access policy that would allow the job creator (or less preferably anybody with access to the project) access to the created instances, without granting similar access to the rest of the machines in the Google Cloud project (e.g., production machines)?
Thanks again,
G
The access policies that you define for the project are carried over to the instances backing up a job submitted in that project.
Details on the policies and related actions are outlined in the GCE documentation: https://cloud.google.com/compute/docs/access
Cheers,
r

Can libgit2sharp rely on the installed git global configuration provider?

I'm wiring up some LibGit2Sharp code to VSO, so I need to use alternate credentials to access it. (NTLM won't work) I don't want to have to manage these cleartext credentials - I'm already using git-credential-winstore to manage them, and I'm happy logging onto the box if I ever need to update those creds.
I see that I can pass in DefaultCredentials and UsernamePassword credentials - is there any way I can get it to fetch the creds from the global git cred store that's already configured on the machine?
Talking to external programs is outside of the scope of libgit2, so it won't talk to git's credential helper. It's considered to be the tool writer's responsibility to retrieve the credentials from the user, wherever they may be.
The credential store is a helper for the git command-line tool to integrate with whatever credcential storage you have on your environment while keeping the logic outside of the main tool, which needs to run in many different places. It is not something that's core to a repository, but a helper for the user interface.
When using libgit2, you are the one who is writing the tool which users interact with and thus knows how to best get to the environment-specific storage. What libgit2 wants to know is what exactly it should answer to the authentication challenge, as any kind of guessing on its part is going to make everyone's life's harder.
Since the Windows credential storage is accessed through an API, it's not out of the question to support some convenience functions to transform from that credential storage into what libgit2's callback wants, but it's not something where libgit2 can easily take the initiative.

API keys and secrets used in iOS app - where to store them?

I'm developing for iOS and I need to make requests to certain APIs using an API key and a secret. However, I wouldn't like for it to be exposed in my source code and have the secret compromised when I push to my repository.
What is the best practice for this case? Write it in a separate file which I'll include in .gitignore?
Thanks
Write it in a separate file which I'll include in .gitignore?
No, don't write it ever.
That means:
you don't write that secret within your repo (no need to gitignore it, or ot worry about adding/committing/pushing it by mistake)
you don't write it anywhere on your local drive (no need to worry about your computer stolen with that "secret" on it)
Store in your repo a script able to seek that secret from an external source (from outside of git repo) and load it in memory.
This is similar to a git credential-helper process, and that script would launch a process listening to localhost:port in order to serve that "secret" to you when you whenever you need it in the current session only.
Once the session is done, there is no trace left.
And that is the best practice to manage secret data.
You can trigger automatically that script on git checkout, if you declare it in a .gitattributes file as a content filter:
This is a very old question, but if anyone is seeing this in google I would suggest you try CloudKit for storing any App secrets (API keys, Oauth secrets). Only your app can access your app container and communication between Apple and your app is secure.
You can check it out here.

Resources