A newbie question: I wrote my first rebar based erlang application. I want to configure some basic properites like server host etc. Where is the best place to put them and how should I load them into the app?
The next steps are to make a release and create a node in it. A node runs your application in a standalone Erlang VM. A good starting point for creating a release using rebar:
Erlang Application Management with Rebar
Once you have created a release. The configuration properties for all applications in your node can then be added to
{your-app}/{release}/files/sys.config
You can read individual properties as follows:
Val = application:get_env(APP, KEY)
Alternatively, all properties for your application can be read as
Config = application:get_all_env(APP)
In sys.config, properties can be added as a proplist.
Example:
{myapp,
[
{port, 1234},
{pool_size, 5}
]
}
Related
I have an application written in erlang, i added a supervisor for distribution and now after parsing the configFile.cfg in the supervisor, i want to pass the configuration to my old application.
i have something like this now:
-module(supervisor_sup).
start() ->
application_app:start().
what i want is:
-module(supervisor_sup).
-record(config,{param1,param2}).
%After parsing the configFile.cfg
Conf = #config{param1 = Param1,
param2 = Param2},
start(Conf) ->
application_app:start(Conf).
It is uncommon to start applications from supervisors or modules under supervisors. The preferred way is to use application dependency to make sure all applications are started, and in the right order.
However, if you want some configuration to be available from several different applications without having to parse the configuration more than once, maybe the gproc library is what you are looking for?
https://github.com/uwiger/gproc
gproc can be used to efficiently set global configuration and much more. Even in a distributed environment.
What is the recommended way to have different values for application environment variables in an erlang application?
What I mean here is: how do you support different environment in your application (e.g. development, stage, production) in your erlang application? For example I would like tests using a specific fake service on a known host and production code use the real server on a different host.
You can use application config file as well. you can also pass the config as parameter while starting an erlang console that can help you in setting up environment variables. so you are pass test.config or production.config based on environment there by no need to compile the code and start them.
You can find more info here
http://www.erlang.org/doc/man/config.html
Dependency injection.
test_setup() -> [ {host,"http://..."}, ... ].
prod_setup() -> [ {host,"http://..."}, ... ].
test_start() -> start(test_setup()).
prod_start() -> start(prod_setup()).
start(Config) -> ... .
Alternately, policy modules. Make a policy whose interface matches the stuff you need, then pass in the name of the module containing the policy you want. Think ETS/DETS.
I'm using rebar to compile my application. Actually, it's two applications:
deps/
apps/A/
apps/B/
apps/B/suites
...where B depends on A. This is correctly configured in apps/B/src/B.app.src. However, when I attempt to run rebar ct, it fails to test B, reporting that it can't find A.app.
Running rebar ct in verbose mode shows that it's setting the code search path (-pa) to include apps/B/ebin, and deps/foo/ebin, deps/bar/ebin, etc.
It is not including apps/A/ebin.
How do I use Common Test to test an Erlang "application" that's made up of multiple applications?
Add in apps/B/rebar.config
{lib_dirs, [
".."
]}.
or
{ct_extra_params, "-pa ../A/ebin"}.
IMO, if B depends on A, I would have two separate tests. One for A and mention it deps section of B of rebar config and write separate test cases for B and run only for B so that application A modules would be automatically taken care by rebar.
How would run execute a php script via the command line in Zend Framework 2 that bypasses all of the MVC functionality, but so that the script will have access to resources created by a module, such as database connections or a Doctrine EntityManager?
For reference this is what my entry point index.php looks like
<?php
chdir(dirname(__DIR__));
require_once (getenv('ZF2_PATH') ?: 'vendor/ZendFramework/library') . '/Zend/Loader /AutoloaderFactory.php';
Zend\Loader\AutoloaderFactory::factory(array('Zend\Loader\StandardAutoloader' => array()));
$appConfig = include 'config/application.config.php';
$listenerOptions = new Zend\Module\Listener\ListenerOptions($appConfig['module_listener_options']);
$defaultListeners = new Zend\Module\Listener\DefaultListenerAggregate($listenerOptions);
$defaultListeners->getConfigListener()->addConfigGlobPath('config/autoload/*.config.php');
$moduleManager = new Zend\Module\Manager($appConfig['modules']);
$moduleManager->events()->attachAggregate($defaultListeners);
$moduleManager->loadModules();
// Create application, bootstrap, and run
$bootstrap = new Zend\Mvc\Bootstrap($defaultListeners->getConfigListener()->getMergedConfig());
$application = new Zend\Mvc\Application;
$bootstrap->bootstrap($application);
$application->run()->send();
There are several options for a module to configure resources, for example a database connection. For ZF2, the DoctrineModule and DoctrineORMModule are maintained by Doctrine themselves and the Entity Manager is only created when you want to use it. Instead of a ZF1 approach where database connections are created during bootstrap, these modules have a sort-of just-in-time: create the instance just before you need it.
So: you need to configure the connection and entity manager in a module which is done with a Dependency Injection container. That configuration is done by the module, so you should take care the module is configured properly. Then, in your script you pull the entity manager from the DIC and the EM is instantiated, just what you need.
How to configure modules? First, modules are loaded, then initiated. This is both done in the call $moduleManager->loadModules() because of several listeners listening to the load event, thus instantiating the modules and running the init() method for these modules. There is also a listener to get the modules configuration, so DI can be set up properly. At this time, modules are set up, but not always ready to run. The last step to be ready is bootstrapping which is done by the Zend\Mvc\Bootstrap, bootstrapping the Zend\Mvc\Application.
This means if you copy that index.php script, but only remove the last line ($application->run()->send()), the modules are loaded, the DI container is configured and the application bootstrapped. Now you can get the DI locator and grap the entity manager:
$em = $application->getLocator()->get('doctrine_em');
Be aware of the CLI feature of zf2, coming in some of the next beta releases. The RFC is ready at this moment and when ready for experiments, it will be merged in zf2 master. This makes construction of CLI applications much easier. You have one entry point (eg app.php) and the "real work" is just like http MVC applications done by controllers. Once you have this app.php, it will be extremely easy to add more CLI features because it's as simple as adding some more controllers with actions and (possibly) a route to that point.
I have several nodes running in an erlang cluster, each using the same magic cookie and trusted by each other. I want to have one master node send code and modules to the other nodes. How can I do this?
use nl(module_name). to load code on all the nodes.
Check out my etest project for an example of programmatically injecting a set of modules on all nodes and then starting it.
The core of this is pretty much the following code:
{Mod, Bin, File} = code:get_object_code(Mod),
{_Replies, _} = rpc:multicall(Nodes, code, load_binary,
[Mod, File, Bin]),