Hyperleger Composer "namespace already exists" error on "composer archive create" command - hyperledger

when execute command:
composer archive create --sourceType dir --sourceName /home/testuser/test-network -a /home/testuser/test-network/dist/test-network.bna
I get an error:
Creating Business Network Archive
Looking for package.json of Business Network Definition
Input directory: /home/testuser/test-network
/usr/lib/node_modules/composer-cli/node_modules/yargs/yargs.js:1079
else throw err
^
Error: namespace already exists
at ModelManager.addModelFiles (/usr/lib/node_modules/composer-cli/node_modules/composer-common/lib/modelmanager.js:234:31)
at Function.fromDirectory (/usr/lib/node_modules/composer-cli/node_modules/composer-common/lib/businessnetworkdefinition.js:493:43)
at Function.handler (/usr/lib/node_modules/composer-cli/lib/cmds/archive/lib/create.js:80:42)
at Object.module.exports.handler (/usr/lib/node_modules/composer-cli/lib/cmds/archive/createCommand.js:31:30)
at Object.self.runCommand (/usr/lib/node_modules/composer-cli/node_modules/yargs/lib/command.js:233:22)
at Object.Yargs.self._parseArgs (/usr/lib/node_modules/composer-cli/node_modules/yargs/yargs.js:990:30)
at Object.self.runCommand (/usr/lib/node_modules/composer-cli/node_modules/yargs/lib/command.js:204:45)
at Object.Yargs.self._parseArgs (/usr/lib/node_modules/composer-cli/node_modules/yargs/yargs.js:990:30)
at Object.get [as argv] (/usr/lib/node_modules/composer-cli/node_modules/yargs/yargs.js:927:19)
at Object.<anonymous> (/usr/lib/node_modules/composer-cli/cli.js:58:5)
I have changed the files to build the network and even I get the error with the example files:
File /home/testuser/test-network/lib/logic.js:
function sampleTransaction(tx) {
// Save the old value of the asset.
var oldValue = tx.asset.value;
// Update the asset with the new value.
tx.asset.value = tx.newValue;
// Get the asset registry for the asset.
return getAssetRegistry('org2.acme.sample2.SampleAsset')
.then(function (assetRegistry) {
// Update the asset in the asset registry.
return assetRegistry.update(tx.asset);
})
.then(function () {
// Emit an event for the modified asset.
var event = getFactory().newEvent('org2.acme.sample2', 'SampleEvent');
event.asset = tx.asset;
event.oldValue = oldValue;
event.newValue = tx.newValue;
emit(event);
});
}
File /home/testuser/test-network/test.cto:
namespace org2.acme.sample2
asset SampleAsset identified by assetId {
o String assetId
--> SampleParticipant owner
o String value
}
participant SampleParticipant identified by participantId {
o String participantId
o String firstName
o String lastName
}
transaction SampleTransaction {
--> SampleAsset asset
o String newValue
}
event SampleEvent {
--> SampleAsset asset
o String oldValue
o String newValue
}
I have tried to change the namespace too and I got the same error

ok, its because you have multiple .cto files (ie in your directory) with the same namespace contained in them (perhaps you are making different editions or wanting multiple .cto files). The archive command checks for namespaces in each CTO). Each busness network model has a single namespace. All resource declarations within the file are implicitly in this namespace. You can have multiple .cto files if you want to break it out - but don't repeat the namespace in the additional files. You can even, if you want, have multiple model files, with different namespaces (if that's what you want of course).
See https://hyperledger.github.io/composer/reference/cto_language.html
Otherwise, suggest to move any 'editions' of the CTO files with the same namespace name.
Then try build your .bna file again.

Related

Asset Creation through Transaction in Hyperledger Composer

While creating any asset or participant need to check some condition Like (IF..THEN..ELSE) on some field.
Is it Possible to create Asset or Participant through transaction?
Yes it is possible.
I did the same thing in my network, creating assets with a transaction and applying whatever rules you need.
transactions are run from your logic.js file in lib.
assume you have an asset myAsset in org.myAssets namespace
asset myAsset identified by assetId
{
o String assetId
o String someData
//other fields as required
}
you now want a transaction which will create an asset
your cto transaction looks like this:
namespace org.transactions
import org.myAssets.*
transaction MyAssetCreate
{
o myAsset anAsset
}
you can't have a reference to the asset here since you won't have an asset yet.
in your lib/logic.js you have something like:
/**
* creates an asset
* #param {org.transactions.MyAssetCreate} myAssetCreate
* #transaction
*/
async function MyAssetCreate(myAssetCreate) {
return getAssetRegistry('org.myAssets.myAsset')
.then(function(result) {
var factory = getFactory();
var newAsset = factory.newResource(
'org.myAssets',
'myAsset',
myAssetCreate.anAsset.assetId);
newAsset.someData = myAssetCreate.anAsset.someData
//continue with property assignments and any logic you have
//when you are done and everything looks good you can continue
return result.add(newAsset);
});
now you can invoke MyAssetCreate and you will get your asset in the right registry.
Of course, if you do this then you need to make sure you don't allow assets to be created via the standard asset endpoint.
I myself plan to not expose any asset endpoints at all and only allow changes via transactions.
Check the code for typos etc, I took this from my running network, replacing my type names so it's possible I mistyped something.

SSIS , counting files in a folder using script task, looping the task untill number of files reaches 25

Need to develop a package which should read number of files from ftp/folder. If the count is less than 25 , keep looping, Go to the next task once the count reaches 25.
What i tried is:
I used a script task, created few variable and variable have file count(Successfully). I used expression in precedence constraint for checking if the number of files in a particular folder is 25. if not it wont go to another task. What i cant do is keep the script task looping until the file count becomes 25. I tried using for each loop but couldn't get trough. please suggest.
Please have a look at this. i have 2 script tasks. first one counts and display the number of files in a folder. here is the script for that.
enter code here
public void Main()
{
// TODO: Add your code here
String FolderPath =
Dts.Variables["User::FolderPath"].Value.ToString();
string Prefix =
Dts.Variables["User::prefix"].Value.ToString();
Int32 FileCnt = 0;
var directory = new DirectoryInfo(FolderPath);
FileInfo[] files = directory.GetFiles(Prefix + "*");
//Declare and initilize variables
//Get one Book(Excel file at a time)
foreach (FileInfo file in files)
{
FileCnt += 1;
MessageBox.Show(file.Name);
}
MessageBox.Show(FileCnt.ToString());
Dts.Variables["User::FileCnt"].Value =
Convert.ToInt32(FileCnt);
}
#region ScriptResults declaration
/// <summary>
/// This enum provides a convenient shorthand within the scope of
this class for setting the
/// result of the script.
///
/// This code was generated automatically.
/// </summary>
enum ScriptResults
{
Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
};
#endregion
}
}
then i edited the precedence constraint and wrote and expression #FileCnt == 25, which means next script task will only be executed when the number of files in the folder is 25. what i want now is that the first script task should be running in a loop until the folder gets 25 file. i think we need to use foreachloop container here. did you get what im looking for now??

Copying default external configuration on first run of Grails web app

In our Grails web applications, we'd like to use external configuration files so that we can change the configuration without releasing a new version. We'd also like these files to be outside of the application directory so that they stay unchanged during continuous integration.
The last thing we need to do is to make sure the external configuration files exist. If they don't, then we'd like to create them, fill them with predefined content (production environment defaults) and then use them as if they existed before. This allows any administrator to change settings of the application without detailed knowledge of the options actually available.
For this purpose, there's a couple of files within web-app/WEB-INF/conf ready to be copied to the external configuration location upon the first run of the application.
So far so good. But we need to do this before the application is initialized so that production-related modifications to data sources definitions are taken into account.
I can do the copy-and-load operation inside the Config.groovy file, but I don't know the absolute location of the WEB-INF/conf directory at the moment.
How can I get the location during this early phase of initialization? Is there any other solution to the problem?
There is a best practice for this.
In general, never write to the folder where the application is deployed. You have no control over it. The next rollout will remove everything you wrote there.
Instead, leverage the builtin configuration capabilities the real pro's use (Spring and/or JPA).
JNDI is the norm for looking up resources like databases, files and URL's.
Operations will have to configure JNDI, but they appreciate the attention.
They also need an initial set of configuration files, and be prepared to make changes at times as required by the development team.
As always, all configuration files should be in your source code repo.
I finally managed to solve this myself by using the Java's ability to locate resources placed on the classpath.
I took the .groovy files later to be copied outside, placed them into the grails-app/conf directory (which is on the classpath) and appended a suffix to their name so that they wouldn't get compiled upon packaging the application. So now I have *Config.groovy files containing configuration defaults (for all environments) and *Config.groovy.production files containing defaults for production environment (overriding the precompiled defaults).
Now - Config.groovy starts like this:
grails.config.defaults.locations = [ EmailConfig, AccessConfig, LogConfig, SecurityConfig ]
environments {
production {
grails.config.locations = ConfigUtils.getExternalConfigFiles(
'.production',
"${userHome}${File.separator}.config${File.separator}${appName}",
'AccessConfig.groovy',
'Config.groovy',
'DataSource.groovy',
'EmailConfig.groovy',
'LogConfig.groovy',
'SecurityConfig.groovy'
)
}
}
Then the ConfigUtils class:
public class ConfigUtils {
// Log4j may not be initialized yet
private static final Logger LOG = Logger.getGlobal()
public static def getExternalConfigFiles(final String defaultSuffix, final String externalConfigFilesLocation, final String... externalConfigFiles) {
final def externalConfigFilesDir = new File(externalConfigFilesLocation)
LOG.info "Loading configuration from ${externalConfigFilesDir}"
if (!externalConfigFilesDir.exists()) {
LOG.warning "${externalConfigFilesDir} not found. Creating..."
try {
externalConfigFilesDir.mkdirs()
} catch (e) {
LOG.severe "Failed to create external configuration storage. Default configuration will be used."
e.printStackTrace()
return []
}
}
final def cl = ConfigUtils.class.getClassLoader()
def result = []
externalConfigFiles.each {
final def file = new File(externalConfigFilesDir, it)
if (file.exists()) {
result << file.toURI().toURL()
return
}
final def error = false
final def defaultFileURL = cl.getResource(it + defaultSuffix)
final def defaultFile
if (defaultFileURL) {
defaultFile = new File(defaultFileURL.toURI())
error = !defaultFile.exists();
} else {
error = true
}
if (error) {
LOG.severe "Neither of ${file} or ${defaultFile} exists. Skipping..."
return
}
LOG.warning "${file} does not exist. Copying ${defaultFile} -> ${file}..."
try {
FileUtils.copyFile(defaultFile, file)
} catch (e) {
LOG.severe "Couldn't copy ${defaultFile} -> ${file}. Skipping..."
e.printStackTrace()
return
}
result << file.toURI().toURL()
}
return result
}
}

Read the files at the spesific commit with libgit2sharp

There is a bare repository, I have a commit id, and want to read all the files at that commit without cloning.
This repository.Lookup<Tree>(repository.Commits.First().Tree.Sha) code give me only the files that are in the commit but I want also other files that exists at that level.
How to do that?
My understanding of your question is that you're willing to access the whole content of a commit, not only the first level of the commit. The code below will work against a bare (or a standard) repository and will allow one to recursively access and examine the content of a commit.
In order to make it easier for you to test drive it, it dumps information (git object meta data along with blob content) in the console output.
RecursivelyDumpTreeContent(repo, "", commit.Tree);
[...]
private void RecursivelyDumpTreeContent(IRepository repo, string prefix, Tree tree)
{
foreach (var treeEntry in tree)
{
var path = prefix + treeEntry.Name;
var gitObject = treeEntry.Target;
var meta = repo.ObjectDatabase.RetrieveObjectMetadata(gitObject.Id);
Console.WriteLine("{0}\t{1}\t{2}\t{3}\t{4}", gitObject.Id, treeEntry.Mode, treeEntry.TargetType, meta.Size, path);
if (treeEntry.TargetType == TreeEntryTargetType.Tree)
{
RecursivelyDumpTreeContent(repo, path + "/", (Tree)gitObject);
}
if (treeEntry.TargetType == TreeEntryTargetType.Blob)
{
Console.WriteLine((((Blob)gitObject).GetContentText()));
}
}
}
Would you precisely know the path of a specific file you'd like to access, use the indexer exposed by the Commit type in order to directly access the GitObject you're after.
For instance:
var blob = commit["path/to/my/file.txt"].Target as Blob;

How can I build all build.xml (.java) files under a directory

We needed to automate testing that all of the Java samples we ship compile properly. We need it to build all files without our listing each one. Listing each one means if someone forgets to add a new one (which will happen someday), explicit calls will miss it. By walking all build.xml files, we always get everything.
Doing this is pretty easy:
Install the samples on a clean VM (that we revert back to the snapshot for each test run).
Create a build.xml file that calls all the build.xml files installed.
Use ant to run the generated build.xml
Step 2 requires a means to generate the build.xml file. Is there any way to tell ant to run all build.xml files under a sub-directory or to create a build.xml that calls all the underlying build.xml files?
It sounds like what you want to do is run the same build process for a number of sub-projects that (hopefully) follow a standard layout pattern.
If that's the case, you can create a single build.xml that knows how to compile those projects, and make a top-level build script which finds all the sub directories, then calls the common build script in each one. Subant was taylor-made for this, and doesn't require a magic C# program to generate scripts in each directory.
We couldn't find anything so we wrote a program that creates a build.xml that calls all build.xml files under a directory. Full solution is at Windward Wrocks (my blog).
The code is (yep, using C# to create a build file for Java):
using System;
using System.IO;
using System.Text;
using System.Xml;
using System.Xml.Linq;
namespace BuildJavaTestScript
{
public class Program
{
/// <summary>
/// Build build.xml for all build.xml files in sub-directories.
/// </summary>
/// <param name="args">Optional: build.xml root_folder</param>
public static void Main(string[] args)
{
string projFile = Path.GetFullPath(args.Length > 0 ? args[0] : "build.xml");
string rootDirectory = Path.GetFullPath(args.Length > 1 ? args[1] : Directory.GetCurrentDirectory());
Console.Out.WriteLine(string.Format("Creating build file {0}", projFile));
Console.Out.WriteLine(string.Format("Root directory {0}", rootDirectory));
XDocument xdoc = new XDocument();
XElement elementProject = new XElement("project");
xdoc.Add(elementProject);
elementProject.Add(new XAttribute("name", "BuildAll"));
elementProject.Add(new XAttribute("default", "compile"));
XElement elementTarget = new XElement("target");
elementProject.Add(elementTarget);
elementTarget.Add(new XAttribute("name", "compile"));
XElement elementEcho = new XElement("echo");
elementTarget.Add(elementEcho);
elementEcho.Add(new XAttribute("message", "Build All: jdk = ${java.home}, version = ${java.version}"));
// add .sln files - recursively
AddBuildXmlFiles(elementTarget, rootDirectory, rootDirectory);
Console.Out.WriteLine("writing build file to disk");
// no BOM
using (var writer = new XmlTextWriter(projFile, new UTF8Encoding(false)))
{
writer.Formatting = Formatting.Indented;
xdoc.Save(writer);
}
Console.Out.WriteLine("all done");
}
private static void AddBuildXmlFiles(XElement elementTarget, string rootDirectory, string folder)
{
// add build.xml files
foreach (string fileOn in Directory.GetFiles(folder, "build.xml"))
{
string filename = Path.GetFileName(fileOn);
string workingFolder;
if (folder.StartsWith(rootDirectory))
{
workingFolder = folder.Substring(rootDirectory.Length).Trim();
if ((workingFolder.Length > 0) && (workingFolder[0] == Path.DirectorySeparatorChar || workingFolder[0] == Path.AltDirectorySeparatorChar))
workingFolder = workingFolder.Substring(1);
}
else
workingFolder = folder;
if (workingFolder.Length == 0)
continue;
XElement elementExec = new XElement("ant");
elementExec.Add(new XAttribute("dir", workingFolder));
elementExec.Add(new XAttribute("antfile", filename));
elementTarget.Add(elementExec);
}
// look in sub-directories
foreach (string subDirectory in Directory.GetDirectories(folder))
AddBuildXmlFiles(elementTarget, rootDirectory, subDirectory);
}
}
}

Resources