How to set up EdiFabric to use multiple X12 versions - edi

I've purchased ediFabric and have a question about how to set up a project to use it. The SDK that comes with it has a Rules project with a few Codes, Complex Elements and Segments that are specific to a version of the X12 spec, e.g. 005010. My application uses different different versions based on transaction type. For example, 753 & 856 use 005010 and 810 & 855 use 004010.
Should I have one rules project with different folders for the different versions? Or should I have multiple rules projects with one version per project.

You would normally group rules by either version or partner. In case your partners conform to the standard specifications and you didn't have to amend the rules in any way - please split rules by version.
The pattern is to have each version in a separate project\assembly. EdiFabric allows only one message per version and type in the same assembly. This way is also neat and easy for you to maintain.
Each rules project will contain the 3 common files, Segments, Complex and Codes and the files for the transactions you will be using. There is no point to add all the files and you can gradually upgrade this whenever you need it.
Loading the correct specification dynamically can be achieved by using a factory. Please refer to the X12 SDK for sample code or the documentation here.

Related

How to identify what projects have been affected by a code change

I have a large application to manage consisting of of three or four executables and as many as fifty .dlls. Many of the source code files are shared across many of the projects.
The problem is a familiar one to many of us - if I change some source code I want to be able to identify which of the binaries will change and, therefore, what it is appropriate to retest.
A simple approach would be simply to compare file sizes. That is an 80% acceptable solution, but there is at least a theoretical possibility of missing something. Secondly, it gives me very little indication as to WHAT has changed; It would be ideal to get some form of report on this so I can then filter out irrelevant (e.g. dates/versions copyrights etc..)
On the plus side :
all my .dcus are in a row - I mean they are all built into a single folder
the build is controlled by a script (.bat)(easy, for example, to emit .obj files if that helps)
svn makes it easy to collect together any (two) revisions for comparison
On the minus side
There is no policy to include all used units in all projects; some units get included because they are on a search path.
Just knowing that a changed unit is used/compiled by a project is not sufficient proof that the binary is affected.
Before I begin writing some code to solve the problem I would like to ask the panel what suggestions they might have as to how to approach this.
The rules of StackOverflow forbid me to ask for recommended software, but if anyone has any positive experiences of continuous integration tools that would help - great
I am open to any suggestion or observation that is relevant in this context.
It seems to me that your question boils down to knowing which units are contained in your various executables. Since you are using search paths, it will be hard for you to work this out ahead of time. The most robust way to find out is to consult the .map file that the compiler emits. This contains a list of all units contained in your executable.
Once you know which units are contained in each executable, you need to know whether or not anything has changed in those units. That information is contained in your revision control system. Put this all together and you have the information that you need.
Of course, just because the source code for a unit has changed, you might argue that re-testing is not needed. Perhaps the only change made was the version, or the date in a copyright label or some such. But it is asking too much to be able to ask a computer to make such a judgement. At some point you need a human to step up and take responsibility.
What is odd about this though is that you are asking the question at all. It seems to me to be enormously risky to attempt partial testing. I cannot understand why you don't simply retest the entire product.
After using it for > 10 years for commercial in-house and freelancer work in large projects, I can recommend to try Apache Ant. It is a build tool which supports dependencies, and has many very helpful features.
Apache Ant also integrates nicely with CI tools such as Hudson/Jenkins, Bamboo etc.
Another suggestion - based on experience with Maven - is to design the general software architecture as modular as possible. If modules (single or multiple source or DCU files in one directory) use a version number in the directory name as a version number, it is possible to control exactly how application are composed from these modules.
If you want to program such a tool yourself the approach would be something like this:
First you need to detect wheter there were any changes made to seperate source files. As you already figured out comparing the file size is bad idea as the file size can stay the same despite lots of changes made to it (as long as there is same amount of text in pas file its size won't change). So instead you could check the last modification time for specific file or create some hash value like MD5 hash for comparison (can be quite slow).
Then you need to generate yourself a dependancy tree which will tell you which files are used for which project/subproject.
Finally based on changes detected in seperate files you check the dependancy tree to see which projects needs to be recompiled.
The problem of such approach is that you would probably have to update the dependancy tree manually each time when new unit is added to the project or an existing one is removed from the project.
But the best way would be to go and use some version controll software istead of reinventing the wheel. I myself like the way how GIT works and I belive that with proper implementation of GIT into the project mannager itself could be quite powerfull do to GIT support of branching/subbranching (each project is its own branch, each version of your software can be its own subbranch).
Now latest version of Delphi does have GIT integration done though SVN but this unfortunately limits some of best GIT functionality. So if you maybe decide to go and integrate GIT support directly into Delphi I'm first in line to use it.

handling modular web-application

i know this is pretty general, but i couldnt find any suited information regarding this topic:
we need to develop a module-based system (ASP.NET-MVC) that should be adaptable for multiple different customers. Each of the modules can be customized for every company.
Is there some kind of tutorial on how to handle such complex requirements? (multiple customers that can have different compositions of modules and different module-implementations)
Can you recommend an approach on how to represent this structure in TFS?
How can the enrollment be done when each customer can have a different composition of modules.
Is there a recommended tool to keep track of all the versions that are enrolled on the different servers (staging, customers, ...)?
Would be really glad if someone could shed some light on that topic or at least throw some hints on what to exactly search for!
I am not sure if there is any tutorials but MVC its like a Class lib with views.
You can devide to components lets say you have module called payment, this module will include controllers, scripts, and views. To use this module you can have 2 options, 1. Create nuGet package with all content. Other option is to create zip file with content and just copy all to your project.
To use it for each customer would be simple since you divide all to different modules, and then you can include or exclude from your project.
As i mentioned i would use private Nuget server to handle packages then its really easy to add remove components just with few click. Also you can add build server where you can run unittests and if everything passed sucessfully then publish new nuget package

Divide an app to multiple apps that have different UI design and share logic code

I have an app which I will call it the "base app". The app works with many brands.
I need now to separate those brands, and to make a distinct app for every brand.
Every app will have a slightly different design (including different images) and here and there maybe some specific-to-a-brand code.
All of the apps should also use the same base code from the "base app" that deals with logic.
I have some options I have thought, but I am not sure if any of them suit my needs. Will be happy for clarifying the difference among the options.
The options I have thought are:
1) Creating an app for each one of the brands and just copy-paste the class files from the "base app" as a reference, except the .xib files, which will be copied as a copy. The problem is that then I do not know how and where to write a brand specific code (because it will be shared among others).
2) Creating a workspace that will include the projects for each one of the brand. Not sure how this works and if this is correct, will be glad for help clarifying here.
3) Nest a "base app" project inside every brand's project. Any help clarifying what does it do will be appreciated.
3) Using the base app as a static library which will be linked in every brand's project. Not sure what will happen with the UI (shared, not shared). Will be glad for help clarifying here too.
4) Using a simple way of maintaining each one of the brand's project, including the shared code (which will be a disaster, I guess).
The simple solution in iOS is use targets.
For resources you can use different targets for each brand and then select different resources (images, xibs, etc) for each target.
Also if the changes in code are minimal you can then refactor some part of your code and create different classes with different implementation for each target (you can use some pattern like a Factory). Also you can simply use preprocessor macros.
It's not the better, but this is the simplest and quick approach, but if your code changes a lot it's better to create a core library like the other answers say.
A good approach would be to split your app up into the following components:
Core Model Library
Reusable views & view controllers. The views can be designed to support skinning and customization.
Any other reusable code that can be encapsulated as its own 'identity'.
These core projects should ideally have their own continuous integration (quality control) builds and tests.
And then use CoocaPods
Instead of manually performing all this complex integration, use CocoaPods. CocoaPods will create the Xcode workspace, build the libraries and link them into your project. You then create a custom build just by gluing the pieces together.
In addition to this, CocoaPods also performs tasks such as:
Resolving transitive dependencies - which just means building and fetching any libraries that your libraries themselves use.
Managing versions of the libraries being integrated.
Private Spec Repo is possible, or just use GitHub
The main CocoaPods repository is of course public and contains open-source and/or freely available libraries.
You can host your own CocoaPods spec repository, or simply set up a private GitHub account, and include a PodSpec in each project, then resolve as follows:
pod 'MyLibraryName', :git => 'https://github.com/myOrgName/MyLibrary.git'
this will install all of your libraries into your workspace. To update your project to include any changes to the core libraries, simply:
pod update
Advantages of this approach
You'll have a separate set of quality controls that gets applied to each core project.
There'll be much less reputation.
You can use more automation. More automation equals less waste equals more customer value.
As the team grows, you can split up core product devlopment and solution integration into separate roles/teams. A team working on an integration build, need not pull the latest library features, if that would disrupt them.
You can have two different customers on different builds of the core library. CocoaPods will manage this seamlessly. So you wouldn't necessarily have to update a build, until you get an enhancement request or scheduled maintenance. (Again reducing waste, thus increasing customer value).
Inspired by Piggly Wiggly (but lean through and through)
This approach is modeled after the production line style approach that was popularized in Japan after World War II. Its called Lean Methodology, and is all about having a fast, small inventory and reducing waste. (Delivering more with less). . Japanese execs got the inspiration for this when they went to America and visited Piggly Wiggly Supermarket stores.
This is often something you encounter creating cheap flash-games or apps.
These have very generic frameworks like: kicking a ball, shooting at the screen, or generating a list with some data downloaded from a specific server etc...
Everytime they want to create a new shootergame, they just load up their shooting framework, add a bunch of graphics and can release a crappy game within a day.
How do they do it?
They often create a framework which contains shared models, handlers, interfaces etc.
Put a lot of general utility functions like downloading files etc in a library.
And you can also create some default framework views and view-controllers.
When you want to create a similar app, just import the library and re-use the base framework. Containing base-views, base-models etc.
You can find a good example in the demo-examples delivered with the ios SDK or android SDK.
Good luck.

What is the recommended naming for scoped units in Delphi XE2 onwards?

With the advent of Delphi XE2 scoped units like Xml.Internal.AdomCore_4_3 or System.StrUtils came in fashion.
I like the ability to use such descriptive names but I am puzzled what are the naming conventions and preffered directory structure.
Should it be
com.company.project.Security.Compression.ZLib.pas like in Java
System.Security.Compression.ZLib.pas like in .NET
or something else?
Should I place my files in a directory strcuture like this
System\Security\Compression\System.Security.Compression.ZLib.pas
or just System.Security.Compression.ZLib.pas in the root folder?
Looking at the way Embarcadero organized their units I am left with the impression that they simply kept the directory structure as in Delphi 5/6/7/.../XE
Please advise.
I believe that everyone should be scoping their units. (You don't need to do this only if you are a 3rd party component vendor - as you want to reduce name collision on any external units that you utilize.) By scoping your units you are going a very long way to prevent unit naming issues.
I'd strongly suggest a global prefix that you will use for all internally-created units. You can use an abbreviated form of your company name, or your full name as its totally your preference. I would recommend simply something readable and easy to type.
If you work in a handful of major projects, then project-specific names would follow such as Acme.Widgets.SlicerUtils.pas for the Widgets product/project and Acme.Wonkers.WippleFactory.pas for the Wonkers product.
Regardless of the naming, this should strongly correlate to how you versionize your projects and manage your source code. You want to be able to easily setup a build for version 1.2.1.0 of the Widgets project to include all the units related to that specific build. (1.2.1.0 Widgets may include an explicit revision 1.5.3.0 of the System library) This needs to be easily understood by all developers and seemlessly managed. You should desire one-click operations to begin work on 1.2.1.1 of the Widgets project.
The subdirectory question is answered by the general rule to separate version-controlled projects into their own sub directories. So if your Acme.System.Security.Compression library is version controlled separately than Acme.System.Security.Auth, then you'll have a root\System\Compression subdirectory structure...but more likely you'll simply have Acme.Widgets.System)
Hope this helps a little.
Unlike Java and .NET the "." in the new scoped filenames provides no functionality at run-time or design-time. System.StrUtils.pas could just as easily have been called System_StrUtils.pas or SystemXStrUtils.pas and we'd have exactly the same situation as we have now.
Because of that I don't recommend you adopt the new naming conventions unless perhaps you are a component developer who distributes your components publicly.
Even then I wouldn't name your units System.MyUnit.pas, as I believe that prefix should be limited to the official Delphi System units. If you really have to adopt this new convention (personally I don't think I will be) I'd name your units something along the lines of
MyCompany.UnitName.pas
or
MyComponentLib.UnitName.pas
I don't see any need to put VCL or FMX in the names unless you're releasing a dual framework library.

Find differences in a .po file

I have a .po file where most translated strings are identical to original ones. However, few are different. How do I quickly find the ones that differ from original?
use podiff
I used it, an it workd for me. Its in C, so you have to compile it. make is your friend
I would suggest using one of the many web-based localization management platforms. To name a few:
Amanuens (disclaimer: my company builds this product)
Web Translate It
Transifex
GetLocalization
This kind of platforms allows you to keep your resource files in sync, edit them in a web-based editor (useful for no-technical translators) and most importantly to highlight/see only the changed/untranslated strings.

Resources