How to set up an EDI Server - edi

I was tasked to check out the feasibility of doing inhouse EDI, as the 3rd party costs are getting out of hand.
In doing web searches on the subject, there is a lot of info about the various documents types and formats and creating them from XML or database files. This looks pretty straightforward. However, I don't see much on the subject of the server to server communication.
The question is, what does it take to set up such as server? I am looking for a 3rd party component that I can run on a Windows server as a server (I see it like as IIS server which just sits there and waits for incoming connections and then does the handshake and accepts the file.) The only thing I have found so far is that MS BizTalk server includes EDI capability.
I have also found Edidev.com which has an AS2 server which looks like it might fit the bill.
I am completely new to this area, and don't want to miss anything important.

There are many different options available when talking about EDI.
The primary components involved in an Full EDI Setup include, a Translation Tool, a Job Scheduler, a Managed File Transfer (MFT) Solution, and a Server.
Now, what most EDI professionals, primarily EDI Oustourcing Companies want the public to believe is that is setting up your own In House EDI solution is extremely difficult and will cost an arm and a leg.
Granted, some outsource options out there really do cost an arm and a leg like Gentran, GXS Open Text, IBM Sterling, and the other mainstream EDI Service providers out there. Others like Liaison, 1EDI Source, and the like are quite a bit more cost effective and have various different support packages out there including cloud based solutions.
The overall truth of the matter though, and getting back to the original point on setting up your own In-House Solution, is that everything that these EDI Service Providers provide can be done In-House at a fraction of the cost. There are open source solutions, free cloud services options, or off the shelf software that can be installed, implemented, and integrated for use with your business systems.
The cost in this scenario is the In-House EDI Specialist to run, maintain, and implement new relationships with your newly built EDI Solution.
One example solution: Open Source Solution
(Free to download, install,& use)
Translation Tool - MFT Solution - AS2 Server
[ Bots EDI Translator - Waarp MFT - OpenAS2 Server ]
bots.sourceforge.net/en/index.shtml
sourceforge.net/projects/waarp/
sourceforge.net/p/openas2/wiki/Home/
There are many other options out there that can be pieced together and customized to fit your specific business needs.
EDI simply defined is the translation and transmission of electronic business documents from one company to another.
The Translation tool is simply taking one data field from one file and putting in the predefined field on the other file.
The Managed File Transfer is the GUI you want to use in order to view transactions, resend transactions, manually download previous transactions, etc.
The File Transfer portion is simply setting up the communication settings between your company and the company you want to send/receive the file. Can be configured as FTP, SFTP, AS2, or even email distribution lists.
I am the former Business Manager of one of those 3rd Party EDI Service Providers, and was amazed at how the industry was able to trick the users into thinking that EDI was so complex and hard to implement, maintain, or even understand.
I am still in the EDI Industry, and currently work as the Business Analyst for a Manufacturing Company doing, you guessed it, In-House EDI.

Most traditional EDI software houses have "enterprise" scale integration options that run on (as) a server. Gentran, TrustedLink, BizTalk..all names that are in that space and are usually a sizable (expensive) investment.
What I use here is Liaison's Delta (translation) and ECS (communication). Both run as client / server. The translation software is a true Windows drag / drop any-to-any mapper that can handle all integration scenarios. This commercial software would run you around $20k. We are currently supporting well over 100 trading partners, doing about 3000 batches of data per day. The system is integrated with our ERP and not only handles EDI but XML, flat file, CSV data as well. Delta and ECS. You might be interested in reading this: My Case Study
If you have your own translation engine (parser) and just want a piece for communication, you can still check out ECS, Cleo Lexicom, or Axway. All have Managed File Transfer solutions that will work for you, and run as a Windows service.
So our server handles AS2 communication, picks up files on schedules, sends data via FTP and FTPs, handles web services via HTTP, and has client utilities to show data coming in and out of the system. It also automatically generates the 997 for inbound transactions. Setup of ECS is very easy. Learning a translator - any translator - can be a daunting task. There are quirks to every one of them. That's where the time will be invested in.

This is a really old question (but then, EDI is a REALLY old technology).
Your transmission software is what is internet facing, and most of your trading partners will use a form of FTP, and your translation software will just pickup the files from their FTP mailbox. NOTE that many trading partners insist of having a Drummond certified FTP server, so this could be the most expensive part of a small EDI setup.
Anyway, the way to set this up is to have your FTP/SFTP/FTPS server (Cleo Lexicom or something similar) accept and/or send the files, and your translation software take over from there.

Related

What is meant by the phrase adapter/connector?

This is a basic questions. I want to apply to an entry level java developer position with the following requirement:
Familiarity with the Sailpoint Identity IQ standard adapters/connectors
By standard connectors do they basically mean how Sailpoint exchanges data with third party tools? And by adapter do they mean that the adapter pattern would be used? Thanks
This is going to probably appear well after your interview - but to answer the question:
1) Standard adapters/connectors:
SailPoint ships with a "standard" set of connectors which are part of the purchase price there are those ie EPIC which do not ship as part of the standard product and must be enabled. To give you a deeper view into connectors..
Connectivity Methods:
Direct Connectivity - This is where a connector communicates directly to a system using APIs or data-sources. Some advantages of using direct connect are that you don't have to generate or transmit files, and you can be more efficient in processing only things that have changed. Some disadvantages are the they are subject to availability and downtime concerns like any connected system. They are also typically subject to advantages and disadvantages that APIs might impose as well.
Some people also refer to this as an 'online' method of connectivity.
File-Based Connectivity - This is where a connector reads from a snapshot of data presented in a file, rather than connecting directly to the system. Some advantages of using a file, are that files are portable, easily inspected for data issues, and not typically subject to availability. Some disadvantages are that files are usually processed in their entirety, and may require processing or transformation in order to work effectively.
Some people also refer to this as a 'decoupled' or 'offline' method of connectivity.
Connector Implementations
Source-Specific Implementation - These are connectors built with a specific target-system in mind. These typically use specific APIs targeted to the system they are integrated with. Because the systems and APIs are known, these typically require less configurations to get working.
Examples of these are Active Directory, Workday, Salesforce, SAP, etc.
General Implementation - These are general-purpose connectors which can be used to connect to a variety of sources or systems. These tend to be more flexible in general, but typically do require a bit more setup and configuration to meet needs.
Examples of these are Web Services, SCIM, JDBC, Delimited Files, etc.
Custom Implementation - These are completely custom connectors and tailored to the system and API of your choice. This approach offers the most flexibility of all connector options, however making custom connectors is definitely a development-level activity, and is not to be taken lightly. The code written for custom connectors is maintained and supported by the customer who owns the connector.
Examples of these are custom in-house applications, etc.
Understanding these connector implementations is important, because if a source-specific implementation isn't available, another general or custom connector implementation may be used instead.

Localisation platforms and translation services

I am currently conducting some research on externalised translation services, and how to integrate them into our development workflow.
I have come across various services, and find it difficult to compare them:
transifex
crowdin
localizejs
tran.sl
oneskyapp
smartling
We are managing a large content website, using 2 methods:
gettext for the "static" text
different versions of the content (1 for each language) managed through a CMS.
The difficulty for us is to commission translations manually, it just doesn't well. We would like to automate the process instead.
whenever the gettext files are updated, content is sent automatically to a translation service.
whenever the content is updated, it is also pushed to a translation service.
It seems that all services above are designed to meet those requirements. So the question is which criteria to use to compare those various services?
The answer has a couple of different aspects.
Firstly, it will depend on the specific CMS you are using and how much you want to change your system. Namely, some of those services have to incorporate your website into their system or incorporate quite a few things into your system to achieve that degree of full automation. So you'll have to check with each one, I guess (although I have no direct experience of these services).
Otherwise, you might be advised to consult with a localisation agency and choose a localisation plug-in for your CMS and settle for a degree less automation. The process has to detect the changes, export the modified pages/contents as an XML or XLIFF file and send it to the translation service, it also has to recognise receipt of the translated file and re-import it. How much of that will happen without a supportive, prompting click or two from you, I'm not sure.
(The gettext PO files can usually be provided direct to a translator or agency who import them into their translation memory system and recognise changes since the last time they had them, only translating the changed segments. Here it pays off to work with the same translator or agency over time.)
Secondly, it depends on how many languages? All of the above gets potentially more complicated when it is being translated into more than one language.
Thirdly, who is translating? The services that offer real 'crowdservicing' often end up in a scenario where very short pieces of text without context are delivered to whichever translator happens to be awake and online at that moment. My recommendation would be specialised localisation agencies working with the relevant CMS plug-ins who should be able to offer higher quality by giving your contents to the same translator every time, or one of a small team, who then get a chance to develop a feel for your contents and translate accordingly.
Hope that helps,

COM servers in Delphi service applications

This is a somewhat general question but I'm hoping someone will have specific info or recommendations.
I have an application suite that includes a service application that acts as a communications interface and data historian for industrial pollution-control hardware. The service contains a singleton COM server to allow the rest of the suite to have access to the hardware and data via the service.
I've read the stuff about how SvCom is required to make COM servers work in Delphi service apps. I have and use SvCom - it does what it claims. But I'm not all that comfortable with it, the product and my coding styles and expectations don't match, and it makes debugging somewhat more of a headache.
But my real problem is with the idea that the lengths SvCom goes to to make a COM server work in a service app is absolutely required. Their documentation, and some of the stuff that comes up in searches on the subject, makes it sound like their toolbox is absolutely required for any COM-server-in-service scenario. But I have a couple of different 3rd-party libraries for implementing OPC servers, Prosys Sentrol and the older Production Robots library (if you're not familiar with OPC, it's a pretty-much ubiquitous data-interchange standard built on COM) and both support the OPC COM server being put in standard TService-based apps without special handling beyond doing in the AfterInstall and BeforeUninstall events what would normally be done in a stand-alone EXE when run with the /regserver or /unregserver command-line switches, and of course using DelayInitialize := True. So at least SOME COM servers can be done as typical TService-based apps without the extraneous steps SvCom goes through.
So my question is: Is the line between "what sorts of COM servers work in a TService-based app" and "what causes the need for the extra stuff SvCom does" clearly known? If so, what is it and/or where is it documented? If not, I'm kind of surprised - seems like implementing COM servers in service apps would be a fairly common need, but I've does several deep searches and, based on the dearth of info I've found on the subject, maybe it's not.

Web service documentation (schemas, locations) discovery in SOA

I need to make a recommendation on approaches for allowing web service (WCF) documentation (wsdl, schemas, locations etc.) to be stored and found. Being able to monitor the services would be a definite bonus.
This needs to be considered in the wider context of moving to an SOA built, where possible, with Microsoft technologies that should be accessible by clients from other frameworks. The aim is to develop a system in which clients do not need to change if a service is moved or new versions are brought online - it should be possible to write the client 'knowing' just one address / location which is capable of directing them appropriately.
Having a central location for the service documentation is important too; our Business Analysts should be able to find all they need to about the services we provide from a central place. We would also want (potentially) to expose that repository of service information to partners as well. I know we could generate wsdls and manually manage them (create a folder somewhere and zip them up before sending them out) but that seems very labour intensive and prone to error (on my part).
As I see it at the moment there are two broad approaches;
Write something bespoke that uses WS-Discoverability and a dynamic routing service which can respond to the client requests.
Get an off the shelf solution.
I have to say that an off the shelf solution is the most likely approach that will be accepted but I have to at least consider the alternatives. For the off the shelf solutions I have identified
BizTalk
WSO2 ESB and WSO2 Governance Registry
as possibly providing the features.
What I need to know
Am I right with my understanding of the broad approaches?
Are there any other approaches I should consider evaluating?
Specifically I also need to know pros and cons of any approach I consider and have an idea of how it could be implemented.
To start with I would definitely not go with Biztalk or any WS-Whatever SOAP based protocol.
Go simpler and you'll be an happy man in the end.
For the middleware I would go Mass Transit
or if you prefer, NServiceBus, which I'm not a big fan off, but which provides another level of enterprise support. If you choose to go with Event SOA you'd get async operations as a bonus.
With the middleware layer defined it is time to define the API Layer. I would not expose my services to the outside world, and if the middleware is event based, the services within it they can only respond to events placed in the bus, so I would use ASP.NET Web API with a REST interface to get the requests to the outside, and based on the request type create the related message (command) and place it on the bus.
Way to high level but I hope it helps.

Windows Azure for web developers vs Amazon EC2

I just watched the Windows Azure intro video and it left me feeling like it was a front end shell for hosted IIS instances. Can anyone who know more (possibily that was part of the beta) shed on why you would use this vs. EC2.
it seemed easy enough but really didnt give specifics on how it works, why it works or why you would use this vs the traditional solutions out there?
According to the vision (and I can only talk about the vision here since the product isn't really out yet), here's a couple of reasons you might consider Azure over EC2.
Azure includes built-in load balancing abilities. If you want to do that in Amazon, you have to roll your own solution or buy a third-party solution like www.RightScale.com.
Azure-friendly-coded apps can be delivered internally or in Microsoft's cloud. If you write apps that have confidential information like financial data or health care data, not all of your clients will be willing to put their data in the public cloud. In that case, they can deploy your apps internally on Windows. That's sold as a skillset win, because you can go from public to private projects. Don't get me wrong - if you master Amazon EC2 development, then you can deploy your apps internally with Linux virtual servers in your datacenter, but it's not as turnkey. (Hard to describe a tech preview as turnkey when it's not licensed yet, hahaha.)
Having said that, it wasn't clear that the load balancing functionality is included in the box with internal deployments. If you have to do a combination of Azure plus ISA Server, that'll be a tougher deployment and management sell.
AppHarbor is a .NET cloud hosting environment that sits on Amazon EC2. The nice thing is they offer a free plan (much like Heroku does) so you can check it out yourself with very little friction.
My company is using Amazon EC2 now and I am down at the PDC watching the details on Azure unfold. I have not seen anything yet that would convince us to move away from Amazon. Azure definitely looks compelling, but the fact is I can now utilize Windows and SQL server on Amazon with SLAs in place. Ray Ozzie made it clear that Azure will be changing A LOT based on feedback from the developer community. However, Azure has a lot of potential and we'll be watching it closely.
Also, Amazon will be adding load balancing, autoscaling and dashboard features in upcoming updates to the service (see this link: http://aws.amazon.com/contact-us/new-features-for-amazon-ec2/). Never underestimate Amazon as they have a good headstart on Cloud Computing and a big user base helping refine their offerings already. Never underestimate Microsoft either as they have a massive developer community and global reach.
Overall I do not think the cloud services of one company are mutually exclusive from one another. The great thing is that we can leverage all of them if we want to.
Microsoft should offer up the ability to host Linux based servers in their cloud. That would really turn the world upside down!
Well it's more than just web services. It will also allow you to host other types of connected applications. Plus it provides integrated access to other MS software on the cloud; i.e. SharePoint, Exchange, CRM, SQL data sevices, and will allow you to fully customize and extend those offerings in the same way that you would be able to customize and extend them if they were hosted on-premises.
At the Archtect Insight Conference last year they mentioned that they have started to alter core server products to deal with the large scale failover environment which is very interesting to me at least.
Its bunch of stuff that is coming into the Cloud. I think of this as more of Platform in the Cloud.
Sql Server
CRM
MOSS
Exchange
BizTalk
Geneva (identity)
The terms that are mentioned here are "STORE" and "COMPUTE"
For me this get really intersting around the IDEA of a Internet Service Bus.
It is also about moving to the development workflow process too.
OSLO DSLs and Qudrant - Moving to a Model Driven View
Entity Framework - giving developers strong typed model in code at a click of button
ADO Data Services and Data Dynamic Webtemplates using MVC
Then with the Azure Templates and the new "WebRoles" moving to deployment of the applications to the Cloud.
Then for the Admins one click provisioning of servers is awsome.
On the Data Privacy Rules... which is the one big elephant in the room and has been mentioned... Typically there is the often a ruling in each Country about information security.
UK RIPA
US Patriot Act
Are these really conceptully different? And these 2 countries do share information anyway...IMHO (legally they are different, but to a customer both laws give access to customer data its just question of who)
At this point, information on Windows Azure is pretty scarce. I was in the keynote during the announcement, and my best guess at this point is that they're trying to provide a more extensive virtualization environment than simply hosted IIS instances.
At this point, though, I can't say more than that.
We use S3 for storage very successfully and I've always kept an eye on EC2 for Windows and SQL Server support. So now these are available I dug further.
I was pretty worried when I read this:
http://www.brentozar.com/archive/2008/11/bad-storage-performance-on-amazon-ec2-windows-servers/
Perhaps, as we're developing what will hopefully become a very popular website, we should be considering the new data store models - Azure's or Amazon's SimpleDB. Hmmmmm - complete rewrite!
The major difference going forward is that Amazon EC2 is free from today Nov 1, Check this out.
http://www.buzzingup.com/2010/10/amazon-announces-free-cloud-services-for-new-developers/

Resources