I have created a simple process, consisting of one task. The task requires a Dynaform to be completed and a trigger is executed in the After Dynaform event to send an email to a group.
The process functions as intended when I create cases from within ProcessMaker. The case is created and the email is sent. When a case is created via web entry however, the trigger doesn't execute. I have verified this from the logs.
I also tried using the End Event: Email Message but see the same behavior i.e. mail sent sent when starting case in PM but not sent when using web entry.
Would anyone be able to provide some insight an on why this would be happening?
It sounds like you are using an old version of ProcessMaker by the fact that you are talking about Trigger and DynaForms. Are you using Version 3? If so, we recommend that you start looking at Version 4 (4.0.13 is the latest release) - https://github.com/ProcessMaker/processmaker
thanks,
brian
Related
I'm in the process of planning the development of a mail-server to hand the sending of email across our multiple websites. Below is a description of what we are planning to implement and I'd like your opinion/suggestions.
We use ASP.NET MVC and have many web-sites hosted on Azure. We currently send mail internally within each of the web applications using SMTPServer.Send(). Obviously this is not the ideal way to send emails when you have a decently busy set of websites because the send mail call is blocking and cannot guarantee mails are sent. With this I'm worried out getting an influx of mail requests when we launch our next website (we think it'll get a decent amount of traffic and lots of emails will be sent).
My plan of action : to build a centralised mail-server that runs in the background (we use azure and this will be simply another web-application). When each one of our web applications wants to send a mail, instead of doing this internally, it'll call a web method on the mail-server called sendMail() this function will accept certain parameters and insert the mail parameters, content etc. into a database. The mail server will then poll the database at fixed time intervals, select a set of unsent emails and attempt to send them using the same SMTPServer.Send() function. If an email fails for some reason we won't flag it as sent and in the next poll interval the email will be selected again and another send attempt will be made. (we will cap the number of send attempts to say 20).
This will allow each of the websites to run smoothly without having loads of blocking send mail calls internally and the mail server will handle all the sending sequentially and in a controlled environment as a separate standalone web-application.
Thanks in advance!
Looks like a good design, Don't know the entire scenario which let to you building something like an email server. The problem has been solved well by using a service that already exist like Office 365.
Your design is good, My suggestions would be the following,
You can use Azure WebJobs to build the polling agent. You can make the web job run as a scheduled web job that does the polling and sending the mail and it can be written very clean as a simple console app.
You can use Azure API App to build the SendMail() call and you can use the Azure AD Auth on the API to authenticate the caller of the API using the Authentication and Authorization feature to easily secure your email server. You can also enable CORS easily as well to make sure you receive requests from other websites and process it.
Some issues I foresee for you,
Volume and Scaling : You can only process so much email between each polling. If you cannot then you will need to create another polling agent which will making things complicated as they need to know they are picking different sets of emails to send. If you volume is going to be low you should be fine.
Challenge : Why can't the websites send the mail them selves ? And then record it on the database for tracking. All you have to do build module or a component that they use on their web page to create and send the mail. Polymer 1.0 works well for this scenario.
Hope this helps to get you started.
I want an E-Mail to be sent using a background process whenever an Invite was generated.
What I currently have is this approach: The Invite model has the method send_mail, which sends an E-Mail using the Mandrill API and gem. It also has the method queue_mail adds InviteMailer with the invite's ID to the queue using Resque.
However… Since I'm having sort of a really hard time writing specs for this, I assume this might not be the best approach to send mails.
What I mainly want and need to test:
was the mail added to the queue?
is InviteMailer working properly?
does the mail contain the correct vital information?
Vital informations are: sent to the correct person, contains a link to a specific site and some specific data/text; also I'm not sure how to get the current host to the link.
I don't think this is a rare thing to do, so I wonder what the best practices are.
My testing environment: rspec, capybara, factory girl. I already added VCR, to cache the API-request.
You can use Mailcatcher to fake your mail server, and check received mail via web API:
Features
Catches all mail and stores it for display.
Shows HTML, Plain Text and Source version of messages, as applicable.
Rewrites HTML enabling display of embedded, inline images/etc and open links in a new window. (currently very basic)
Can send HTML for analysis by Fractal.
Lists attachments and allows separate downloading of parts.
Download original email to view in your native mail client(s).
Command line options to override the default SMTP/HTTP IP and port settings.
Mail appears instantly if your browser supports WebSockets, otherwise updates every thirty seconds.
Growl notifications when you receive a new message.
Runs as a daemon run in the background.
Sendmail-analogue command, catchmail, makes using mailcatcher from PHP a lot easier.
Written super-simply in EventMachine, easy to dig in and change.
How
gem install mailcatcher
mailcatcher
Go to http://localhost:1080/
Send mail through smtp://localhost:1025
API
A fairly RESTful URL schema means you can download a list of messages
in JSON from /messages, each message's metadata with
/messages/:id.json, and then the pertinent parts with
/messages/:id.html and /messages/:id.plain for the default HTML
and plain text version, /messages/:id/:cid for individual
attachments by CID, or the whole message with /messages/:id.source.
I've read most of the other answers on this topic, but a lot of them related to either third-party services like MailChimp (which I'm not necessarily opposed to) or how not to upset the host's email server.
I believe this case is unique so that it'll contribute...
I have my own DigitalOcean droplet running a rails app. I need to send out 100-1000 emails every so often, each with a unique message (a link I'm using for tracking clicks originating from the email).
I'm also operating my own iRedMail server.
Can someone recommend how to best-handle this task? I was going to simply cycle through the list of emails and use the template.html.erb to drop in my link, but what types of problems might I run into?
Thank you!
You should decouple your Rails App from the mail sending so that you don't have to wait in your view for the mails to be sent (assuming that you click on something that triggers the start of your mail sending). Use something like delayed_job or another queueing mechanism that Rails offers and only queue up the sending job of the e-mails. Then when the queue comes to execute the particular job you can customize the message with an HTML part and a text part or whatever else you need and pass them on individually to your MTA.
I'm trying to implement a webmail in PHP. I would like to write a PHP CLI script which is run on every email arrival to store some parts of (not all of) incoming email into database for search purposes. Then when the user finished searching and chose an email to show, a connection is made to mail server to retrieve the complete email. In order to implement this scenario I need to make some sort of connection among emails within database and mail server.
Since my knowledge of working with mail servers is limited to Zend Framework's API, what I believe I need in order to retrieve an email from an IMAP server is a message number or a message unique id (this later one seems not to be supported by all mail servers).
To this point, I've managed to find .forward (and some other ways) to introduce my PHP CLI script to MTAs to be run on every email arrival. This way I can store emails to database. But this won't do since message unique id is created by MDA so MTA do not know of it and they can not provide it to me. This means I can not find emails later when I want to retrieve them from mail server.
At last, here's my question: Is there a way to introduce a PHP CLI script to a MDA for emails' arrival? If this is dependent on the mail server, which servers do support this and how? My personal choice would be Dovecot or Courier, but any other mail server would do as well.
This is tricky -- there are many ways on how to setup delivery. Some of them work with the underlying mail store directly, bypassing your IMAP server altogether, while others use e.g. Dovecot's facilities.
Have you considered building on top of the notify plugin which ships with Dovecot?
It seems like it's impossible to introduce such a PHP CLI script to IMAP server (at least I'm sure of Dovecot). Anyway, the work around I found for this problem is to use my own PHP script to insert the new mails into IMAP server and retrieve their id's and then store the id in database for future references. To be clear, email are given to my PHP CLI script by MTA, not MDA. As I said before this is done easily using .forward file.
[UPDATE]
Unfortunately it seems this solution can not be implemented as well. The way to insert a new email to IMAP server is APPEND command, and to have the UID of the recently added mail server must support UIDPLUS extension. Neither Dovecot nor Courier supports this extension at the moment! If they did it seems the server would return the UID with a APPENDUID response.
[UPDATE]
It is my bad since Courier does support UIDPLUS. So this solution is valid and the one I'm going to implement.
Is it possible to setup TFS/Test Manager so that it sends out an email after a test fails?
Yes, it is possible but it requires quite a lot of changes/additions to the process template and possibly a custom-made activity.
After tests have run, we check if BuildDetail.BuildPhaseStatus has status failed
We send mail to everyone who has changesets committed to this build, so the build goes through BuildDetail.AssociatedChangesets (you need to have AssociateChangesetsAndWorkItems on) and get the committer username.
Unfortunately for us, there's no good correlation between TFS username and email address at our place, so we had to create a custom activity that looks that up in the AD.
The actual email is sent with the BuildReport action from Community TFS Build Extensions. We modified the xslt, but that's not really necessary. We also wanted to include a listing of the failed tests, and that required modification of the action itself (test data isn't included by default).
Looking at this description and all the work made to get this working, I'm beginning to wonder if it was worth it ;).