Concurrency Issue Grails - grails

I am using this code for updating a row.
SequenceNumber.withNewSession {
def hibSession = sessionFactory.getCurrentSession()
Sql sql = new Sql(hibSession.connection())
def rows = sql.rows("select for update query");
}
in this query I am updating the number initially sequenceNumber is 1200.
and every time this code run then it will b increamented by 1.
and I am running this code 5 times in loop.
but this is not flushing the hibernate session so that every time I am getting the same number 1201.
I have also used
hibSession.clear()
hibSession.flush()
but no success.
If I use following code then its works fine.
SequenceNumber.withNewSession {
Sql sql = new Sql(dataSource)
def rows = sql.rows("select for update query")
}
every time I am getting a new number.
Can anybody tell me what's wrong with above code

Try with Transaction, + flush it on the end, like:
SequenceNumber.withTransaction { TransactionStatus status ->
...
status.flush()
}

Related

Grails - not-null property references a null or transient value on lastUpdated Property

I have a Grails application that is running a periodic job that collects information from several domain objects and then updates records in a table for easy tracking and monitoring.
This project is using Grails 2.1.1
The job runs fine the first two or three times but then it begins failing. The Error that is being thrown is
org.spingframework.dao.DataIntergrityViolationException:
not-null property references a null or transient vale:
Tracker.lastUpdated;
nested exception is org.hibernate.PropteryValueException:
not-null property references a null or tansient value:
Tracker.lastUpdated
So, in my domain I have this:
package arena
import grails.plugin.multitenant.core.annotation.MultiTenant
#MultiTenant
class Tracker {
static belongsTo = [ plan: Plan ]
boolean active
Integer produced
Integer counted
Integer failed
Integer remaining
Float percentDone
Date lastUpdated
static constraints = {
plan(nullable:false)
produced(nullable:true)
failed(nullable:true)
counted(nullable:true)
remaining(nullable:true)
percentDone(nullable:true)
}
}
In the job I get each plan then update all of the properties except lastUpdated since that should be automagically updated at the time of the row update. It is causing my job to fail so not all of the plans trackers are getting updated accordingly.
Do I need to manually set the lastupdated? I don't have to do that with any other domain objects.
The Job is executed every 2 hours. The gist of the process is that I collect all plans across all the tenant locations. And for each plan I calculate values for each plan and update the tracker record.
def plans = Plan.createCriteria().list {
projections {
property("id")
}
}
// update or create strawTracker objects
for (plan in plans) {
def collection = Plan.get(plan)
multiTenantService.doWithTenantId(collection.tenantId) {
def tracker = Tracker.findByPlan(collection)
if (!tracker) {
tracker = new Tracker()
tracker.merge(updateTracker(tracker, collection)).save(flush: true)
} else {
if (tracker.active && !collection.status.equals("done")) {
// get info
// println "Update tracker " + collection
def tempTracker = new Tracker()
tempTracker.merge(updateTracker(tracker, collection)).save(flush:true)
}
def updateTracker(Tracker tracker, Plan plan) {
def prodSum = 0
def failSum = 0
def countSum = 0
plan?.Schedules?.each {
// doing calculations here //
}
int tenant = plan.tenantId
tracker.Plan = plan
tracker.active = !plan.status.equals("done")
tracker.produced = prodSum
tracker.strawsFailed = failSum
tracker.countedStraws = countSum
tracker.remaining = plan.quantityOrdered == null? null : (plan.quantityOrdered - countSum)
tracker.percentDone = plan.quantityOrdered == null? null : (countSum * 100 / plan.quantityOrdered)
tracker.tenantId = tenant
return tracker
}
As stated before it runs successfully the first few times, then as users are updating plans and filling orders it begins failing and so some tracker information is updated correctly but some are not.

Grails execute multiple native sql queries asynchronously

I have a service that executes multiple native sql statements. Is it possible to run those sql statements asynchronously? My query looks something like this with multiple queries:
def sessionFactory
sessionFactory = ctx.sessionFactory // this only necessary if your are working with the Grails console/shell
def session = sessionFactory.currentSession
def query = session.createSQLQuery("select f.* from fee f where f.id = :filter)) order by f.name");
You can use the new grails Promises API
import static grails.async.Promises.*
List queriesToExecute = []
someList.each {
//create task to execute query
def t = task {
//create and execute query here
}
queriesToExecute << t
}
def result = waitAll(queriesToExecute) //if you want to wait till queries are executed
//or if you dont want to wait, use this - onComplete(queriesToExecute) { List results -> }​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​​

Insert 10,000,000+ rows in grails

I've read a lot of articles recently about populating a grails table from huge data, but seem to have hit a ceiling. My code is as follows:
class LoadingService {
def sessionFactory
def dataSource
def propertyInstanceMap = org.codehaus.groovy.grails.plugins.DomainClassGrailsPlugin.PROPERTY_INSTANCE_MAP
def insertFile(fileName) {
InputStream inputFile = getClass().classLoader.getResourceAsStream(fileName)
def pCounter = 1
def mCounter = 1
Sql sql = new Sql(dataSource)
inputFile.splitEachLine(/\n|\r|,/) { line ->
line.each { value ->
if(value.equalsIgnoreCase('0') {
pCounter++
return
}
sql.executeInsert("insert into Patient_MRNA (patient_id, mrna_id, value) values (${pCounter}, ${mCounter}, ${value.toFloat()})")
pCounter++
}
if(mCounter % 100 == 0) {
cleanUpGorm()
}
pCounter = 1
mCounter++
}
}
def cleanUpGorm() {
session.currentSession.clear()
propertyInstanceMap.get().clear()
}
}
I have disabled secondary cache, I'm using assigned ids, and I am explicitly handling this many to many relationship through a domain, not the hasMany and belongsTo.
My speed has increased monumentally after applying these methods, but after a while the inserts slow down to the point of almost stopping compared to about 623,000 per minute at the beginning.
Is there some other memory leak that I should be aware of or have I just hit the ceiling in terms of batch inserts in grails?
To be clear it takes about 2 minutes to insert 1.2 million rows, but then they start to slow down.
Try doing batch inserts, it's much more efficient
def updateCounts = sql.withBatch { stmt ->
stmt.addBatch("insert into TABLENAME ...")
stmt.addBatch("insert into TABLENAME ...")
stmt.addBatch("insert into TABLENAME ...")
...
}
I have fought with this in earlier versions of Grails. Back then I resorted to either simply run the batch manually in proper chunks or use another tool for the batch import, such as Pentaho Data Integration (or other ETL tool or DIY).

XmlSlurper parsing a query result

I have a query that bring back a cell in my table the has all xml in it. I have it so I can spit out what is in the cell without any delimiters. Now i need to actually take each individual element and link them with my object. Is there any easy way to do this?
def sql
def dataSource
static transactional = true
def pullLogs(String username, String id) {
if(username != null && id != null) {
sql = new Sql(dataSource)
println "Data source is: " + dataSource.toString()
def schema = dataSource.properties.defaultSchema
sql.query('select USERID, AUDIT_DETAILS from DEV.AUDIT_LOG T WHERE XMLEXISTS(\'\$s/*/user[id=\"' + id + '\" or username=\"'+username+'\"]\' passing T.AUDIT_DETAILS as \"s\") ORDER BY AUDIT_EVENT', []) { ResultSet rs ->
while (rs.next()) {
def auditDetails = new XmlSlurper().parseText(rs.getString('AUDIT_EVENT_DETAILS'))
println auditDetails.toString
}
}
sql.close()
}
}
now this will give me that cell with those audit details in it. Bad thing is that is just puts all the information from the field in on giant string without the element tags. How would I go through and assign the values to a object. I have been trying to work with this example http://gallemore.blogspot.com/2008/04/groovy-xmlslurper.html with no luck since that works with a file.
I have to be missing something. I tried running another parseText(auditDetails) but haven't had any luck on that.
Any suggestions?
EDIT:
The xml int that field looks like
<user><username>scottsmith</username><timestamp>tues 5th 2009</timestamp></user>
^ simular to how it is except mine is ALOT longer. It comes out as "scottsmithtue 5th 2009" so on and so forth. I need to actually take those tags and link them to my object instead of just printing them in one conjoined string.
Just do
auditDetails.username
Or
auditDetails.timestamp
To access the properties you require

Why sql.rows Groovy method is so slow

I tried to fetch some data with the sql.rows() Groovy method and it took a very long time to return the values.
So I tried the "standard" way and it's much much faster (150 times faster).
What am I missing ?
Look at the code below : the first method returns results in about 2500ms and the second in 15 ms !
class MyService {
javax.sql.DataSource dataSource
def SQL_QUERY = "select M_FIRSTNAME as firstname, M_LASTNAME as lastname, M_NATIONALITY as country from CT_PLAYER order by M_ID asc";
def getPlayers1(int offset, int maxRows)
{
def t = System.currentTimeMillis()
def sql = new Sql(dataSource)
def rows = sql.rows(SQL_QUERY, offset, maxRows)
println "time1 : ${System.currentTimeMillis()-t}"
return rows
}
def getPlayers2(int offset, int maxRows)
{
def t = System.currentTimeMillis();
Connection connection = dataSource.getConnection();
Statement statement = connection.createStatement();
statement.setMaxRows(offset + maxRows -1);
ResultSet resultSet = statement.executeQuery(SQL_QUERY);
def l_list =[];
if(resultSet.absolute(offset)) {
while (true) {
l_list << [
'firstname':resultSet.getString('firstname'),
'lastname' :resultSet.getString('lastname'),
'country' :resultSet.getString('country')
];
if(!resultSet.next()) break;
}
}
resultSet.close()
statement.close()
connection.close()
println "time2 : ${System.currentTimeMillis()-t}"
return l_list
}
When you call sql.rows, groovy eventually calls SqlGroovyMethods.toRowResult for each row returned by the resultSet.
This method interrogates the ResultSetMetaData for the resultSet each time to find the column names, and then fetches the data for each of these columns from the resultSet into a Map which it adds to the returned List.
In your second example, you directly get the columns required by name (as you know what they are), and avoid having to do this lookup every row.
I think I found the reason this method is so slow : statement.setMaxRows() is never called !
That means that a lot of useless data is sent by the database (when you want to see the first pages of a large datagrid)
I wonder how your tests would turn out if you try with setFetchSize instead of setMaxRows. A lot of this has to the underlying JDBC Driver's default behavior.

Resources