How does one sort invoice values by dimensions ?
Lets say that I wanted to loop through all of the records in the period 01.01.2014 to 05.01.2014 and sum up the total invoiceAmount for each dimension.
If I were writing this in C# I would probably just make a directory and do a ForEach on the dataset and check if the key exists in the directory if it does I would add to the value if it does not exist I would add a new key to the directory.
But i am unsure how to do this in X++
while select custInvoiceJour
where custInvoiceJour.InvoiceDate >= 01\01\2014
&& custInvoiceJour.InvoiceDate < 04\01\2014
{
// Here I would need some kind of container or other method of
// working with the data
info(strFmt("Dimension %2", custInvoiceJour.Dimension[2]));
}
Related
First of all: I'm an inexperienced coder and just started reading PiL. I only know a thing or two but I'm fast learning and understanding. This method is really unnecessary but I sort of want to give myself a hard time in order to learn more.
Okay so for testing and for getting to know the language more, I'm trying to grab two different values from two different files and storing them in tables
local gamemap = file.Read("addons/easymap/data/maplist.txt", "GAME")
local mapname = string.Explode( ",", gamemap )
local mapid = file.Read("addons/easymap/data/mapid.txt", "GAME")
local id = string.Explode( ",", mapid )
I'm grabbing two values which in the end are mapname and id
Once I have them, I know that using
for k, v in pairs(mapname)
It will give specific values to the data taken from the file, or at least assign them.
But what I need to do with the both tables is that if there is certain map in the server, check for the value in the table unless the map name is nil and then once having the name, grab the value of that map and match it with the id of the other file.
For example, I have in the maplist.txt file gm_construct and it is the first entry [1] and its corresponding id in mapid.txt lets say it is 54321 and it is also the first entry [1].
But now I must check the server's current map with game.GetMap function, I have that solved and all, I grab the current map, match it with the mapname table and then check for its corresponding value in the id table, which would be gm_construct = 1.
For example it would be something like
local mapdl = game.GetMap()
local match = mapname[mapdl]
if( match != nil )then --supposing the match isn't nil and it is in the table
--grab its table value, lets say it is 1 and match it with the one in the id table
It is a more complex version of this http://pastebin.com/3652J8Pv
I know it is unnecessary but doing this script will give me more options to expand the script further.
TL;DR: I need to find a function that lets me match two values coming from different tables and files, but in the end they are in the same order ([1] = [1]) in both files. Or a way to fetch a full table from another file. I don't know if a table can be loaded globally and then grabbed by another file to use it in that file.
I'm sorry if I'm asking too much, but where I live, if you want to learn to program, you have to do it on your own, no schools have classes or anything similar, at least not until University, and I'm far away from even finishing High School.
Edit: this is intended to be used on Garry's mod. The string.Explode is explained here: http://wiki.garrysmod.com/page/string/Explode
It basically separates phrases by a designated character, in this case, a comma.
Okay. If I understand correctly... You have 2 Files with data.
One with Map Names
gm_construct,
gm_flatgrass,
de_dust2,
ttt_waterworld
And One with IDs, Numbers, Whataver (related to the entries at the same position in the Map Names File
1258,
8592,
1354,
2589
And now you want to find the ID of the current Map, right?
Here is your Function
local function GetCurrentMapID()
-- Get the current map
local cur_map = game.GetMap()
-- Read the Files and Split them
local mapListRaw = file.Read("addons/easymap/data/maplist.txt", "GAME")
local mapList= string.Explode(",", mapListRaw)
local mapIDsRaw = file.Read("addons/easymap/data/mapid.txt", "GAME")
local mapIDs = string.Explode(",", mapIDsRaw)
-- Iterate over the whole map list
for k, v in pairs(mapList) do
-- Until you find the current map
if (v == cur_map) then
-- then return the value from mapIDs which is located at the same key (k)
return mapIDs[k]
end
end
-- Throw a non-breaking error if the current map is not in the Maplist
ErrorNoHalt( "Current map is not registered in the Maplist!\n" )
end
Code could have errors 'cause I couldn't test it. Pls Comment with error if so.
Source: My Experience and the GMod Wiki
I'd like to get the row data given to me by sqlJocky in my Dart Server application and convert it (with column names) to a Map. Ie. row['email'] == "somename".
The row data is a "_BinaryDataPacket" which is an extension of Row type. Right now the method recommended by the sqlJocky developer to access the data involves either knowing the column name of what your accessing in the database:
row.email == "somename"
or just ignoring the column name all together:
row[0] == "somename"
I've tried a few hacks to get at the column data and even edited the original sqlJockey code to get at _BinaryDataPacket._fieldIndex to made it public. While that did give me access to Symbol instances of the Column titles to build a map with, I would like to avoid modifying the developers stable code if at all possible.
I assume there has to be an easy way to get the column names and put them in a Map with the row data.
TLDR:
I want to Convert alpha.brava == "charle" into alpha["bravo"] == "charle".
Thanks
In your results, you have a fields list attribute.
So you could do this :
.then((Results results) {
results.forEach((Row row) {
Map r = new Map();
for(int i=0; i<results.fields.length; i++) {
r[results.fields[i].name] = row[i];
}
});
});
This will map the data from a row to the map
Hi is it possible using Entity Framework and/or linq to select a certain number of rows? For example i want to select rows 0 - 500000 and assign these records to the List VariableAList object, then select rows 500001 - 1000000 and assign this to the List VariableBList object, etc. etc.
Where the Numbers object is like ID,Number,DateCreated, DateAssigned, etc.
Sounds like you're looking for the .Take(int) and .Skip(int) methods
using (YourEntities db = new YourEntities())
{
var VariableAList = db.Numbers
.Take(500000);
var VariableBList = db.Numbers
.Skip(500000)
.Take(500000);
}
You may want to be wary of the size of these lists in memory.
Note: You also may need an .OrderBy clause prior to using .Skip or .Take--I vaguely remember running into this problem in the past.
I know of:
http://lua-users.org/wiki/SimpleLuaApiExample
It shows me how to build up a table (key, value) pair entry by entry.
Suppose instead, I want to build a gigantic table (say something a 1000 entry table, where both key & value are strings), is there a fast way to do this in lua (rather than 4 func calls per entry:
push
key
value
rawset
What you have written is the fast way to solve this problem. Lua tables are brilliantly engineered, and fast enough that there is no need for some kind of bogus "hint" to say "I expect this table to grow to contain 1000 elements."
For string keys, you can use lua_setfield.
Unfortunately, for associative tables (string keys, non-consecutive-integer keys), no, there is not.
For array-type tables (where the regular 1...N integer indexing is being used), there are some performance-optimized functions, lua_rawgeti and lua_rawseti: http://www.lua.org/pil/27.1.html
You can use createtable to create a table that already has the required number of slots. However, after that, there is no way to do it faster other than
for(int i = 0; i < 1000; i++) {
lua_push... // key
lua_push... // value
lua_rawset(L, tableindex);
}
I'm trying to figure out the best approach to display combined tables based on matching logic and input search criteria.
Here is the situation:
We have a table of customers stored locally. The fields of interest are ssn, first name, last name and date of birth.
We also have a web service which provides the same information. Some of the customers from the web service are the same as the local file, some different.
SSN is not required in either.
I need to combine this data to be viewed on a Grails display.
The criteria for combination are 1) match on SSN. 2) For any remaining records, exact match on first name, last name and date of birth.
There's no need at this point for soundex or approximate logic.
It looks like what I should do is extract all the records from both inputs into a single collection, somehow making it a set on SSN. Then remove the blank ssn.
This will handle the SSN matching (once I figure out how to make that a set).
Then, I need to go back to the original two input sources (cached in a collection to prevent a re-read) and remove any records that exist in the SSN set derived previously.
Then, create another set based on first name, last name and date of birth - again if I can figure out how to make a set.
Then combine the two derived collections into a single collection. The collection should be sorted for display purposes.
Does this make sense? I think the search criteria will limit the number of record pulled in so I can do this in memory.
Essentially, I'm looking for some ideas on how the Grails code would look for achieving the above logic (assuming this is a good approach). The local customer table is a domain object, while what I'm getting from the WS is an array list of objects.
Also, I'm not entirely clear on how the maxresults, firstResult, and order used for the display would be affected. I think I need to read in all the records which match the search criteria first, do the combining, and display from the derived collection.
The traditional Java way of doing this would be to copy both the local and remote objects into TreeSet containers with a custom comparator, first for SSN, second for name/birthdate.
This might look something like:
def localCustomers = Customer.list()
def remoteCustomers = RemoteService.get()
TreeSet ssnFilter = new TreeSet(new ClosureComparator({c1, c2 -> c1.ssn <=> c2.ssn}))
ssnFilter.addAll(localCustomers)
ssnFilter.addAll(remoteCustomers)
TreeSet nameDobFilter = new TreeSet(new ClosureComparator({c1, c2 -> c1.firstName + c1.lastName + c1.dob <=> c2.firstName + c2.lastName + c2.dob}))
nameDobFilter.addAll(ssnFilter)
def filteredCustomers = nameDobFilter as List
At this point, filteredCustomers has all the records, except those that are duplicates by your two criteria.
Another approach is to filter the lists by sorting and doing a foldr operation, combining adjacent elements if they match. This way, you have an opportunity to combine the data from both sources.
For example:
def combineByNameAndDob(customers) {
customers.sort() {
c1, c2 -> (c1.firstName + c1.lastName + c1.dob) <=>
(c2.firstName + c2.lastName + c2.dob)
}.inject([]) { cs, c ->
if (cs && c.equalsByNameAndDob(cs[-1])) {
cs[-1].combine(c) //combine the attributes of both records
cs
} else {
cs << c
}
}
}