For building a dynamic SELECT I am iterating over a field list and adding an alias.
However, that clears the list.
REPORT ZITER_TEST.
CONSTANTS lc_alias TYPE string VALUE 'ref'.
DATA lt_field TYPE string_table.
lt_field = VALUE string_table(
( CONV string( 'PERNR' ) )
( CONV string( 'SUBTY' ) )
).
" lt_field: ['PERNR','SUBTY']
lt_field = VALUE string_table( FOR <lv_tmp> IN lt_field
( lc_alias && `~` && <lv_tmp> )
).
" lt_field: [] instead of: ['ref~PERNR','ref~SUBTY']
The ABAP documentation of VALUE says:
In assignments to a data object, the target variable is used directly and no temporary data object is created. This variable is initialized or overwritten in full before the assignment of the values specified in the parentheses. Its original value, however, is still available in an optional LET expression.
So, instead of:
lt_field = VALUE string_table( FOR <lv_tmp> IN lt_field
( lc_alias && `~` && <lv_tmp> )
).
Use:
lt_field = VALUE string_table( LET lt_field_temp = lt_field IN
FOR <lv_tmp> IN lt_field_temp
( lc_alias && `~` && <lv_tmp> )
).
You just have to use some temporary table. It works in my case:
REPORT ZITER_TEST.
CONSTANTS lc_alias TYPE string VALUE 'ref'.
DATA: lt_field_temp TYPE string_table,
lt_field TYPE string_table.
lt_field_temp = VALUE string_table(
( CONV string( 'PERNR' ) )
( CONV string( 'SUBTY' ) )
).
lt_field = VALUE string_table( FOR <lv_tmp> IN lt_field_temp
( lc_alias && '~' && <lv_tmp> )
).
Looping table with field-symbol will be significantly faster than using temporary table in either form (LET or direct assign):
LOOP AT lt_field ASSIGNING FIELD-SYMBOL(<field>)
<field> = lc_alias && `~` && <field>.
ENDLOOP.
Related
Does z3py have functions to create a parametric data type, like what would be generated using the following SMTLIB code?
( declare - datatype List ( par ( T )
( ( nil ) ( cons ( car T ) ( cdr ( List T )) ))))
Yes. See here: https://ericpony.github.io/z3py-tutorial/advanced-examples.htm
Search for the section titled "Datatypes."
Here's the example right from that page, that does exactly what you want:
def DeclareList(sort):
List = Datatype('List_of_%s' % sort.name())
List.declare('cons', ('car', sort), ('cdr', List))
List.declare('nil')
return List.create()
IntList = DeclareList(IntSort())
RealList = DeclareList(RealSort())
IntListList = DeclareList(IntList)
I'm encountering a problem in a Lua script that I'm learning from (i am new to Lua) this error got me heavily confused, when I run the code it gives me this following error:
attempt to index global "zoneName" (a nil value)
this is my code:
local zoneName = zoneName:gsub ( "'", "" )
if dbExec ( handler, "INSERT INTO `safeZones` (`rowID`, `zoneName`, `zoneX`, `zoneY`, `zoneWidth`, `zoneHeight`) VALUES (NULL, '".. tostring ( zoneName ) .."', '".. tostring ( zoneX ) .."', '".. tostring ( zoneY ) .."', '".. zoneWidth .."', '".. zoneHeight .."');" ) then
createSafeZone ( { [ "zoneName" ] = zoneName, [ "zoneX" ] = zoneX, [ "zoneY" ] = zoneY, [ "zoneWidth" ] = zoneWidth, [ "zoneHeight" ] = zoneHeight } )
outputDebugString ( "Safe Zones: Safe zone created name: ".. tostring ( zoneName ) )
return true
else
return false, "Unable to create the safe zone"
end
You reference zoneName already in it's definition, you code equals to
local zoneName = nil:gsub("'", "")
hence the error (zoneName is not yet defined when Lua tries to execute zoneName:gsub()).
Either define zoneName before the gsub() call or use string.gsub()
We are regularly setting up new DOORS installations on standalone networks, and each of these networks use slightly different drive mappings and installation directories. We have a set of DXL scripts that we copy over to each network that uses DOORS, but these DXL scripts reference some Microsoft Word templates that are used as the basis for custom-developed module export scripts.
We no longer have a DXL expert in-house, and I'm trying to make the scripts more portable so that they no
longer contain hard-coded file paths. Because we copy all of the templates and DXL files in a pre-defined directory structure, I can use the dxlHere() function to figure out the execution path of the DXL script, which would print something like this:
<C:\path\to\include\file\includeFile.inc:123>
<C:\path\to\include\file\includeFile.inc:321>
<Line:2>
<Line:5>
<Line:8>
What I'd like to do is extract everything before file\includeFile.inc:123>, excluding the starting <. Then I want to append templates\template.dotx.
For example, the final result would be:
C:\path\to\inclue\template.dotx
Are there any built-in DXL functions to handle string manipulation like this? Is regex the way to go? If so, what regexp would be appropriate to handle this?
Thanks!
I got this... kind of working.
dxlHere is something I don't work with much, but this seems to work- as long as it's saved to a an actual dxl or inc file (i.e. not just run from the editor)
string s = dxlHere()
string s2 = null
string s3 = null
Regexp r = regexp2 ( "\\..*:.*> $" )
Regexp r2 = regexp2 ( "/" )
if ( r s ) {
s2 = s[ 1 : ( ( start ( 0 ) ) - 1 ) ]
s3 = s[ 1 : ( ( start ( 0 ) ) - 1 ) ]
int x = 0
while ( r2 s2 ) {
x++
s2 = s2[ ( ( start ( 0 ) ) + 1 ) : ]
}
int z = 0
for ( y = 0; y <= length( s3 ); y++ ){
if ( s3[y] == '/' ) {
z++
if ( z == ( x - 2 ) ) {
s = s3[ 0 : y ]
break
}
}
}
}
print s
So we're doing a single regexp to check if we have a valid 'location', then running through it to find ever '/' character, then leaving off the last 2 of them.
Hope this helps!
Concatenation is working in select statement:
SELECT 'HELLO' || 'WORLD';
is returning HELLOWORLD, but when I try to use it in stored procedure like below:
SET Time_of_Day=TRIM(Hour_of_Day) || ' : ' || TRIM(Minute_of_Hour) || ' : ' || TRIM(Second_of_Minute);
where Hour_of_Day,Minute_of_Hour,Second_of_Minute are variables, I also tried without TRIM:
ERROR:
**CALL FAILED 2620:PROCEDURE_NAME:THE FORMAT OR DATA CONTAINS A BAD CHARACTER**
Casting a string to a time in Teradata requires two-digit hour/minute/second:
SET Time_of_Day=TRIM(Hour_of_Day (FORMAT '99')) || ' : ' ||
TRIM(Minute_of_Hour (FORMAT '99')) || ' : ' ||
TRIM(Second_of_Minute (FORMAT '99'))
But this snippet is probably from the SP question you deleted an hour ago (a few seconds before I could post my answer).
There's no need for running 86400 single-row Inserts, simply create all data in a single Select, e.g.:
SELECT
Row_Number() Over (ORDER BY h,m,s),
Extract(SECOND From t) AS s,
Extract(MINUTE From t) AS m,
Extract(HOUR From t) AS h,
t
FROM
(
SELECT Cast(Begin(pd) AS TIME(0)) AS t
FROM sys_calendar.CALENDAR
WHERE calendar_date = Current_Date
EXPAND ON PERIOD(Cast(Current_Date AS TIMESTAMP(0)), Cast(Current_Date + 1 AS TIMESTAMP(0))) AS pd
) AS dt
I am working with a database I am trying to make secure. I have been able to create the database and access the information within it. However once I try to encrypt the individual pieces of information within the database I run into trouble. I am using sqlite3 for my database and openssl to try to encrypt the database.
local users = [[CREATE TABLE IF NOT EXISTS users (id INTEGER PRIMARY KEY, username, password);]]
db:exec( users )
local nameData = cipher:encrypt ( nameField.text, "sbs_math_key" )
local passData = cipher:encrypt ( passwordField.text, "sbs_math_key" )
local tablefill =[[INSERT INTO users VALUES (NULL, ']].. nameData ..[[',']].. passData ..[['); ]]
db:exec( tablefill )
This works if I am inserting into an existing database, however if I am creating a new database it won't let me insert the encrypted information.
Try this:
local nameData = mime.b64 ( cipher:encrypt ( nameField.text, "sbs_math_key") )
local passData = mime.b64 ( cipher:encrypt ( passwordField.text, "sbs_math_key") )
print( "Before : "..nameData )
local tablefill =[[INSERT INTO users VALUES (NULL, ']].. nameData ..[[',']].. passData ..[['); ]]
db:exec( tablefill )
message = cipher:decrypt ( mime.unb64 ( nameData ), "sbs_math_key" )
print( "After : "..message )