F#- How can we validate the whole schema of API response using HttpFs.Client or Hopac? - f#

I have a test where after getting a response I would like to validate the entire schema of the response (not individual response node/value comparison).
Sample test:
[<Test>]
let howtoValidateSchema () =
let request = Request.createUrl Post "https://reqres.in/api/users"
|> Request.setHeader (Accept "application/json")
|> Request.bodyString """{"name": "morpheus",
 "job": "leader"}"""
|> Request.responseAsString
|> run
Is there a way that I can save my expected Schema somewhere and once I get the response I do the comparison to check that response has same number of nodes (neither less nor more than expected schema)?
I am ok to opt for other libs like FSharp.Data if we there is no direct way in HttpFs.Client. I looked at FSharp.Data (https://fsharp.github.io/FSharp.Data/library/JsonProvider.html) but not able to seek how it meets the requirements where the schema comparison needs to be done with the savedExpectedSchemaJson=ResponseJson.

You can use Newtonsoft.Json.Schemato validate schemas:
open Newtonsoft.Json.Schema
open Newtonsoft.Json.Linq
let schema = JSchema.Parse expectedSchema
let json = JObject.Parse responeJson
let valid = json.IsValid schema
However this assumes you have a schema predefined somewhere. If you don't have such schema is best to use the JsonProvider who can infer it for you.
Run the call manually and save the result in a sample.json file and create a type using the JsonProvider:
type ResponseSchema = JsonProvider<"sample.json">
and you can use this type to parse any new content based on the sample (provided that the sample is a representative.
ResponseSchema.parse response
This won't validate the schema but will try to meet as best as it can given the input.

Related

Read data from XLSX provided as XSTRING

An Excel file (.xlsx) is uploaded on the frontend which is UI5 Fiori.
The file contents come to SAP ABAP backend via ODATA in XSTRING format.
I need to store that XSTRING into an internal table and then in a DDIC table. Eg: Suppose the Excel has 5 columns then I want to store that data of 5 columns in the corresponding columns in the DDIC table.
I have tried various Function Modules like:
SCMS_XSTRING_TO_BINARY
SCMS_BINARY_TO_STRING
and following Classes & methods:
cl_bcs_convert=>raw_to_string
cl_soap_xml_helper=>xstring_to_string
but none were able to convert the XSTRING to STRING.
Can you please suggest which function module or class/method can be used to solve the problem?
For most comfort, use abap2xlsx.
If you cannot or do not want to use that, you can alternatively parse the Excel file on your own. .xlsx files are basically .zip files with a different file ending. Use cl_abap_zip->load to open the xstring you receive and ->get to extract the individual files from the zip. Afterwards, use XML parsers like cl_ixml or transformations to parse the XML content of the files.
Note that Excel's XML is a complicated file format, with several files that work together to form the worksheets. Refer to Microsoft's File format reference for Word, Excel, and PowerPoint for details. It's non-trivial to interpret this, so you will usually be a lot happier with abap2xlsx.
abap2xlsx is the most powerful and feature-rich way of doing this, as said by Florian, it supports styles, charts, complex tables, however it may not be always available due to the system limitations, restrictions to install custom packages in system or whatever.
Here is the way how to accomplish this with pure standard without using custom frameworks.
Since Netweaver 7.02 SAP supports Open Microsoft formats natively and provides classes for handling them: CL_XLSX_DOCUMENT, CL_DOCX_DOCUMENT and CL_PPTX_DOCUMENT, abap2xlsx is built at these classes too, yes. So let's start a bit of reinventing the wheel.
XLSX file is an OpenXML archive of files, of which the most interesting: sheet1.xml and sharedStrings.xml. Let's build a sample based on MARC table fields
Now you want to transfer this table to internal table with the same structure. The steps would be:
Extract needed files from XLSX archive
Read worksheet structure from sheet1.xml
Read sheet values from sharedStrings.xml
Map them together and write the result to the internal table
Here is the sample class that handles the job, I used the cl_openxml_helper applet to load XLSX, but you can receive XSTRINGed XLSX in whatever way.
CLASS xlsx_reader DEFINITION.
PUBLIC SECTION.
TYPES: BEGIN OF ty_marc,
matnr TYPE char20,
werks TYPE char20,
disls TYPE char20,
ekgrp TYPE char20,
dismm TYPE char20,
END OF ty_marc,
tt_marc TYPE STANDARD TABLE OF ty_marc WITH EMPTY KEY.
METHODS: read RETURNING VALUE(tab) TYPE tt_marc,
extract_xml IMPORTING index TYPE i
xstring TYPE xstring
RETURNING VALUE(rv_xml_data) TYPE xstring.
ENDCLASS.
CLASS xlsx_reader IMPLEMENTATION.
METHOD read.
TYPES: BEGIN OF ty_row,
value TYPE string,
index TYPE abap_bool,
END OF ty_row,
BEGIN OF ty_worksheet,
row_id TYPE i,
row TYPE TABLE OF ty_row WITH EMPTY KEY,
END OF ty_worksheet,
BEGIN OF ty_si,
t TYPE string,
END OF ty_si.
DATA: data TYPE TABLE OF ty_si,
sheet TYPE TABLE OF ty_worksheet.
TRY.
DATA(xstring_xlsx) = cl_openxml_helper=>load_local_file( 'C:\marc.xlsx' ).
CATCH cx_openxml_not_found.
ENDTRY.
"Read the sheet XML
DATA(xml_sheet) = extract_xml( EXPORTING xstring = xstring_xlsx iv_xml_index = 2 ).
"Read the data XML
DATA(xml_data) = extract_xml( EXPORTING xstring = xstring_xlsx iv_xml_index = 3 ).
TRY.
* transforming structure into ABAP
CALL TRANSFORMATION zsheet
SOURCE XML xml_sheet
RESULT root = sheet.
* transforming data into ABAP
CALL TRANSFORMATION zxlsx_data
SOURCE XML xml_data
RESULT root = data.
CATCH cx_xslt_exception.
CATCH cx_st_match_element.
CATCH cx_st_ref_access.
ENDTRY.
* mapping structure and data
LOOP AT sheet ASSIGNING FIELD-SYMBOL(<fs_row>).
APPEND INITIAL LINE TO tab ASSIGNING FIELD-SYMBOL(<line>).
LOOP AT <fs_row>-row ASSIGNING FIELD-SYMBOL(<fs_cell>).
ASSIGN COMPONENT sy-tabix OF STRUCTURE <line> TO FIELD-SYMBOL(<fs_field>).
CHECK sy-subrc = 0.
<fs_field> = COND #( WHEN <fs_cell>-index = abap_false THEN <fs_cell>-value ELSE VALUE #( data[ <fs_cell>-value + 1 ]-t OPTIONAL ) ).
ENDLOOP.
ENDLOOP.
ENDMETHOD.
METHOD extract_xml.
TRY.
DATA(lo_package) = cl_xlsx_document=>load_document( iv_data = xstring ).
DATA(lo_parts) = lo_package->get_parts( ).
CHECK lo_parts IS BOUND AND lo_package IS BOUND.
DATA(lv_uri) = lo_parts->get_part( 2 )->get_parts( )->get_part( index )->get_uri( )->get_uri( ).
DATA(lo_xml_part) = lo_package->get_part_by_uri( cl_openxml_parturi=>create_from_partname( lv_uri ) ).
rv_xml_data = lo_xml_part->get_data( ).
CATCH cx_openxml_format cx_openxml_not_found.
ENDTRY.
ENDMETHOD.
ENDCLASS.
zsheet transformation:
<?sap.transform simple?>
<tt:transform xmlns:tt="http://www.sap.com/transformation-templates" template="main">
<tt:root name="root"/>
<tt:template name="main">
<worksheet xmlns="http://schemas.openxmlformats.org/spreadsheetml/2006/main" xmlns:r="http://schemas.openxmlformats.org/officeDocument/2006/relationships" xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006" xmlns:x14ac=
"http://schemas.microsoft.com/office/spreadsheetml/2009/9/ac" xmlns:xr="http://schemas.microsoft.com/office/spreadsheetml/2014/revision" xmlns:xr2="http://schemas.microsoft.com/office/spreadsheetml/2015/revision2" xmlns:xr3=
"http://schemas.microsoft.com/office/spreadsheetml/2016/revision3">
<tt:skip count="4"/>
<sheetData>
<tt:loop name="row" ref="root">
<row>
<tt:attribute name="r" value-ref="row_id"/>
<tt:loop name="cells" ref="$row.ROW">
<c>
<tt:cond><tt:attribute name="t" value-ref="index"/><tt:assign to-ref="index" val="C('X')"/></tt:cond>
<v><tt:value ref="value"/></v>
</c>
</tt:loop>
</row>
</tt:loop>
</sheetData>
<tt:skip count="2"/>
</worksheet>
</tt:template>
</tt:transform>
zxlsx_data transformation
<?sap.transform simple?>
<tt:transform xmlns:tt="http://www.sap.com/transformation-templates" template="main">
<tt:root name="ROOT"/>
<tt:template name="main">
<sst xmlns="http://schemas.openxmlformats.org/spreadsheetml/2006/main">
<tt:loop name="line" ref=".ROOT">
<si>
<t>
<tt:value ref="t"/>
</t>
</si>
</tt:loop>
</sst>
</tt:template>
</tt:transform>
Here is how to call it:
START-OF-SELECTION.
DATA(reader) = NEW xlsx_reader( ).
DATA(marc) = reader->read( ).
The code is pretty self-explanatory, but let's put a couple of notes:
File sheet1.xml contains a special attribute t in each cell which denotes either the value should be treated as a literal or a reference to sharedStrings.xml
I used two simple transformations but XSLT can be used as well, possibly allowing you to reduce all XML stuff to single transformation
I deliberately used generic char20 types to be able to handle headers. If you wanna preserve native types, then you cannot read table header (skip the first line in sheet LOOP), because you'll receive type violation and dump. If you receive table without headers, then it is fine to declare structure with native types
If you don't want to use transformations then sXML is your friend. You can parse XML with classes as well, but ST transformation are considerably faster
With some additional effort you can make this snippet dynamic and parse XLSX with any structure
You can read more about this approach in this doc.

How to send the file returned from a function over post in GatLing-Scala?

def siteNameChange():File={
for(line<-Source.fromFile("RecordedSimulation_0000_NewSiterequest2.txt").getLines())
if(line.contains("siteUrl"))
println(line)
return new File("RecordedSimulation_0000_NewSiterequest2.txt")
}
val scn = scenario("RecordedSimulation")
.exec(http("request_0")
.post(“/student/new”)
.body(RawFileBodyPart(session=>siteNameChange())).asJSON)
Hello I am a newbie to Gatling, using it for performance testing. I have a function named siteNameChange() which returns a file after doing some modifications on the file.
This function I am calling in the scenario body to send the data.
But when I am running the script I am getting scala:48:26: missing parameter type
.body(RawFileBodyPart(session=>siteNameChange())).asJSON)
Can some one please suggest whats the best thing to do this here, how to get the function return the modified file and pass the file data over the post request
body doesn't take a BodyPart (which is for multipart) parameter but a Body one.
You should be passing a RawFileBody.

Wireshark: display filters vs nested dissectors

I have an application that sends JSON objects over AMQP, and I want to inspect the network traffic with Wireshark. The AMQP dissector gives the payload as a series of bytes in the field amqp.payload, but I'd like to extract and filter on specific fields in the JSON object, so I'm trying to write a plugin in Lua for that.
Wireshark already has a dissector for JSON, so I was hoping to piggy-back on that, and not have to deal with JSON parsing myself.
Here is my code:
local amqp_json_p = Proto("amqp_json", "AMQP JSON payload")
local amqp_json_result = ProtoField.string("amqp_json.result", "Result")
amqp_json_p.fields = { amqp_json_result }
register_postdissector(amqp_json_p)
local amqp_payload_f = Field.new("amqp.payload")
local json_dissector = Dissector.get("json")
local json_member_f = Field.new("json.member")
local json_string_f = Field.new("json.value.string")
function amqp_json_p.dissector(tvb, pinfo, tree)
local amqp_payload = amqp_payload_f()
if amqp_payload then
local payload_tvbrange = amqp_payload.range
if payload_tvbrange:range(0,1):string() == "{" then
json_dissector(payload_tvbrange:tvb(), pinfo, tree)
-- So far so good. Let's look at what the JSON dissector came up with.
local members = { json_member_f() }
local strings = { json_string_f() }
local subtree = tree:add(amqp_json_p)
for k, member in pairs(members) do
if member.display == 'result' then
for _, s in ipairs(strings) do
-- Find the string value inside this member
if not (s < member) and (s <= member) then
subtree:add(amqp_json_result, s.range)
break
end
end
end
end
end
end
end
(To start with, I'm just looking at the result field, and the payload I'm testing with is {"result":"ok"}.)
It gets me halfway there. The following shows up in the packet dissection, whereas without my plugin I only get the AMQP section:
Advanced Message Queueing Protocol
Type: Content body (3)
Channel: 1
Length: 15
Payload: 7b22726573756c74223a226f6b227d
JavaScript Object Notation
Object
Member Key: result
String value: ok
Key: result
AMQP JSON payload
Result: "ok"
Now I want to be able to use these new fields as display filters, and also to add them as columns in Wireshark. The following work for both:
json (shows up as Yes when added as a column)
json.value.string (I can also filter with json.value.string == "ok")
amqp_json
But amqp_json.result doesn't work: if I use it as a display filter, Wireshark doesn't show any packets, and if I use it as a column, the column is empty.
Why does it behave differently for json.value.string and amqp_json.result? And how can I achieve what I want? (It seems like I do need a custom dissector, as with json.value.string I can only filter on any member having a certain value, not necessarily result.)
I found a thread on the wireshark-dev mailing list ("Lua post-dissector not getting field values", 2009-09-17, 2009-09-22, 2009-09-23), that points to the interesting_hfids hash table, but it seems like the code has changed a lot since then.
If you'd like to try this, here is my PCAP file, base64-encoded, containing a single packet:
1MOyoQIABAAAAAAAAAAAAAAABAAAAAAAjBi1WfYOCgBjAAAAYwAAAB4AAABgBMEqADcGQA
AAAAAAAAAAAAAAAAAAAAEAAAAAAAAAAAAAAAAAAAAB/tcWKO232y46mkSqgBgxtgA/AAAB
AQgKRjDNvkYwzb4DAAEAAAAPeyJyZXN1bHQiOiJvayJ9zg==
Decode with base64 -d (on Linux) or base64 -D (on OSX).
It turns out I shouldn't have tried to compare the display property of the json.member field. Sometimes it gets set by the JSON dissector, and sometimes it just stays as Member.
The proper solution would involve checking the value of the json.key field, but since the key I'm looking for presumably would never get escaped, I can get away with looking for the string literal in the range property of the member field.
So instead of:
if member.display == 'result' then
I have:
if member.range:range(1, 6):string() == 'result' then
and now both filtering and columns work.

Map json input but not output in Suave

Suave.Json.mapJson maps the input JSON to an object into your function, then maps the output of your function into JSON.
The problem is that I'm happy with the way it maps into my function, but I need to return a json string response rather than have suave serialise my output into JSON for me. How can I do this?
Currently i'm getting my output serialised twice. My code so far:
let executeQuery : Query -> string = //Query is my deserialised json input, the return value is a json string
let app = POST >=> path "/graphql" >=> Json.mapJson executeQuery >=> setMimeType "application/json; charset=utf-8"
startWebServer defaultConfig app
If you look at the Suave source code, you'll see that mapJson is shorthand for mapJsonWith fromJson toJson. The fromJson and toJson functions are the default JSON deserializer and serializer (respectively), but you could create your own instead -- or just use id to say "map this direction without changing it". E.g.,
let oneWayMapJson = mapJsonWith fromJson id
Note that I haven't tested this, just typed it into the Stack Overflow answer box, so some tweaking may be required. I don't have time to expand on this answer right now, but if you need more help than this rather barebones answer, let me know and I'll try to give you more help sometime tomorrow.

Difference between Msxml2.DOMDocument and Msxml2.XMLHTTP

What is the difference between:
Msxml2.DOMDocument
Msxml2.XMLHTTP
? And of course, the other question is which one will work best for my purpose as described below?
The context is this - I have code that makes many calls to retrieve web pages. I am looking for the most efficient object for this task. For example, something like this:
Dim oXmlHttp : Set oXmlHttp = CreateObject("MSXML2.XMLHTTP")
oXmlHttp.Open "GET", sUri, False
oXmlHttp.Send
If Err Then
getWebPage = "ERROR - could not get the source text of the webpage."
Exit Function
End If
sResponse = oXmlHttp.responseBody
This seems to work the same way if I create an object using:
Dim oXmlHttp : Set oXmlHttp = CreateObject("MSXML2.XMLHTTP")
Can anyone explain or point me to a reference that clearly outlines the differences (and intended usages) for each of those?
If you want to learn more about MSXML, these links may help:
http://msdn.microsoft.com/en-us/library/aa468547.aspx
http://msdn.microsoft.com/en-us/library/windows/desktop/ms766487(v=vs.85).aspx
In short, XMLHTTP is used to retrieve information, while DOMDocument is used to structure and parse it.
This page explains it better: http://msdn.microsoft.com/en-us/library/windows/desktop/ms760218(v=vs.85).aspx
DOMDocument "Represents the top node of the XML DOM tree." while XMLHTTP "Provides client-side protocol support for communication with HTTP servers."

Resources