I am trying to generate multiple clients for each controller using the {controller}Service class name option:
nswag swagger2tsclient /Input:api-swagger/api-swagger.json /Output:src/app/api/api.ts /ClassName:{controller}Service /WrapDtoExceptions:true /TypeScriptVersion:2.0 /Template:Angular /HttpClass:HttpClient /DateTimeType:Date /GenerateClientClasses:true /GenerateClientInterfaces:true /ImportRequiredTypes:true /BaseUrlTokenName:API_BASE_URL /InjectionTokenType:InjectionToken /RxJsVersion:6.0 /UseSingletonProvider:true /ExcludedParameterNames:Authorisation
While I expect a client for each controller, it only generates one client called Service, which contains all operations across all endpoints.
The spec file looks like the following:
Related
I Check the swagger and we can have URL dynamically for various site based on Server specified. But i have doubt whether is it possible to construct URL dynamically for each operation similar like this
http://{serverurl}/{tagName}/{operation}
I know that server url can be configured at the top of swagger.json but how can we configure/add tag name to the URL dynamically.
Example:
consider i have following scenario
I have following tags,
customer
user
admin
Each tags have various operations associated with them, I need to add the tag name along with operation name dynamically in swagger URL as following
If the operation is getRecentActivity it need to be like
http://www.example.com/{tagname}/{operation}
which is similar like
http://www.example.com/customer/getRecentActivity
Note:- My configuration is in such a way, i know i can directly specify it in path itself by i required it to generate dynamically based on tag name
I'm building a Grails application that starts with loading XML file and based on it generates (dynamically) a form that has checkboxes and a submit button. When a user submits a form, a controller receives the params (checkboxes) and calls a method that reads an Excel file to extract records that matches the params.
In other words, the scenario I'm trying to implement is that the user submits a form that has some checkboxes (generated dynamically from an XML file), and the params will be passed to a controller that runs a script or a service, which uses these params as criteria to retrieve some rows from an xlsx file and displays these rows in a view.
My question is what are the best practices for doing that?
I pass the params from a controller to a service (although I don't
have a domain class for the data --it can be updated everyday), And
inject this service into BootStrap.
I call a method in an external groovy class (in the src folder) that
uses the params to retrieve data from the Excel file.
Also, what is the most efficient way to parse an Excel or CSV file and extract records with data that match the params? (I can't use DB or domain classes because the Excel file can be updated with new columns over time).
params in Grails do a good job. However, I would suggest that you consider using command objects. Command objects are like Grails domains but they do not persist data in a physical database. You can easily pass the instantiated command object to the external groovy class (in the src folder) instead of passing params.
You can find more details here: http://guides.grails.org/command-objects-and-forms/guide/index.html
I'm importing a swagger specification file into postman to create a collection, at this point, works as expected and the collection is generated with all requests & sub-folders, fine!!. But when the api is updated, I need update the postman to update all requests based on the new specification. I can't find a action like "update" or something else. I'm trying import the new specification into postman and he say:
A collection APIName already exists.
What would you like to do?
Replace or Import as copy
a copy its a not feasible option, then I use replace and the existent collection is updated, but all tests, parameters, pre-req scripts are remove and I need reconfigure all again.
I'm missing something, exist a way to import & update a existent collection from a specification file, without losing existent tests & configuration?
thanks in advance!
Postman does not support this as of now. Link
Alternative I learned from this blog. In short:
Update your OpenAPI YAML/JSON files.
Import to Postman as a new collection.
Export the new collection from Postman. As JSON in Collection v2.1 format (recommended).
Using Postman API (Update Collection), update the existing collection with the JSON in step 3 as body. Make sure to update collection_uid accordingly.
Postman update collection API body sample:
{
"collection":
<------- YOUR EXPORTED COLLECTION HERE --------->
}
I have made a small tool to do this: swagger2postman: convert swagger to postman collection and update exist collection
The tool will combine new and old collection, when conflict, it will use saved in postman. The tool can detect update in query parameter, but not post body. It will keep all your collection and test cases.
There is no straightforward solution. There also probably won't be one for quite some time - the devs said it is hard to correlate old Postman requests in a collection with new requests generated from incoming Swagger file (or any other source of updates, for that matter).
You can do it outside of Postman, though. Postman collections are really just JSON data and can be manipulated as such.
Transform your Swagger file to a Postman collection file. (You can just import it and export it again, or use tools like Swagger2Postman if you want to automate. Save it as collection 2.0 or newer, that format makes step 3 a lot easier.)
Export your old collection in the same version
Merge the two JSONs in your preferred scripting language
Import the merged JSON back to Postman
I made myself a simple helper for step 3. It is fine if you are just adding more stuff to the collection (my case), but can't remove anything. If anyone has a more versatile solution, please, post it - I would gladly use it instead.
function execute() {
collection = JSON.parse($(".collection").val());
swagger = JSON.parse($(".swagger").val());
result = JSON.stringify($.extend(true, {}, swagger, collection));
$(".result").val(result);
}
<html><body>
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<br>Collection: <br> <textarea class="collection"></textarea>
<br>Swagger: <br> <textarea class="swagger"></textarea>
<br>Result: <br> <textarea class="result"></textarea>
<br>
<button onClick="execute()">EXECUTE</button>
</body></html>
It seems that they have implemented this feature,
https://github.com/postmanlabs/postman-app-support/issues/6722#issuecomment-652929581
https://learning.postman.com/docs/designing-and-developing-your-api/validating-elements-against-schema/
You can paste your new schema in the API define tab and update it.
I want to use TensorFlow Serving for a custom model (No pre-trained starting point).
I've made it through the pre-Kubernetes part of the TensorFlow Serving tutorial for Inception, using Docker: http://tensorflow.github.io/serving/serving_inception
I understand (roughly) that the Bazel compiling is central to how everything works. But I am trying to understand how the generated predict_pb2 from tensorflow_serving.apis works, so that I can swap in my own custom model.
To be clear, this is what the main in inception_client.py currently looks like:
def main(_):
host, port = FLAGS.server.split(':')
channel = implementations.insecure_channel(host, int(port))
stub = prediction_service_pb2.beta_create_PredictionService_stub(channel)
# Send request
with open(FLAGS.image, 'rb') as f:
# See prediction_service.proto for gRPC request/response details.
data = f.read()
request = predict_pb2.PredictRequest()
request.model_spec.name = 'inception'
request.model_spec.signature_name = 'predict_images'
request.inputs['images'].CopyFrom(
tf.contrib.util.make_tensor_proto(data, shape=[1]))
result = stub.Predict(request, 10.0) # 10 secs timeout
print(result)
https://github.com/tensorflow/serving/blob/65f50621a192004ab5ae68e75818e94930a6778b/tensorflow_serving/example/inception_client.py#L38-L52
It's hard for me to unpack and debug what predict_pb2.PredictRequest() is doing since it's Bazel-generated. But I would like to re-point this to a totally different, saved model, with its own .pb file, etc.
How can I refer to a different saved model?
PredictionService, defined here, is the gRPC API service definition which declares what RPC functions the server will respond to. From this proto, bazel/protoc can generate code that will be linked in the server and in the client (predict_pb2 that you mentioned).
The server extends the autogenerated service here and provides an implementation for each function.
Python clients use the provided predict_pb2 and use that to build a request and send an RPC using the right API.
predict_pb2.PredictRequest() is a PredictRequest proto defined here which is the request type for the Predict() API call (see PredictService Proto definition linked above). That part of the code simply builds a request and result = stub.Predict(request, 10.0) is where the request is actually sent.
In order to use a different model, you'd just need to change the ModelSpec's model name to your model. In the example above, the server loaded the iception model with the name "inception", so the client queries it with request.model_spec.name = 'inception'. To use your model instead, you'd just need to change the name to your model name. Note that you'll probably also need to change the signature_name to your custom name or remove it entirely to use the default signature (assuming it's defined).
I'm working on a Pylons project using Jinja2 templating. I want to test for the request URI and/or controller inside the Jinja2 templates - is there an equivalent to a getRequestUri() call? I can set a context variable as a flag inside all the controller methods to do what I want, but that seems a bit like writing my home address on each and every one of my house keys... i.e. not quite the right way to do it.
Solution: not quite a function call, but I can test against url.environ.PATH_INFO. It only gives me the URL path, not the hostname, and I don't know that it would give me the query string, but it gives me what I need.